First  Prev  1  2  3  Next  Last
Self-Driving Cars and a Moral Dilemma
33299 cr points
Send Message: Send PM GB Post
27 / M / STL
Offline
Posted 6/18/15

Morbidhanson wrote:
If the road is being shared by self-driving vehicles and human-operating vehicles, I personally believe the self-driving vehicles should be programmed to protect their passengers. When everything is automatic, however, and no more human-operated vehicles are on the road, that's when I'm not sure what the CPU should be programmed to do in case there is a lose-lose emergency situation.


Honestly having all the cars automatic is better than having half and half.
The easiest way to prevent said moral decisions is to have the cars all talk to each other.
That way in an emergency they all react (avoid/slow down), thus preventing most accidents.
As an engineer you need to prevent random events, before said moral decisions.
But for the moral decisions, I think the simplest answer is usually best and that would be self-survival because that is human nature.
That way we are all equal instead of a 2 person car is greater than a 1 person car; or those 4 McDonalds workers are less than 1 Noble Peace Prize Winner.
Of course you could give school busses extra privilege in survival.

*And I would have cars maintain the given path (straight) in the event of a tire blowout / ice, because that's the easiest way to program them. As long as it has some control of the steering it could maintain the initial path, and it would tell the other cars to keep their distance. Thus avoiding the moral decision all together.
49109 cr points
Send Message: Send PM GB Post
Offline
Posted 6/19/15
It would cut your drunk/drink driving deaths down to about zero.I don't see a moral dilemma outside the simple acceptance of the technology.What is really needed in the US is an effective mass transit system.
17674 cr points
Send Message: Send PM GB Post
こ ~ じ ~ か
Offline
Posted 6/19/15

Morbidhanson wrote:

http://www.sciencedaily.com/releases/2015/06/150615124719.htm

"How will a Google car, or an ultra-safe Volvo, be programmed to handle a no-win situation -- a blown tire, perhaps -- where it must choose between swerving into oncoming traffic or steering directly into a retaining wall? The computers will certainly be fast enough to make a reasoned judgment within milliseconds. They would have time to scan the cars ahead and identify the one most likely to survive a collision, for example, or the one with the most other humans inside. But should they be programmed to make the decision that is best for their owners? Or the choice that does the least harm -- even if that means choosing to slam into a retaining wall to avoid hitting an oncoming school bus? Who will make that call, and how will they decide?"


Whichever impact would be the slowest, with preference given to the barrier. However in the case of a blown tire even a computer likely won't be able to do much. Steering response would be all wonky even if it was a rear tire. So, even if the computer chose the wall and made a valiant effort for it, the car might end up in traffic anyway.

Now, for the contrived trolley dilemma...although, this is sort of different in that you're the one killed.



The day when autonomous cars all on full autopilot mode take us everywhere may not be too far in the future. Here's an interesting example of the problem mentioned here.

You are in one of these self-driving vehicles and the traffic is flowing at about 70 MPH. Suddenly, a large group of stupid pedestrians is in front of you. The car is going too fast to stop so its only option is to swerve into a barrier that may be destroyed, sending you down a dangerous slope. However, if it does this, you will almost certainly be critically injured or killed. What should the programmers of the CPU have it do in this situation? Try to stop and probably kill a few pedestrians? Or run into the barrier and only kill you?


I think the only choice for the computer is to try to stop. If the meatbag in the car wants to do otherwise he can just override, but the car should protect itself and its occupant first. It's possible the stupids might survive if the car can slow down enough. Car makers these days have pedestrian safety as one of their design considerations after all and they get rated on it. A self-driving car might even be held to stricter standards.

Also, you would be there to render aid to any wounded. Meanwhile, there would be no one there to render aid for you if the car jumped off the cliff.
20660 cr points
Send Message: Send PM GB Post
Offline
Posted 6/19/15
I think self-driving autos knowledge of reality should be limited to roads. The larger the scope of reality an AI has the more difficult and complicated it becomes to make it do what humans want. So limiting the car to operate in narrower parameters makes them more predictable and probably safer in general. There is no way that it is okay to run over pedestrians, but within the scope of normal road operation there are probably fewer other types of crashes.

IMO some humans can get the best possible outcome in a very bad situation where a self-driving car would fail. But in many cases a self-driving car would prevent many bad situations most humans get into through poor driving, speeding, or inattention. Hypothetically of course. I don't know how safe they are atm.
16249 cr points
Send Message: Send PM GB Post
M
Offline
Posted 6/19/15
Self driving cars isn't exactly new tech. Self driving trucks were demoed in the early 90s. They have come a long ways but still have problems like driving in heavy rain really affects the optics and lasers that check distance and where things are. So the solution google's cars use is, don't use it in the rain. EG, the human drives the car when the weather bogs down the system.


The whole moral dilemma requires a lot to go wrong simultaneously which won't ever go wrong simultaneously. It is interesting to think of in a philosophical sense or even as a trope, but not in a real sense. The old question, "If you were at the train switch and had to choose a bus of old ladies to die or one child" is basically the same thing, as is the button question you listed.


The next problem is such a question also begs the question of "What would a human do in the same situation?". But it is complicated a bit. A person driving a car at 70Mph might not react the same way every time. What the person had for breakfast, if they are distracted, and reaction times, etc will all factor into it. There's a very real possibility a human might his the pedestrians AND slam into the barrier or flip their vehicle in a knee jerk over steering reaction. In contrast, the self driving vehicle will make the same decision every time, and make the decision much quicker than a human would. So as a rule, I find it unlikely a self driving car would have it in it's rules to drive off the permitted roadway or cross the center divide. I do find it likely it would be in the rules to try to stop and possibly change lanes and this would all happen likely before the human could react to try to override the computer.


That all said, there's another problem of the self driving car problem. If the computer chooses to slam you into death via concrete barrier you'd never know. It's safer to have seats face the rear so it is highly likely once cars go completely self driving seats will face the rear.
1522 cr points
Send Message: Send PM GB Post
23 / M / The Cosmos
Offline
Posted 6/19/15
Self driving cars? Yeezus Chrysler. That takes all the fun out of driving
2047 cr points
Send Message: Send PM GB Post
24 / M / USA
Offline
Posted 6/19/15

IShouldBeStudying wrote:

Self driving cars? Yeezus Chrysler. That takes all the fun out of driving


I hate driving self driving cars would be a lifesaver for me.
27344 cr points
Send Message: Send PM GB Post
28 / M
Offline
Posted 6/19/15 , edited 6/19/15
I'm all for self-driving cars. It's actually pretty amazing how smoothly traffic cal flow on a two-lane highway when people don't drive like nuts and stay right when not passing. Like...really amazing. I could be doing 90 comfortably in some areas (and did) even though that's speeding. But this isn't about whether self-driving cars should be accepted. The answer to that is a resounding YES and there really isn't much debate going on over that.

It's about how a CPU will deal with a lose-lose emergency situation that involves loss of lives in both options. Get creative. A lot of ridiculous things can go wrong on the road. They may be unlikely but they are not at all impossible. It doesn't have to be the exact situation I described. Chances are that it WON'T be. What if the vehicle you are about to hit is a transport for volatile substances? Should the car not turn aside? Plenty of other hazards might be present and we may actually be unable to imagine all of them.

And, again, the pyramids weren't built in a day. There's going to be trial and error before the tech rises to the level of sophistication necessary to truly minimize the likelihood of deadly emergencies.
Posted 6/19/15
Frankly, I am inclined to believe the AI is incapable of dealing with every possibility much less resolving the moral quandaries present.
I imagine if it came out that Google or whoever programmed their car to sacrifice it's occupants after a fatal crash the lawsuits would be crippling.
27344 cr points
Send Message: Send PM GB Post
28 / M
Offline
Posted 6/19/15
Currently, the law will ding you if you sacrifice someone to save two, even if all three would have died if you did nothing.

This may change in the future as law evolves, though.
Posted 6/19/15
3342 cr points
Send Message: Send PM GB Post
42 / M / NW
Offline
Posted 6/19/15 , edited 6/19/15
Do you think the police or cities would allow these cars, which should be essentially ticket-less? They get a lot of revenue from fleecing people. Then again they are like highway robbers in places. I'm sure they'll still stop vehicles with remotes and arrest people and take their stuff for looking suspicious. Or just start fleecing people in the streets on any charges they can find, more than they already do.
16249 cr points
Send Message: Send PM GB Post
M
Offline
Posted 6/19/15

Morbidhanson wrote:

Currently, the law will ding you if you sacrifice someone to save two, even if all three would have died if you did nothing.

This may change in the future as law evolves, though.


In some places the law will demand retribution if you do nothing as well as if you do something. Also, I think Florida has some interesting morale hazards enshrined in law. Like the penalty for brandishing a weapon being more severe than actually shooting and killing someone.
11107 cr points
Send Message: Send PM GB Post
14 / F / California
Offline
Posted 6/19/15

biscuitnote wrote:

Op its likely these cars would have advanced sensors and would be able to anticipate such a problem before it even occurs or the braking system is state of the art. Honestly I believe the problem you propose can be easily remedied.


I will tell you these sensor systems are far far far from being foolproof. Without going into too much detail, they have issues from smaller animals to people on motorcycles. I'll tell you right now, being kicked into a panic stop at interstate speeds because the system has a mental breakdown at the wrong time when there is no-one around you is a great way to give people heart attacks.

9051 cr points
Send Message: Send PM GB Post
Offline
Posted 6/19/15
Hmm. Interesting topic, as I'm a programmer. There's a dichotomy here. Do you save the many, or the one? The code is the code. The computer only does what it's told. I'd like to think that other programmers would have thought to institute traffic controls to prevent this scenario, but as a coder, I can't count on that. I suppose, if it came down to it, my program would sacrifice the one for the many. I'd try to implement as many safety protocols as I could; airbags/foam, speed reduction, course change if possible. It's a tough situation, but one of many programmers face. Even simple things like road construction throw a wrench into the works. We've got a long way to go, but we are working on it.
First  Prev  1  2  3  Next  Last
You must be logged in to post.