First  Prev  1  2  3  Next  Last
Post Reply Self-Driving Cars and a Moral Dilemma
27250 cr points
Send Message: Send PM GB Post
27 / M
Offline
Posted 6/18/15 , edited 6/18/15
http://www.sciencedaily.com/releases/2015/06/150615124719.htm

"How will a Google car, or an ultra-safe Volvo, be programmed to handle a no-win situation -- a blown tire, perhaps -- where it must choose between swerving into oncoming traffic or steering directly into a retaining wall? The computers will certainly be fast enough to make a reasoned judgment within milliseconds. They would have time to scan the cars ahead and identify the one most likely to survive a collision, for example, or the one with the most other humans inside. But should they be programmed to make the decision that is best for their owners? Or the choice that does the least harm -- even if that means choosing to slam into a retaining wall to avoid hitting an oncoming school bus? Who will make that call, and how will they decide?"

The day when autonomous cars all on full autopilot mode take us everywhere may not be too far in the future. Here's an interesting example of the problem mentioned here.

You are in one of these self-driving vehicles and the traffic is flowing at about 70 MPH. Suddenly, a large group of stupid pedestrians is in front of you. The car is going too fast to stop so its only option is to swerve into a barrier that may be destroyed, sending you down a dangerous slope. However, if it does this, you will almost certainly be critically injured or killed. What should the programmers of the CPU have it do in this situation? Try to stop and probably kill a few pedestrians? Or run into the barrier and only kill you?


This reminds me of a similar discussion I was having in a law class. I suppose these are just different variants of the classic Trolley Problem.

If you push a button, you will kill a person. However, if you leave it alone, two people will die. What do you do?

Now, consider if that one person is a genius or a person of great influence. What if the two people are mentally-ill and disabled? What if they are homeless old people? What if they are terrible criminals?


It seems that these are difficult decisions even for people. Will computers be able to handle it? Should they be allowed to?

Edit: Since people seem to like to focus on the unimportant details, changed speed to 70. And these are hypothetical questions, the essence of which is the classic Trolley Problem in philosophy, in case I'm being unclear. Please answer instead of trying to change the scenario lol
11622 cr points
Send Message: Send PM GB Post
40 / M / USA
Offline
Posted 6/18/15
Supposedly Google can do it already. I know they've been doing this shit for years now.
43982 cr points
Send Message: Send PM GB Post
37 / M / USA
Offline
Posted 6/18/15
If the traffic is going 90 mph, which I don't think would be permitted to happen, there better not be any pedestrians allowed anywhere near that street. There would have to be physical barriers to prevent that from happening. But again, I don't think traffic would be allowed to run at that speed.
2047 cr points
Send Message: Send PM GB Post
24 / M / USA
Offline
Posted 6/18/15
If your going 90 most likely you are on a highway and not on a city street so it is highly unlikely that this scenario would occur.
2047 cr points
Send Message: Send PM GB Post
24 / M / USA
Offline
Posted 6/18/15
Op its likely these cars would have advanced sensors and would be able to anticipate such a problem before it even occurs or the braking system is state of the art. Honestly I believe the problem you propose can be easily remedied.
27250 cr points
Send Message: Send PM GB Post
27 / M
Offline
Posted 6/18/15 , edited 6/18/15

biscuitnote wrote:

Op its likely these cars would have advanced sensors and would be able to anticipate such a problem before it even occurs or the braking system is state of the art. Honestly I believe the problem you propose can be easily remedied.


I'm not proposing a problem. This has been a problem that has been around for a long time and will continue to be around even in the future unless the cars are absolutely perfect. The situations don't have to be common in order to be possible problems. Even if they are not common, they are certainly foreseeable and programmers will have to choose what the vehicle should do should such a situation arise.

1240 cr points
Send Message: Send PM GB Post
M
Offline
Posted 6/18/15
Hypothetically just like if any large group of pedestrians walked onto the highway while cars where going 70 mph with human drivers, I suspect some would swerve to avoid killing said pedestrians and some would not stop in time and run over said pedestrians. Anyone smart enough to walk out onto the highway with vehicles going 70 mph deserves their fate.
2047 cr points
Send Message: Send PM GB Post
24 / M / USA
Offline
Posted 6/18/15

madmejis wrote:

Hypothetically just like if any large group of pedestrians walked onto the highway while cars where going 70 mph with human drivers, I suspect some would swerve to avoid killing said pedestrians and some would not stop in time and run over said pedestrians. Anyone smart enough to walk out onto the highway with vehicles going 70 mph deserves their fate.


Natural selection!
35191 cr points
Send Message: Send PM GB Post
25 / M
Offline
Posted 6/18/15
Speaking as a someone with a degree in computer science, self-driving cars should not be programmed with the capability to even consider deliberately driving themselves off of the road, especially based off of such a vague criterion as "are the obstacles in front of me humans who will die if I do not deliberately place my driver in danger?"

And just speaking as a person, I will never get behind the idea of sacrificing innocent people to save the lives of a bunch of idiots who put themselves in a dangerous situation.
501 cr points
Send Message: Send PM GB Post
42 / M / A Mile High
Offline
Posted 6/18/15
Since cars are engineered to best absorb impacts head on, the programming would probably default to avoid swerving in the event of an imminent impact. Collisions with animals are relatively common occurrences on the roads here, so the scenario of colliding with something/someone crossing the road would definitely have to be anticipated in the development.

Eznik 
58067 cr points
Send Message: Send PM GB Post
34 / M / Providence, RI
Offline
Posted 6/18/15
I don't think this is exactly the trolley problem. Although the scenario is similar this comes down to collision detection and artificial intelligence. One simple solution is to space out all cars so that they are all going at a consistent speed and have a set braking distance for situations where a tire gets blown out.

For the situation where "stupid pedestrians" are crossing a road, you are assuming the car can detect actual people crossing the road and not just physical objects. if the car could detect something crossing, I would program the car to slam on the brakes.

If the car could detect actual people then I think this becomes too much of a hypothetical situation as I do not want to speculate about the technology advances required for that.
17264 cr points
Send Message: Send PM GB Post
34 / F / Earth
Offline
Posted 6/18/15
As for handling a blown tire, it would probably just slow down and pull off to the shoulder of the road, one blown tire isn't severe enough to cause a major loss of steering, plus there's always the option of solid rubber tires which never blow out, they just require a very good suspension so the ride isn't bumpy. At speeds greater than 40mph, you're most likely not somewhere where you'd find a person or people crossing the road, but there is always a chance of wildlife running out into the road (say like a deer... that would suck to hit). As for driving around the city, anyone who chooses to cross the street at somewhere other than a stop sign or a stoplight/crosswalk, has knowingly made the choice to put them selves at risk by playing a real life version of frogger... and we all know what happens to the little green frog when he doesn't get out of the way of the car fast enough...
What I wonder most, living in a northern climate, is how would the car handle hitting a patch of black ice during the winter? Has anyone thought about that one?
2461 cr points
Send Message: Send PM GB Post
31 / M / Minnesota, USA
Offline
Posted 6/18/15
I love how people are freaking out about this automated vehicle thing. It's a car that eliminates stupid humans from the driving part. I'm all for that. It might take some growing of the technology and trial and error but eventually it's going to be an amazing idea. I'm all for it.
27250 cr points
Send Message: Send PM GB Post
27 / M
Offline
Posted 6/18/15 , edited 6/18/15
Yes, I understand the technology will eventually develop to the point where it's really amazing and sophisticated and there are almost no chances of problems occurring. And I also understand that there are ways to ensure that the likelihood of these things happening will be minimized to the point where they'll almost be a non-issue.

But what about the time before that? The pyramids weren't built in a day. There is inevitably going to be some trial and error. There's also the eternal truth that nothing is perfect. There are also things beyond our control and ability to predict. What if a sinkhole opens up or an earthquake throws everything into disarray? I don't think anyone has actually gotten to the meat of the issue yet lol

If the road is being shared by self-driving vehicles and human-operating vehicles, I personally believe the self-driving vehicles should be programmed to protect their passengers. When everything is automatic, however, and no more human-operated vehicles are on the road, that's when I'm not sure what the CPU should be programmed to do in case there is a lose-lose emergency situation.
Posted 6/18/15



First  Prev  1  2  3  Next  Last
You must be logged in to post.