Reading 12: We’re Cruisin.

I can see the motivation for building self-driving cars. It is all about innovation. Everyone wants to be the first person to create “the next big thing.” Although many companies are working on this problem, and some have even put cars on the road, no one has perfected it yet. Driving a car can be a very dangerous thing. The first person to perfect a vehicle that can drive itself and avoid the risk of human error will be very rich. One thing I think is interesting is that we do not yet know if self driving cars will actually be safer, we just know that human drivers are known to cause accidents and that self driving cars “might” eliminate or decrease the number of danger accidents. 

Aside from the possibility of being safer, another pro to self driving cars is that they will allow the “driver” to become a passenger giving them more time to do other things. Imagine having an hour commute to and from work everyday. That means a typical 8 hour work day turns into a 10 hour day. Now imagine being able to work while in your car on the way to the office. If you could start working on your commute, and work on your way back home, you could cut your once 10 hour day back to an 8 hour day and still get all of your work done. Another pro would be that other self driving cars could communicate with your car. Imagine knowing exactly what the car in front of you is going to do! That could help you determine which lane you want to be in to make your commute faster. Other pros include eliminating “driver distractions” and drunk driving, being able to increase speed limits, and improve heavy traffic.

Important cons of self driving cars include cost, if this will actually increase safety, determining who is actually at fault in case of an accident, and whether the sensors and cameras will actually be effective on all roads in all conditions. People also fear loosing the ability to actually drive a car.  What if something malfunctions and the car must default to manual control, and the driver does not actually know how to drive? Also, some people actually do enjoy driving cars. Probably the biggest concern with self-driving cars is the “social dilemma.” Who is at fault, and what should the car do in a life-and-death situation? Many people use the trolley problem to discuss the morality of autonomous cars. Another way to think about this is if your car is driving next to a cliff and it has the option to drive off the cliff or run into a child, what should it do? What if there is also a child in your car? How do we value one persons live over another. As a human driver, often our natural reaction in an accident is to save ourselves. Often we do not have time to consider the value of the other persons life over our own before a collision occurs. But what does that mean for a computer. And who is liable when this occurs?

I honestly do not have a great answer to this problem. Every human is different, and every human is going  to have a different opinion. It is impossible to make a “perfect” self driving car that reflects the moral behavior of every human because we will not all agree on the answer to the trolley problem. We also will not all agree on who is liable. The software engineers are trying to “mimic” human reactions. They are just doing their jobs, and not actually the computer themselves, so how can they be liable? The driver isn’t really in control, so how can they be liable? In the most recent case where an uber killed a pedestrian in Arizona, police are putting the blame on the victim, saying this situation would have been difficult to avoid even if the car was not autonomous. But what if the pedestrian was not a pedestrian and instead another self driving car? Then who can they throw the blame on?

Personally, I am not interested in owning a totally self driving car at this time. I think some of the automatous features are cool, like the car being able to parallel park on its own, or braking for you to avoid an accident, or even making sure your car stays in its lane. But I am not ready to give total control of my vehicle to a computer. In my opinion cars are very dangerous things. I am a tiny human in total control of something that weighs multiple tons and I take it very seriously. For now, I feel safer being in control of the car, than trusting a computer to be in control. Maybe some day the technology will advance enough and there will be enough proof that giving control to a computer is actually safer, but for now I do not have enough evidence the trust a computer to totally drive my car.