Reading 12: Self-Driving Cars

  • Would you want a self-driving car? Explain why or why not.

When I first saw this questions, I thought to myself, well of course because they seem so cool. However, it got me thinking about a possible future scenario: The scenario that I thought of is if consumers see that AVs are safer and more and more people buy those cars, then what happens to the consumers that like to drive themselves? Would it be unethical for a person to drive a car that’s not autonomous, because they are the standard deviation to the safety of AVs? Would the culture behind AVs shift and recognize that people not driving an AV are a hazard on the roads? Personally, I think that the culture of driving one’s own car would die out in the end, and everyone would eventually drive an AV due to the pressure that some others might bring to those who don’t drive AVs. 

  • What is the motivation for developing and building self-driving cars? What are the arguments for and against self-driving cars? Would they make our roads safer?

The thought of developing and building self-driving cars always seemed like a good idea, in terms of both convenience and safety. However, I always thought that the argument that says that having more autonomous vehicles on the road would save more lives, meant that the majority of vehicles on the road would be autonomous, which doesn’t mean perfect. I think it’s more like a system, where if the majority of cars follow the same set of protocols, then driving on the roads would be a lot safer since all the cars follow the same policy. Then again, it’s hard for a system to be perfect, when there is a lot of variation of driving conditions.

  • How should programmers address the “social dilemma of autonomous vehicles”? How should an artificial intelligence approach life-and-death situations? Who is liable for when an accident happens?

I don’t think programmers should be the ones completely in charge of such a huge question. Instead a combined anticipatory design process is needed for the ethics of autonomous vehicles, because it’s hard to place blame and to create decision. It’s hard to place blame because of each situation being so circumstantial and that hard cases disallow the blame to be put on indirect parties such as the manufacturers. It’s hard to create decisions for the autonomous vehicle or customer, because in a high stakes situation, a customer can not usually put in their own input at that time.

  • What do you believe will be the social, economic, and political impact of self-driving cars? What role should the government play in regulating self-driving cars?

The impact that AVs will have is tremendous, but I hope that the people whose jobs depended on that industry aren’t screwed over. As to deciding which government body should provide regulatory or suggestive rules on traffic rules for autonomous vehicles, I’m not sure as to what should happen. There should definitely be experts in this field, as well as ethicists and more contributing to the conversation before we consider law.