What are the ethical issues with self-driving cars?

2023 © Wikiask
Main topic: Tech
Other topics: Self-driving cars
Short answer:

Some of the main issues:

  1. In an accident, who is responsible: the driver or manufacturer?
  2. Appropriate to give driver control, at the last second?
  3. Who should make these ethical decisions: manufacturers or the government?
  4. Ethical to test on public roads?
  5. Ethics in the case of a cybercrime?
  6. How to calculate the value of a life?

It is possible to conclude that self-driving automobiles have a higher degree of dependability than their manual-driven counterparts. However, there are still some ethical dilemmas that self-driving vehicles confront today. There is a concern regarding the ethics of self-driving cars, which makes us ask whether we are genuinely ready for a world without drivers.

Self-driving car.

Because the ethical difficulties surrounding autonomous cars entail judgments that might result in someone's life or death, these topics are complicated and controversial, and the personal moral codes of people strongly impact attitudes. Even if they vary significantly from individual to individual and across places, it is essential to have a consistent and unified code of ethics and legal framework. Before entirely autonomous cars are available to customers, firms creating AVs must work together to produce a code of ethics for their vehicles. As the discussion around the issue heats up as a result of the growing popularity of self-driving vehicles, There are high hopes for the development of stringent rules and regulations that will, at long last, provide answers to the concerns in a way that is both right and reasonable.

The potential of robots to make moral judgments is a significant challenge for ethical considerations in the research and development of self-driving automobiles. Individuals are often placed in an ethical problem when faced with the need to make moral judgments in the case of an accident. Because of this, it would be tough to design a vehicle that would universally please the ethical frameworks that people in all of the world's populations have chosen to accept if there were no agreement on a global moral code. However, before they are sold to consumers, automobile manufacturers, politicians, and society need to reach a consensus on how vehicles ought to behave morally in the most challenging of circumstances: a collision that might result in death.[1][2]

Here are a few ethical issues/moral dilemmas that self-driving cars face:[edit]

1. When a self-driving car crashes and kills a pedestrian, who is legally and morally responsible for the death? Is it the car itself, the manufacturer, the owner/passenger, or the software programmer?[edit]

Illustration of a collision warning system.

The topic of liability is a crucial aspect to consider when considering the ethical concerns raised by accidents involving autonomous vehicles. Who is legally and ethically liable for the death of a pedestrian when a collision caused by self-driving automobile results in the death of the pedestrian? Is it the maker of the cars that drive themselves? Is it the person who programs the software? Or maybe it's something about the driver or the other passengers in the vehicle.

This question does not have a straightforward and correct explanation. Therefore, most people believe that accidents should occur organically rather than having a machine or piece of software select who will survive and who will die in a given situation.[3]

2. Is it appropriate to give the driver control of the vehicle at the last second? Who is accountable for the destruction of life and property in the event of an accident? Is it the vehicle, the driver, a careless pedestrian, or a driver in a different vehicle?[edit]

Driver in a self-driving car.

One of the most challenging questions to answer about self-driving automobiles is whether or not it would be ethical to give control of the vehicle to the driver at the very last possible moment. Not only will this doubt the morality of autonomous cars, but it will also call into question the morality of human drivers.

Even when the automobile is operating in its most hands-off mode, a self-driving car's driver must retain their hands on the steering wheel and pay attention to their surroundings. The driver is responsible for being ready to take control at any given time.

But even in these situations, if an accident were to occur, who would be accountable for the loss of life and property? Is it possible that the collision was caused by the self-driving car? Is it possible that the driver was the one who was unable to make the correct choice in a short amount of time? Or maybe it was the careless pedestrian or the other vehicle's driver.[4]

3. Who is the right person to decide the Ethics of Self-Driving Cars? Is it the engineers who work on the car’s technology or the government?[edit]

The engineers who work on the technology of the vehicle are the ones who, in most cases, decide what constitutes ethical behavior for self-driving automobiles. What they consider appropriate or inappropriate dictates how the car will behave in certain circumstances, such as accidents.

However, there is a lot of debate regarding who should be the one to evaluate whether or not self-driving vehicles adhere to ethical standards. Are the engineers that worked on the technology for the car or the government of the nation where the vehicle will be driven that will be in charge of it?

One may point out that nobody can judge the ethics of individual circumstances involving autonomous vehicles. The individual who is now operating the vehicle must be the one to make a choice.[5]

4. The dilemma of determining who should be saved by calculating the value of life. It is determined by several variables, including youth or physical fitness, group vs. individual, and human vs. animal[edit]

Graph of the value of saving a human life.

If a car cannot come to a complete stop in time to avert an accident and casualties are thus unavoidable, it must be pre-programmed to choose who should be spared and who should be struck. As a result, this dilemma raises the possibility that there is an innate need to compute a life's relative worth, which might depend on a range of characteristics like youth or physical fitness. Should a self-driving vehicle even consider this value when deciding who would be injured or killed in an accident? Suppose the automobile is forced to choose between rescuing a kid and a group of senior citizens. In that case, it should prioritize the senior citizens since doing so will save the most lives. In addition, preserving human life should precede the lives of animals. This results in an ethical dilemma when deciding who should live, which raises problems regarding the appropriate method for judging the worth of human life.[6][7][8]

5. The Hacking Dilemma: Who is accountable for an accident and casualties if a cybercriminal manipulates a car's system?[edit]

Illustration of the hacking dilemma.

There is always the possibility that a cybercriminal may get into the system of the automobile using hacking software to obtain access to private information or to commit an illegal act. What happens if a cybercriminal gains access to the autonomous vehicle and instructs it to cause an accident to pin the blame on the driver? In situations like these, who is to blame for the accident that resulted in the loss of lives?

Is it the person who commits crimes online? Is it the one who's driving? Is it the fault of the automobile manufacturer that they were unable to prevent such assaults on the vehicle? Society may see self-driving automobiles as immoral due to the increasing hazards and lack of clear solutions.

In addition, terrorists and other types of criminals can hack into a car and cause it to crash. They can also deploy autonomous cars for various purposes, such as transporting explosives to their intended sites of detonation or wreaking havoc on a nation's transportation network.[9]

6. Is it ethical to test self-driving cars on public roads?[edit]

Car manufacturers working on autonomous vehicles have been conducting research and development on public roads in recent years. The testing of self-driving cars on public roads might put the lives of other drivers in danger, even though these vehicles have the potential to make roadways safer. This underscores the ethical dilemma faced by the public, regulators, and automobile manufacturers.[10]

References[edit]

  1. "The moral dilemmas behind developing self-driving vehicles". KrASIA. 2022-04-17. Retrieved 2022-11-07.
  2. Ampe, Teagan (2020-12-12). "Autonomous Accidents: The Ethics of Self-Driving Car Crashes". Viterbi Conversations in Ethics. Retrieved 2022-11-07.
  3. Hevelke, Alexander; Nida-Rümelin, Julian (2015). "Responsibility for Crashes of Autonomous Vehicles: An Ethical Analysis". Science and Engineering Ethics. 21 (3): 619–630. doi:10.1007/s11948-014-9565-5. ISSN 1353-3452. PMC 4430591. PMID 25027859.
  4. "Adoption of Self-Driving Cars". Drishti IAS. Retrieved 2022-11-07.
  5. Joshi, Naveen. "5 Moral Dilemmas That Self-Driving Cars Face Today". Forbes. Retrieved 2022-11-07.
  6. Perez, Isabel Yarwood (2020-12-12). "The Ethics of Self-Driving Cars". Viterbi Conversations in Ethics. Retrieved 2022-11-07.
  7. Perez, Isabel Yarwood (2020-12-12). "The Ethics of Self-Driving Cars". Viterbi Conversations in Ethics. Retrieved 2022-11-07.
  8. Martinho, Andreia; Herber, Nils; Kroesen, Maarten; Chorus, Caspar (2021-09-03). "Ethical issues in focus by the autonomous vehicles industry". Transport Reviews. 41 (5): 556–577. doi:10.1080/01441647.2020.1862355. ISSN 0144-1647.
  9. "Cyber Security of Autonomous Machines and Systems". University of North Dakota Online. 2020-01-27. Retrieved 2022-11-07.
  10. Furchgott, Roy (2021-12-23). "Public Streets Are the Lab for Self-Driving Experiments". The New York Times. ISSN 0362-4331. Retrieved 2022-11-07.