Are self-driving cars really safer?

2023 © Wikiask
Main topic: Tech
Other topics: Self-driving cars
Short answer:
  • No.
  • There are currently more fatalities using autonomous vehicles than those involving human drivers.
  • Self-driving auto accidents occur 9.1 times per million miles traveled, compared to 4.1 incidents per million miles for conventional cars.

A car capable of driving itself without input from a human driver is a self-driving or autonomous vehicle. Several categories of self-driving cars may be broken down according to the degree to which they are computerized.

The Society of Automotive Engineers (SAE) is the organization that came up with these levels and had six of them approved by the United States Department of Transportation. These levels range from Level 0 (totally manual) to Level 5, with Level 0 being the lowest and Level 5 being the highest (fully autonomous).

The ability of autonomous cars to "understand" their surroundings and respond appropriately to instructions is enabled by artificial intelligence (AI) and machine learning (ML) technologies. Sophisticated sensors and actuators generate a constantly updated map of their surroundings. In conjunction with superior computer vision capabilities, they are utilized to identify the presence of adjacent cars and people, measure distances, and detect uneven surfaces in roadways and sidewalks.[1]

The following is a list of safety concerns that are associated with autonomous/self-driving cars:

When it comes to safety, self-driving vehicles give drivers the impression that they are entirely driverless, which is not true at all[edit]

Autonomous Car Crash, Source:

It is deceptive to refer to these vehicles as "driverless" since none of them can operate without a human behind the wheel. Drivers are expected to maintain a state of increased sensitivity and be prepared to take control of the vehicle immediately. There is presently no completely automated or "self-driving" automobile available anywhere in the world. The safe operation of any car available for purchase in the United States needs the driver to provide their complete attention at all times. Even while an increasing number of vehicles now offer certain automated features to aid the driver in certain circumstances, these vehicles are not entirely automated despite the growing number of cars providing such functions.[2]

Danger of lithium-ion (LI) batteries catching fire and causing explosion due to hydrogen gas[edit]

Car fire aftermath

It is well known that lithium-ion (LI) batteries have a high propensity to catch fire. When burned, lithium produces a metal fire that may reach 3,632 degrees Fahrenheit (or 2,000 degrees Celsius) in temperature. An explosion caused by hydrogen gas may occur if you use water to put out the fire.[3]

The National Transportation Safety Board warns that there is a potential for "uncontrolled rises in temperature and pressure, sometimes known as thermal runaway," if a collision causes damage to a battery. It results in an explosion of hazardous fumes, the discharge of projectiles, and fire, all of which pose an additional risk to those who react to calls for help.[4]

The accident involving a Tesla vehicle that occurred on 17 April 2021 was the cause of a fire that burned for a total of four hours and consumed more than 30,000 gallons of water. In most cases, the fire in a car is extinguished within minutes.[5][6]

The technology is still underway and not 100% accurate while making decisions in edge cases[edit]

More technical work must be done on autonomous vehicles to handle difficult circumstances, especially edge cases. We may still need a few more decades of development before we can confidently call them "safer" than humans.

Real-world driving resulted in a problem occurring once every eight miles, on average, for automobiles outfitted with active driving aid systems. In addition, it was discovered that active driving assistance systems, which integrate the acceleration, braking, and steering of a car, often disconnect with little to no warning, forcing the driver to regain control of the vehicle rapidly. Suppose the driver is even temporarily distracted or places an excessive amount of trust in the capabilities of the system. In that case, it is not difficult to see how this situation may result in a tragedy.[7]

Real-life driving scenarios, such as making split-second judgments, driving in quickly changing weather conditions, and being able to stare into the eyes of another vehicle at a crossroad, are best left for fully involved drivers. However, driving is difficult; roads, lanes, and conditions constantly change, and the same actions aren't necessarily the most appropriate under all circumstances. When utilized correctly, several of the latest automotive aid technologies have the potential to save lives in certain situations. Additionally, computers do not possess the intuitive and instinctual capabilities typical of humans. This may cause them to react differently in certain circumstances. Further, adverse weather conditions can disrupt the car's sensors and systems, making them more likely to make mistakes.[8]

In 2016, a semi-tractor trailer drove across a highway in Florida as a Tesla tried to go through it at full speed. The accident occurred in Florida. The driver of the Tesla as a consequence of the injuries sustained. The autopilot function of the vehicle did not apply the brakes because it could not differentiate the white side of the truck from the sky, which was very well-lighted. The National Highway Traffic Safety Administration concluded that the occupant was to blame for the accident since he should have been able to use the brakes to avoid the collision. Still, he was most likely preoccupied at the time.[9][10]

Risk of a cybercriminal manipulating a car's system and perform an unlawful act[edit]

Cybercrime - keyboard and handcuffs

There is always the chance that a cybercriminal may use hacking software to access the system of the vehicle to acquire access to private information or perform an unlawful act. In 2015, hackers gained remote control of a Jeep and brought it to a halt on a highway in St. Louis when it traveled 70 miles per hour. By hacking into the onboard entertainment system, the cybercriminals gained access to the brakes and steering controls of the vehicle.[11]

Hackers, whether they be terrorists or other forms of criminals, can bring a vehicle to a stop. They can also deploy autonomous automobiles for various objectives, such as getting bombs to the areas where they are meant to detonate or creating havoc on the transportation network of a country. See Also

Lack of standard Self-Driving Regulations[edit]

The vast majority of safety standards for autonomous cars are voluntary, and authorities are uncertain about how to harmonize legislation between jurisdictions. Concerns have been raised by safety advocates over the development of voluntary rules for makers of autonomous vehicles.


  1. Milenkovic, Damjan (2022-02-20). "24 Self-Driving Car Statistics & Facts". Carsurance. Retrieved 2022-11-08.
  2. "Automated Vehicles for Safety | NHTSA". Retrieved 2022-11-08.
  3. "Lithium-Ion Batteries and Electrical Fires | Envista Forensics". Retrieved 2022-11-08.
  4. Huetter, John (2021-01-18). "NTSB report educates on electric vehicle safety risks, practices". Repairer Driven News. Retrieved 2022-11-08.
  5. Pietsch, Bryan (2021-04-18). "2 Killed in Driverless Tesla Car Crash, Officials Say". The New York Times. ISSN 0362-4331. Retrieved 2022-11-08.
  6. Kolodny, Lora. "'No one was driving' in Tesla crash that killed two men in Spring, Texas, report says". CNBC. Retrieved 2022-11-08.
  7. Edmonds, Ellen (2020-08-06). "AAA Finds Active Driving Assistance Systems Do Less to Assist Drivers and More to Interfere". AAA Newsroom. Retrieved 2022-11-08.
  8. "Tata Elxsi - 5 Challenges in the adoption of Autonomous vehicles". Retrieved 2022-11-08.
  9. Yadron, Danny; Tynan, Dan (2016-06-30). "Tesla driver dies in first fatal crash while using autopilot mode". The Guardian. ISSN 0261-3077. Retrieved 2022-11-08.
  10. Hawkins, Andrew J. (2019-05-01). "Tesla sued by family of man killed in Autopilot-related crash". The Verge. Retrieved 2022-11-08.
  11. Greenberg, Andy. "Hackers Remotely Kill a Jeep on the Highway—With Me in It". Wired. ISSN 1059-1028. Retrieved 2022-11-08.