The Risk of Self-Driving Cars
As computer and vehicle technology continues to improve, more car manufacturers are opting to include some form of automated driving assist tech in their vehicles. When the famous electric car company Tesla first announced their project to create fully autonomous vehicles, they raised safety concerns.
This didn’t stop other car brands from jumping in and develop their own systems in the hope that one day we would have the roads full of vehicles that pretty much drove themselves while we concern with other tasks during our travels.
Self-driving cars have been a dream for every automotive manufacturer ever since the 1920s when the first radio-controlled experimental vehicle was revealed to the masses, and while it never saw widespread commercial use it paved the way for the autonomous driving systems we have right now. The same can be said for the famous cruise control system that many car models have in place today.
Nevertheless, the more technology for self-driving cars improve, the more concerns are raised over the safety of the technology. With car manufacturers misleading the public with techno-jargon to promote widespread adoption and boost sales, we must educate ourselves on the truth of so-called autonomous vehicles.
What is a self-driving car?
First, we want to dispel the myth that there are fully self-driving and truly autonomous cars on the road. As we said before, car manufacturers use these terms as nothing more than PR speak. No car on the road today is fully autonomous.
Misleading nomenclature like self-driving cars, automated cars, or autonomous vehicles can lead to putting drivers at risk and has even lead to fatal accidents. To understand to what degree these vehicles are placed regarding driving autonomy we should look at SAE International’s (formerly Society of Automotive Engineers) classification for car automation:
- Level 0: The automated system might issue some warnings to the driver does not control any key systems in the car, we can see level 0 automation with things like our car computer flashing the check engine light, or telling us we are low on gas.
- Level 1: The driver and the vehicle share driving capabilities. Examples of this level of automation can be found in cruise control where the car maintains a set speed or Adaptive Cruise Control, where speeding and braking are controlled by the computer systems of the vehicle while the driver is still in charge of the wheel.
A level 1 automation level still requires the human driver to take control of the vehicle to ensure that the car behaves correctly.
- Level 2: In level 2 autonomous cars the onboard systems take control of every function of the vehicle. Acceleration, braking, and the steering wheel are controlled by the car itself. Nevertheless, the human driver should still have their hands on the steering wheel and be prepared to take control in case the onboard computer makes a mistake like not taking an obstacle into account or veering off-lane.
SAE advises that level 2 automation is not truly hands-off. The driver should always be prepared to go into manual control in case of emergencies as well as always be aware of their surroundings. Currently, most if not all “self-driving” vehicles fall in the level 2 category.
- Level 3: With a level 3 automation system, the driver can safely turn their attention away from the vehicle and focus on other tasks of their interest, be it texting, posting on social media, or even watch a movie and play videogames on their vehicle.
The vehicle should be able to fully monitor its surroundings and react to situations that arise in the road, be it an obstacle or another car cutting in front of the vehicle. Still, minimal attention by the driver would be required to take control when the vehicle tells its driver to do so. A level 3 system would treat you as a co-driver and monitor and while no level 3 cars exist in the market today, there is indeed level 3 tech being developed like the Traffic Jam Chauffeur.
- Level 4: Like level 3 automation, but on this level, the driver is not expected to monitor or take control of the vehicle. A level 4 system would even let the driver fall asleep during their trip, with the car handling every task expected from the vehicle.
However, these systems would be applied only to certain areas or routes, with the vehicle parking and aborting the trip if the driver does not take control outside of the geofenced area. A level 4 system could very well incorporate level 3 systems when going outside its limited area in case of commercial vehicles and could see widespread use with so-called robot taxis or delivery drones.
- Level 5: A true self-driving car, no human input is required, and the driver can drive manually only if they want to. A level 5 vehicle would not be limited to a set area like a level 4 but would work anywhere, at any time, and under any weather condition.
Car manufacturers love to tout their vehicles as if they were level 5 platforms, but this technology is expected to be initially available until 2030 if technological and scientific advances continue at the speed they are going right now.
The dangers of a self-driving vehicle
As we said earlier, while there are no truly autonomous vehicles on the road today, the systems that are being used by car manufacturers can still assist you with semi-hands off driving, an example of this is the newer General Motors Super Cruise System or Tesla Autopilot’s Full Self Driving (FSD) which is expected to expand their testing capabilities in the coming weeks, there are still security risks present for all these driving assist systems. Below you can find some of the most prevalent dangers of a self-driving vehicle:
Obstacle Detection: Autonomous vehicles use a variety of sensors to detect obstacles, traffic lights, road conditions, and the speed and direction of surrounding vehicles. Yet, dangerous weather conditions, heavy traffic, weathered road signs, or damaged vehicle tags makes the car sensors unable to work properly, which could lead to accidents as has been evidenced by the numerous Tesla models unable to detect highway barriers and even hit parked vehicles.
Machine Learning Fallacy: No doubt machine learning and artificial intelligence technology are becoming more prevalent in our everyday lives. From Netflix giving us recommendations based on the shows and movies we watch to smart refrigerators that notify us when our pantry is empty, autonomous cars are using this technology to recognize potentially hazardous situations.
But, with the technology still in its relative infancy, cars are unable to react in the same way humans do when confronted by an unexpected situation. Until autonomous vehicles are able to behave with the same reaction time a human can when in an anomalous situation, self-driving cars pose a potential threat to pedestrians, other vehicles, and even the drivers themselves should they get too confident with the systems now in place.
Cybersecurity Concerns: With all the data that self-driving cars must collect, there are some concerns about how to secure all these systems are regarding intrusion from a third party. If self-driving cars become prevalent on the open road, they would have to be able to communicate not only with each other but with the manufacturer or tech provider as well. This could make them vulnerable to cyberattacks. With other areas of computing suffering cyber-intrusions, there is only a matter of time until someone decides to hack not only one vehicle but a whole flotilla of them for nefarious purposes.
Complexities with Insurance Companies: The matter of insurance regarding autonomous vehicles will become a complex matter. Normally one would pay insurance in case they get into an accident but, what would happen when the driver isn’t at fault for an accident, but if it was caused by the vehicle itself?
Should self-driving cars become a mainstay then who would be held liable in case of an accident? Should the fault fall on the manufacturer of the vehicle, it could see a price hike for vehicles as well as a complex reform to how insurance works for the end-user.
True self-driving cars are still long ways from becoming part of our daily lives, but as more manufacturers try to outdo each other in regards to this new technology, there’s no doubt that more autonomous vehicles will be hitting the road in the coming years. Nowadays there are still both security concerns and public acceptance problems with self-driving vehicles.
The technology is still far from being completely viable, and for it to work as intended most cars on the road would need to be autonomous vehicles. Experts agree that only about 10% of vehicles will be fully autonomous by 2034. Not only would our vehicles change, but also how roads are constructed. We would also have to move to some version of smart roads that can operate in tandem with self-driving vehicles to ensure total safety for the passengers of the autonomous car.
We would recommend waiting until the tech is perfected to get a self-driving car and continue to enjoy your traditional vehicle with all the safety precautions needed to ensure your experience on the road is as good as it can ever be. In case you ever get into an accident, be it against a traditional vehicle or one of those futuristic self-driving cars don’t hesitate to contact Farahi Law Firm, APC for a free consultation and review of your case at (844) 824-2955. We are available 24/7 your wellbeing is our TOP priority.