Driverless cars are coming to a road near you, and sooner than you may think. Technology for collision avoidance is improving for every new model. Some warn you if you begin to drift out of your lane, some will automatically brake for you if you are about to rear-end someone. Some cars, like the Tesla and Google’s own autonomous car, can drive themselves.
The data on whether autonomous cars can greatly improve road safety is up for debate. Google boasts no at-fault collisions for their vehicles. But a recent study has found that self-driving cars get into five times as many accidents as a regular human-driven car.
One of the major problems appears to be that self-driving cars are too cautious, and are struck by not-so-cautious human-driven cars. A careful, prudent human driver might bend rules to stay safe, such as accelerating out of the way of an accident, but self-driving cars can struggle with varying options in obtuse situations. For example, Following the road rules to the letter might be a problem in Calgary, especially in the winter.
After a snowstorm, we don’t always drive in the lanes painted on the road (assuming you can even see them), but everyone is safe as long as everyone follows snow lanes made by cars. Self-driving cars will have to improve their algorithms to face the winter challenge.
From a legal perspective, if you’re a passenger in a human-driven car, and the driver gets you into an accident, or fails to avoid an accident, you have a legal case against the driver. But, if your self-driving car gets you into an accident, or fails to help you avoid an accident, who would be to blame? The manufacturer? The software? The person in the driver’s seat? These are important legal questions that will need to be answered before widespread adoption of autonomous cars.
