Two days after the California DMV pulled a taxi license from autonomous car company Cruise, the company has decided to push pause on all trials.
The company posted on X its intentions to reexamine processes, systems, and tools to help create a safer experience for riders.
(2/3) In that spirit, we have decided to proactively pause driverless operations across all of our fleets while we take time to examine our processes, systems, and tools and reflect on how we can better operate in a way that will earn public trust.
— cruise (@Cruise) October 27, 2023
The announcement shines a light again on autonomous cars with big questions surrounding wider use on the roads.
Each incident calls into question the decision-making abilities of the technology, all of which gain national headlines when something goes wrong.
In the latest case, a Cruise autonomous taxi trapped a woman who had been hit by a human-driven vehicle in San Fransisco.
At first, the vehicle stopped when the accident was recorded but then the vehicle followed a protocol to move to the side of the road.
Unfortunately, the woman was still trapped under the car and was then dragged 20 additional feet by the Cruise vehicle.
In an earlier incident, a Cruise Robotaxi entered an intersection on a green light not noticing an approaching fire truck on the way to an emergency.
Proponents of autonomous vehicles are quick to point out that human-driven cars are much more dangerous and likely to cause accidents.
Detractors point out that in unexpected incidents like the ones above, autonomous vehicles are currently incapable of assessing the safest course of action in a particular situation.
There are more and more autonomous car companies and self-driving features embedded into cars from major brands every day.
Tesla in particular was one of the first to tout autonomous cars and has been involved in several major incidents leading the multiple deaths.
The latest setback from GM’s Cruise brings to light a fundamental truth with AI technology like self-driving cars- Mistakes by drivers are replaced by mistakes by coders.