Elon Musk has long used his powerful Twitter megaphone to amplify the idea that Tesla’s automated driving software is not only safe, it’s safer than anything a human driver can achieve.
That campaign gained momentum last fall when the electric car maker expanded its fully self-driving “beta” program from a few thousand people to a fleet that now numbers more than 100,000. The $12,000 feature allegedly allows a Tesla to drive itself on highways and neighborhood streets, change lanes, make turns, and obey traffic signs and signals.
As critics chastised Musk for testing experimental technology on public roads without trained safety drivers as backups, Santa Monica chief investment officer and Tesla vocal booster Ross Gerber was among the allies who came to his defense.
“There hasn’t been a single accident or injury since the launch of the FSD beta,” he said. tweeted in January. “Not one. Not one.
To which Musk replied with one word: “Correct”.
In fact, by that time dozens of drivers had already filed safety complaints with the National Highway Traffic Safety Administration about incidents involving fully autonomous driving — and at least eight of them involved crashes. The complaints are in the public domain, in a database on the NHTSA website.
A driver reported the FSD “self-steering towards a tractor-trailer” before accelerating into a mid-post causing a wreck.
“The car pulled into the wrong lane ‘with the FSD engaged’ and I was hit by another driver in the lane beside my car,” another said.
YouTube and Twitter are full of videos that expose the FSD’s bad behavior, including a recent Publish which appears to show a Tesla heading into the path of an oncoming train. The driver pulls on the steering wheel to avoid a frontal impact.
It’s nearly impossible for anyone other than Tesla to say how many FSD-related crashes, injuries, or deaths have occurred; NHTSA is investigating several recent fatal crashes in which it may have been involved. The agency recently ordered automakers to report serious accidents involving automated and semi-automated technologies to the agency, but it has yet to release details of each crash.
Automotive robot companies such as Cruise, Waymo, Argo and Zoox are equipped with live software that immediately reports accidents to the company. Tesla pioneered this kind of software in passenger cars, but the company, which doesn’t have a media relations office, didn’t answer questions about receiving automated crash reports from cars. using FSD. Automakers without live software must rely on public reports and communications with drivers and service centers to judge whether an NHTSA report is necessary.
Attempts to reach Musk were also unsuccessful.
Gerber said he was unaware of any crash reports in NHTSA’s database when he posted his tweet, but believed the company would have been aware of any crashes. “Because Tesla records everything that happens, Tesla is aware of every incident,” he said. He said it was possible the drivers were responsible for the crashes, but he had not reviewed the reports himself.
There are no accurate public statistics on self-driving car crashes, as police who write crash reports only have statements from drivers to rely on. “We are not experts on how to extract this type of data,” said Amber Davis, spokeswoman for the California Highway Patrol. “Ultimately, we ask for fond memories of how [a crash] past.”
The exact data that a Tesla vehicle’s automated driving system collects and transmits to corporate headquarters is known only to Tesla, said Mahmood Hikmet, head of research and development at self-driving shuttle company Ohmio. He said Musk’s definition of a crash or accident might differ from how an insurance company or the average person might define it. NHTSA requires accident reports for fully or partially automated vehicles only if someone is injured, or if an airbag is deployed, or if a car needs to be towed.
The FSD crash reports were first uncovered by FSD critic Taylor Ogan, who runs Snow Bull Capital, a China-focused hedge fund. The Times separately downloaded and assessed the data to verify Ogan’s findings.
The data – covering a period from January 1, 2021 to January 16, 2022 – shows dozens of safety complaints about FSD, including numerous phantom brake reports, in which a car’s automatic emergency braking system slams on the brakes for no apparent reason.
Below are excerpts from the eight accident reports in which FSD was involved:
- Southampton, NY: A Model 3 traveling at 60 mph crashed into an SUV parked on the shoulder of the freeway. The Tesla drove itself “straight through the side of the SUV, ripping the car’s rearview mirror off”. The driver called Tesla to tell him that “our car had gone crazy”.
- Houston: A Model 3 was traveling at 35mph “when suddenly the car jumped over the curb causing damage to the bumper, wheel and a flat tire”. The accident “appeared to have been caused by a discolored spot on the road which gave the FSD the false perception of an obstacle it was trying to avoid”. Denying a warranty claim, a Tesla service center charged $2,332.37 and said they would not return the car until the bill was paid.
- Brea: “While taking a left turn, the car moved into the wrong lane and I was hit by another driver in the lane beside my car.” The car “took control of itself and forced itself into the wrong lane…putting everyone involved at risk. The car is badly damaged on the driver’s side.
- Collettsville, North Carolina: “The road ran to the left and when the car took the turn it took too wide a turn and veered off the road… The right side of the car went up and overshot the beginning of the rocky slope The right front tire burst and only the side airbags deployed (both sides) The car traveled approximately 500 yards along the road and then died.Estimated damage was $28,000 to $30,000.
- Troy, Missouri: A Tesla was turning around a curve when “suddenly, about 40% into the turn, the Model Y straightened the wheel and crossed the centerline into the direct path of the oncoming vehicle. When I attempted to steer the vehicle back into my lane, I lost control and skidded into a ditch and through the woods causing significant damage to the vehicle.
- Jackson, MO: A 3″ model pulled right towards a semi, then pulled left towards the center posts as it accelerated and the FSD wouldn’t turn off.…We owned this car for 11 days when our accident happened.”
- Hercules, California: The “phantom braking” caused the Tesla to suddenly stop and “the vehicle behind me did not react”. A rear-end collision caused “serious damage to the vehicle”.
- Dallas: “I was driving with fully autonomous driving assistance…a car was in my blind spot so I tried to take control of the car by pulling on the steering wheel. The car sounded an alarm that I was going hitting the left median I think I was struggling with the car to regain control of the car and I ended up hitting the left median which ricocheted[ed] the car full right, hitting the median.
Critics say the name Full Self-Driving is a misnomer and that no car available for sale to an individual in the United States can drive itself. FSD “is entirely a fantasy,” said New York University professor Meredith Broussard, author of the book “Artificial Unintelligence,” published by MIT Press. “And it’s a security nightmare.”
California regulations prohibit a company from advertising a car as fully self-driving when it is not. The state Department of Motor Vehicles is conducting a marketing review of Tesla, a review well into its second year.
DMV Chief Steve Gordon has declined to speak publicly about the issue since May 2021. On Wednesday, the department said, “The review is ongoing. I’ll let you know when we have something to share.
#Musk #selfdriving #Tesla #crashed #Regulators #counted
Post expires at 11:01pm on Thursday July 21st, 2022