New York Times ad warns of Tesla’s ‘fully self-driving’

A full-page ad in Sunday’s New York Times took aim at Tesla’s ‘Full Self-Driving’ software, calling it ‘the worst software ever sold by a Fortune 500 company’ and offering $10,000, the same price than the software itself in the first person who might cite “another commercial product from a Fortune 500 company that has a critical malfunction every 8 minutes”.

The ad was taken down by The Dawn Project, a recently founded organization aimed at banning dangerous software from security-critical systems that can be targeted by military-style hackers, as part of a campaign to remove Tesla Full Public Self-Driving (FSD). routes until it has “1,000 times fewer critical malfunctions”.

The defense group’s founder, Dan O’Dowd, is also the CEO of Green Hill Software, a company that builds operating systems and programming tools for integrated safety and security systems. At CES, the company said BMW’s iX vehicle uses its real-time operating system and other security software, and also announced the availability of its new live software product and data services for automotive electronic systems.

Despite the founder of The Dawn Project’s potential competitive bias, Tesla’s beta software FSD, an advanced driver assistance system that Tesla owners can access to manage certain driving functions on city streets, did the subject of intense scrutiny in recent months after a series of YouTube videos showing flaws in the system went viral.

The NYT announcement comes just days after the California Department of Motor Vehicles told Tesla it would be “revisit” his opinion that the company’s testing program, which uses consumers and not professional security operators, does not fall under the department’s autonomous vehicle regulations. The California DMV regulates self-driving testing in the state and compels other companies like Waymo and Cruise that are developing, testing, and planning to deploy robotaxis to report crashes and system failures called “disengages.” Tesla never published these reports.

Tesla CEO Elon Musk has since vaguely replied on Twitter, claiming that Tesla’s FSD has not resulted in any accidents or injuries since its launch. The United States National Highway Traffic Safety Administration (NHTSA) investigate a report of a Tesla Model Y owner who reported that his vehicle pulled into the wrong lane while making a left turn in FSD mode, resulting in the vehicle being collided by another driver.

Even though this was the first FSD crash, Tesla’s Autopilot, the automaker’s ADAS that comes standard on vehicles, was involved in a dozen accidents.

Alongside the NYT ad, The Dawn Project published a Fact check of its claims, referring to its own FSD security analysis who studied data from 21 YouTube videos totaling seven hours of driving time.

Videos analyzed included beta versions 8 (released in December 2020) and 10 (released in September 2021), and the study avoided videos with significantly positive or negative titles to reduce bias. Each video was scored according to the California DMV’s Driver Performance Rating, which is what human drivers must pass to earn a driver’s license. To pass a driving test, California drivers must have 15 or fewer maneuvering errors, such as not using turn signals when changing lanes or maintaining a safe distance from other moving vehicles, and zero critical driving errors. , like crashing or running a red light. .

The study found that FSD v10 made 16 rating maneuver errors on average in less than an hour and one critical driving error approximately every 8 minutes. There was an improvement in errors in the nine months between v8 and v10, according to the analysis, but at the current rate of improvement, “it will take another 7.8 years (according to AAA data) to 8, 8 years (according to Transportation Bureau data) to reach the accident rate of a human driver.”

The Project Dawn announcement makes some bold claims that should be taken with a grain of salt, particularly because the sample size is far too small to be taken statistically seriously. If, however, the seven hours of footage is indeed representative of an average FSD reader, the results could point to a larger problem with Tesla’s FSD software and address the larger question of whether Tesla should be allowed to test this software. on public roads. without any regulation.

“We did not sign up for our families to be crash test dummies for thousands of Tesla cars on public roads…” the ad reads.

Federal regulators have begun taking action against Tesla and its Autopilot and FSD beta software systems.

In October, NHTSA sent two letters to the automaker targeting its use of non-disclosure agreements for owners getting early access to the FSD beta, plus the company’s decision to use over-the-air software updates to fix an issue in the standard autopilot system that should have been a recall . In addition, Consumer Reports released a statement over the summer, saying the FSD version 9 software upgrade did not seem safe enough for public roads and that it would test the software independently. Last week, the organization published the results of its tests, which revealed that “Tesla’s camera-based driver monitoring system fails to keep the driver’s attention on the road”. CR found that Ford BlueCruise, on the other hand, issues alerts when the driver’s eyes are averted.

Since then, Tesla has rolled out many different versions of its v10 software – 10.9 should be here any day, and v11 with a “single city/highway software stack” and “many more architectural upgrades” will be released in February, according to CEO Elon Musk.

Reviews of the latest 10.8 are skewed, with some online reviewers saying it’s much smoother, and many others saying they don’t feel confident using the technology at all. A thread reviewing the latest version of FSD on the Tesla Motors secondary page shows owners sharing complaints about the software, with one even writing: “Definitely not ready for the general public yet…”

Another reviewer said that it took the car too long to turn right onto “a completely empty straight road… Then it had to turn left and continued to hesitate for no reason, blocking the lane coming in the opposite direction, only to then suddenly accelerate once it had made it to the next street, followed by an equally sudden deceleration because he changed his mind about the speed and now thought a 45 mph road was at 25 mph.”

The driver said he eventually had to disengage completely because the system completely ignored an upcoming left turn, one that was to occur at a standard intersection “with lights and clear visibility in all directions and no other traffic” .

The Dawn Project Campaign highlights a warning from Tesla that its FSD “can do the wrong thing at the worst time.”

“How can anyone tolerate a security-critical product on the market that can do the wrong thing at the worst time,” the advocacy group said. “Isn’t that the definition of flawed? Fully autonomous driving needs to be taken off our roads immediately.”

Neither Tesla nor The Dawn Project could be reached for comment.

Comments are closed.