The latest version of Tesla's Full Self Driving (FSD) Autopilot beta has a bit of a kink: it doesn't appear to notice child-sized objects in its path, according to a campaign group.
In tests performed by The Dawn Project using a Tesla Model 3 equipped with FSD version 10.12.2 (the latest, released June 1), the vehicle was given 120 yards (110 meters) of straight track between two rows of cones with a child-sized mannequin at the end.
The group says the "test driver's hands were never on the wheel." Crucially, Tesla says that Autopilot is not a fully autonomous system and only provides assistance and cruise control functionality to the driver. You're supposed to keep your hands on the wheel and be able to take over at any time.
Traveling at approximately 25mph (about 40kph), the Tesla hit the dummy each time.
Of the results, the Dawn Project said 100 yards of distance is more than enough for a human driver to notice a child, stating: "Tesla's Full Self-Driving software fails this simple and safety critical test repeatedly, with potentially deadly results."
"Elon Musk says Tesla's Full Self-Driving software is 'amazing.' It's not... This is the worst commercial software I've ever seen," said the project's founder, Dan O'Dowd, in a video he tweeted out along with the results.
O'Dowd, who also founded Green Hills Software in 1982 and advocates for software safety, has been an opponent of Tesla for some time, even launching a bid for US Senate in California that centered on policing Tesla as a way to talk about broader cybersecurity issues. O'Dowd's Senate bid ended in June when he lost the Democratic party primary.
The Dawn Project's stated goal is "making computers safe for humanity." Tesla FSD is the Project's first campaign.
It's worth noting that The Dawn Project's tests of FSD 10.12.2, which took place on June 21 in Rosamond, CA, only consisted of three runs. That's a small sample size, but considering other Autopilot tests and statistics it's not unexpected.
Malfunctions in Tesla Autopilot have been cited as allegedly being a factor in several, fatal accidents involving both drivers and pedestrians over the years. Last year Tesla rolled back FSD software releases after software bugs were discovered that caused troubles with left turns, something Tesla is still working on.
In early June, the US National Highway Traffic Safety Administration upgraded a probe of Tesla Autopilot after it found reasons to look into whether "Autopilot and associated Tesla systems may exacerbate human factors or behavioral safety risks." The investigation is ongoing.
A week after announcing its probe, the NHTSA said Tesla Autopilot (operating at level 2, not FSD) was involved in 270 of the 394 driver-assist accidents - around 70 percent - it cataloged as part of an investigation into the safety of driver assist technology.
Most recently, the California Department of Motor Vehicles filed complaints against Tesla alleging it misrepresented claims the vehicles can drive autonomously. If Tesla doesn't respond to the DMV's claims by the end of this week, the case will be settled by default and could lead to the automaker losing its license to sell cars in California.
The Dawn Project said that the NHTSA has acted quickly to issue recalls on Tesla features, pointing to Tesla's NHTSA-spurred recalls of FSD code that allowed Teslas to roll past stop signs and the disabling of Tesla Boombox.
The Dawn Project says its research "is far more serious and urgent." ®
polling(238,"hide hide_when_voted hide_show_results")
They can't be any worse than some human developers
Meanwhile in China, Alibaba runs 500 delivery-bots and they've delivered 10 million items to Easy Street
Kernel live-patching and a full decade of software updates
How meta, or should that be, Meta?
Google-owned company claims another scalp with applications in computer science
And it wouldn't be a Redmond OS update without printing issues
Plus: It wouldn't be a Windows update without printing issues