Tesla employee exposed a dangerous safety flaw in Autopilot

Tesla's misstep can put road users at risk.

Sep 11, 2024 - 20:30
 0  9
Tesla employee exposed a dangerous safety flaw in Autopilot

Tesla's Autopilot driver assistance system is a couple of of the automakers' most well-known technologies, which promises drivers a hands-free driving experience in equipped vehicles.

However it surely, Tesla's (TSLA) technology differs from most other self sustaining driving systems, equivalent to radar and LiDAR-based systems utilized by companies like Waymo. Tesla relies on a system of cameras that covers every corner of its events, in addition to to machine learning software that makes on-the-fly decisions on a strategy to respond to hazards like road signs and parked cars.

Related: Park next to a criminal offense? Police say your Tesla would possibly possibly be a famous person witness

Additionally, Tesla employs a team of researchers and programmers who continuously analyze what the Autopilot cameras see and adapt the software to respond to quite quite a number conditions its vehicles bump into.

However it surely, while Tesla owners and other road users would possibly are taking a wait for that the engineers at the back of Autopilot were intent on keeping their promise to make safer roads, a new report reveals the alternative.

A driver rides hands-free in a Tesla Model S equipped with Autopilot hardware and software in New York on Sept. 19, 2016.

Bloomberg/Getty Images

No road rules at Tesla

Per a contemporary report from Business Insider, 17 current and former workers on the Tesla data annotation team, at offices in Buffalo, N.Y., Palo Alto, Calif., and Draper, Utah, revealed a whole host of an extremely powerful and damning information related to their jobs, which primarily revolved around viewing 30-2nd clips recorded by the cameras which are liable for keeping Autopilot functioning.

Tasked with interpreting the information in light of a couple of of the important road rules in each and every of the different parts of the arena where Tesla vehicles equipped with Autopilot are sold, seven of its employees noted that now and then, the automaker took a more relaxed stance on these rules.

The extent of this stance, some workers revealed, came as far as being told to now not teach Autopilot to follow certain traffic signs like "No Turn On Red" or "No U-Turn" inside of the hassle of making its systems drive the cars more "human-like."

"It be a driver-first mentality," a former Tesla employee told BI. "I think the basis is we want to train it to drive like a human would, now not a robot it could actually probably possibly be only following the foundations."

More Business of EVs:

  • Waymo finds new thanks to bring chaos to quiet city streets
  • Gavin Newsom's 'EV mandate' is lower than U.S. Supreme Court threat
  • BMW's clever, new EV app is a privacy nightmare

Additionally, tons to the identical tune of Facebook content moderators, these workers viewed some pretty disturbing footage day in and day out, which were now not limited to Teslas moving into accidents and near misses. Some workers even disclosed to BI that a fellow employee shared a disturbing video involving a Tesla vehicle hitting a young boy on a bicycle as a joke.

Similarly, Tesla employees told the publication that the corporate monitors the information annotation personnel with surveillance cameras, in addition to to software that tracks their speed and keystrokes. On one given shift, they would possibly possibly be subjected to five to 7-1/2 of hours annotating videos.

"Once in a while it could actually probably possibly be going to get monotonous," an ex-Tesla employee said. "Which you'd spend eight hours a day for months on end just labeling lane lines and curbs across thousands of videos."

Related: Tesla rival is now not playing games with self-driving safety

A flawed 'safety system'

In late July, a report released by the Wall Street Journal found many foundational flaws and shortcomings related to Autopilot and FSD.

The outlet combed through details of more than 200 crashes involving Teslas equipped with Autopilot and FSD and found that most of its flaws would possibly possibly be attributed promptly to the systems' overreliance on cameras and machine learning software.

Though Tesla's data annotators have told BI that they do their best to train the cameras to spot objects like road signs, stopped cars, trucks, or animals, there are still major gaps in what the software at the back of the cameras doesn't know.

“The sort of things that tend to go wrong with these systems are things prefer it used to be now not trained on the images of an over-turned double trailer – it just didn’t know what it used to be,” Phil Koopman, associate professor of electrical and computer engineering at Carnegie Mellon University, told the Journal.

“A deepest would have clearly said ‘something big is inside of the guts of the road,’ but the way machine learning works is it trains on a bunch of examples. If it encounters something it doesn’t have a bunch of examples for, it could actually probably possibly be going to would not have any idea what’s going on.”

The risks that systems like Autopilot and FSD present would possibly possibly be fixed with the application of radar and LiDAR systems, on the alternative hand, Tesla CEO Elon Musk seriously is never a fan.

He has persistently said that this kind of technology is "unnecessary" and that installing LiDAR on its cars may possibly be like fitting it with a "whole bunch of costly appendices."

Tesla, Inc. which trades on the NASDAQ lower than the ticker TSLA, is up four.58% today, finishing the day at $226.17 at the time of writing.

Tesla did now not right away respond to a request for comment.

Related: Veteran fund manager sees world of pain coming for stocks

What's Your Reaction?

like

dislike

love

funny

angry

sad

wow