Engineering:Tesla Autopilot

From HandWiki
Short description: Suite of advanced driver-assistance system features by Tesla
Tesla Autopilot in operation

Tesla Autopilot is an advanced driver-assistance system (ADAS) developed by Tesla that amounts to partial vehicle automation (Level 2 automation, as defined by SAE International). Tesla provides "Base Autopilot" on all vehicles, which includes lane centering and traffic-aware cruise control. Owners may purchase an upgrade to "Enhanced Autopilot" (EA) which adds semi-autonomous navigation on limited access roadways, self-parking, and the ability to summon the car from a garage or parking spot. The company claims the features reduce accidents caused by driver negligence and fatigue from long-term driving.[1][2] Collisions and deaths involving Tesla cars with Autopilot engaged have drawn the attention of the press and government agencies.[3]

Full Self-Driving (FSD) is Tesla's branding for its beta testing program to achieve fully autonomous driving (SAE Level 5). The naming is controversial, because vehicles operating under FSD remain at Level 2 automation and are therefore not "fully self-driving" and require active driver supervision. FSD adds semi-autonomous navigation on city streets and the ability to respond to visible traffic lights or stop sign. (As of February 2023), Tesla has about 360,000 participants in the FSD program.[4] Industry observers and academics have criticized Tesla's decision to use untrained consumers to validate beta features as dangerous and irresponsible.[5][6][7][8]

The company's stated intent is to offer fully autonomous driving at a future time, acknowledging that technical and regulatory hurdles must be overcome to achieve this goal.[9] Since 2013, Tesla CEO Elon Musk has made repeated inaccurate predictions for Tesla to achieve Level 5 autonomy,[10] most recently predicting the end of 2023.[11]

History

Tesla Autopilot engaged on I-80 near Lake Tahoe

Elon Musk first discussed the Tesla Autopilot system publicly in 2013, noting that "Autopilot is a good thing to have in planes, and we should have it in cars."[12] Over the ensuing decade, Autopilot went through a series of hardware and software enhancements, gradually approaching the goal of full autonomy, which, (As of January 2024), remains a work in progress. Autopilot, as initially introduced in 2014, referred to automatic parking and low-speed summoning on private property,[13] using sensor and computing hardware developed by Mobileye. By 2016, the Mobileye-based Autopilot had added automatic emergency braking, adaptive cruise control, and lane centering capabilities[14] when Tesla and Mobileye dissolved their partnership that July.[15] Enhanced Autopilot (EA) was announced later in 2016 as an extra-cost option that used a new hardware suite developed by Tesla;[16] the key distinguishing feature for EA, "Navigate on Autopilot", which uses the new hardware suite to guide the vehicle on controlled-access roads, from on-ramp to off-ramp, was delayed until 2018.[17] At the same time that EA was introduced, Tesla also offered Full Self-Driving (FSD) as an upgrade option to EA in 2016, which would extend machine-guided driving capabilities to local roads.[16] FSD beta testing started in October 2020.[18]

Hardware 1 and Autopilot (Mobileye)

In October 2014, Tesla offered customers the ability to pre-purchase Autopilot[13][19][20] that was not designed for self-driving.[21] Initial versions were built in partnership with Mobileye,[22] but Mobileye ended the partnership in July 2016 because Tesla "was pushing the envelope in terms of safety".[23][24]

Tesla cars manufactured after September 2014 had the initial hardware (hardware version 1 or HW1) that supported Autopilot.[25] The first Autopilot software release came in October 2015 as part of Tesla software version 7.0.[26] Version 7.1 removed some features to discourage risky driving.[27]

Version 8.0 processed radar signals to create a point cloud similar to lidar to help navigate in low visibility.[28][29] In November 2016, Autopilot 8.0 was updated to encourage drivers to grip the steering wheel.[30][31] By November 2016, Autopilot had operated for 300 million miles (500 million km).[32]

Hardware 2 and 3, and Navigate on Autopilot

In October 2016, Autopilot sensors and computing hardware transitioned to hardware version 2 (HW2) for new cars,[33] the upgraded hardware collectively was called Autopilot 2.0 to distinguish it from the original Autopilot/HW1 vehicles.[34] At the time it was launched, Autopilot 2.0 vehicles with HW2 actually had fewer features than HW1 vehicles; for example, HW2 vehicles were unable to be summoned in 2016.[35][36]

Tesla also used the term Enhanced Autopilot (EA) to refer to planned capabilities that would be coming to HW2 vehicles; the signature EA feature announced in December 2016 was "Navigate on Autopilot", which allows machine-controlled driving on controlled-access highways from on-ramp to off-ramp, including the abilities to change lanes without driver input, transition from one freeway to another, and exit.[37] HW2 vehicles were updated in January and February 2017 with software version 8.0, which included traffic-aware cruise control and autosteer (lane-centering) on divided highways and 'local roads' up to a speed of 45 miles per hour (72 km/h).[38][39] Software version 8.1 for HW2 arrived in March 2017, providing HW2 cars feature parity with HW1 cars, but not "Navigate on Autopilot".[40]

Autopilot version 9 enabled autonomous lane changes to pass vehicles moving below the set cruising speed; the most aggressive mode was named "Mad Max" after the media franchise.[41]

In August 2017, Tesla announced hardware version 2.5 (HW2.5), which upgraded the on-board processor and added redundant systems.[42] Software version 9.0 was released in October 2018,[43] in preparation for the release of "Navigate on Autopilot" for HW2/HW2.5 vehicles with EA, which was implemented later that month.[44] At the same time, Tesla removed the option to purchase the "Full Self-Driving" upgrade.[45] In a November 2018 test drive, The Verge reporter Andrew J. Hawkins called the beta "Navigate on Autopilot" system "the feature that could give Tesla an edge as it grows from niche company to global powerhouse".[46] As initially released, "Navigate on Autopilot" would suggest lane changes, but could not change lanes until the suggestion had been confirmed by the driver through the turn signal stalk.[47]

In March 2019, Tesla transitioned to hardware version 3 (HW3) for new cars.[48] Completely automated lane changes without requiring driver confirmation using "Navigate on Autopilot" were added as an option in an April software update,[49] although Consumer Reports called it "far less competent" than a human driver.[50] To comply with the new United Nations Economic Commission for Europe regulation related to automatically commanded steering function,[51] Tesla provided an updated Autopilot in May, limited to Europe.[52] In September, Tesla released software version 10 to Early Access Program (EAP) testers, citing improvements in driving visualization and automatic lane changes.[53]

Full Self-Driving video promotion

In October 2016, at the same time as the release of HW2,[54] Tesla released a video entitled "Full Self-Driving Hardware on All Teslas"[55] that claimed to demonstrate Full Self-Driving, the system designed to extend automated driving to local roads.[56][57] (CEO Elon Musk tweeted a link to a longer version in November 2016.[58]) In the video, the person in the driver's seat does not touch the steering wheel or pedals during the demonstration. The video also shows perspectives from the vehicle's cameras and image recognition system.[59] At Musk's suggestion, the title card states "The person in the driver's seat is only there for legal reasons. He is not doing anything. The car is driving itself."[60] It was nicknamed the "Paint It Black" video, after the 1966 Rolling Stones song used as its soundtrack.[56]

Former employees who helped to produce the 2016 video were interviewed by The New York Times in 2021.[61] In the interview, they stated the vehicle was following a route that had been mapped with detailed scanning cameras, which is a technology that was and is not available in Tesla production cars. Even with these augmentations in place, human drivers had to intervene to take control, the vehicle allegedly struck "a roadside barrier" on the Tesla grounds during filming, requiring repairs to the vehicle, and the car crashed into a fence when trying to automatically park.[62] In January 2024, Bloomberg published an expose based on internal Tesla emails revealing that Musk personally oversaw the editing and post-production of the video.[63]

Motor Trend and Jalopnik compared what Tesla had showcased to the deceptive video depicting a Nikola One EV truck which was actually powered by gravity;[64] Jalopnik commented "[the Tesla video] may be worse, because this video was used to deceptively suggest capabilities of a system deployed into real people's hands and used on public roads."[65] In June 2022, Ashok Elluswamy, director of Autopilot software, made a statement during a deposition taken for a civil lawsuit filed against Tesla by the family of a driver that was killed in 2018 after the Model X he was driving using Autopilot crashed into a concrete barrier in Mountain View, California. Elluswamy stated the video was not originally intended "to accurately portray what was available for customers in 2016. It was to portray what was possible to build into the system," while the final video had no such disclaimer.[66] A Florida circuit court judge also noted the final video as part of Tesla's marketing strategy in rejecting Tesla's motion to dismiss a lawsuit over a 2019 death, writing that "absent from this video is any indication that the video is aspirational or that this technology doesn't currently exist in the market."[67]

Full Self-Driving options

At the time the "Paint it Black" video was released in 2016, FSD was acknowledged to be "some way off in the future."[37] The option to purchase the FSD upgrade to EA was removed from Tesla's website in October 2018; Elon Musk tweeted the upgrade was "causing too much confusion". Technology analyst Rob Enderle called the removal of the upgrade option "incredibly stupid", adding "don't release a system that doesn't work and make it hard to order."[68] During a January 2019 earnings call, Elon Musk reiterated "full self-driving capability is there", referring to "Navigate on Autopilot", an EA feature limited to controlled-access highways.[45] The EA option was replaced by FSD in 2019 without offering "Navigate on Autopilot"-like functionality for local roads; autosteer and traffic-aware cruise control were transferred to the basic Autopilot feature set, which was made standard on all new Teslas.[69][70]

In September 2020, Tesla reintroduced the term Enhanced Autopilot to distinguish the existing subset of features which included high-speed highway travel and low-speed parking and summoning, from FSD, which would add medium-speed city road travel.[71] Tesla released a "beta" version of its FSD software (which extended "Navigate on Autopilot"-like machine-controlled driving and navigation to 'local roads') in the United States in October 2020 to EAP testers.[72][73] The EA option tier was made available to all buyers by June 2022,[69] and the FSD beta had expanded to approximately 160,000 testers in the United States and Canada by September.[74] In November, the FSD beta was extended to all owners in North America that had purchased the option.[75] (As of February 2023), Tesla has about 360,000 participants in this program.[4]

In February 2023, 362,758 vehicles equipped with the FSD beta were recalled by the U.S. National Highway Traffic Safety Administration (NHTSA), and addition of new participants was halted by the company.[76][77] Tesla will issue a software update to address the issues identified by the regulator.[78] FSD Beta v11 was released in March to fix the issues, which also merged Autopilot with FSD.[79] In July, NHTSA asked Tesla to clarify which changes had been made, and when they were implemented.[80]

Pricing

In 2016, the initial price of Autopilot (called "Enhanced Autopilot") was $5,000, and FSD was an add-on for $3,000.[81] In April 2019, basic Autopilot was included in every Tesla car,[82] and FSD was $5,000, growing to $10,000 in October 2020 and $15,000 in September 2022.[83] As the price of FSD increased, the fraction of buyers who purchased it steadily declined, from an estimated 37% in 2019 to 22% in 2020 to 12% in 2021.[84] Starting in 2021, the company offers a subscription for FSD for $199 per month or $99 per month if the customer has already purchased Enhanced Autopilot.[85]

Team executive turnover

There has been high turnover in Autopilot lead roles.[86] Some of the executives leaving were:

  • 2015–2016: Sterling Anderson (Autopilot Director)[87] left to start a competing company.
  • 2017–2017: Chris Lattner (Vice President of Autopilot Software) left after six months due to cultural fit issues.[88]
  • 2017–2018: Jim Keller (Vice President, Autopilot Hardware Engineering) left to join Intel.[89]
  • 2018–2022: Andrej Karpathy (Senior Director, Artificial Intelligence) left to revisit "long-term passions around technical work in AI, open source and education".[90][91]

(As of November 2022), executives include:

  • Ashok Elluswamy (Director, Autopilot Software)[92]
  • Milan Kovac (Director, Autopilot Software Engineering)[92]
  • Pete Bannon (Vice President, Hardware Engineering)[93]

Regional availability

UK and Europe

Autopilot product capabilities in the United-Kingdom (UK) is different from the one available in the United-States. The UK product is a suite including technologies such as ACC, LKA and AEB similar to what is available in other European vehicles.[94]

EuroNCAP compared the Tesla Model 3 Autopilot ADAS suite with what is available in other European vehicles. EuroNCAP makes comment about disseminated misleading/confusing suggestive information. It also comments on a risk of overreliance.[95]

The handbook correctly indicates the limitations of the system capabilities.
—EuroNCAP[95]

Other countries

In China, basic Autopilot and FSD may not be available,[96][97] when Basic Autopilot, Enhanced Autopilot and FSD may be available.[97][98][96]

In Australia, Basic Autopilot and Enhanced Autopilot are available while FSD is not. Lane change is available as part of enhanced autopilot.[99] In Australia, a FSD service which differs from the US beta one is available. For instance, turning in city street may not be available in Australian Tesla FSD.[100] In Australia, Tesla FSD may stop at stop signs.[101]

Full Self-Driving

Full Self-Driving (FSD) capability[102] is a pre-release upgrade package to Autopilot offering additional ADAS features, such as traffic light recognition. (As of February 2023), 363,000 users in North America have access to FSD Beta.[103] In April 2023, Musk announced that users had driven 150 million miles on FSD Beta.[104]

Approach

Tesla's approach to try to achieve SAE Level 5 is to train a neural network using the behavior of 3+ million Tesla drivers[105] using chiefly visible light cameras and information from components used for other purposes in the car (the coarse-grained two-dimensional maps used for navigation; the ultrasonic sensors used for parking, etc.[106][107]). Tesla has made a deliberate decision to not use lidar, which Elon Musk has called "stupid, expensive and unnecessary".[108] This makes Tesla's approach markedly different from that of other companies like Waymo and Cruise which train their neural networks using the behavior of a small number of highly trained drivers,[109][110] and are additionally relying on highly detailed (centimeter-scale) three-dimensional maps and lidar in their autonomous vehicles.[107][111][112][113][114]

According to Elon Musk, full autonomy is "really a software limitation: The hardware exists to create full autonomy, so it's really about developing advanced, narrow AI for the car to operate on."[115][116] The Autopilot development focus is on "increasingly sophisticated neural nets that can operate in reasonably sized computers in the car".[115][116] According to Musk, "the car will learn over time", including from other cars.[117]

Tesla's software has been trained based on 3 billion miles driven by Tesla vehicles on public roads, (As of April 2020).[118][119] Alongside tens of millions of miles on public roads,[120] competitors have trained their software on tens of billions of miles in computer simulations, (As of January 2020).[121] In terms of computing hardware, Tesla designed a self-driving computer chip that has been installed in its cars since March 2019[122] and also designed and built in-house a neural network training supercomputer ("Tesla Dojo");[123][124] other vehicle automation companies such as Waymo regularly use custom chipsets and neural networks as well.[125][126]

Predictions and deployment

I don't think we have to worry about autonomous cars because it's a sort of a narrow form of AI. It's not something I think is very difficult. To do autonomous driving that is to a degree much safer than a person, is much easier than people think. [...] I almost view it like a solved problem.

 — Elon Musk, Opening keynote, Nvidia conference (March 2015)[127]

In December 2015, Musk predicted that "complete autonomy" would be implemented by 2018.[128] At the end of 2016, Tesla expected to demonstrate full autonomy by the end of 2017,[129][130] and in April 2017, Musk predicted that in around two years, drivers would be able to sleep in their vehicle while it drives itself.[131] In 2018 Tesla revised the date to demonstrate full autonomy to be by the end of 2019.[132]

I think we will be feature complete, full self-driving, this year. Meaning the car will be able to find you in a parking lot, pick you up and take you all the way to your destination without an intervention. This year. I would say I am of certain of that, that is not a question mark. However, people sometimes will extrapolate that to mean now it works with 100% certainty, requiring no observation, perfectly, this is not the case.

 — Elon Musk, Podcast (February 2019)[133][134]

In 2019[135][136] and 2020,[137] Tesla's order page for "Full Self-Driving Capability" stated:

Coming later this year:
  • Recognize and respond to traffic lights and stop signs
  • Automatic driving on city streets.

In January 2020, Musk claimed the FSD software would be "feature complete" by the end of 2020, adding that feature complete "doesn't mean that features are working well".[138] In August 2020, Musk stated that 200 software engineers, 100 hardware engineers and 500 "labelers" were working on Autopilot and FSD.[139] In early 2021, Musk stated that Tesla would provide SAE Level 5 autonomy by the end of 2021.[140][141] In a March 2021 conference call between Tesla and the California Department of Motor Vehicles (DMV), Tesla's director of Autopilot software revealed that Musk's comments "did not reflect engineering reality." Details of the call were made public via a Freedom of Information Act request by PlainSite.[142]

Full Self-Driving beta

In October 2020, Tesla first released a beta version of its FSD software to early access program testers, a small group of users in the United States.[143][72][73] Musk stated that the testing of FSD beta "[w]ill be extremely slow [and] cautious" and "be limited to a small number of people who are expert [and] careful drivers".[72] The release of the beta program renewed concern regarding whether the technology is ready for testing on public roads.[144][145] In January 2021, the number of employees and customers testing the beta FSD software was "nearly 1,000"[146] expanding in May 2021 to several thousand employees and customers.[147]

In October 2021, Tesla began the wide release of the FSD beta to approximately 1,000 more drivers in the US, and the beta became accessible to Tesla drivers who achieved a 100 / 100 on a proprietary safety scoring system.[148] By November 2021 there were about 11,700 FSD beta testers[149] and about 150,000 vehicles using Tesla's safety score system,[150] which then grew to 60,000 users participating in FSD beta by January 2022,[151] and 100,000 users by April 2022.[147] In November 2022, the FSD beta was opened to all North American owners who had purchased the option, regardless of safety score.[75] In August 2023, Musk livestreamed a 45-minute demo of the upcoming version 12 of FSD, which he claimed uses only a neural network, and not any human-written code.[152] There was one manual intervention: halfway through, it misinterpreted a green left-turn arrow as allowing forward traffic, and almost ran the red light before Musk intervened.[152]

Tesla Dojo

Main page: Tesla DojoTesla Dojo is a supercomputer designed from the ground up by Tesla for computer vision video processing and recognition. It will be used to train Tesla's machine learning models to improve FSD.

Dojo's goal is to efficiently process millions of terabytes of video data captured from real-life driving situations from Tesla's cars.[153] This goal led to a considerably different architecture than conventional supercomputer designs.[154][155]

The defining goal of [Dojo] is scalability. We have de-emphasized several mechanisms that you find in typical CPUs, like coherency, virtual memory, and global lookup directories just because these mechanisms do not scale very well... Instead, we have relied on a very fast and very distributed SRAM [static random-access memory] storage throughout the mesh. And this is backed by an order of magnitude higher speed of interconnect than what you find in a typical distributed system.

 — Emil Talpes, Tesla hardware engineer, 2022 The Next Platform article[155]

Dojo was first mentioned by Musk in April 2019[156][157] and August 2020.[157] It was officially announced by Musk at Tesla's AI Day on August 19, 2021.[158] In September 2021, a Tesla Dojo whitepaper was released. In August 2023, Tesla said that it started production use of Dojo, configured with 10,000 Nvidia chips.[159]

Dojo consists of multiple cabinets. Each cabinet holds multiple, vertically-arranged training tiles.[160] Each tile holds multiple Tesla-designed D1 processing chips with associated memory.[161] According to Tesla's senior director of Autopilot hardware, Ganesh Venkataramanan, "Tesla places 25 of these chips onto a single 'training tile', and 120 of these tiles come together... amounting to over an exaflop [a million teraflops] of power".[162] ((As of August 2021), Nvidia stated that the pre-Dojo Tesla AI-training center used 720 nodes of eight Nvidia A100 Tensor Core GPUs (5,760 GPUs in total) for up to 1.8 exaflops of performance.[163]) Dojo supports software programming using PyTorch, rather than C, C++, or CUDA. The static random-access memory (SRAM) presents as a single address space.[160]

Criticism

Tesla's self-driving strategy has been criticized as dangerous and obsolete as it was abandoned by other companies years ago.[164][165][166] Most experts believe that Tesla's approach of trying to achieve autonomous vehicles by eschewing high-definition maps and lidar is not feasible.[167][168][169] Auto analyst Brad Templeton has criticized Tesla's approach by arguing, "The no-map approach involves forgetting what was learned before and doing it all again."[170] In a May 2021 study by Guidehouse Insights, Tesla was ranked last for both strategy and execution in the autonomous driving sector.[171] Some news reports in 2019 state "practically everyone views [lidar] as an essential ingredient for self-driving cars"[172] and "experts and proponents say it adds depth and vision where camera and radar alone fall short."[173]

An August 2021 study conducted by Missy Cummings et al found three Tesla Model 3 cars exhibited "significant between and within vehicle variation on a number of metrics related to driver monitoring, alerting, and safe operation of the underlying autonomy... suggest[ing] that the performance of the underlying artificial intelligence and computer vision systems was extremely variable."[174]

In September 2021, legal scholars William Widen and Philip Koopman argued that Tesla's advertising of FSD as an SAE Level 2 system was misleading to "avoid regulatory oversight and permitting processes required of more highly automated vehicles".[175] Instead, they argued FSD should be considered a SAE Level 4 technology and urged state Departments of Transportation in the U.S. to classify it as such since publicly available videos show that "beta test drivers operate their vehicles as if to validate SAE Level 4 (high driving automation) features, often revealing dramatically risky situations created by use of the vehicles in this manner."[175]

Driving features

Tesla's Autopilot is classified as Level 2 under the SAE six levels (0 to 5) of vehicle automation.[176] At this level, the car can act autonomously, but requires the driver to monitor the driving at all times and be prepared to take control at a moment's notice.[177][178] Tesla's owner's manual states that Autopilot should not be used on city streets or on roads where traffic conditions are constantly changing;[179][180][181] however, some FSD capabilities ("traffic and stop sign control (beta)"), and future FSD capabilities ("autosteer on city streets") are advertised for city streets.[182]

Overview of features[182]
Name
Feature
Base Autopilot Enhanced Autopilot (EA) Full Self-Driving (FSD)
SAE Classification Level 2
Traffic-Aware Cruise Control Green check.svg Yes Green check.svg Yes Green check.svg Yes
Autosteer Green check.svg Yes Green check.svg Yes Green check.svg Yes
Navigate on Autopilot Dark Red x.svg No Green check.svg Yes Green check.svg Yes
Auto Lane Change Dark Red x.svg No Green check.svg Yes Green check.svg Yes
Autopark Dark Red x.svg No Green check.svg Yes Green check.svg Yes
Summon Dark Red x.svg No Green check.svg Yes Green check.svg Yes
Smart Summon Dark Red x.svg No Green check.svg Yes Green check.svg Yes
Traffic and Stop Sign Control Dark Red x.svg No Dark Red x.svg No Green check.svg Yes
Autosteer on City Streets Dark Red x.svg No Dark Red x.svg No Green check.svg Yes
Feature history
Hardware Year Function Description[183] Requirements
HW1 HW2 HW3 AP EA FSD
Yes Yes Yes 2014 Over-the-air updates Autopilot updates received as part of recurring Tesla software updates. Yes Yes Yes
Yes Yes Yes 2014 Safety Features If Autopilot detects a potential front or side collision with another vehicle, bicycle or pedestrian within a distance of 525 feet (160 m), it sounds a warning.[184] Autopilot also has automatic emergency braking that detects objects that may hit the car and applies the brakes, and the car may also automatically swerve out of the way to prevent a collision. Yes Yes Yes
Yes Yes Yes 2014 Visualization System generates a visualization of what it sees around it, including lane lines and vehicles in front, behind and on either side of it (in other lanes). It also displays lane markings and speed limits (via its cameras and what it knows from maps). On HW3, it displays stop signs and traffic signals. It distinguishes pedestrians, bicyclists/motorcyclists, small cars, SUVs, pickup trucks, buses, and large semi-trucks. Yes Yes Yes
Yes Yes Yes 2014 Speed Assist Front-facing cameras detect speed limit signs and compare them against map data to display those limits on the dashboard center display even if no signs are detected.[184] Yes Yes Yes
No Yes Yes 2018

[185]

Obstacle Aware Acceleration Reduces acceleration when an obstacle is detected in the path of travel while driving at low speeds. Yes Yes Yes
No Yes Yes 2019

[186]

Blind Spot Monitoring Sounds warning chime when an obstacle is detected while changing lanes. Yes Yes Yes
Yes Yes Yes 2014

[21][187][182]

Traffic Aware Cruise Control[182] Also known as adaptive cruise control, the ability to maintain a safe distance from the vehicle in front of it by accelerating and braking as that vehicle speeds up and slows down. It also slows on tight curves, on interstate ramps, and when another car enters or exits the road in front of the car. It can be enabled at any speed between 0 mph and 90 mph. By default, it sets the limit at the current speed limit and adjusts its target speed according to changes in speed limits. If road conditions warrant, Autosteer and cruise control disengage and an audio and visual signal indicate that the driver must assume full control. Yes Yes Yes
Yes Yes Yes 2014

[21][187][182]

Autosteer Steers the car to remain in whatever lane it is in (lane keeping). It is able to safely change lanes when the driver taps the turn signal stalk.[188] On divided highways, HW2 and HW3 cars limit use of the feature to 90 mph (145 km/h), and on non-divided highways, the limit is five miles over the speed limit or 45 mph (72 km/h) if no speed limit is detected.[189] If the driver ignores three audio warnings about controlling the steering wheel within an hour, Autopilot disables until a new journey is begun.[190] Yes Yes Yes
No Yes Yes 2019

[191]

Emergency Lane Departure Avoidance Steers to prevent departing from a lane that may have a chance of collision. Yes Yes Yes
Yes Yes Yes 2019

[191]

Lane Departure Avoidance Steers to maintain lane centering. Yes Yes Yes
Yes Yes Yes 2014

[192][193][194]

Lane Departure Warning Warns the driver when the vehicle begins to move out of lane centering. Yes Yes Yes
No Yes Yes 2016

[195][187][196][182]

Navigate on Autopilot A set of features consisting of automatically 'guiding through highway interchanges and exits' including 'lane changes on certain roads'.[197] As of 2019, it navigates freeway interchanges fully from onramp to offramp, including automatic lane changes.[196][198][199] No Yes Yes
No Yes Yes 2014

[199]

Automatic Lane Change Lane changing without driver initiation.[200][201] No Yes Yes
Yes Yes Yes 2015

[202][203][204][187][182]

Autopark Parks the car in parking spaces perpendicular or parallel either side, nose or tail facing in or out, without driver supervision.[205][206] No Yes Yes
Yes Yes Yes 2014

[21][187][182]

Summon Moves car forward or backward in tight spaces using the key fob or the Tesla app, without the driver in the car.[207][208] No Yes Yes
No Yes Yes 2019

[209]

Smart Summon Enables line of sight 150 ft (46 m) remote car retrieval on private properties (parking lots) using the key fob or Tesla phone app.[210][211][212] No Yes Yes
No No Yes 2019

[213]

Traffic Signs Aware Traffic light, stop sign and yield sign recognition. No No Yes
No No Yes 2020

[182][214]

Traffic Light and Stop Sign Control[182][214] When using Traffic-Aware Cruise Control or Autosteer, this feature will stop for stop signs and red traffic lights,[215] and will proceed through green lights.[216] Even when Autopilot is not engaged, the system can chime when the traffic light turns green.[217] No No Yes
No No Yes 2023 Autosteer on city streets Enables in-city navigation under FSD.[218] No No Yes

Hardware

Summary
Hardware name Autopilot hardware 1 Enhanced Autopilot hardware 2.0[lower-alpha 1] Enhanced Autopilot hardware 2.5[lower-alpha 2] Full self-driving computer hardware 3[lower-alpha 3] Full self-driving computer hardware 4
Initial availability date 2014 October 2016 August 2017 April 2019 May 2021 October 2022 February 2023
Computers
Platform MobilEye EyeQ3[222] Nvidia DRIVE PX 2 AI computing platform[223] Nvidia DRIVE PX 2 with secondary node enabled[220] Two identical Tesla-designed "FSD 1 Chip" (12 core) processors Three identical Tesla-designed "FSD 2 Chip" (20 core) processors[224]
Sensors
Forward radar 160 m (525 ft)[183] 170 m (558 ft)[183] None[225][lower-alpha 4] Model S & X: 300 m (984 ft)
Model 3 & Y: None
Front / Side camera
color filter array
N/A RCCC[183] RCCB[183]
Forward cameras 1 monochrome with unknown range 3:
  • Narrow (35°): 250 m (820 ft)
  • Main (50°): 150 m (490 ft)
  • Wide (120°): 60 m (195 ft)
All:
  • Main (50°): 150 m (490 ft)
  • Wide (120°): 60 m (195 ft)
    Model S & X: Bumper Camera
Forward looking side cameras N/A
  • Left (90°): 80 m (260 ft)
  • Right (90°): 80 m (260 ft)
5 MP
Rearward looking side cameras N/A
  • Left: 100 m (330 ft)
  • Right: 100 m (330 ft)
5 MP
Sonars 12 surrounding with 5 m (16 ft) range 12 surrounding with 8 m (26 ft) range None[229]
Notes
  1. All cars sold after October 2016 are equipped with Hardware 2.0, which includes eight cameras (covering a complete 360° around the car), one forward-facing radar, and twelve sonars (also covering a complete 360°). Buyers may choose an extra-cost option to purchase either the "Enhanced Autopilot" or "Full Self-Driving" to enable features. Front and side collision mitigation features are standard on all cars.[219]
  2. Also known as "Hardware 2.1"; includes added computing and wiring redundancy for improved reliability.[220]
  3. Tesla described the previous computer as a supercomputer capable of full self-driving.[221]
  4. Radar hardware is not installed in Model 3 and Model Y vehicles built for the North American market and delivered in May 2021 or later, while Model S and Model X vehicles retain radar hardware.[226] In November 2021, Tesla China also removed radar.[227] Vehicles operating with Tesla FSD Beta do not use radar even if vehicle has the hardware.[228]

Hardware 1

Vehicles manufactured after late September 2014 are equipped with a camera mounted at the top of the windshield, forward looking radar[230][231] in the lower grille, and 12 ultrasonic acoustic location sensors in the front and rear bumpers that provide a 360-degree view around the car.[232] The computer is the Mobileye EyeQ3.[233] This equipment allows suitably-equipped Tesla Model S and Model X vehicles to detect lane markings, obstacles, and other vehicles, enabling advanced driver-assistance functions branded Autosteer (automatic lane-keeping), Auto lane change, Autopark (parallel parking robot), and Side-collision warning.[232]

Auto lane change can be initiated by the driver turning on the lane changing signal when safe (due to the ultrasonic sensors' 16-foot limited range capability), and then the system completes the lane change.[201] In 2016 the system did not detect pedestrians or cyclists,[234] and while Autopilot detects motorcycles,[235] there have been two instances of HW1 cars rear-ending motorcycles.[236]

Tesla released a new version of Autopilot in September 2016 that changed the object detection algorithm to more fully use the radar sensor; previously, primary obstacle detection responsibilities fell to the cameras and the radar was used in a secondary role to confirm their presence, but was not given the authority to initiate emergency braking alone. After the update, the radar data was given an equal role in object detection and made capable of identifying "dense obstacles" including "other vehicles, moose, or even alien spaceships", according to Musk. He added that Tesla "believe it would have" prevented the fatal May 2016 underride accident in Williston, Florida, in which Autopilot failed to detect a white trailer against the sky.[237]

Mobileye ended its partnership with Tesla in 2016, stating that Tesla was "going to hurt the interests of [Mobileye] and hurt the interests of an entire industry, if a company of our reputation will continue to be associated with this type of pushing the envelope in terms of safety".[238] Tesla responded that Mobileye backed away after learning Tesla was developing its own vision-based sensor system.[239] Upgrading from Hardware 1 to Hardware 2 is not offered as it would require substantial work and cost.[240]

Hardware 2

Tesla HW2 camera and radar coverage as shown on the company's website

HW2, included in vehicles manufactured after October 2016, includes an Nvidia Drive PX 2[241] GPU for CUDA based GPGPU computation.[242][243] Tesla claimed that the hardware was capable of processing 200 frames per second.[244] Elon Musk called HW2 "basically a supercomputer in a car", referring to its capacity of up to 12 trillion operations per second.[245] The Autopilot computer hardware, housed just above the glovebox, is replaceable to allow for future upgrades.[246][247] Tesla claimed the HW2 suite of sensors and computation provided the necessary equipment to allow FSD at SAE Level 5.[248]

The hardware includes eight cameras covering an aggregate view of 360° around the car and 12 ultrasonic sensors, in addition to forward-facing radar with enhanced processing capabilities.[248] The radar is able to observe beneath and ahead of the vehicle in front of the Tesla;[249] the radar can see vehicles through heavy rain, fog or dust.[250] The eight cameras are mounted in various locations around the vehicle: three forward-facing, next to the central rearview mirror mounted on the windshield; two front/side cameras, one each mounted in the left and right B-pillars; two rear/side cameras, mounted in the left and right front fender turn-signal repeaters; and one rear camera, above the license plate.[245]

When "Enhanced Autopilot" was enabled in February 2017 by the v8.0 (17.5.36) software update, testing showed the system was limited to using one of the eight onboard cameras—the main forward-facing camera.[251] The v8.1 software update released a month later enabled a second camera, the narrow-angle forward-facing camera.[252] With all eight cameras enabled, data extracted from Autopilot in debugging mode showed the cameras provide a black-and-white feed to the computer, possibly to improve image processing speed.[253]

The Tesla Model 3, introduced in 2017, and related Model Y, introduced in 2019, are equipped with an additional driver-facing in-cabin camera. This was disabled at launch and was intended to monitor the cabin remotely while the owner was operating the vehicle as an autonomous robotaxi,[254] but was activated in May 2021 to monitor driver attentiveness while using Autopilot in vehicles without radar sensors.[255]

Hardware 2.5

Tesla HW2.5 (top) and infotainment (bottom) boards

In August 2017, Tesla announced that HW2.5 included a secondary processor node to provide more computing power and additional wiring redundancy to slightly improve reliability; it also enabled dashcam and "sentry mode" capabilities.[256] During this time, the supplier for the system's radar components was changed from Bosch to Continental,[257] using the ARS4-B unit.[258]

Hardware 3

According to Tesla's director of Artificial Intelligence (AI) Andrej Karpathy, Tesla had, as of Q3 2018, trained large neural networks but they could not be deployed to Tesla vehicles built up to that time due to insufficient computational resources. HW3 was designed to run these neural networks.[259] Overall, Tesla claims HW3 has 2.5× improved performance over HW2.5, with 1.25× higher power and 0.2× lower cost.[260]

HW3 is based on a custom Tesla-designed system on a chip called "FSD Chip",[261] fabricated using a 14 nm process by Samsung.[262] Jim Keller and Pete Bannon, among other architects, have led the project since February 2016.[263] FSD Chip features twelve ARM Cortex-A72 CPUs operating at 2.6 GHz, two Neural Network Arrays operating at 2 GHz and a Mali GPU operating at 1 GHz.[260] Tesla claimed that FSD Chip processes images at 2,300 frames per second (fps), which is a 21× improvement over the 110 fps image processing capability of HW2.5.[263][264] The firm described FSD Chip as a "neural network accelerator" custom-designed for Tesla AI processing.[244] Each of the two neural network arrays on a single FSD Chip are capable of 36 trillion operations per second,[261] and there are two FSD Chips for redundancy.[265]

Each of the eight cameras supplied with HW3 use the same AR0136A image sensor supplied by Onsemi, which has a maximum resolution of 1280×960 (1.2-megapixel) and a 3.75 μm pixel size.[266] Initial versions of HW3 also included a Continental ARS4-B radar module.[266]

The company claimed that HW3 was necessary for FSD, but not for "enhanced Autopilot" functions.[267] The first availability of HW3 was April 2019.[268] Customers with HW2 or HW2.5 who purchased the FSD package are eligible for an upgrade to HW3 without cost.[269] The system board for HW3 is the same physical size as the HW2.5 board, but carries more components.[266]

Tesla Vision

Tesla Vision camera coverage

In late May 2021, Elon Musk posted to Twitter that "Pure Vision Autopilot" was starting to be implemented.[270] The system, which Tesla brands "Tesla Vision", eliminates the forward-facing radar starting in May 2021 from the Autopilot hardware package on Model 3 and Model Y vehicles built for the North American market.[271] The Washington Post reported in March 2023 the immediate result was "an uptick in crashes, near misses and other embarrassing mistakes by Tesla vehicles suddenly deprived of a critical sensor."[272] For vehicles without the forward radar, temporary limitations were applied to certain features such as Autosteer, and other features (Smart Summon and Emergency Lane Departure Avoidance) were disabled, but Tesla promised to restore the features "in the weeks ahead ... via a series of over-the-air software updates".[226]

In response, Consumer Reports delisted the Model 3 from its Top Picks in 2021, and the Insurance Institute for Highway Safety (IIHS) announced plans to delist the Model 3 as a Top Safety Pick+,[273][274] but after further testing, both organizations restored those designations.[275] Also, NHTSA rescinded the agency's check marks for forward collision warning, automatic emergency braking, lane departure warning, and dynamic brake support, applicable to Model 3 and Model Y vehicles built on or after April 27, 2021,[276] but (As of October 2022), those check marks have been restored.[277][278]

In December 2021, the New York Times reported that Musk had made a unilateral decision to pursue the camera-only approach and had "repeatedly told members of the Autopilot team that humans could drive with only two eyes and that this meant cars should be able to drive with cameras alone." Several autonomous vehicle experts have denounced the analogy.[61] According to former employees, Tesla engineers attempted to convince Musk that removing the radar could lead to crashes if the cameras became obscured, but Musk "was unconvinced and overruled" their objections.[272] Brad Templeton noted that lidar "will never fail to see a train or truck, even if it doesn't know what it is. It knows there is an object in front and the vehicle can stop without knowing more than that."[272] In contrast, Tesla Vision relies on the "Autopilot labeling team",[279] hundreds of Tesla employees that view short video clips recorded by the cameras to label visible signs and objects, which trains the machine vision interpreter.[272][280]

Initially, data labeling was handled by a non-profit outsourcing company named Samasource, which provided 400 workers in Nairobi by 2016, but Andrej Karpathy later stated the "quality [of their work] was not amazing" and Tesla began hiring employees for data labeling in San Mateo, California instead.[281] In April 2023, it was revealed that San Mateo Autopilot employees had shared clips internally amongst themselves, including recordings of privately-owned areas such as garages, crashes, road-rage incidents, and meme videos annotated with "amusing captions or commentary". Former Tesla employees described the San Mateo office atmosphere as "free-wheeling" and noted "people who got promoted to lead positions shared a lot of these funny [clips] and gained notoriety for being funny." In one case, the submersible Lotus Esprit prop featured in the James Bond film The Spy Who Loved Me, which had been purchased by Elon Musk in 2013 and stored in his garage, was recorded and shared by Autopilot data labeling team members. Because sharing these clips was apparently for entertainment and not related to Autopilot training, Carlo Plitz, a data privacy lawyer, noted "it would be difficult to find a legal justification" for doing so.[281] After the San Mateo office was closed in June 2022, the Autopilot data labeling teams moved to Buffalo, New York,[280] where Tesla employs approximately 675.[281]

Tesla announced in October 2022 they would remove ultrasonic sensors by 2023; vehicles without the ultrasonic sensors would be shipped without Autopark, Park Assist, Summon, and Smart Summon features initially. However, Tesla promised these features would be restored in the future through a software update.[229]

Hardware 4

Samsung will make the processor for Hardware 4 at Hwasung, South Korea, on a 7 nm process. The custom SoC is called "FSD Computer 2".[282]

In June 2022, Tesla filed an application with the FCC to use a new radar in future vehicles;[283] based on filings with the Chinese government and a prototype Tesla Model 3 spotted in December 2022 while undergoing testing in Santa Cruz, it is believed that HW4 will include high-resolution cameras and radar. In addition, fans and/or heaters have been fitted to the cameras to prevent fogging.[284] The Continental ARS4-B previously used with HW2.5 and HW3 had a total capacity of eight transmit/receive channels, while updated automotive radars have up to 200 channels, giving it capabilities similar to lidar.[258] However, an updated filing with the FCC in 2023 indicates any new radar that might be included with HW4 is not likely to be high-resolution.[285]

Comparisons

  • In 2018, Consumer Reports rated Tesla Autopilot as second best out of four (GM, Tesla, Nissan, Volvo) "partially automated driving systems".[286] Autopilot scored highly for its capabilities and ease of use, but was worse at keeping the driver engaged than the other manufacturers' systems.[286] Consumer Reports also found multiple problems with Autopilot's automatic lane change function, such as cutting too closely in front of other cars and passing on the right.[287]
  • In 2018, the Insurance Institute for Highway Safety compared Tesla, BMW, Mercedes and Volvo "advanced driver assistance systems" and stated that the Tesla Model 3 experienced the fewest incidents of crossing over a lane line, touching a lane line, or disengaging.[288]
  • In February 2020, Car and Driver compared GM's Super Cruise, comma.ai and Autopilot.[289] They called Autopilot "one of the best", highlighting its user interface and versatility, but criticizing it for swerving abruptly.
  • In June 2020, Digital Trends compared GM's Super Cruise self-driving and Tesla Autopilot.[290] The conclusion: "Super Cruise is more advanced, while Autopilot is more comprehensive."
  • In October 2020, the European New Car Assessment Program gave the Tesla Model 3 Autopilot a score of "moderate".[291]
  • Also in October 2020, Consumer Reports evaluated 17 driver assistance systems, and concluded that Tesla Autopilot was "a distant second" behind GM's Super Cruise, although Autopilot was ranked first in the "Capabilities and Performance" and "Ease of Use" categories.[292][293]
  • In February 2021, a MotorTrend review compared GM's Super Cruise and Autopilot and said Super Cruise was better, primarily due to safety.[294]
  • In May 2021, consulting firm Guidehouse Insights ranked Tesla Full Self-Driving last in strategy and execution among 15 companies.[171]
  • In January 2023, Consumer Reports rated 12 active driving assistance systems and ranked Tesla Autopilot at 7th. The Full Self-Driving package was not tested.[295]
  • In December 2023, TechCrunch ranked Full Self-Driving last out of five systems evaluated, saying "it's pretty easy to choose a loser. Three years after its initial beta release, Tesla's supposed Full Self-Driving still doesn't live up to its name", adding "the FSD beta software [was] frequently confused on urban and rural streets" and "Tesla's driver monitoring was by far the most lax of those tested".[296]

Safety statistics and concerns

Safety statistics

<graph>{"legends":[],"scales":[{"type":"ordinal","name":"x","zero":false,"domain":{"data":"chart","field":"x"},"points":true,"range":"width","nice":true},{"clamp":true,"type":"linear","name":"y","domain":{"data":"chart","field":"y"},"domainMin":0,"zero":false,"range":"height","nice":true},{"domain":{"data":"chart","field":"series"},"type":"ordinal","name":"color","range":["#1f77b4","#ff7f0e"]}],"version":2,"marks":[{"type":"group","marks":[{"properties":{"hover":{"stroke":{"value":"red"}},"update":{"stroke":{"scale":"color","field":"series"}},"enter":{"y":{"scale":"y","field":"y"},"x":{"scale":"x","field":"x"},"stroke":{"scale":"color","field":"series"},"strokeWidth":{"value":2.5}}},"type":"line"}],"from":{"data":"chart","transform":[{"groupby":["series"],"type":"facet"}]}}],"height":150,"axes":[{"type":"x","title":"Quarter","scale":"x","properties":{"title":{"fill":{"value":"#54595d"}},"grid":{"stroke":{"value":"#54595d"}},"ticks":{"stroke":{"value":"#54595d"}},"axis":{"strokeWidth":{"value":2},"stroke":{"value":"#54595d"}},"labels":{"align":{"value":"right"},"angle":{"value":-45},"fill":{"value":"#54595d"}}},"grid":true},{"type":"y","title":"Million miles","scale":"y","properties":{"title":{"fill":{"value":"#54595d"}},"grid":{"stroke":{"value":"#54595d"}},"ticks":{"stroke":{"value":"#54595d"}},"axis":{"strokeWidth":{"value":2},"stroke":{"value":"#54595d"}},"labels":{"fill":{"value":"#54595d"}}},"grid":true}],"data":[{"format":{"parse":{"y":"number","x":"string"},"type":"json"},"name":"chart","values":[{"y":3.35,"series":"y1","x":"2018-Q3"},{"y":2.84,"series":"y1","x":"2018-Q4"},{"y":2.88,"series":"y1","x":"2019-Q1"},{"y":3.24,"series":"y1","x":"2019-Q2"},{"y":3.85,"series":"y1","x":"2019-Q3"},{"y":3.11,"series":"y1","x":"2019-Q4"},{"y":4.86,"series":"y1","x":"2020-Q1"},{"y":5.08,"series":"y1","x":"2020-Q2"},{"y":5.09,"series":"y1","x":"2020-Q3"},{"y":3.76,"series":"y1","x":"2020-Q4"},{"y":4.64,"series":"y1","x":"2021-Q1"},{"y":4.94,"series":"y1","x":"2021-Q2"},{"y":5.54,"series":"y1","x":"2021-Q3"},{"y":4.35,"series":"y1","x":"2021-Q4"},{"y":6.57,"series":"y1","x":"2022-Q1"},{"y":5.1,"series":"y1","x":"2022-Q2"},{"y":6.26,"series":"y1","x":"2022-Q3"},{"y":4.85,"series":"y1","x":"2022-Q3"},{"y":1.92,"series":"y2","x":"2018-Q3"},{"y":1.23,"series":"y2","x":"2018-Q4"},{"y":1.24,"series":"y2","x":"2019-Q1"},{"y":1.4,"series":"y2","x":"2019-Q2"},{"y":1.56,"series":"y2","x":"2019-Q3"},{"y":1.48,"series":"y2","x":"2019-Q4"},{"y":1.45,"series":"y2","x":"2020-Q1"},{"y":1.63,"series":"y2","x":"2020-Q2"},{"y":1.78,"series":"y2","x":"2020-Q3"},{"y":1.26,"series":"y2","x":"2020-Q4"},{"y":0.98,"series":"y2","x":"2021-Q1"},{"y":1.36,"series":"y2","x":"2021-Q2"},{"y":1.58,"series":"y2","x":"2021-Q3"},{"y":1.21,"series":"y2","x":"2021-Q4"},{"y":1.21,"series":"y2","x":"2022-Q1"},{"y":1.54,"series":"y2","x":"2022-Q2"},{"y":1.71,"series":"y2","x":"2022-Q3"},{"y":1.4,"series":"y2","x":"2022-Q3"}]}],"width":200}</graph>
Millions of miles driven between accidents with and without Autopilot[2]
  Autopilot engaged
  No Autopilot

In April 2016, Elon Musk stated the probability of an accident was at least 50% lower when using Autopilot without citing any references. At the time, it was estimated that collectively, Teslas had been driven for 47 million miles in Autopilot mode.[297] After the first widely-publicized fatal Autopilot crash in May 2016 which occurred in Williston, Florida, Tesla acknowledged the death and published a blog in June, comparing the average fatality rate in the United States (at the time, one per 94 million miles) and worldwide (one per 60 million miles) with that of Tesla Autopilot (one per 130 million miles);[298] Tesla stated in July that "customers using Autopilot are statistically safer than those not using it at all", "the [Autopilot] system provided a net safety benefit to society", and "the 'better-than-human' threshold had been crossed and robustly validated internally".[299] Tesla's statistical approach was criticized as comparing two different datasets; while Autopilot is limited to highway driving, the overall death rate for the United States includes more varied driving conditions. In addition, Tesla's vehicles were larger and more expensive than most vehicles on the road, making them generally safer in a crash.[300] Other factors that could have affected the data include weather conditions and Tesla owner demographics.[301]

Fortune criticized the sale of US$2 billion in Tesla stock, noted the sale occurred less than two weeks after "immediately" reporting the fatal early May crash to the NHTSA, but before Tesla posted its public acknowledgement of the crash in late June; the article stated that "Tesla and Musk did not disclose the very material fact that a man had died while using an auto-pilot technology that Tesla had marketed vigorously as safe and important to its customers." Musk responded to the article with a statistical argument in an email to the reporter, saying "Indeed, if anyone bothered to do the math (obviously, you did not) they would realize that of the over 1M auto deaths per year worldwide, approximately half a million people would have been saved if the Tesla autopilot was universally available. Please, take 5 mins and do the bloody math before you write an article that misleads the public."[302]

Following the Williston crash, NHTSA released a preliminary report in January 2017 stating "the Tesla vehicles' crash rate dropped by almost 40 percent after Autosteer installation."[303][304]:10 NHTSA did not release the data until November 2018. A private company, Quality Control Systems, released a report in February 2019 analyzing the same data, stating the NHTSA conclusion was "not well-founded".[305] Part of the data verification included scrutiny of the 43,781 vehicles NHTSA claimed had Autosteer installed; of those, only 5,714 had an exact odometer reading at the time that Autosteer was installed and airbag deployment data; collectively, the data for those 5,714 vehicles showed 32 airbag deployments in the 42,001,217 mi (67,594,407 km) traveled before installation, and 64 airbag deployments in the 52,842,182 mi (85,041,249 km) after.[306] That means the crash rate, as measured by the rate of airbag deployments per million miles of travel, actually increased from 0.76 to 1.21 after the installation of Autosteer,[307]:9 an increase of 59%.[306]

Tesla accident rates according to its self-published report[2]
Autopilot
engaged
Yes No
Year / Quarter (Reported as millions of miles
driven per accident, so higher
numbers imply a lower rate of
accidents.)
2018 Q3 3.35 1.92
Q4 2.84 1.23
2019 Q1 2.88 1.24
Q2 3.24 1.40
Q3 3.85 1.56
Q4 3.11 1.48
2020 Q1 4.86 1.45
Q2 5.08 1.63
Q3 5.09 1.78
Q4 3.76 1.26
2021 Q1 4.64 0.984
Q2 4.94 1.36
Q3 5.54 1.58
Q4 4.35 1.52
2022 Q1 6.57 1.21
Q2 5.10 1.54
Q3 6.26 1.71
Q4 4.85 1.40

Starting in 2018, Tesla began publishing safety statistics on a quarterly basis,[308] using the values to demonstrate decreased accident rates while using Autopilot.[309] The data have been difficult to interpret, as the actual crash counts, baseline mileage, and basic definition of a crash are not available.[310]:3 Because of this lack of accurate reporting, Green Car Reports noted that "While these updated numbers for Autopilot are encouraging, it's clear that Tesla's claims of its vastly superior safety—at least in terms of fatal accidents—are still vapor. It's way too soon to come to any firm conclusions about Autopilot safety."[309]

In February 2020, Andrej Karpathy, Tesla's head of AI and computer vision, stated that Tesla cars have driven 3 billion miles on Autopilot, of which 1 billion have been driven using Navigate on Autopilot; Tesla cars have performed 200,000 automated lane changes; and 1.2 million Smart Summon sessions have been initiated with Tesla cars.[311] He also stated that Tesla cars are avoiding pedestrian accidents at a rate of tens to hundreds per day.[312] The company abruptly stopped publishing safety statistics after 2021, leading to speculation that after NHTSA issued Standing General Order 2021-01 in June 2021, requiring manufacturers to report detailed crash statistics, the number of reported crashes was growing "far faster than Tesla's sales growth".[308] Tesla resumed publishing statistics in January 2023.[313] The first comparable safety statistics using Full Self-Driving were released in March 2023; Tesla stated that vehicles operating under FSD experienced a crash that deployed the airbag approximately every 3.2 million miles, compared to all crashes with airbag deployment reported to the police, which occur approximately every 0.6 M miles.[314]

Additionally, a statistical analysis first published as a preprint in 2021[310] and in final form in 2023[315] criticized the self-reported Tesla crash rate data, as it failed to account for vehicle owner demographics as well as the types of roads on which Autopilot was being operated.[310]:3 While the government baseline crash rate used as a comparison drew from vehicles of all ages, operating on roads ranging from highways, rural roads, and city streets, Tesla vehicles are relatively newer and Autopilot is limited to operations on freeways and highways.[308] When adjusted for driver age and road type, Autopilot crash rates were found to be nearly the same as using "active safety features" only.[310]:Fig.4

An MIT study published in September 2021 found that Autopilot is not as safe as Tesla claims, and led to drivers becoming inattentive.[316][317]

General concerns

The National Transportation Safety Board (NTSB) criticized Tesla's lack of system safeguards in a fatal 2018 Autopilot crash in California,[318] and for failing to foresee and prevent "predictable abuse" of Autopilot.[319][320] Following this collective criticism amid increased regulatory scrutiny of ADAS systems, especially Tesla Autopilot,[321] in June 2021, the NHTSA announced an order requiring automakers to report crashes involving vehicles equipped with ADAS features in the United States.[322]

Driver monitoring

Musk stated in October 2015 that "we're advising drivers to keep their hands on the wheel [when Autopilot is engaged]". Despite this, multiple videos were posted to YouTube at the time showing drivers using Autopilot to drive hands-free, including Musk's ex-wife Talulah Riley.[323] Other drivers have been found sleeping at the wheel, driving under the influence of alcohol, and performing other inappropriate tasks with Autopilot engaged.[324][325] As initially released, the Autopilot system uses a torque sensor to detect if the driver's hands were on the steering wheel[326][327] and gives audible and visual warnings for the driver to take the wheel when no torque is detected, but several owners confirmed they could drive for several minutes hands-free before receiving a warning.[323] At least one device designed to defeat the torque sensor was ordered by NHTSA to discontinue sales in 2018.[328] Initially, Tesla decided against adding more advanced driver monitoring options to ensure drivers remained engaged with the driving task.[329]

In late May 2021, a new version of the software enabled driver-facing cameras inside new Model 3 and Model Y (i.e. the first cars as part of the switch to Tesla Vision) to monitor driver attentiveness while using Autopilot.[330] Model S and Model X cars made before 2021 do not have an inside camera and therefore physically cannot offer such capabilities, although the refreshed versions are expected to have one.[331] A review of the in-cabin camera-based monitoring system by Consumer Reports found that drivers could still use Autopilot even when looking away from the road or using their phones, and could also enable FSD beta software "with the camera covered."[332]

In 2022, Musk agreed to a proposal on Twitter that "users with more than 10,000 miles on FSD Beta should be given the option to turn off the steering wheel nag",[333] saying the system would be updated in January 2023.[334][335] In April, Musk confirmed the nag was being reduced gradually.[336] That June, a hacker discovered that FSD Beta had an undocumented mode which disables all driver monitoring.[337] The NHTSA wrote a letter to Tesla under the authority of EA 22-002 on July 26, noting the new mode "could lead to greater driver inattention and failure of the driver to properly supervise Autopilot". The letter was attached a Special Order requesting when the software was updated with the hidden mode, detailed steps or conditions required to unlock that mode, and the reasons why Tesla issued the updates.[338] Tesla responded by August 25;[339] the response was considered confidential and no public version is available.[340]

A "nag elimination" module sold as an aftermarket accessory automatically adjusts the volume from the steering wheel, which is registered as steering wheel input, allowing drivers to take their hands off the wheel. Anecdotal evidence has shown the module is effective only for Tesla vehicles sold in the United States and Canada, leading to speculation the driver monitoring software is different by region.[341]

Detecting stationary vehicles at speed

Autopilot may not detect stationary vehicles; the manual states: "Traffic-Aware Cruise Control cannot detect all objects and may not brake/decelerate for stationary vehicles, especially in situations when you are driving over 50 mph (80 km/h) and a vehicle you are following moves out of your driving path and a stationary vehicle or object is in front of you instead."[342] This has led to numerous crashes with stopped emergency vehicles.[343][344][345][346]

Dangerous and unexpected behavior

In a 2019 Bloomberg survey, hundreds of Tesla owners reported dangerous behaviors with Autopilot, such as phantom braking, veering out of lane, or failing to stop for road hazards.[347] Autopilot users have also reported the software crashing and turning off suddenly, collisions with off ramp barriers, radar failures, unexpected swerving, tailgating, and uneven speed changes.[348]

Ars Technica notes that the brake system tends to initiate later than some drivers expect.[349] One driver claimed that Tesla's Autopilot failed to brake, resulting in collisions, but Tesla pointed out that the driver deactivated the cruise control of the car prior to the crash.[350] The automatic emergency braking (AEB) system also initiates sooner than some drivers expect due to a software error, which led to a recall in 2021 for false activation of the AEB system.[351]

Ars Technica also noted that while lane changes may be semi-automatic (if Autopilot is on, and the vehicle detects slow moving cars or if it is required to stay on route, the car may automatically change lanes without any driver input), the driver must show the car that he or she is paying attention by touching the steering wheel before the car makes the change.[352] In 2019, Consumer Reports noted that Tesla's automatic lane-change feature is "far less competent than a human driver".[353]

Data collection

Most late-model vehicles, including Teslas, are equipped with event data recorders which collect approximately five seconds of data to aid crash investigations, including speed, acceleration, brake use, steering input, and driver assistance feature status; Tesla vehicles permanently record this data as "gateway log" files onto a microSD card in the Media Control Unit, at a rate of approximately 5 times per second (hertz or Hz). Gateway log files are uploaded to Tesla when the vehicle connects to a Wi-Fi network.[354]

The Autopilot computer stores images and video (images only for model year 2015 and earlier vehicles) along with driving data similar to that captured in gateway log files at a higher temporal resolution (up to 50 Hz) and uploads these to Tesla periodically as well. These "snapshots" are deleted locally after being uploaded. Tesla has been silent about its data retention policies.[354] Snapshot data always are captured when the vehicle crashes (defined as deploying the airbags) and uploaded via a 4G cellular network.[354] Snapshots sometimes also are captured for other events defined by Tesla. Even when Autopilot is not actively providing steering, throttle, and brake controls, 2016 and later model year Teslas operate Autopilot in "Shadow Mode".[355] When the control inputs generated by the shadow mode Autopilot do not match those of the human driver, the vehicle may record a snapshot to assist in training the system, after which the data may be reviewed by the Autopilot team.[356] As explained by Karpathy, Tesla can deploy additional software "detectors" triggered by specific situations identified by snapshot data, which then upload camera and other data to Tesla when similar situations are detected. These data are used to revise the existing detectors.[356]

For Teslas built after mid-2017, the Autopilot computer also records "trail" data, including the car's route as determined by GPS "breadcrumbs" for the entire trip. A trip starts when the vehicle shifts from Park to Drive, and ends when shifted back to Park. Trail data also includes vehicle speed, road type used, and Autopilot status. Like the snapshots, these trail data are deleted locally after being uploaded to Tesla.[354] The trail data are meant to be anonymized by stripping the vehicle identification number (VIN) and assigning a temporary ID, but the same temporary ID can be assigned to a single vehicle for several weeks.[357]

Under Tesla's privacy policies,[358] the company does not sell customer and vehicle data, but may share the data with government entities.[357]

Regulatory and legal actions

Regulation

In 2015, a spokesman for the NHTSA said that "any autonomous vehicle would need to meet applicable federal motor vehicle safety standards" and the NHTSA "will have the appropriate policies and regulations in place to ensure the safety of this type of vehicles".[359] On February 1, 2021, Robert Sumwalt, chair of the NTSB, wrote a letter to NHTSA regarding that agency's "Framework for Automated Driving System Safety", which had been published for comment in December 2020.[360][361][362] In the letter, Sumwalt recommended that NHTSA include user monitoring as part of the safety framework and reiterated that "Tesla's lack of appropriate safeguards and NHTSA's inaction" to act on the NTSB's recommendation "that NHTSA develop a method to verify that manufacturers of vehicles equipped with Level 2 incorporate system safeguards that limit the use of automated vehicle control systems to the conditions for which they were designed" was a contributing cause to a fatal crash of a vehicle in Delray Beach, Florida in 2019.[361]:7

NHTSA announced Standing General Order (SGO) 2021-01 on June 29, 2021. Under this General Order, manufacturers and operators of vehicles equipped with advanced driver assistance systems (ADAS, SAE J3016 Level 2) or automated driving systems (ADS, SAE Level 3 or higher) are required to report crashes.[322] An amended order was issued and became effective on August 12, 2021.[363] Reporting is limited to crashes where the ADAS or ADS was engaged within 30 seconds prior to the crash that involve an injury that requires hospitalization, a fatality, a vehicle being towed from the scene, an air bag deployment, or involving a "vulnerable road user" (e.g., pedestrian or bicyclist); these crashes are required to be reported to NHTSA within one calendar day, and an updated report is required within 10 calendar days.[364]:13–14 On August 16, 2021, after reports of 17 injuries and one death in car crashes involving emergency vehicles, the US auto safety regulators opened a formal safety probe (PE 21-020) into Tesla's driver assistance system Autopilot.[365]

Initial data from SGO 2021-01 were released in June 2022; 12 manufacturers reported 392 crashes involving ADAS (Level 2) between July 2021 and May 15, 2022. Of those 392 crashes, 273 were Tesla vehicles, out of approximately 830,000 Tesla vehicles equipped with ADAS. Honda had the next highest total, with 90 crashes reported out of approximately 6 million Honda vehicles equipped with ADAS.[366] The NHTSA said Tesla's numbers may appear high because it has real-time crash reports, whereas other automakers do not, so their crash reports may be delivered more slowly or not reported at all.[367] Collectively, five people were killed and six more were seriously hurt in the 392 ADAS crashes that were reported.[366] According to updated data, by June 2023, Tesla drivers using Autopilot had been involved in 736 crashes and 17 fatalities cumulatively since 2019; 11 fatalities had occurred since May 2022.[368][369]

SGO 2021-01 also applied to manufacturers of vehicles equipped with ADS (Levels 3 through 5); 25 ADS manufacturers reported 130 crashes in total from the initial data release in June 2022, led by Waymo (62), Transdev Alternative Services (34), and Cruise LLC (23). In most cases, these crashes involved the ADS vehicle being struck from the rear; only one serious injury was reported, and 108 of the 130 crashes resulted in no injury.[366]

Court cases

Tesla's Autopilot was the subject of a class action suit brought in 2017 that claimed the second-generation Enhanced Autopilot system was "dangerously defective".[370] The suit was settled in 2018; owners who in 2016 and 2017 paid $5,000 (equivalent to $5,327 in 2019) to equip their cars with the updated Autopilot software were compensated between $20 and $280 for the delay in implementing Autopilot 2.0.[371]

In 2020, a German court ruled in a lawsuit brought in 2019 by The Center for Combating Unfair Competition [de] that Tesla had violated advertising regulations with its marketing of Autopilot.[372][373][374] Upon appeal, that decision was reversed in 2021 by a higher court under the condition that Tesla clarify the capabilities of Autopilot on its website.[375][376]

In July 2022, a German court awarded a plaintiff most of the €112,000 that she had paid for a Model X, based in part on a technical report that demonstrated Autopilot did not reliably recognize obstacles and would unnecessarily activate its brakes, which could cause a "massive hazard" in cities; Tesla's lawyers argued unsuccessfully that Autopilot was not designed for city traffic.[377]

In September 2022, a class action lawsuit was filed in the U.S. District Court (Northern California) alleging that "for years, Tesla has deceptively and misleading marketed its ADAS technology as autonomous driving technology under various names, including 'Autopilot,' 'Enhanced Autopilot,' and 'Full Self-Driving Capability'", adding that Tesla represented "that it was perpetually on the cusp of perfecting that technology and finally fulfilling its promise of producing a fully self-driving car", while "Tesla knew for years its statements regarding its ADAS technology were deceptive and misleading, but the company made them anyway."[378][379] Tesla filed a motion in November 2022 to dismiss the case, defending the company's actions as "mere failure to realize a long-term, aspirational goal [of a fully self-driving car] [and] not fraud", basing the motion on the private arbitration clause in the purchasing contract signed by each buyer.[380][381]

A second class action lawsuit was filed in the same court by Tesla shareholders in late February 2023.[382] The complaint alleges the defendants "had significantly overstated the efficacy, viability, and safety of [Tesla's] Autopilot and FSD technologies" and those same systems "created a serious risk of accident and injury", which "subjected Tesla to an increased risk of regulatory and governmental scrutiny and enforcement action", linking multiple specific accidents to documented decreases in share prices.[383]

In April 2023, Tesla was found not liable in a lawsuit filed in 2020 by a driver who sued for damages after she claimed the Autopilot system guided her Tesla Model S into a curb, resulting in an airbag deployment and facial injuries.[384] Jurors explained in post-trial interviews that "Autopilot never confessed to be self pilot. It's not a self-driving car ... [Tesla] were adamant about a driver needing to always be aware."[385]

Additional lawsuits have been filed by the estates of two drivers killed in 2019 while using Autopilot, one in California and one in Florida. In the California case, which had not previously been reported, Tesla has argued the driver had consumed alcohol and it is not clear that Autopilot was engaged;[386] the plaintiff's lawyers alleged that a known defect in the Autopilot system had caused the vehicle to veer off a highway at 65 mph (105 km/h) and strike a palm tree.[387] Tesla prevailed in that case, with the jury voting 9–3 in October 2023 that there was no manufacturing defect.[388] For the Florida case, the judge rejected Tesla's motion to dismiss, concluding that he could not "imagine how some ordinary consumers would not have some belief that the Tesla vehicles were capable of driving themselves hands free",[389] citing "reasonable evidence" demonstrating that Tesla had "engaged in a marketing strategy that painted the products as autonomous" and that Musk's statements "had a significant effect on the belief about the capabilities of the products".[67]

False or misleading advertising

I don't think that something should be called, for example, an Autopilot, when the fine print says you need to have your hands on the wheel and eyes on the road at all times.

 — Pete Buttigieg, U.S. Secretary of Transportation, Interview with Associated Press, May 2023[390]

The Center for Auto Safety and Consumer Watchdog wrote to the Federal Trade Commission (FTC) in 2018, asking them to open an investigation into the marketing of Autopilot. The letter stated "the marketing and advertising practices of Tesla, combined with Elon Musk's public statements, have made it reasonable for Tesla owners to believe, and act on that belief, that a Tesla with Autopilot is an autonomous vehicle capable of 'self-driving'".[391] The groups renewed their appeal to the FTC and added the California DMV in 2019,[392] noting that "Tesla continues to be the only automaker to describe its Level 2 vehicles as 'self-driving' and the name of its driver assistance suite of features, Autopilot, connotes full autonomy."[393] U.S. Senators Ed Markey (D-MA) and Richard Blumenthal (D-CT) echoed these concerns to the FTC in 2021.[394]

A 2019 IIHS study showed that the name "Autopilot" causes more drivers to misperceive behaviors such as texting or taking a nap to be safe, versus similar level 2 driver-assistance systems from other car companies.[395] In 2020, UK safety experts called Tesla's Autopilot "especially misleading".[396]

Tesla's use of the terms Autopilot and FSD were criticized in a May 2020 report published on ScienceDirect titled "Autonowashing: The Greenwashing of Vehicle Automation".[397]

In 2021, following more than a dozen Autopilot crashes (some fatal), the U.S. Department of Justice (DOJ) started a criminal investigation to determine if Tesla misled consumers, investors, and regulators about Autopilot.[398] Tesla confirmed the DOJ had requested Autopilot and FSD-related documents in its 10-K filing for 2022.[399] The Securities and Exchange Commission also opened an independent civil probe into statements made by Tesla and its executives about Autopilot.[400][401]

In July 2022, the California DMV filed two complaints with the state Office of Administrative Hearings that alleged Tesla "made or disseminated statements that are untrue or misleading, and not based on facts" relating to both "Autopilot and Full Self-Driving technologies".[402][403][404] In August 2022, Tesla requested a hearing to present its defense.[405]

In September 2022, California governor Gavin Newsom signed state bill SB 1398,[406] which took effect January 1, 2023 and prohibits any manufacturer or dealer of cars with partial driver automation features from using misleading language to advertise their vehicles as autonomous, such as by naming the system "Full Self-Driving".[407][408]

NHTSA investigations

According to a document released in June 2021, the NHTSA has initiated at least 30 investigations into Tesla crashes that were believed to involve the use of Autopilot, with some involving fatalities.[409][410]

In August 2021, the NHTSA Office of Defects Investigation (ODI) opened a preliminary evaluation (PE) designated PE 21-020 and released a list of eleven crashes involving Tesla vehicles striking stationary emergency vehicles; in each instance, NHTSA confirmed that Autopilot or Traffic Aware Cruise Control were active during the approach to the crashes. Of the eleven crashes, seven resulted in seventeen total injuries, and one resulted in one fatality. The scope of the planned evaluation of the Autopilot system specifically addressed the systems used to monitor and enforce driver engagement.[411] In September 2021, NHTSA added a twelfth accident in Orlando from August 2021 to the investigation list.[412]

NHTSA sent a request for information relating to PE 21-020 to Tesla's director of field quality in August 2021. The response was due by October 22, 2021.[413][414] In September 2021, NHTSA sent a request for information to Tesla and other automobile manufacturers for comparative ADAS data.[415][416][417] After Tesla deployed its Emergency Light Detection Update in September 2021, NHTSA sent a follow-up letter to Tesla in October 2021 asking for "a chronology of events, internal investigations, and studies" that led to the deployment of the update,[418] as it potentially addressed a safety defect, which requires a formal recall.[419]

List of crashes in NHTSA ODI Preliminary Evaluation (PE) 21-020[411]
Date City/County State Notes/Refs
2018 Culver City California Tesla struck a stationary fire truck on southbound I-405.[420]
2018 Laguna Beach California Tesla struck a stationary patrol vehicle on Laguna Canyon Road at 11:07 a.m.[421] Later removed from investigation as patrol vehicle was parked out of right-of-way and emergency lights were not active.[422]
2019 Norwalk Connecticut Tesla struck a stationary police cruiser with its emergency lights flashing on I-95 near exit 15. Driver stated he had been checking on his dog in the back seat.[423]
2019 Cloverdale Indiana Tesla struck a stationary fire truck on I-70 near mile marker 38; passenger in Tesla was killed.[424]
2020 West Bridgewater Massachusetts Tesla struck a stationary patrol vehicle at 10 p.m. on Route 24. Driver stated that Autopilot was engaged.[425]
2020 Cochise County Arizona Tesla struck a stationary patrol vehicle at 3 a.m. on I-10 near Benson, Arizona.[426]
2020 Charlotte North Carolina Tesla struck a stationary patrol vehicle on US-64W near the border of Nash and Franklin counties. Driver was watching a movie.[427]
2021 Montgomery County Texas Tesla struck a stationary police cruiser at 1:15 a.m. on the Eastex Freeway near East River Road.[428]
2021 Lansing Michigan Tesla struck a stationary patrol car at 1:10 a.m. on I-96 in Eaton County.[429]
2021 Miami Florida Tesla struck a stationary Florida Department of Transportation road ranger truck at 5:30 a.m. on I-95 near 103rd St.[430]
2021 San Diego California Tesla struck a stationary patrol car at 1:45 a.m. on State Route 56.[431]

In February 2022, NHTSA ODI opened a second preliminary evaluation (PE 22-002) for "phantom braking" in 2021–2022 Tesla Model 3 and Model Y vehicles.[432] PE 22-002 was correlated to the removal of radar hardware from those vehicles in May 2021; at the time PE 22-002 was opened, the NHTSA was not aware of any crashes or injuries resulting from the complaints.[433] According to some complaints, while using Autopilot, "rapid deceleration can occur without warning, at random, and often repeatedly in a single drive cycle."[432] The Washington Post also published an article detailing the surge in complaints to NHTSA over false positives to its automatic emergency-braking system.[434] By May 2022, NHTSA had received 758 reports of unexpected braking when Autopilot was in use and requested that Tesla respond to questions by June 20, 2022.[435][436]

Also in June 2022, NHTSA ODI upgraded PE 21-020 to an engineering analysis (EA) and designated it as EA 22-002, covering an estimated 830,000 Tesla vehicles sold between 2014 and 2022.[422] Data for PE 21-020 had been supplemented by prior information requests to Tesla (April 19, 2021) and Standing General Order (SGO) 2021-01,[437] issued June 29, 2021[438] and amended on August 5, 2021,[422] which required manufacturers of advanced driving assistance systems to promptly report crashes to NHTSA.[439]

List of crashes added in NHTSA ODI Engineering Analysis (EA) 22-002[422]
Date City/County State Notes/Refs
2020 Houston Texas [440]:Report ID 13781-2451
2021 Mount Pleasant South Carolina Involved crash attenuator truck[422]
2021 Belmont California Involved first responder[422]
2021 Orlando Florida Tesla struck a stationary patrol car at 5 a.m. on I-4.[441][440]:Report ID 13781-1140
2021 Petaluma California [440]:Report ID 13781-1357
2022 Desert Center California [440]:Report ID 13781-2201

The investigation was expanded to an engineering analysis after NHTSA reviewed data from 191 crashes involving the use of Autopilot or related ADAS Level 2 technologies (Traffic-Aware Cruise Control, Autosteer, Navigate on Autopilot, or Auto Lane Change).[442] 85 were removed because other drivers were involved or there was insufficient data.[442] It was found that in approximately ​12 of the remaining 106 crashes, the driver was not sufficiently responsive to the driving task, and approximately ​14 of the 106 resulted from operating Autopilot outside of limited-access highways, or when traction and weather conditions could interfere.[442] Detailed telemetry existed for 43 of the 106 crashes; of these, data from 37 indicated the driver's hands were on the steering wheel in the last second prior to collision.[422]

The Laguna Beach incident identified initially in PE 21-020 was removed from EA 22-002 as it was found "the struck vehicle was parked out of traffic with no lights illuminated."[422] Six incidents were added, making a total of sixteen accidents where a Tesla struck stationary emergency vehicle(s), including the August 2021 incident in Orlando.[422] In these 16 incidents, NHTSA found that a majority resulted in forward collision warnings and approximately half resulted in automatic emergency braking.[422] On average, when video was available, drivers would have been able to see a potential impact eight seconds prior to collision, yet Autopilot would abort control "less than one second prior to the first impact",[443] which may not have been enough time for the driver to assume full control.[444] In addition, the data suggest that Tesla's requirement for Autopilot drivers to have their hands on the wheel at all time may not be sufficient to ensure the driver is paying attention to the driving task.[445][442]

NHTSA sent a second letter for EA 22-002 to Tesla in August 2022, which included requests for a description of the role of the driver-facing camera, identification of all lawsuits or arbitration resulting from Autopilot use, including complete transcripts of depositions, and "the engineering and safety explanation and evidence for design decisions regarding enforcement of driver engagement / attentiveness".[446] Tesla submitted a response in September. A follow-up letter was submitted in July 2023, asking for current data and updates to the prior response.[447] Starting in October 2023, NHTSA conveyed its preliminary conclusions to Tesla during several meetings, followed by Tesla conducting a voluntary recall on December 5, 2023, to provide an over-the-air software update to "incorporate additional controls and alerts ... to further encourage the driver to adhere to their continuous driving responsibility whenever Autosteer is engaged ... [and providing] additional checks upon engaging Autosteer and while using the feature outside controlled access highways", while not concurring with NHTSA's analysis.[448]

Recalls

Tesla issued an "Emergency Light Detection Update" for Autopilot in September 2021 which was intended to detect "flashing emergency vehicle lights in low light conditions and then [respond] to said detection with driver alerts and changes to the vehicle speed while Autopilot is engaged", after NHTSA had opened PE 21-020 the previous month. After the update was issued, NHTSA sent a letter to Tesla asking why the update had not been performed under the recall process, as "any manufacturer issuing an over-the-air update that mitigates a defect that poses an unreasonable risk to motor vehicle safety is required to timely file an accompanying recall notice to NHTSA."[449][450]

Tesla issued a recall of 11,728 vehicles in October 2021 due to a communication error that could lead to false forward-collision warnings or unexpected activations of the automatic emergency braking system. The error had been introduced by the Full Self-Driving beta software version 10.3 over-the-air firmware update, and was reversed by another over-the-air update the same month.[451] The recalled vehicles were reverted to 10.2, then updated to 10.3.1.[351]

FSD 10.3 also was released with different driving profiles to control vehicle behavior, branded 'Chill', 'Average', and 'Assertive'; the 'Assertive' profile attracted negative coverage in January 2022 for advertising that it "may perform rolling stops" (passing through stop signs at up to 5.6 mph), change lanes frequently, and decrease the following distance.[452][453] On February 1, after the NHTSA advised Tesla that failing to stop for a stop sign can increase the risk of a crash and threatened "immediate action" for "intentional design choices that are unsafe",[454] Tesla recalled nearly 54,000 vehicles to disable the rolling stop behavior,[455] removing the feature with an over-the-air software update.[456]

On February 16, 2023, Tesla issued a recall notice for all vehicles equipped with the Full Self-Driving beta software, including 2016–23 Model S and X; 2017–23 Model 3; and 2020–23 Model Y, covering 362,758 vehicles in total.[78] NHTSA identified four specific traffic situations in a letter sent to Tesla on January 25,[457][458] and Tesla voluntarily chose to pursue a recall to address those situations,[78] which include vehicles operating under FSD Beta performing the following inappropriate actions:[457]

  1. Traveling or turning through an intersection on a "stale yellow traffic light"
  2. Not stopping perceptibly at stop signs
  3. Not adjusting speed appropriately in response to posted speed limit signs
  4. Continuing straight while traveling in turn-only lanes

The recall, which covers all versions of FSD Beta,[457] was performed by pushing out a software update in March 2023.[79][78]

On December 12, 2023, following a 2-year-long investigation by the NHTSA,[459] Tesla issued a wider recall on all vehicles equipped with any version of Autosteer, including 2012–2023 Model S; 2016–2023 Model X; 2017–2023 Model 3; and 2020–2023 Model Y, covering 2,031,220 vehicles in total.[448] The NHTSA concluded that Autosteer's controls were not sufficient to prevent misuse and did not ensure that the drivers maintained "continuous and sustained responsibility for vehicle operation".[448] An over-the air software update was deployed to a subset of vehicles upon the release of the recall notice, with plans to roll it out to the remaining fleet at a later date, which made visual alerts more prominent within the user interface, simplified the process to turn Autosteer on and off, added additional checks when Autosteer is used near intersections or on streets that are not limited-access roads, and disabled Autosteer if the driver repeatedly failed to demonstrate driving responsibility while using the feature.[448]

Notable crashes

Fatal crashes

(As of November 2023), there have been forty-two verified fatalities involving Tesla's Autopilot function, though other deadly incidents involving suspected Autopilot use remain outstanding.[460] Many of these incidents have received varying degrees of attention from news publications.

Handan, Hebei, China (January 20, 2016)

Tesla accident in Handan, Hebei, China

On January 20, 2016, Gao Yaning, the driver of a Tesla Model S in Handan, Hebei, China, was killed when his car crashed into a stationary truck.[461] The Tesla was following a car in the far left lane of a multi-lane highway; the car in front moved to the right lane to avoid a truck stopped on the left shoulder, and the Tesla, which the driver's father believes was in Autopilot mode, did not slow before colliding with the stopped truck.[462] According to footage captured by a dashboard camera, the stationary street sweeper on the left side of the expressway partially extended into the far left lane, and the driver did not appear to respond to the unexpected obstacle.[463]

Initially, Yaning was held responsible for the collision by local traffic police and, in September 2016, his family filed a lawsuit in July against the Tesla dealer who sold the car.[464][465] The family's lawyer stated the suit was intended "to let the public know that self-driving technology has some defects. We are hoping Tesla when marketing its products, will be more cautious. Do not just use self-driving as a selling point for young people."[462] Tesla released a statement which said they "have no way of knowing whether or not Autopilot was engaged at the time of the crash" since the car telemetry could not be retrieved remotely due to damage caused by the crash.[462] In 2018, the lawsuit was stalled because telemetry was recorded locally to a SD card and was not able to be given to Tesla, who provided a decoding key to a third party for independent review. Tesla stated that "while the third-party appraisal is not yet complete, we have no reason to believe that Autopilot on this vehicle ever functioned other than as designed."[466] Chinese media later reported that the family sent the information from that card to Tesla, which admitted autopilot was engaged two minutes before the crash.[467] Tesla since then removed the term "Autopilot" from its Chinese website.[468]

Williston, Florida, USA (May 7, 2016)

Tesla accident in Wllinston, Florida
The model S after it was recovered from the crash scene in Williston, Florida

On May 7, 2016, a Tesla driver was killed in a crash with an 18-wheel tractor-trailer in Williston, Florida. By late June 2016, the NHTSA opened a formal investigation into the fatal autonomous accident, working with the Florida Highway Patrol. According to the NHTSA, preliminary reports indicate the crash occurred when the tractor-trailer made a left turn in front of the 2015 Tesla Model S at an intersection on a non-controlled access highway, and the car failed to apply the brakes. The car continued to travel after passing under the truck's trailer.[469][470][471] The Tesla was eastbound in the rightmost lane of US 27, and the westbound tractor-trailer was turning left at the intersection with NE 140th Court, approximately 1 mi (1.6 km) west of Williston; the posted speed limit is 65 mph (105 km/h).[472]

The diagnostic log of the Tesla indicated it was traveling at a speed of 74 mi/h (119 km/h) when it collided with and traveled under the trailer, which was not equipped with a side underrun protection system.[473]:12 A reconstruction of the accident estimated the driver would have had approximately 10.4 seconds to detect the truck and take evasive action.[474] The underride collision sheared off the Tesla's glasshouse, destroying everything above the beltline, and caused fatal injuries to the driver.[473]:6–7; 13 In the approximately nine seconds after colliding with the trailer, the Tesla traveled another 886.5 feet (270.2 m) and came to rest after colliding with two chain-link fences and a utility pole.[473]:7; 12

Dr. Deb Bruce, head of the NTSB investigation team, announces results to the NTSB on September 12, 2017.

The NHTSA's preliminary evaluation was opened to examine the design and performance of any automated driving systems in use at the time of the crash, which involves a population of an estimated 25,000 Model S cars.[475] On July 8, 2016, the NHTSA requested Tesla Inc. to hand over to the agency detailed information about the design, operation and testing of its Autopilot technology. The agency also requested details of all design changes and updates to Autopilot since its introduction, and Tesla's planned updates scheduled for the next four months.[476]

According to Tesla, "neither autopilot nor the driver noticed the white side of the tractor-trailer against a brightly lit sky, so the brake was not applied." The car attempted to drive full speed under the trailer, "with the bottom of the trailer impacting the windshield of the Model S". Tesla also stated that this was Tesla's first known Autopilot-related death in over 130 million miles (208 million km) driven by its customers while Autopilot was activated. According to Tesla there is a fatality every 94 million miles (150 million km) among all type of vehicles in the U.S.[469][470][477] It is estimated that billions of miles will need to be traveled before Tesla Autopilot can claim to be safer than humans with statistical significance. Researchers say that Tesla and others need to release more data on the limitations and performance of automated driving systems if self-driving cars are to become safe and understood enough for mass-market use.[478][479]

The truck's driver told the Associated Press that he could hear a Harry Potter movie playing in the crashed car, and said the car was driving so quickly that "he went so fast through my trailer I didn't see him. [The film] was still playing when he died and snapped a telephone pole a quarter-mile down the road." According to the Florida Highway Patrol, they found in the wreckage an aftermarket portable DVD player. (It is not possible to watch videos on the Model S touchscreen display while the car is moving.[471][480]) A laptop computer was recovered during the post-crash examination of the wreck, along with an adjustable vehicle laptop mount attached to the front passenger's seat frame. The NHTSA concluded the laptop was probably mounted and the driver may have been distracted at the time of the crash.[473]:17–19; 21

Tesla's manufacture of cars equipped with Autopilot preceded NHTSA's issuance of its [Federal Automated Vehicles] Policy [dated September 2016], and that policy applies to SAE Levels 3–5 rather than Level 2 automated vehicles, but Tesla clearly understands the [ operational design domain ] concept and advised drivers to use the Autopilot systems only on limited-access roadways. Following the crash, Tesla modified its Autopilot firmware to add a preferred road usage constraint, which affects the timing of the hands-off driving alert. But despite these modifications, a Tesla driver can still operate Autopilot on any roads with adequate lane markings.

Collision Between a Car Operating With Automated Vehicle Control Systems and a Tractor-Semitrailer Truck Near Williston, Florida | May 7, 2017 | Accident Report NTSB/HAR-17/02 PB2017-102600[481]:33

In January 2017, the NHTSA Office of Defects Investigations (ODI) released a preliminary evaluation, finding that the driver in the crash had seven seconds to see the truck and identifying no defects in the Autopilot system; the ODI also found that the Tesla car crash rate dropped by 40 percent after Autosteer installation,[303][304] but later also clarified that it did not assess the effectiveness of this technology or whether it was engaged in its crash rate comparison.[482] The NHTSA Special Crash Investigation team published its report in January 2018.[473] According to the report, for the drive leading up to the crash, the driver engaged Autopilot for 37 minutes and 26 seconds, and the system provided 13 "hands not detected" alerts, to which the driver responded after an average delay of 16 seconds.[473]:24 The report concluded "Regardless of the operational status of the Tesla's ADAS technologies, the driver was still responsible for maintaining ultimate control of the vehicle. All evidence and data gathered concluded that the driver neglected to maintain complete control of the Tesla leading up to the crash."[473]:25

In July 2016, the NTSB announced it had opened a formal investigation into the fatal accident while Autopilot was engaged. The NTSB is an investigative body that only has the power to make policy recommendations. An agency spokesman said, "It's worth taking a look and seeing what we can learn from that event, so that as that automation is more widely introduced we can do it in the safest way possible." The NTSB opens annually about 25 to 30 highway investigations.[483] In September 2017, the NTSB released its report, determining that "the probable cause of the Williston, Florida, crash was the truck driver's failure to yield the right of way to the car, combined with the car driver's inattention due to overreliance on vehicle automation, which resulted in the car driver's lack of reaction to the presence of the truck. Contributing to the car driver's overreliance on the vehicle automation was its operational design, which permitted his prolonged disengagement from the driving task and his use of the automation in ways inconsistent with guidance and warnings from the manufacturer."[484]

Mountain View, California, USA (March 23, 2018)

Tesla Model X accident in Mountain View, California

On March 23, 2018, a second U.S. Autopilot fatality occurred in Mountain View, California.[485] The crash occurred just before 9:30 a.m. Pacific Standard Time on southbound US 101 at the carpool lane exit for southbound Highway 85, at a concrete barrier where the left-hand carpool lane offramp separates from 101. After the Model X crashed into the narrow concrete barrier, it was struck by two following vehicles, and then it caught on fire.[486]

Both the NHTSA and NTSB began investigations into the March 2018 crash.[487] Another driver of a Model S demonstrated that Autopilot appeared to be confused by the road surface marking in April 2018. The gore ahead of the barrier is marked by diverging solid white lines (a vee-shape) and the Autosteer feature of the Model S appeared to mistakenly use the left-side white line instead of the right-side white line as the lane marking for the far left lane, which would have led the Model S into the same concrete barrier had the driver not taken control.[488] Ars Technica concluded that "as Autopilot gets better, drivers could become increasingly complacent and pay less and less attention to the road."[489]

Post-crash scene on US 101 in Mountain View, March 23, 2018

In a corporate blog post, Tesla noted the impact attenuator separating the offramp from US 101 had been previously crushed and not replaced prior to the Model X crash on March 23.[485][490] The post also stated that Autopilot was engaged at the time of the crash, and the driver's hands had not been detected manipulating the steering wheel for six seconds before the crash. Vehicle data showed the driver had five seconds and a 150 metres (490 ft) "unobstructed view of the concrete divider, ... but the vehicle logs show that no action was taken."[485] The NTSB investigation had been focused on the damaged impact attenuator and the vehicle fire after the collision, but after it was reported the driver had complained about the Autopilot functionality,[491] the NTSB announced it would also investigate "all aspects of this crash including the driver's previous concerns about the autopilot".[492] A NTSB spokesman stated the organization "is unhappy with the release of investigative information by Tesla".[493] Elon Musk dismissed the criticism, tweeting that NTSB was "an advisory body" and that "Tesla releases critical crash data affecting public safety immediately & always will. To do otherwise would be unsafe."[494] In response, NTSB removed Tesla as a party to the investigation on April 11.[495]

NTSB released a preliminary report on June 7, 2018, which provided the recorded telemetry of the Model X and other factual details. Autopilot was engaged continuously for almost nineteen minutes prior to the crash. In the minute before the crash, the driver's hands were detected on the steering wheel for 34 seconds in total, but his hands were not detected for the six seconds immediately preceding the crash. Seven seconds before the crash, the Tesla began to steer to the left and was following a lead vehicle; four seconds before the crash, the Tesla was no longer following a lead vehicle; and during the three seconds before the crash, the Tesla's speed increased to 70.8 mi/h (113.9 km/h). The driver was wearing a seatbelt and was pulled from the vehicle before it was engulfed in flames.[496]

The crash attenuator had been previously damaged on March 12 and had not been replaced at the time of the Tesla crash.[496] The driver involved in the accident on March 12 collided with the crash attenuator at more than 75 mph (121 km/h) and was treated for minor injuries; in comparison, the driver of the Tesla collided with the collapsed attenuator at a slower speed and died from blunt force trauma. After the accident on March 12, the California Highway Patrol failed to report the collapsed attenuator to Caltrans as required. Caltrans was not aware of the damage until March 20, and the attenuator was not replaced until March 26 because a spare was not immediately available.[497]:1–4 This specific attenuator had required repair more often than any other crash attenuator in the Bay Area, and maintenance records indicated that repair of this attenuator was delayed by up to three months after being damaged.[497]:4–5 As a result, the NTSB released a Safety Recommendation Report on September 9, 2019, asking Caltrans to develop and implement a plan to guarantee timely repair of traffic safety hardware.[498]

At a NTSB meeting held on February 25, 2020, the board concluded the crash was caused by a combination of the limitations of the Tesla Autopilot system, the driver's over-reliance on Autopilot, and driver distraction likely from playing a video game on his phone. The vehicle's ineffective monitoring of driver engagement was cited as a contributing factor, and the inoperability of the crash attenuator contributed to the driver's injuries.[499] As an advisory agency, NTSB does not have regulatory power; however, NTSB made several recommendations to two regulatory agencies. The NTSB recommendations to the NHTSA included: expanding the scope of the New Car Assessment Program to include testing of forward collision avoidance systems; determining if "the ability to operate [Tesla Autopilot-equipped vehicles] outside the intended operational design domain pose[s] an unreasonable risk to safety"; and developing driver monitoring system performance standards. The NTSB submitted recommendations to the OSHA relating to distracted driving awareness and regulation. In addition, NTSB issued recommendations to manufacturers of portable electronic devices (to develop lock-out mechanisms to prevent driver-distracting functions) and to Apple (banning the nonemergency use of portable electronic devices while driving).[500]

Several NTSB recommendations previously issued to NHTSA, DOT, and Tesla were reclassified to "Open—Unacceptable Response". These included H-17-41[501] (recommendation to Tesla to incorporate system safeguards that limit the use of automated vehicle control systems to design conditions) and H-17-42[502] (recommendation to Tesla to more effectively sense the driver's level of engagement).[500]

Kanagawa, Japan (April 29, 2018)

Tesla Model X accident in Kanawaga, Japón

On April 29, 2018, a Tesla Model X operating on Autopilot struck and killed a pedestrian in Kanagawa, Japan, after the driver had fallen asleep.[503] According to a lawsuit filed against Tesla in federal court (N.D. Cal.) in April 2020, the Tesla Model X accelerated from 24 to 38 km/h (15 to 24 mph) after the vehicle in front of it changed lanes; it then crashed into a van, motorcycles, and pedestrians in the far right lane of the expressway, killing a 44-year-old man on the road directing traffic.[504][505]:2;9 The original complaint claims the accident occurred due to flaws in Tesla's Autopilot system, such as inadequate monitoring to detect inattentive drivers and an inability to handle traffic situations "that drivers will almost always certainly encounter".[505]:3–4[506] In addition, the original complaint claimed this is the first pedestrian fatality to result from the use of Autopilot.[505]:1

According to vehicle data logs, the driver of the Tesla had engaged autopilot at 2:11 p.m. Japan Standard Time, shortly after entering the Tōmei Expressway.[505]:11 The driver's hands were detected on the wheel at 2:22 p.m.[505]:11 At some point before 2:49 p.m., the driver began to doze off, and at approximately 2:49 p.m., the vehicle ahead of the Tesla signaled and moved one lane to the left to avoid the vehicles stopped in the far right lane of the expressway.[505]:11 While the Tesla was accelerating to resume its preset speed, it struck the man, killing him.[505]:11 He belonged to a motorcycle riding club which had stopped to render aid to a friend that had been involved in an earlier accident; he specifically had been standing apart from the main group while trying to redirect traffic away from that earlier accident.[505]:9–10

The driver of the Tesla was convicted in a Japanese court of criminal negligence and sentenced to three years in prison (suspended for five years).[507] The suit against Tesla in California was dismissed for forum non-conveniens by Judge Susan van Keulen in September 2020 after Tesla said it would accept a case brought in Japan.[508] The plaintiffs appealed the dismissal to the Ninth Circuit Court of Appeals in February 2021,[504] which upheld the lower court's dismissal.[509]

Delray Beach, Florida, USA (March 1, 2019)

Tesla Model 3 accident in Delray Beach, Florida

At approximately 6:17 a.m. Eastern Standard Time on the morning of March 1, 2019, a Tesla Model 3 driving southbound on US 441/SR 7 in Delray Beach, Florida, struck a semi-trailer truck that was making a left-hand turn to northbound SR 7 out of a private driveway at Pero Family Farms; the Tesla underrode the trailer, and the force of the impact sheared off the greenhouse of the Model 3, resulting in the death of the Tesla driver.[510] The driver of the Tesla had engaged Autopilot approximately 10 seconds before the collision and preliminary telemetry showed the vehicle did not detect the driver's hands on the wheel for the eight seconds immediately preceding the collision.[511] The driver of the semi-trailer truck was not cited.[512] Both the NHTSA and NTSB dispatched investigators to the scene.[513][514]

According to telemetry recorded by the Tesla's restraint control module, the Tesla's cruise control was set to 69 mph (111 km/h) 12.3 seconds prior to the collision and Autopilot was engaged 9.9 seconds prior to the collision; at the moment of impact, the vehicle speed was 68.3 mph (109.9 km/h).[515] After the crash and underride, the Tesla continued southbound on SR 7 for approximately 1,680 ft (510 m) before coming to rest in the median between the northbound and southbound lanes.[516] The car sustained extensive damage to the roof, windshield, and other surfaces above 3 ft 6 in (1.07 m), the clearance under the trailer. Although the airbags did not deploy following the collision, the Tesla's driver remained restrained by his seatbelt; emergency response personnel were able to determine the driver's injuries were incompatible with life upon arriving at the scene.[517]

In May 2019 the NTSB issued a preliminary report that determined that neither the driver of the Tesla or the Autopilot system executed evasive maneuvers.[518] The circumstances of this crash were similar to the fatal underride crash of a Tesla Model S in 2016 near Williston, Florida; in its 2017 report detailing the investigation of that earlier crash, NTSB recommended that Autopilot be used only on limited-access roads (i.e., freeway),[481]:33 which Tesla did not implement.[519]

The NTSB issued its final report in March 2020.[520] The probable cause of the collision was the truck driver's failure to yield the right of way to the Tesla; however, the report also concluded that "[a]t no time before the crash did the car driver brake or initiate an evasive steering action. In addition, no driver-applied steering wheel torque was detected for 7.7 seconds before impact, indicating driver disengagement, likely due to overreliance on the Autopilot system." In addition, the NTSB concluded the operational design of the Tesla Autopilot system "permitted disengagement by the driver" and Tesla failed to "limit the use of the system to the conditions for which it was designed"; the NHTSA also failed to develop a method of verifying that manufacturers had safeguards in place to limit the use of ADAS to design conditions.[516]:14–15

Key Largo, Florida, USA (April 25, 2019)

While driving on Card Sound Road, a 2019 Model S ran through a stop sign and flashing red stop light at the T-intersection with County Road 905, then struck a parked Chevrolet Tahoe which then spun and hit two pedestrians, killing one. A The New York Times article later confirmed Autopilot was engaged at the time of the accident.[521] The driver of the Tesla, who was commuting to his home in Key Largo from his office in Boca Raton, dropped his phone while on a call to make flight reservations and bent down to pick it up, failing to stop at the intersection: "I looked down, and I ran the stop sign and hit the guy's car ... When I popped up and I looked and saw a black truck — it happened so fast", later telling the responding police officers that Autopilot was "stupid cruise control".[521]

When the driver of the Tesla called authorities to respond, he spotted only one injured man, who was unconscious and bleeding from the mouth. He told police at the scene that he was driving in "cruise" and was allowed to leave without receiving a citation.[521][522][523] Emergency medical personnel saw a woman's shoe under the Tahoe, prompting a search for the second victim, who was found approximately 25 yd (23 m) away from the scene, where she had been thrown from the impact.[521]

The decedent's family filed separate lawsuits against Tesla and the driver; the suit against the driver was settled out of court.[521] The lawsuit against Tesla alleges the company marketed a vehicle with "defective and unsafe characteristics, such as the failure to adequately determine stationary objects in front of the vehicle, which resulted in the death of [the victim]".[524]

Fremont, California, USA (August 24, 2019)

In Fremont, California on I-880, while driving north of Stevenson Boulevard, a Ford Explorer pickup was rear-ended by a Tesla Model 3 using Autopilot, causing the pickup's driver to lose control. The pickup overturned and a 15-year-old passenger in the Ford, who was not seat-belted, was jettisoned from the pickup and killed.[525][526][527] The deceased's parents sued Tesla and claimed in their filing that "Autopilot contains defects and failed to react to traffic conditions."[528] In response, a lawyer for Tesla noted the police had cited the driver of the Tesla for inattention and operating the car at an unsafe speed.[529] The incident has not been investigated by the NHTSA.[525]

Cloverdale, Indiana, USA (December 29, 2019)

Tesla crash in Cloverdale, Indiana

An eastbound Tesla Model 3 rear-ended a fire truck parked along I-70 near mile marker 38 in Putnam County, Indiana at approximately 8 a.m.;[530][531] both the driver and passenger in the Tesla, a married couple, were injured and taken to Terre Haute Regional Hospital, where the passenger later died from her injuries. The driver stated he regularly uses Autopilot mode, but could not recall if it was engaged when the Tesla hit the fire truck.[532]

The NHTSA announced it was investigating the crash on January 9[533] and later confirmed the use of Autopilot at the time of the crash.[411] The driver filed a civil lawsuit against Tesla in November 2021;[534] it was moved to federal court in February 2022.[535]

Gardena, California, USA (December 29, 2019)

Tesla crash in Gardena, California

Shortly before 12:39 a.m. on December 29, 2019, a westbound Tesla Model S exited the freeway section of SR 91, failed to stop for a red light, and crashed into the driver's side of Honda Civic in Gardena, California, killing the driver and passenger in the Civic and injuring the driver and passenger in the Tesla.[536] The freeway section of SR 91 ends just east of the intersection with Vermont Ave and continues as Artesia Blvd. The Tesla was proceeding west on Artesia against the red light when it struck the Civic, which was turning left from Vermont onto Artesia.[537] The occupants of the Tesla were taken to the hospital with non life-threatening injuries.[538]

The NHTSA initiated an investigation of the crash,[539] which was considered unusual for a two-vehicle collision,[538] and later confirmed in January 2022 that Autopilot was engaged during the crash. The driver of the Tesla was charged in October 2021 with vehicular manslaughter in Los Angeles County Superior Court.[540][541] The families of the two killed also have filed separate civil lawsuits against the driver of the Tesla, for his negligence, and Tesla, for selling defective vehicles.[542]

In May 2022, a preliminary court hearing was held to determine if there was probable cause to proceed with a trial; a Tesla engineer testified the driver of the Tesla had engaged the Autopilot system approximately 20 minutes prior to the crash, setting the speed at 78 mph (126 km/h). The Tesla was traveling at 74 mph (119 km/h) when it collided with the Honda. The judge ordered the driver of the Tesla to stand trial on two counts of vehicular manslaughter.[543] Telemetry data indicated the driver had a hand on the steering wheel, but no brake inputs were detected in the six minutes preceding the crash, despite multiple signs at the end of the freeway warning drivers to slow down.[544] The driver of the Tesla pleaded not guilty in June.[545] The trial, scheduled for November 15, was postponed to late February 2023.[546] The driver changed his plea to no contest and was sentenced to two years of probation that June.[547]

Arendal, Norway (May 29, 2020)

Tesla killed a pedestrian in Arendal, Norway

After being notified that some straps on his trailer had come loose, on May 29, 2020, at approximately 11:00 a.m., a solo truck driver parked a tractor-trailer on the hard shoulder of northbound E18, 181 m (594 ft) northeast of the Torsbuås tunnel exit, just outside Arendal. Because of the restricted shoulder width, part of the truck was protruding into the right lane of E18.[548] While fixing the loose strap that was securing the load, he was struck and killed by a northbound Tesla Model S.[549] The Tesla driver had engaged Autopilot approximately 4 km (2.5 mi) south of the accident site; as he exited the tunnel and approached the parked truck, he observed there were no warning lights on the truck or a warning triangle on the ground and he assumed the truck was abandoned.[548] He then "heard a loud bang, and the car's windscreen cracked"; after pulling over to the shoulder, he walked back towards the parked truck and saw the truck driver's body.[548]

The Tesla's driver was charged with negligent homicide. Early in the trial, an expert witness testified that the car's computer indicated Autopilot was engaged at the time of the incident.[549] A forensic scientist said the victim was less visible because he was in the shadow of the trailer.[550] The driver said he had both hands on the wheel,[550] and that he was vigilant.[549] He was sentenced to three months' imprisonment in December 2020.[551]

The Accident Investigation Board Norway investigated the crash[549][552][553] and published its report in June 2022.[548] According to the investigation report, the truck driver had failed to report his stop to fix the strap to the Traffic Control Centre, and no passing motorists reported the parked truck; consequently, the driver of the Tesla was not notified there was a truck parked outside the tunnel. The Tesla's driver believed there was sufficient room to pass the parked truck while remaining in the right lane. Because the truck driver was next to the trailer in the shadow cast by the truck, the Tesla driver's view of the truck driver may have been compromised.[551]

In addition, the company responsible for planning and constructing the road, Nye Veier AS, was faulted by the investigators. During the planning phase, Nye Veier proposed a narrower shoulder of 2.0 m (6 ft 7 in) rather than 3.0 m (9.8 ft) as originally designed; this variance was approved by the Norwegian Public Roads Administration contingent on Nye Veier implementing mitigations. Nye Veier did not implement the proposed mitigations.[551]

Marietta, Georgia, USA (September 17, 2020)

On September 17, 2020, at approximately 5:24 a.m. EDT, the driver of a 2020 Tesla Model 3 crashed into an occupied CobbLinc bus shelter, demolishing it and killing the man waiting inside. The Tesla was driving north on South Cobb Drive near the intersection with Leader Road.[554][555] Because the car's event data recorder showed it had reached a speed of 77 mph (124 km/h) prior to the crash and that area has a posted speed limit of 45 mph (72 km/h), police charged the driver with first-degree vehicular homicide and reckless driving.[556]

At the time of the crash, it was not determined if Autopilot was engaged.[554] In September 2022, data provided by Tesla to the NHTSA demonstrated that Autopilot was active at the time of the crash.[440]:Report ID 13781–3847

The Woodlands, Texas, USA (April 17, 2021)

Tesla crash in The Woodlands, Texas

A Tesla Model S P100D[557] crashed and caught fire after departing Hammock Dunes Place in The Woodlands, Texas, a suburb of Houston, at 9:07 p.m. CDT on April 17, 2021, killing the driver and passenger.[558] According to a police spokesperson, the vehicle was traveling at a high speed and after failing to negotiate a curve, departed the roadway, crashed into a tree, and burst into flames.[559] Autopsies showed both the driver and passenger died from blunt-force trauma and smoke inhalation.[558] Initially, law enforcement authorities suspected that Autopilot was involved, based on the position of the bodies as found after the crash,[559] but subsequent investigations determined that Autopilot was not used.[558]

Security footage from the point of departure at the owner's residence showed that when the car left, the two men were occupying the driver's seat and the front passenger seat.[560] Witnesses stated the two men wanted to test drive the vehicle without a driver.[561] One man was found in the front passenger seat, and the other was in the back seat.[559][561] Because the responding personnel found neither man behind the wheel of the Tesla, authorities initially were "100 percent certain that no one was in the driver seat driving that vehicle at the time of impact",[559] which the subsequent investigation found to be false, as the NTSB determined the driver, who was wearing a seatbelt, was "moved into the rear seat" by the deployment of the airbag.[562]

The resulting fire took four hours and more than 30,000 US gal (113,600 L) of water to extinguish.[559] The chief of The Woodlands fire department later clarified the fire had been extinguished within a few minutes of arriving on the scene, but could not be fully extinguished because of the presence of bodies, the ongoing investigation, and possibility of being a crime scene, so a steady stream of water was required to keep the battery cool.[563][564]

Investigators from both NHTSA and NTSB were dispatched.[565] Although the post-crash fire destroyed the car's onboard telemetry storage, the restraint control module/event data recorder (EDR), while damaged, was evaluated at the NTSB's recorder laboratory.[560] Based on data recovered from the EDR, the highest recorded speed in the five seconds leading up to the crash was 67 mph (108 km/h).[566] According to interviews, that afternoon the driver and his wife had hosted two friends at their house, going out for dinner and returning to the house on the cul-de-sac of Hammock Dunes Place around 8:30 p.m. Alcohol had been consumed. The driver was showing the car to his friend; after they entered the street, the driver failed to negotiate the left hand turn along Hammock Dunes Place, departing the road and striking a storm sewer inlet, an elevated manhole cover, sideswiping a tree, and came to rest after colliding with a second tree at 57 mph (92 km/h).[558] The Woodlands Fire Department was notified of a "small outside fire" via 9-1-1 at 9:24 p.m.[558]

In response to the incorrect early assertions that Autopilot was involved, Elon Musk stated on Twitter that data logs indicated that Autopilot was not enabled, and the FSD package had not been purchased for that car.[567][568] During an earnings call in April 2021, Tesla's vice president of vehicle engineering pushed back on the news coverage of the incident and added that Tesla representatives had studied the crash and reported the steering wheel was "deformed", which could indicate "someone was in the driver's seat at the time of the crash".[569][570] The same Tesla officer noted a test car's adaptive cruise control had accelerated the car to only 30 mph (48 km/h) at the crash site.[571][572] On a closed course, Consumer Reports demonstrated that Autopilot would stay engaged after a person climbed out of the driver's seat by using a weight to apply torque to the steering wheel and leaving the driver's seatbelt buckled.[573] The NTSB tested an exemplar car at the site and found that Autosteer was not available on that part of Hammock Dunes.[560]

In an update published in October 2021, the NTSB concluded that Autopilot was not engaged and both the driver and front passenger seats were occupied at the time of the crash, based on the deformation of the steering wheel and data recovered from the car's EDR.[566] The frontal collision disabled the car's low-voltage (12 V) electrical system, which includes the power supply for normal rear seat door handles. In the event of power loss, rear seat passengers are expected to use a mechanical release tab in the carpet beneath the rear seat cushions; investigators were unable to determine if the mechanical release had been used due to damage from the postcrash fire.[558] The final investigation report, published in February 2023, determined the driver was operating the vehicle with a blood alcohol concentration of 0.151 g/dL, almost twice the legal limit in Texas (0.08 g/dL), concluding "the probable cause of the Spring, Texas, electric vehicle crash was the driver's excessive speed and failure to control his car, due to impairment from alcohol intoxication in combination with the effects of two sedating antihistamines".[558]

Fontana, California, USA (May 5, 2021)

At 2:35 a.m. PDT on May 5, 2021, a Tesla Model 3 crashed into an overturned tractor-trailer on the westbound Foothill Freeway (I-210) in Fontana, California. The driver of the Tesla was killed, and a man who had stopped to assist the driver of the truck was struck and injured by the Tesla.[574] California Highway Patrol (CHP) officials announced on May 13 that Autopilot "was engaged" prior to the crash, but added a day later that "a final determination [has not been] made as to what driving mode the Tesla was in or if it was a contributing factor to the crash".[575] The CHP and NHTSA are investigating the crash.[576][577] Telemetry data indicate that an automated driving system was in use at the time of the crash.[440]:Report ID 13781–5609

Queens, New York, USA (July 26, 2021)

On July 26, 2021, just after midnight, a man was hit and killed by a driver in a Tesla Model Y SUV. The victim had parked his vehicle on the left shoulder of the westbound Long Island Expressway (I-495), just east of the College Point Boulevard exit in Flushing, Queens, New York, to change a flat tire.[578][579] The NHTSA later determined Autopilot was active during the collision and sent a team to further investigate.[580][440]:Report ID 13781-21

Evergreen, Colorado, USA (May 16, 2022)

In the evening of May 16, 2022, the driver of a Tesla Model 3 left Upper Bear Creek Road in Evergreen, Colorado and collided with a tree. After the car caught on fire, a passenger was able to exit, but the driver was unable to leave the car and died at the scene.[581] Law enforcement suspect that the Tesla was operating in Autopilot.[440]:Report ID 13781-3074

Mission Viejo, California, USA (May 17, 2022)

At 10:51 p.m. PDT on May 17, 2022, a pedestrian walking on southbound I-5 near Crown Valley Parkway in Mission Viejo, California was struck and killed by a driver operating a Tesla Model 3. After the pedestrian was hit, the driver of the Tesla parked the car and exited it to stand on the right shoulder of the freeway; an impaired driver then crashed their car into the Tesla, and a third driver crashed into the two-car wreck, which was in a construction zone.[582] Field report data confirmed the Tesla was operating in Autopilot when the pedestrian was killed.[440]:Report ID 13781-3279

Gainesville, Florida, USA (July 6, 2022)

At approximately 2:00 p.m. EDT on July 7, 2022, the driver of a Tesla Model S traveling southbound on I-75 exited at a rest area just south of Gainesville, Florida, near Paynes Prairie Preserve State Park, and smashed into the rear of a parked Walmart tractor-trailer. Both the driver and passenger of the Tesla, a married couple from Lompoc, California, were killed.[583] A spokesperson for the Florida Highway Patrol noted "[The vehicle] came off the exit ramp to the rest area, continued south for a short period, and turned into an easterly direction and that's at what time we had the collision where the Tesla struck the rear of the tractor-trailer."[584] The NHTSA confirmed it had sent an investigation team to the site.[585] Data reported by Tesla under NHTSA SGO-2021-01 confirm that Autopilot was engaged during the crash.[440]:Report ID 13781-3327

Riverside, California, USA (July 7, 2022)

It was initially (and incorrectly) reported that at 4:47 a.m. PDT on July 7, 2022, a driver in a Tesla Model Y approached from behind, and then struck a motorcyclist on a Yamaha V-Star. Both vehicles were traveling eastbound in the high-occupancy vehicle lane of SR 91, west of Magnolia Avenue in Riverside, California. The motorcyclist was ejected from his vehicle and died at the scene, while the driver of the Tesla was uninjured after the Model Y went off the road.[586] The driver of the Tesla was not arrested.[587]

Subsequent CHP investigation showed the motorcyclist struck the dividing wall and fell off his motorcycle; the Tesla Model Y following behind struck the motorcycle (which was already lying on its side) but not the motorcyclist. Telemetry data from Tesla later confirmed the Model Y driver was using Autopilot.[588] Data reported by Tesla under NHTSA SGO-2021-01 also confirmed that Autopilot was engaged during the crash.[440]:Report ID 13781-3332

Draper, Utah, USA (July 24, 2022)

A motorcycle rider was struck from behind by a driver using Autopilot in a Tesla Model 3 on southbound Interstate 15 near 15000 S in Draper, Utah, at 1:09 a.m. MDT on July 24, 2022. The collision threw the motorcycle rider from his Harley-Davidson to the ground, killing him.[589][590] The driver told police he did not see the motorcyclist and he was using Autopilot at the time of the crash. Telemetric data submitted to NHTSA later confirmed his statements.[588][440]:Report ID 13781-3488

Michael Brooks, the acting executive director of the Center for Auto Safety commented "It's pretty clear to me, and it should be to a lot of Tesla owners by now, this stuff isn't working properly and it's not going to live up to the expectations, and it is putting innocent people in danger on the roads ... Drivers are being lured into thinking this protects them and others on the roads, and it's just not working."[591]

Boca Raton, Florida, USA (August 26, 2022)

On August 26, 2022 at 2:11 a.m. EDT, a motorcycle rider on a Kawasaki Vulcan was struck from behind by a driver in a Tesla Model 3 while both vehicles were traveling westbound on SW 18th Street approaching Boca Rio Road in Sandalfoot Cove, a census-designated place in unincorporated Palm Beach County, just outside the city of Boca Raton, Florida. The motorcycle rider was thrown from her motorcycle into the windshield of the Tesla; the rider was transported to a hospital, where she later died from the injuries she sustained in the collision. The driver of the Tesla was suspected of driving under the influence of alcohol and/or prescription drugs.[592]

The Palm Beach County Sheriff's Office later confirmed the driver of the Tesla was using Autopilot.[588] Data reported by Tesla under NHTSA SGO-2021-01 also confirm that Autopilot was engaged during the crash.[440]:Report ID 13781-3713

There have been multiple fatal collisions in the United States during 2022 in which a Tesla operating with Autopilot struck a motorcycle from the rear; in each instance, the motorcyclist was killed.[593][594] One theory is that because Tesla has shifted to exclusively visual sensors, the Autopilot logic to set the gap between the Tesla and a leading vehicle assumes the distance to a vehicle in front is inversely proportional to the spacing between that leading vehicle's taillights. Because motorcycle taillights are close-set, Tesla Autopilot may assume incorrectly the motorcycle is a distant car or truck.[595]

Walnut Creek, California, USA (February 18, 2023)

The driver of a 2014 Tesla Model S was killed after the vehicle he was driving crashed into a Contra Costa County fire truck parked across several lanes of northbound I-680 south of the Treat Boulevard offramp in Walnut Creek, California, at 4 a.m. on February 18, 2023.[596][597] The truck was parked with its lights on to protect the scene of an earlier accident that did not result in any injuries.[598] The Tesla had to be cut open to extricate the passenger, who was taken to the hospital to treat their injuries;[599] four firefighters in the fire truck also were injured and taken to the hospital.[600]

Initially, the California Highway Patrol stated it was not clear if the driver was intoxicated or operating the car with assistance features,[599] but NHTSA confirmed in March they suspected that an "automated driving system" was being used when the Tesla crashed into the fire truck, and had sent a special crash investigation team as part of a larger probe (EA 22-002) involving multiple incidents in which Teslas operating with Autopilot have crashed into stationary emergency response vehicles.[601] Tesla confirmed in April the car was operating under Autopilot at the time of the crash.[602] Telemetry data indicate that an automated driving system was in use at the time of the crash.[440]:Report ID 13781–4963

Corona, California, USA (March 28, 2023)

On March 28, 2023, at approximately 10:15 p.m., the driver of a Tesla Model Y died after the Tesla was struck by the driver of Ford F-150 pickup truck, who had entered the intersection of Foothill Parkway and Rimpau Avenue in Corona, California against a red light. The Tesla was proceeding through the intersection on a green light.[603][604] Telemetry data indicate that an automated driving system was in use at the time of the crash.[440]:Report ID 13781–5165

Central Point, Oregon, USA (June 5, 2023)

The Oregon State Police responded to a single-vehicle accident reported at 3:30 a.m. (PDT) on June 5, 2023 in Jackson County, Oregon; the Tesla Model S was driving northbound on I-5 near milepost 33 when the car departed from the roadway, striking a fence and then a tree before catching on fire. The driver was pronounced dead at the scene.[605][606] Telemetry data indicate that an automated driving system was in use at the time of the crash.[440]:Report ID 13781–5673

Brooklyn, New York, USA (June 7, 2023)

On June 7, 2023, at approximately 9 p.m., the driver of a Tesla Model S traveling along Ocean Parkway in Midwood, Brooklyn left the roadway, striking and killing a pedestrian waiting on the sidewalk to cross the street at the intersection with Avenue M.[607] The driver then struck a light pole and collided with a park bench on the median, injuring a man who had been seated on it.[608] The driver was arrested for leaving the scene of the crash.[609] Telemetry data indicate that an automated driving system was in use at the time of the crash.[440]:Report ID 13781–5685

Turlock, California, USA (June 20, 2023)

At approximately 3:15 a.m. on June 20, 2023, a driver operating a white sedan the wrong way (south in the northbound lanes) on SR 99 near Lander Avenue in Turlock, California, collided with a northbound Tesla Model Y traveling at approximately 70 mph (110 km/h). The driver of the wrong-way vehicle was killed, and the driver and passenger in the Tesla were injured.[610] Alcohol appears to have been a factor.[611] Telemetry data indicate that an automated driving system was in use at the time of the crash.[440]:Report ID 13781–5785

South Lake Tahoe, California, USA (July 5, 2023)

On July 5, 2023, at approximately 5:30 p.m. (PDT), the driver of a Subaru Impreza traveling north on Pioneer Trail at 55 mph (89 km/h) collided head-on with a Tesla Model 3 traveling south at 45 mph (72 km/h). The collision happened just south of the intersection with Fair Meadow Trail. The driver of the Subaru was taken to Barton Memorial Hospital, where he died from his injuries.[612][613] The five occupants of the Tesla were taken to UC Davis Medical Center, and one died, a three-month-old infant.[614] NHTSA dispatched an investigation team to the scene of the crash.[615] Telemetry data indicate that an automated driving system was in use at the time of the crash.[440]:Report ID 13781–5835

Opal, Virginia, USA (July 19, 2023)

While traveling north on the concurrent US 15/17/29 (James Madison Highway) at approximately 6:31 p.m. (EDT) on July 19, 2023, the driver of a Tesla Model Y collided with and continued under the side of the trailer of a combination truck pulling out of the Quarles Truck Stop fuel station near Opal, Virginia, south of Warrenton. The Tesla driver was killed and the truck driver was cited for reckless driving.[616] Two days later, the Fauquier County Sheriff's Office executed a search warrant for data from the Tesla, based on witness reports that said the Tesla driver did not attempt to brake before the collision.[617] That August, NHTSA sent a team to investigate the collision; the Tesla is suspected of being operated under Autopilot.[618] Telemetry data indicate that an automated driving system was in use at the time of the crash.[440]:Report ID 13781–5996

Non-fatal crashes

Culver City, California, USA (January 22, 2018)

On January 22, 2018, a 2014 Tesla Model S crashed into a fire truck parked on the side of the I-405 freeway in Culver City, California, while traveling at a speed exceeding 50 mph (80 km/h) and the driver survived with no injuries.[619] The driver told the Culver City Fire Department that he was using Autopilot. The fire truck and a California Highway Patrol vehicle were parked diagonally across the left emergency lane and high-occupancy vehicle lane of the southbound I-405, blocking off the scene of an earlier accident, with emergency lights flashing.[620]

According to a post-accident interview, the driver stated he was drinking coffee, eating a bagel, and maintaining contact with the steering wheel while resting his hand on his knee.[621]:3 During the 30-mile (48 km) trip, which lasted 66 minutes, the Autopilot system was engaged for slightly more than 29 minutes; of the 29 minutes, hands were detected on the steering wheel for only 78 seconds in total. Hands were detected applying torque to the steering wheel for only 51 seconds over the nearly 14 minutes immediately preceding the crash.[621]:9 The Tesla had been following a lead vehicle in the high-occupancy vehicle lane at approximately 21 mph (34 km/h); when the lead vehicle moved to the right to avoid the fire truck, approximately three or four seconds prior to impact, the Tesla's traffic-aware cruise control system began to accelerate the Tesla to its preset speed of 80 mph (130 km/h). When the impact occurred, the Tesla had accelerated to 31 mph (50 km/h).[621]:10 The Autopilot system issued a forward collision warning half a second before the impact, but did not engage the automatic emergency braking (AEB) system, and the driver did not manually intervene by braking or steering. Because Autopilot requires agreement between the radar and visual cameras to initiate AEB, the system was challenged due to the specific scenario (where a lead vehicle detours around a stationary object) and the limited time available after the forward collision warning.[621]:11

Several news outlets started reporting that Autopilot may not detect stationary vehicles at highway speeds and it cannot detect some objects.[622] Raj Rajkumar, who studies autonomous driving systems at Carnegie Mellon University, believes the radars used for Autopilot are designed to detect moving objects, but are "not very good in detecting stationary objects".[623] Both NTSB and NHTSA dispatched teams to investigate the crash.[624] Hod Lipson, director of Columbia University's Creative Machines Lab, faulted the diffusion of responsibility concept: "If you give the same responsibility to two people, they each will feel safe to drop the ball. Nobody has to be 100%, and that's a dangerous thing."[625]

In August 2019, the NTSB released its accident brief for the accident. HAB-19-07 concluded the driver of the Tesla was at fault due to "inattention and overreliance on the vehicle's advanced driver assistance system", but added the design of the Tesla Autopilot system "permitted the driver to disengage from the driving task".[621]:13–14 After the earlier crash in Williston, the NTSB issued a safety recommendation to "[d]evelop applications to more effectively sense the driver's level of engagement and alert the driver when engagement is lacking while automated vehicle control systems are in use." Among the manufacturers that the recommendation was issued to, only Tesla has failed to issue a response.[621]:12–13

South Jordan, Utah, USA (May 11, 2018)

In the evening of May 11, 2018, a 2016 Tesla Model S with Autopilot engaged crashed into the rear of a fire truck that was stopped in the southbound lane at a red light in South Jordan, Utah, at the intersection of SR-154 and SR-151.[626][627] The Tesla was moving at an estimated 60 mi/h (97 km/h) and did not appear to brake or attempt to avoid the impact, according to witnesses.[628][629] The driver of the Tesla, who survived the impact with a broken foot, admitted she was looking at her phone before the crash.[626][630] The NHTSA dispatched investigators to South Jordan.[631] According to telemetry data recovered after the crash, the driver repeatedly did not touch the wheel, including during the 80 seconds immediately preceding the crash, and only touched the brake pedal "fractions of a second" before the crash. The driver was cited by police for "failure to keep proper lookout".[626][632] The Tesla had slowed to 55 mi/h (89 km/h) to match a vehicle ahead of it, and after that vehicle changed lanes, accelerated to 60 mi/h (97 km/h) in the 3.5 seconds preceding the crash.[633]

Tesla CEO Elon Musk criticized news coverage of the South Jordan crash, tweeting that "a Tesla crash resulting in a broken ankle is front page news and the ~40,000 people who died in US auto accidents alone in [the] past year get almost no coverage", additionally pointing out that "[a]n impact at that speed usually results in severe injury or death", but later conceding that Autopilot "certainly needs to be better & we work to improve it every day".[631] In September 2018, the driver of the Tesla sued the manufacturer, alleging the safety features designed to "ensure the vehicle would stop on its own in the event of an obstacle being present in the path ... failed to engage as advertised."[634] According to the driver, the Tesla failed to provide an audible or visual warning before the crash.[633]

Moscow, Russia (August 10, 2019)

On the night of August 10, 2019, a Tesla Model 3 driving in the left-hand lane on the Moscow Ring Road in Moscow, Russia, crashed into a parked tow truck with a corner protruding into the lane and subsequently burst into flames.[635] According to the driver, the vehicle was traveling at the speed limit of 100 km/h (62 mph) with Autopilot activated; he also claimed his hands were on the wheel, but was not paying attention at the time of the crash. All occupants were able to exit the vehicle before it caught on fire; they were transported to the hospital. Injuries included a broken leg (driver) and bruises (his children).[636][637]

The force of the collision was enough to push the tow truck forward into the central dividing wall, as recorded by a surveillance camera. Passersby also captured several videos of the fire and explosions after the accident, these videos also show the tow truck that the Tesla crashed into had been moved, suggesting the explosions of the Model 3 happened later.[638][639]

Chiayi, Taiwan (June 1, 2020)

Traffic cameras captured the moment when a Tesla Model 3 slammed into an overturned cargo truck in Taiwan on June 1, 2020.[640] The crash occurred at 6:40 a.m. National Standard Time on the southbound National Freeway 1 in Chiayi, Taiwan, at approximately the south 268.4 km marker.[641] The truck had been involved in a traffic accident at 6:35 a.m. and overturned with its roof facing oncoming traffic; the driver of the truck got out to warn other cars away.[642]

The driver of the Tesla was uninjured and told emergency responders that the car was in Autopilot mode,[640] traveling at 110 km/h (68 mph).[642] The driver told authorities that he saw the truck and thought the Tesla would brake automatically upon encountering an obstacle; when he realized it would not, he manually applied the brakes,[642] although it was too late to avoid the crash, which is apparently indicated on the video by a puff of white smoke coming from the tires.[640][643]

Arlington Heights, Washington, USA (May 15, 2021)

A Tesla Model S crashed into a stopped Snohomish County, Washington, sheriff's patrol car at 6:40 p.m. PDT on May 15, 2021, shortly after the deputy parked it while responding to an earlier crash which had broken a utility pole near the intersection of SR 530 and 103rd Ave NE in Arlington Heights, Washington. The patrol car was parked to partially block the roadway and protect the collision scene, and the patrol car's overhead emergency lights were activated.[644] Neither the deputy nor the driver of the Tesla were injured. The driver of the Tesla assumed his car would slow and move over on its own because it was in "Auto-Pilot mode".[645]

Brea, California, USA (November 3, 2021)

The driver of a Tesla Model Y reported a crash to the NHTSA that occurred on November 3, 2021 while operating in FSD Beta.[646] The incident was described as a "severe" crash after "the car by itself took control and forced itself into the incorrect lane" during a left turn.[647] It is likely this is the first complaint filed with NHTSA that alleges FSD caused a crash; NHTSA requested further information from Tesla, but other details of the crash, such as the driver's identity and location of the crash, were not released.[648]

Armadale, Victoria, Australia (March 22, 2022)

On March 22, 2022 at approximately 6:30 a.m., the driver of a Tesla Model 3 struck a woman boarding a city-bound tram on Wattletree Road in Armadale, an inner suburb of Melbourne in the Australian state of Victoria. After being struck, the victim was dragged for approximately 15–20 m (49–66 ft). She was taken to the hospital with life-threatening injuries.[649] The driver of the Tesla fled the scene initially, then turned herself in to police two hours later.[650] According to the official report, the driver stated her Tesla 3 was on autopilot when she struck the pedestrian.[651]

The driver pleaded not guilty to four charges in April 2023, including dangerous driving causing serious injury, and was ordered to stand trial after the magistrate heard testimony from five witnesses. The tram operator testified he saw a woman rise from a seat at the tram stop and start walking toward the tram before she was struck: "I hear a thud, a whoosh, a car went passed [sic]". The chief safety officer of Yarra Trams testified that "once the tram has stopped... there are big flashing lights (at the rear of the vehicle), we call them school lights", adding the tram could not have opened its doors before the crash.[652]

Maumee, Ohio, USA (November 18, 2022)

On November 18, 2022 at 8:21 a.m., a Tesla Model 3 collided with the rear end of a stationary Ohio State Highway Patrol cruiser in the left lane of eastbound U.S. 24 near milepost 64, where it passes over Waterville–Monclova Road near Maumee, Ohio, a suburb of Toledo. The cruiser was parked with its emergency lights flashing to protect the vehicle involved in an earlier single-car accident at the scene.[653] The OSHP officer and the driver from the earlier accident were sitting in the cruiser; both sustained minor injuries from the impact.[654]

In December, the NHTSA confirmed they were investigating the crash, which may have involved Autopilot.[655] Telemetry data indicate that Autopilot was active.[440]:Report ID 13781–4293

San Francisco–Oakland Bay Bridge, California, USA (November 24, 2022)

The driver of a 2021 Tesla Model S told the California Highway Patrol that while driving eastbound on "Full Self-Driving" mode in the Yerba Buena Tunnel portion of the San Francisco–Oakland Bay Bridge near Treasure Island, at approximately noon on November 24, 2022,[656] the vehicle cut across several lanes of traffic to the far left lane and abruptly slowed from 55 to 20 mph (89 to 32 km/h), causing a chain-reaction collision involving eight vehicles. Nine were treated for injuries, and two lanes of traffic were closed for 90 minutes.[657] Surveillance footage acquired by The Intercept corroborated the vehicle's sudden movements.[658][659]

NHTSA confirmed they would send a team to investigate the crash.[655] Telemetry data indicate that an automated driving system was in use at the time of the crash.[440]:Report ID 13781–4338

Halifax County, North Carolina, USA (March 15, 2023)

On Wednesday, March 15, 2023, in Halifax County, North Carolina, a 17-year-old high school student attending the Haliwa-Saponi Tribal School was struck by a driver in a 2022 Tesla Model Y. The student had just exited a school bus and was crossing the road to his house when he was struck by the Tesla. The bus was stopped with flashing lights and its stop arm deployed; the North Carolina State Highway Patrol initially attributed the cause of the injury to "distracted driving".[660] The student's father rendered first aid after witnessing the collision, which left the teenager with a broken neck and internal bleeding. He was flown to WakeMed and placed on a ventilator.[661]

It is unclear whether the car was in autopilot during the accident, but it is being investigated by the State Highway Patrol. NHTSA have dispatched a team to investigate.[662] Telemetry data indicate that an automated driving system was in use at the time of the crash.[440]:Report ID 13781–5100

See also

References

Template:Reflist

External links

Template:Tesla Motors Template:Autonomous cars and enabling technologies

Template:Sourceattribution