Engineering:Self-driving car liability
Increases in the use of autonomous car technologies (e.g., advanced driver-assistance systems) are causing incremental shifts in the responsibility of driving, with the primary motivation of reducing the frequency of traffic collisions.[1] Liability for incidents involving self-driving cars is a developing area of law and policy that will determine who is liable when a car causes physical damage to persons or property.[2] As autonomous cars shift the responsibility of driving from humans to autonomous car technology, there is a need for existing liability laws to evolve to reasonably identify the appropriate remedies for damage and injury.[3] As higher levels of autonomy are commercially introduced (SAE automation levels 3 and 4), the insurance industry stands to see higher proportions of commercial and product liability lines, while personal automobile insurance shrinks.[4]
Self-driving car liability and self-driving vehicle liability may be impacted by changes in regulation of self-driving vehicles being developing in some countries.
Overview
Self-driving car liability is a developing area of law and policy that will determine who is liable when an automated car causes physical damage to persons, or breaks road rules.[5] [2]
Similar considerations may also be raised with other automated vehicles and also with damages other than damage to persons.
When automated cars shift the control of driving from humans to automated car technology the driver will need to consent to share operational responsibility[6] which will require a legal framework. [ambiguous]
Driving liability
There may be a need for existing liability laws to evolve in order to fairly identify the parties responsible for damage and injury, and to address the potential for conflicts of interest between human occupants, system operator, insurers, and the public purse.[3] Increases in the use of automated car technologies (e.g. advanced driver-assistance systems) may prompt incremental shifts in this responsibility for driving. It is claimed by proponents to have potential to affect the frequency of road accidents, although it is difficult to assess this claim in the absence of data from substantial actual use.[7] If there was a dramatic improvement in safety, the operators may seek to project their liability for the remaining accidents onto others as part of their reward for the improvement. However, there is no obvious reason why they should escape liability if any such effects were found to be modest or nonexistent, since part of the purpose of such liability is to give an incentive to the party controlling something to do whatever is necessary to avoid it causing harm. Potential users may be reluctant to trust an operator if it seeks to pass its normal liability on to others.
Transition and driver liability
In any case, a well-advised person who is not controlling a car at all (Level 5) would be understandably reluctant to accept liability for something out of their control. And when there is some degree of sharing control possible (Level 3 or 4), a well-advised person would be concerned that the vehicle might try to pass back control at the last seconds before an accident, to pass responsibility and liability back too, but in circumstances where the potential driver has no better prospects of avoiding the crash than the vehicle, since they have not necessarily been paying close attention, and if it is too hard for the very smart car it might be too hard for a human. Since operators, especially those familiar with trying to ignore existing legal obligations (under a motto like 'seek forgiveness, not permission'), such as Waymo or Uber, could be normally expected to try to avoid responsibility to the maximum degree possible, there is potential for attempt to let the operators evade being held liable for accidents while they are in control.
As higher levels of automation are commercially introduced (Level 3 and 4), the insurance industry may see a greater proportion of commercial and product liability lines while personal automobile insurance shrinks.[4]
Fully autonomous driving liability
When it comes to the direction of fully autonomous car liability, torts cannot be ignored. In any car accident the issue of negligence usually arises. In the situation of autonomous cars, negligence would most likely fall on the manufacturer because it would be hard to pin a breach of duty of care on the user who isn't in control of the vehicle. The only time negligence was brought up in an autonomous car lawsuit, there was a settlement between the person struck by the autonomous vehicle and the manufacturer (General Motors). Next, product liability would most likely cause liability to fall on the manufacturer. For an accident to fall under product liability, there needs to be either a defect, failure to provide adequate warnings, or foresee-ability by the manufacturer.[8] Third, is strict liability which in this case is similar to product liability based on the design defect. Based on a Nevada Supreme Court ruling (Ford vs. Trejo) the plaintiff needs to prove failure of the manufacturer to pass the consumer expectation test.[9] That is potentially how the three major torts could function when it comes to autonomous car liability.
Current liability frameworks
Existing tort liability for drivers and insurers and product liability for manufacturer provide the current basis for governing crashes.
Tort liability
There are three basic theories of tort liability: traditional negligence, no-fault liability and strict liability.[3]
Traditional negligence | Driver is held liable for harms caused when reasonable care was not taken while in operation of the vehicle |
---|---|
No-fault | Crash victims are not permitted to sue the driver of the vehicle, unless the injuries resulting from the crash are of a certain severity. Victims are compensated through their own insurance |
Strict liability | Applies for abnormally dangerous or “ultrahazardous” activities. The actors involved consequently bear the associated costs regardless of whether they are legally at fault |
According to a National Motor Vehicle Crash Causation Survey, over 90% of the crashes (representing an estimated 2 million crashes in USA) involved the driver as the critical reason for the crash.[10] Meanwhile, research from the Insurance Institute for Highway Safety (IIHS) shows that Advanced Driver-Assistance Systems, which are seen as stepping stones to get to Level 3 and 4 autonomy, have helped reduce collisions by employing forward-collision warnings and automatic braking.[11] Given these trends, the increased use of autonomous vehicle technology could reduce the number of collisions and prevent crash-related deaths.[12] Consequently, cases of traditional negligence will likely fall, and this will, in turn, reduce automobile-insurance costs.[3]
With the onset of fully autonomous cars, it is possible that the need for specialized automobile insurance disappears and that health insurance and homeowner's liability insurance instead cover automobile crashes, much in the same way that they cover bicycle collisions.[3] Moreover, as cases of traditional negligence decrease, no-fault insurance systems appear attractive given their benefits.[3] It would provide compensation to victims relatively quickly, and the compensation would not depend on the identification of a party at-fault. In such systems, individual drivers would be well protected and would encourage the adoption of autonomous cars for their safety and cost-related benefits.
Negligence was the basis for the lawsuit Nilsson vs. General Motors. Nilsson was knocked off his motorcycle when a Chevy Bolt switched into his lane while it was in a self-driving mode. Nilsson sued for negligence based on the self-driving car having (1) a duty to follow traffic laws and regulations, (2) breaching that duty while switching lanes when he was passing, (3) and injuring his neck.[13] The case was settled before it went to court, so the answer to the question "was the error of the self-driving car foreseeable?" still remains unclear.
Product liability
Product liability governs the liability of manufacturers in terms of negligence and strict liability.[3]
Negligence | Manufacturers must exercise reasonable care in designing their products to be safe under potential use cases |
---|---|
Strict liability | Manufacturer is held strictly liable for damages even when the manufacturer has exercised all possible care to remove defects |
Autonomous car manufacturers are incentivized by possible product liability torts lawsuits to reduce the danger of their products as much as they can within a reasonable cost structure. Strict liability covers an expansive range of potential harms that manufacturers may find challenging to protect against; instead of reducing less cost-effective risks, manufacturers may, to some degree, pass on potential costs of liability to consumers through higher prices.
Furthermore, product liability cases distinguish among various types of defects.
Manufacturing defects | When the product does not meet the manufacturer's specifications and standards |
---|---|
Design defects | When foreseeable risks of harm could have been reduced by use of an alternative design |
Failure to warn | When manufacturer fails in its duty to provide instruction about how the product can be safely used and does not provide adequate warning of its risks |
Under manufacturing defects, a plaintiff needs to show that the autonomous car failed to work as specified by the manufacturer. In the case of autonomous cars, however, this presents a significant hurdle because no court has applied manufacturing defects to software, which is not something tangible that is manufactured.[14] The wrong performance of the technology system is called “malfunction”, which means that there is a coding error within the system to cause the collisions. When there is a coding error, then the controlling software system may not have functioned as its authors had initially intended.[15] If a crash stems from a software error, then the traditional product liability law on manufacturing defects may not suffice. A greater understanding of how the software will be treated under this liability law, mainly when a software error causes physical parts to malfunction, needs to be explored.
Historically, courts have used two tests for the defectiveness of design: consumer-expectations and cost-benefit.
Consumer-expectations: "A product is defective in design or formulation when it is more dangerous than an ordinary consumer would expect when used in an intended or reasonably foreseeable manner. Moreover, the question of what an ordinary consumer expects in terms of the risks posed by the product is generally one for the trier of fact."[16]
On the other hand, the cost-benefit test weighs the benefits against the costs of a product in determining whether a design is defective. With autonomous cars, the plaintiff could make the argument that a different design, whether in the physical features of the vehicle or in the software that controls the movements of the vehicle, could have made the vehicle safer.[17] For plaintiffs, this creates a high burden of proof and also makes it challenging to find qualified experts.[14]
Imposing liability
In asking "who do I sue," a plaintiff in a traditional car crash would assign blame to the driver or the car manufacturer, depending on the cause of the crash. In a crash involving an autonomous car, a plaintiff may have four options to pursue.[14]
- Operator of the vehicle: in Florida and Nevada, an operator is defined as a person who causes the autonomous technology to engage, regardless of whether the person is physically in the vehicle.[18] California, on the other hand, specifies that an operator as “the person who is seated in the driver’s seat, or, if there is no person in the driver’s seat, causes the autonomous technology to engage.”[14] The viability of a claim against the operator will be determined by the level of autonomy. For instance, if the autonomous technology allows the passenger to cede full control of the vehicle, then the passenger will likely not be found to be at fault for a crash caused by the technology.[14]
- Car manufacturer: with this option, a plaintiff will need to determine whether the manufacturer had a part in installing autonomous technology into the vehicle. States such as Florida, however, are providing protection by limiting product liability for manufacturers.[18]
- Company that created the finished autonomous car: Volvo is an example of a manufacturer who has pledged to take full responsibility for collisions caused by its self-driving technology.[19]
- Company that created the autonomous car technology: Companies under this option could include those developing the software behind the autonomous car and those manufacturing the sensor systems that allow a vehicle to detect its surroundings.
Possible defenses
In defense of such liabilities, autonomous vehicle manufacturers could make the argument of comparative negligence, product misuse, and state of the art.[3] With comparative negligence, the driver or passenger interference is seen as a part of the cause of harm and injury. With product misuse, the driver or passenger may be at fault for disregarding directions or altering the vehicle in a way to affect the proper performance of the vehicle. With state of the art, manufacturers could make the argument that there were not safe alternative designs at the time of manufacturing.[3]
Cyber liability
As cars become more interconnected and autonomous, the potential for hacking a car system to acquire data and cause harm poses a serious risk. For manufacturers and developers of autonomous technology, liability exposures arise from the collection and storage of data and personal information in the vehicle and the cloud.[20] Currently, manufacturers require indemnification from vendors and subcontractors (dealerships, repair/installation facilities, etc.), and this practice will likely be extended to autonomous technology developers.
Transportation systems are vital for the autonomous vehicle as they serve as the commander, and with multiple autonomous vehicles systems used to increase efficiency, the risk of exposure to malicious attacks will dramatically increase. In order to protect the systems, cyber physical system must be implemented with autonomous dynamical subsystems to ensure the decision, interaction, and control.[21]
British law
In 2018, a British Automated and Electric Vehicles act of parliament defines the rules for:
- Listing of automated vehicles by the Secretary of State
- Liability of insurers etc. where accident caused by automated vehicle
- Contributory negligence etc.
- Accident resulting from unauthorized software alterations or failure to update software
- Right of insurer etc. to claim against person responsible for accident
- basic liability
The law defines some cases of automated vehicle liability.
(1) Where—
- (a) an accident is caused by an automated vehicle when driving itself on a road or other public place in Great Britain,
- (b) the vehicle is insured at the time of the accident, and
- (c) an insured person or any other person suffers damage as a result of the accident,
- the insurer is liable for that damage.
(2) Where—
- (a) an accident is caused by an automated vehicle when driving itself on a road or other public place in Great Britain,
- (b) the vehicle is not insured at the time of the accident,
- (c) section 143 of the Road Traffic Act 1988 (users of motor vehicles to be insured or secured against third-party risks) does not apply to the vehicle at that time—
- (i) because of section 144(2) of that Act (exemption for public bodies etc), or
- (ii) because the vehicle is in the public service of the Crown, and
- (d) a person suffers damage as a result of the accident,
- the owner of the vehicle is liable for that damage.
—Automated and Electric Vehicles Act 2018[22]
- liability limited by software
For instance:
An insurance policy in respect of an automated vehicle may exclude or limit the insurer's liability under section 2(1) for damage suffered by an insured person arising from an accident occurring as a direct result of—
- (a) software alterations made by the insured person, or with the insured person's knowledge, that are prohibited under the policy, or
- (b) a failure to install safety-critical software updates that the insured person knows, or ought reasonably to know, are safety-critical.
—Automated and Electric Vehicles Act 2018[22]
- regulator
The UK could have a regulator. When there is no user in charge (NUIC) the police contacts the regulator. The regulator sanctions the automated driving system entity until possible withdrawal of authorization.[23]
French law
On 14 April 2021, in France, a text defines the rules for the level 3 (véhicule à délégation de conduite) and level 5 (transport routier de marchandises, lorsqu'il est effectué au moyen d'un système de transport routier automatisé). This text is titled: ordonnance n° 2021-443 du 14 avril 2021 relative au régime de responsabilité pénale applicable en cas de circulation d'un véhicule à délégation de conduite et à ses conditions d'utilisation.[24]
On first July 2021, France is the first European country to update it code de la route for automated cars.[25] This update clarifies driver and car role and responsibility and plans application after Vienna convention update and before September 2022.[26]
German law
When Mercedes launch its Drive Pilot mid 2021 in Germany, it is expected that Daimler would have to assume insurance liability, depending on the jurisdiction.[27]
Level 3 means the driver can legally take their eyes off the wheel and the company, Daimler in this case, would have to assume insurance liability, depending on the jurisdiction.—TheIndu.[27]
In 2021, a proposed German law proposed the "self-driving vehicle" would require an operating permit to be used as an automated vehicle.[28]
Policy considerations (US)
Manufacturers overbearing the costs
As argued in the article “The Coming Collision Between Autonomous Vehicles and the Liability System” by Gary Marchant and Rachel Lindor, a manufacturer cannot anticipate all possible scenarios that an autonomous car will encounter.[29] While the manufacturer will design the system to minimize risks of situations that it does anticipate, the collisions that are most damaging and costly will be those that the manufacturer fails to anticipate. This leaves the manufacturer highly vulnerable to design defects, in particular the cost-benefit test.
In light of this, Marchant and Lindor argue that “the technology is potentially doomed...because the liability burden on the manufacturer may be prohibitive of further development. Thus, even though an autonomous vehicle may be safer overall than a conventional vehicle, it will shift the responsibility for collisions, and hence liability, from drivers to manufacturers. The shift will push the manufacturer away from the socially-optimal outcome—to develop the autonomous vehicle.”[29]
Consequently, policymakers need to be mindful of manufacturer overbearing the liability costs and the potential consequences that may result, such as higher consumer costs and delays in introducing autonomous car technology. In the report “Autonomous Vehicle Technology” by the Rand Corporation, the authors recommend that policymakers consider approaches such as tort preemption, a federal insurance backstop, and long-term cost-benefit analysis of the legal standard for reasonableness.[3] These approaches attempt to align the private and public costs of autonomous car technology such that adoption is not unnecessarily delayed, and one party does not overly-bear the costs.
NHTSA guidelines
In September 2016, the National Highway Traffic Safety Administration released a policy report to accelerate the adoption of autonomous car technology (or HAVs, highly automated vehicles) and provide guidelines for an initial regulatory framework. The key points are:[30]
- States are responsible for determining liability rules for HAVs. States should consider how to allocate liability among HAV owners, operators, passengers, manufacturers, and others when a crash occurs.
- Determination of who or what is the “driver” of an HAV in a given circumstance does not necessarily determine liability for crashes involving that HAV.
- Rules and laws allocating tort liability could have a significant effect on both consumer acceptance of HAVs and their rate of deployment. Such rules also could have a substantial effect on the level and incidence of automobile liability insurance costs in jurisdictions in which HAVs operate.
- In the future, the States may identify additional liability issues and seek to develop consistent solutions. It may be desirable to create a commission to study liability and insurance issues and make recommendations to the States.
H.R. 3388, the SELF DRIVE Act of 2017
The House of Representatives on September 6, 2017, unanimously passed H.R. 3388, the SELF DRIVE Act of 2017[31][32]
- Advance safety by prioritizing the protection of consumers.
- Reaffirm the role and responsibilities of federal and state governments.
- Update the Federal Motor Vehicle Safety Standards to account for advances in technology and the evolution of highly automated vehicles,
The Federal Government, with the passing of the SELF DRIVE Act, is limiting the role of States, and this could signal a change in the future of liability laws. With the Federal Government also asserting that consumers will be protected, manufacturers may be at a liability disadvantage and stand to lose surplus. Updating the Federal Motor Vehicle Safety Standards will affect liability law. These laws will continue to protect the consumer while placing stricter standards on producers. The Federal Government has yet to announce any specific autonomous vehicular manslaughter liability laws.[33][34]
Artificial intelligence and liability
More broadly, any software with access to the real world, including autonomous vehicles and robots, can cause property damage, injury, and death. This raises questions about civil liability or criminal responsibility.
In 2018, The University of Brighton researcher John Kingston analyzed three legal theories of criminal liability that could apply to an entity controlled by artificial intelligence.[35][36]
- Perpetrator via another - the programmer (software designer) or the user could be held liable for directly instructing the AI entity to commit the crime. This is used in common law when a person instructs or directly causes an animal or person incapable of criminal responsibility (such as a young child or a person with a severe mental disability) to commit a crime.
- Natural and probable consequence - the programmer or the user could be held liable for causing the AI entity to commit a crime as a consequence of its natural operation. For example, if a human obstructs the work of a factory robot and the AI decides to squash the human as the easiest way to clear the obstruction to continue working, if this outcome was likely and the programmer knew or should have known that, the programmer could be held criminally liable.
- Direct liability - the AI system has demonstrated the criminal elements of the recognized theory of liability in criminal law. Strict liability offenses (like speeding) require an action (actus reus), but "conventional" offenses (like murder) require an intention (a type of mens rea). Criminal negligence involves non-performance of duty in the face of evidence of possible harm. Legally, courts may be capable under existing laws of assigning criminal liability to the AI system of an existing self-driving car for speeding; however, it is not clear that this would be a useful thing for a court to do.
Possible defenses include unexpected malfunction or infection with malware, which has been successfully used in the United Kingdom in the case of a denial-of-service attack.[35]
Kingston identifies two areas of law, depending on the type of entity:[35]
- For products, product liability laws apply, including the enforcement of warranties.
- For services, the tort of negligence may apply if the system failed to perform up to its duty of care.
The NHTSA investigation of a fatal 2016 crash involving Tesla Autopilot proceeded as an automobile product safety inquiry and determined that despite the crash, there were no defects that required a recall (though Tesla is working to improve the software to avoid similar crashes). Autopilot only gives cars limited autonomy, and human drivers are expected to maintain situational awareness and take over as needed.[37]
With fully autonomous vehicles, the software and vehicle manufacturers are expected to be liable for any at-fault collisions (under existing automobile products liability laws), rather than the human occupants, the owner, or the owner's insurance company.[38] Volvo has already announced that it will pay for any injuries or damage caused by its fully autonomous car, which it expects to start selling in 2020.[38] Starting in 2012, laws or regulations specifically regarding autonomous car testing, certification, and sales, with some issuing special driver's licenses; have been passed by some U.S. states, this remains an active area of lawmaking.[39] Human occupants would still be liable for actions they directed, such as choosing where to park (and thus for parking tickets).
University of South Carolina law professor Bryant Walker Smith points out that with automated systems, considerably more data will typically be available than with human-driver crashes, allowing more reliable and detailed assessment of liability. He also predicted that comparisons between how an automated system responds and how a human would have or should have responded would be used to help determine fault.[40]
State level legislation in the US
According to the NHTSA, states retain their responsibility for motor vehicle insurance and liability regimes, among other traditional responsibilities such as vehicle licensing and registration and traffic laws and enforcement.[18] Several states, such as Michigan and Nevada and Washington D.C., have explicitly written provisions for how liability will be treated.
Enacted autonomous vehicle legislation
State | Bill Number | Relevant Provisions | Effective Date |
---|---|---|---|
Michigan | SB 663 (2013) | Limits liability of vehicle manufacturer or upfitter for damages in a product liability suit resulting from modifications made by a third party to an automated vehicle or automated vehicle technology under certain circumstances; relates to automated mode conversions | Enacted and chaptered on Dec. 26, 2013 |
Nevada | SB 313 (2013) | Provides that the manufacturer of a vehicle that has been converted to be an autonomous vehicle by a third party is immune from liability for certain injuries | Enacted and chaptered on June 2, 2013 |
Washington D.C. | 2012 DC B 19-0931 | Restricts conversion to recent vehicles, and addresses liability of the original manufacturer of a converted vehicle | Enacted and effective from April 23, 2013. |
Arizona's Republican Gov. Doug Ducey's new rules, implemented March 1, lay out a specific list of licensing and registration requirements for autonomous car operators. Specifically, Ducey's order specifies that a “person” subject to the laws includes any corporation incorporated in Arizona.[41]
Shift in auto insurance marketplace
In a white paper titled “Marketplace of Change: Automobile Insurance in the Era of Autonomous Vehicles,” KPMG estimated that personal auto accounted for 87% of loss insurance, while commercial auto accounted for 13% in 2013.[4] By 2040, personal auto is projected to fall to 58%, while commercial auto rises to 28%, and product liability gains 14%. This reflects the view that personal liability will fall as the responsibility of driving shifts to the vehicle and that mobility on demand will take greater hold. In addition, with the view that the overall pie representing losses covered by liability policies will shrink as autonomous cars cause fewer collisions.[4]
Although KPMG cautions that this elimination of excess capacity will bring about significant changes to the insurance industry, 32% of insurance firm leaders expect that driverless vehicles will have no material effect on the insurance industry over the next 10 years.[4] Inaction by the large players has opened up opportunities for new entrants. For example, Metromile, an insurance provider start-up founded in 2011, has started to offer usage-based insurance for low-mileage drivers and designed a policy to complement the commercial coverage of Uber drivers.[42]
Public statements from car manufacturers
In 2015, Volvo issued a press release claiming that Volvo would accept full liability whenever its cars in autonomous mode.[19] President and Chief Executive of Volvo Cars Håkan Samuelsson went further urging "regulators to work closely with car makers to solve controversial outstanding issues such as questions over legal liability in the event that a self-driving car is involved in a crash or hacked by a criminal third party."[19]
In an IEEE article, the senior technical leader for safety and driver support technologies at Volvo echoed a similar sentiment saying, “if we made a mistake in designing the brakes or writing the software, it is not reasonable to put the liability on the customer...we say to the customer, you can spend time on something else, we take responsibility.”[43]
In 2023, Mercedes-Benz is willing to bear liability for the Level 3 Drive Pilot to ease approval from American regulators.[44]
Specific cases
In 2023 a car drove in wet concrete in a road construction zone. So the company pay to repave the road.[45]
See also
- History of self-driving cars
- Self-driving car
- Self-driving truck
- Death of Elaine Herzeburg
- Regulation of self-driving cars
References
- ↑ Bertoncello, Michele; Wee, Dominik. "Ten ways autonomous driving could redefine the automotive world". http://www.mckinsey.com/industries/automotive-and-assembly/our-insights/ten-ways-autonomous-driving-could-redefine-the-automotive-world.
- ↑ 2.0 2.1 Slone, Sean. "State Laws on Autonomous Vehicles". http://knowledgecenter.csg.org/kc/content/state-laws-autonomous-vehicles.
- ↑ 3.0 3.1 3.2 3.3 3.4 3.5 3.6 3.7 3.8 3.9 Anderson, James M.; Kalra, Nidhi; Stanley, Karlyn D.; Sorensen, Paul; Samaras, Constantine; Oluwatola, Oluwatobi A. (2016). "Autonomous Vehicle Technology: A Guide for Policymakers". RAND Corporation. http://www.rand.org/pubs/research_reports/RR443-2.html.
- ↑ 4.0 4.1 4.2 4.3 4.4 "Marketplace of change: Automobile insurance in the era of autonomous vehicles". https://home.kpmg.com/us/en/home/insights/2016/05/era-of-autonomous-vehicles-survey.html.
- ↑ Taeihagh, Araz; Lim, Hazel Si Min (2 January 2019). "Governing autonomous vehicles: emerging responses for safety, liability, privacy, cybersecurity, and industry risks". Transport Reviews 39 (1): 103–128. doi:10.1080/01441647.2018.1494640. ISSN 0144-1647.
- ↑ Pattinson, Jo-Ann; Chen, Haibo; Basu, Subhajit (2020-11-18). "Legal issues in automated vehicles: critically considering the potential role of consent and interactive digital interfaces". Humanities and Social Sciences Communications 7 (1): 1–10. doi:10.1057/s41599-020-00644-2. ISSN 2662-9992.
- ↑ "Ten ways autonomous driving could redefine the automotive world". http://www.mckinsey.com/industries/automotive-and-assembly/our-insights/ten-ways-autonomous-driving-could-redefine-the-automotive-world.
- ↑ "Types of Product Liability Claims". 6 August 2007. https://www.law.cornell.edu/wex/Products_liability.
- ↑ Boba, Antonio (December 1982). "Responsibility for Equipment Failure: Consumer vs. Manufacturer". Anesthesiology 57 (6): 547. doi:10.1097/00000542-198212000-00027. ISSN 0003-3022.
- ↑ "Critical Reasons for Crashes Investigated in the National Motor Vehicle Crash Causation Survey". https://crashstats.nhtsa.dot.gov/Api/Public/ViewPublication/812115.
- ↑ "ADAS technology is reducing crashes". http://www.traffictechnologytoday.com/news.php?NewsID=62827.
- ↑ "Nearly 10,000 Deaths Could Be Prevented and More Than $250 Billion Saved with Greater Use of Driver Assistance Technologies". https://www.bcg.com/publications/2015/automotive-roadmap-to-safer-driving-through-advanced-driver-assistance-systems.
- ↑ "Nillson vs. General Motors". https://www.courtlistener.com/recap/gov.uscourts.cand.321643/gov.uscourts.cand.321643.1.0.pdf.
- ↑ 14.0 14.1 14.2 14.3 14.4 Jansma, Steven D. (August 11, 2016). "Autonomous vehicles: The legal landscape in the US". Norton Rose Fulbright. https://www.nortonrosefulbright.com/en/knowledge/publications/2951f5ce/autonomous-vehicles-the-legal-landscape-in-the-us.
- ↑ A., Geistfeld, Mark (2017). "A Roadmap for Autonomous Vehicles: State Tort Liability, Automobile Insurance, and Federal Safety Regulation" (in en). California Law Review 105 (6). doi:10.15779/z38416sz9r.
- ↑ "Donegal Mutual Insurance v. White Consolidated Industries Inc". Findlaw. March 31, 2006. https://caselaw.findlaw.com/oh-court-of-appeals/1451338.html.
- ↑ Sudzus, David B. (April 2015). "Autonomous Vehicles - Liability and Policy Issues". Drake Management Review (Drake University) 4 (1/2). ISSN 2166-0859. http://faculty.cbpa.drake.edu/dmr/0412/DMR041204S.pdf.
- ↑ 18.0 18.1 18.2 "Autonomous Vehicles | Self-Driving Vehicles Enacted Legislation". National Conference of State Legislatures. November 11, 2016. https://www.ncsl.org/research/transportation/autonomous-vehicles-self-driving-vehicles-enacted-legislation.aspx.
- ↑ 19.0 19.1 19.2 "US urged to establish nationwide Federal guidelines for autonomous driving" (Press release). Volvo. October 7, 2015.
- ↑ "Considerations for Personal and Commercial Lines Insurers". Munich Reinsurance America. https://www.munichre.com/site/mram-mobile/get/documents_E1725865033/mram/assetpool.mr_america/PDFs/3_Publications/Autonomous_Vehicles.pdf.[yes|permanent dead link|dead link}}]
- ↑ Mikulski, Dariusz (Fall 2015). "Special Issue: Modeling & Simulation for Cyber Security of Autonomous Vehicle Systems". The Journal of Defense Modeling and Simulation: Applications, Methodology, Technology 12 (4): 359–361. doi:10.1177/1548512915604584.
- ↑ 22.0 22.1 "Automated and Electric Vehicles Act 2018". The National Archives. https://www.legislation.gov.uk/ukpga/2018/18/section/4.
- ↑ "FN Mar 21 mag". https://cdn.fleetnews.co.uk/web/1/digital-issue-categories/march-2021/FN_Mar_2021_mag/index.html#page=17.
- ↑ "Ordonnance n° 2021-443 du 14 avril 2021 relative au régime de responsabilité pénale applicable en cas de circulation d'un véhicule à délégation de conduite et à ses conditions d'utilisation" (in fr). Government of France. https://www.legifrance.gouv.fr/jorf/id/JORFTEXT000043370894.
- ↑ "Comment le Code de la route s'adapte à la voiture autonome". https://www.rtl.fr/actu/sciences-tech/comment-le-code-de-la-route-s-adapte-a-la-voiture-autonome-7900052176.
- ↑ "Décret n° 2021-873 du 29 juin 2021 portant application de l'ordonnance n° 2021-443 du 14 avril 2021 relative au régime de responsabilité pénale applicable en cas de circulation d'un véhicule à délégation de conduite et à ses conditions d'utilisation" (in fr). Government of France. https://www.legifrance.gouv.fr/jorf/id/JORFTEXT000043729532.
- ↑ 27.0 27.1 "Mercedes bets on evolution as Tesla touts revolution in automated driving". The Hindu. 28 October 2020. https://www.thehindu.com/sci-tech/technology/mercedes-bets-on-evolution-as-tesla-touts-revolution-in-automated-driving/article32960760.ece.
- ↑ "Germany says 'JA!' to fully autonomous vehicles hitting public roads in 2022". 27 May 2021. https://thenextweb.com/news/germany-fully-autonomous-vehicles-hitting-public-roads-2022.
- ↑ 29.0 29.1 "The Coming Collision Between Autonomous Vehicles and the Liability System". https://web.law.asu.edu/Portals/31/Marchant_autonomous_vehicles.pdf.
- ↑ "Federal Automated Vehicles Policy". 2016-09-19. https://www.transportation.gov/AV/federal-automated-vehicles-policy-september-2016.
- ↑ "House Passes Bipartisan Legislation Paving the Way for Self-Driving Cars on America's Roads - Energy and Commerce Committee" (in en-US). Energy and Commerce Committee. 2017-09-06. https://energycommerce.house.gov/news/house-passes-bipartisan-legislation-paving-way-self-driving-cars-americas-roads/.
- ↑ "Safely Ensuring Lives Future Deployment and Research In Vehicle Evolution Act". Act of July 26, 2017. United States House of Representatives. http://docs.house.gov/meetings/IF/IF00/20170727/106347/BILLS-115-HR3388-L000566-Amdt-9.pdf.
- ↑ Kang, Cecilia (2017-09-06). "Self-Driving Cars' Prospects Rise with Vote by House". The New York Times. https://www.nytimes.com/2017/09/06/technology/self-driving-cars-prospects-rise-with-vote-by-congress.html.
- ↑ "House panel approves legislation to speed deployment of". Reuters. 2017-07-27. https://www.reuters.com/article/us-usa-selfdriving-vehicles/house-panel-approves-legislation-to-speed-deployment-of-self-driving-cars-idUSKBN1AC2K0.
- ↑ 35.0 35.1 35.2 "When an AI finally kills someone, who will be responsible?". March 12, 2018. https://www.technologyreview.com/s/610459/when-an-ai-finally-kills-someone-who-will-be-responsible/.
- ↑ Kingston, J. K. C. (2016, December). Artificial intelligence and legal liability. In International Conference on Innovative Techniques and Applications of Artificial Intelligence (pp. 269-279). Springer, Cham.
- ↑ Boudette, Neal E. (January 20, 2017). "Tesla's Self-Driving System Cleared in Deadly Crash". The New York Times. OCLC 6927433730. https://www.nytimes.com/2017/01/19/business/tesla-model-s-autopilot-fatal-crash.html.
- ↑ 38.0 38.1 Iozzio, Corinne (May 1, 2016). "Who's Responsible When a Self-Driving Car Crashes?". Scientific American 314 (5): 12–13. doi:10.1038/scientificamerican0516-12. OCLC 6032769361. PMID 27100237. https://www.scientificamerican.com/article/who-s-responsible-when-a-self-driving-car-crashes/.
- ↑ "Automated Driving: Legislative and Regulatory Action". http://cyberlaw.stanford.edu/wiki/index.php/Automated_Driving:_Legislative_and_Regulatory_Action.
- ↑ After crash, injured motorcyclist accuses robot-driven vehicle of 'negligent driving'
- ↑ Felton, Ryan (19 March 2018). "Why Uber Could Be Held Criminally Liable In Fatal Crash Involving Autonomous Car (Updated)". https://jalopnik.com/why-uber-could-be-held-criminally-liable-in-fatal-crash-1823901514.
- ↑ "Metromile". 24 November 2021. https://www.metromile.com/insurance/.
- ↑ "Why You Shouldn't Worry About Liability for Self-Driving Car Accidents". 12 October 2015. https://spectrum.ieee.org/cars-that-think/transportation/self-driving/why-you-shouldnt-worry-about-liability-for-selfdriving-car-accidents.
- ↑ Mercedes accepts legal liability for Level 3 Drive Pilot system, Jack Quick, Journalist, 23 March 2022
- ↑ https://www.japantimes.co.jp/commentary/2023/09/17/world/self-driving-cars-ai-ethics/
Original source: https://en.wikipedia.org/wiki/Self-driving car liability.
Read more |