Recent from talks
Nothing was collected or created yet.
Vehicular automation
View on Wikipedia


| Part of a series on |
| Self-driving cars and self-driving vehicles |
|---|
| Enablers |
| Topics |
| Related topics |
| Part of a series on |
| Automation |
|---|
| Automation in general |
| Robotics and robots |
| Impact of automation |
| Trade shows and awards |
Vehicular automation is using technology to assist or replace the operator of a vehicle such as a car, truck, aircraft, rocket, military vehicle, or boat.[2][3][4][5][6] Assisted vehicles are semi-autonomous, whereas vehicles that can travel without a human operator are autonomous.[3] The degree of autonomy may be subject to various constraints such as conditions. Autonomy is enabled by advanced driver-assistance systems (ADAS) of varying capacity.
Related technology includes advanced software, maps, vehicle changes, and outside vehicle support.
Autonomy presents varying issues for road, air, and marine travel. Roads present the most significant complexity given the unpredictability of the driving environment, including diverse road designs, driving conditions, traffic, obstacles, and geographical/cultural differences.[7]
Autonomy implies that the vehicle is responsible for all perception, monitoring, and control functions.[8]
SAE autonomy levels
[edit]The Society of Automotive Engineers (SAE) classifies road vehicle autonomy in six levels:[9][10]
- 0: No automation.
- 1: Driver assistance, the vehicle controls steering or speed autonomously in specific circumstances.
- 2: Partial automation, the vehicle controls both steering and speed autonomously in specific circumstances.
- 3: Conditional automation, the vehicle controls both steering and speed under normal environmental conditions, but requires the driver to be ready to take control in other circumstances.
- 4: High automation, the vehicle travels autonomously under normal environmental conditions, not requiring driver oversight.
- 5: Full autonomy, where the vehicle can complete travel autonomously in any environmental conditions.
Level 0 refers, for instance, to vehicles without adaptive cruise control. Level 1 and 2 refer to vehicles where one part of the driving task is performed by the ADAS under the responsibility/liability of the driver.
From level 3, the driver can transfer the driving task to the vehicle, but the driver must assume control when the ADAS reaches its limits. For instance an automated traffic jam pilot can drive in a traffic jam, but otherwise passes control to the driver. Level 5 refers to a vehicle that can handle any situation.[11]
Technology
[edit]Software
[edit]Autonomous vehicle software generally contains several different modules that work together to enable self-driving capabilities.[12][13][14] The perception module ingests and processes data from various sensors, such cameras, LIDAR, RADAR, and ultrasonic SONAR, to create a comprehensive understanding of the vehicle's surroundings.[15] The localization module uses 3D point cloud data, GPS, IMU, and mapping information to determine the vehicle's precise position, including its orientation, velocity, and angular rate.[16][17] The planning module takes inputs from both perception and localization to compute actions to take, such as velocity and steering angle outputs.[18] These modules are typically supported by machine learning algorithms, particularly deep neural networks,[19] which enable the vehicle to detect objects, interpret traffic patterns,[20] and make real-time decisions.[21] Furthermore, modern autonomous driving systems increasingly employ sensor fusion techniques that combine data from multiple sensors to improve accuracy and reliability in different environmental conditions.[22][23]
Perception
[edit]The perception system is responsible for observing the environment. It must identify everything that could affect the trip, including other vehicles, pedestrians, cyclists, their movements, road conditions, obstacles, and other issues.[24] Various makers use cameras, radar, lidar, sonar, and microphones that can collaboratively minimize errors.[24][25]
Navigation
[edit]Navigation systems are a necessary element in autonomous vehicles. The Global Positioning System (GPS) is used for navigation by air, water, and land vehicles, particularly for off-road navigation.
For road vehicles, two approaches are prominent. One is to use maps that hold data about lanes and intersections, relying on the vehicle's perception system to fill in the details. The other is to use highly detailed maps that reduce the scope of real-time decision-making but require significant maintenance as the environment evolves.[19] Some systems crowdsource their map updates, using the vehicles themselves to update the map to reflect changes such as construction or traffic used by the entire vehicle fleet.[26]
Another potential source of information is the environment itself. Traffic data may be supplied by roadside monitoring systems and used to route vehicles to best use a limited road system.[27] Additionally, modern GNSS enhancement technologies, such as real-time kinematic (RTK) and precise point positioning (PPP), enhance the accuracy of vehicle positioning to sub-meter level precision, which is crucial for autonomous navigation and decision-making.[28]
History
[edit]Automated vehicles in European Union legislation refer specifically to road vehicles (car, truck, or bus).[29] For those vehicles, a specific difference is legally defined between advanced driver-assistance system and autonomous/automated vehicles, based on liability differences.
AAA Foundation for Traffic Safety tested two automatic emergency braking systems: some designed to prevent crashes and others that aim to make a crash less severe. The test looked at popular models like the 2016 Volvo XC90, Subaru Legacy, Lincoln MKX, Honda Civic, and Volkswagen Passat. Researchers tested how well each system stopped when approaching moving and nonmoving targets. It found that systems capable of preventing crashes reduced vehicle speeds by twice that of the systems designed to mitigate crash severity. When the two test vehicles traveled within 30 mph of each other, even those designed to lessen crash severity avoided crashes 60 percent of the time.[30]
Sartre
[edit]The SAfe Road TRains for the Environment (Sartre) project's goal was to enable platooning, in which a line of cars and trucks (a "train") follow a human-driven vehicle. Trains were predicted to provide comfort and allow the following vehicles to travel safely to a destination. Human drivers encountering a train could join and delegate driving to the human driver.[31]
Tests
[edit]Self-driving Uber vehicles were tested in Pittsburgh, Pennsylvania. The tests were paused after an autonomous car killed a woman in Arizona.[32][33] Automated busses have been tested in California.[34] In San Diego, California, an automated bus test used magnetic markers. The longitudinal control of automated truck platoons used millimeter wave radio and radar. Waymo and Tesla have conducted tests. Tesla FSD allows drivers to enter a destination and let the car take over.
Risks and liabilities
[edit]Ford offers Blue Cruise, technology that allows geofenced cars to drive autonomously.[35]
Drivers are directed to stay attentive, and safety warnings are implemented to alert the driver when corrective action is needed.[36] Tesla, Incorporated has one recorded incident that resulted in a fatality involving the automated driving system in the Tesla Model S.[37] The accident report reveals the accident was a result of the driver being inattentive and the autopilot system not recognizing the obstruction ahead.[37] Tesla has also had multiple instances where the vehicle crashed into a garage door. According to the book "The Driver in the Driverless Car: How Your Technology Choices Create the Future," Tesla automatically performs an update overnight. The morning after the update, the driver used his app to "summon" his car, and it crashed into his garage door.
Another flaw with automated driving systems is that unpredictable events, such as weather or the driving behavior of others, may cause fatal accidents due to sensors that monitor the surroundings of the vehicle not being able to provide corrective action.[36]
To overcome some of the challenges for automated driving systems, novel methodologies based on virtual testing, traffic flow simulation and digital prototypes have been proposed,[38] especially when novel algorithms based on Artificial Intelligence approaches are employed which require extensive training and validation data sets.
Implementing automated driving systems poses the possibility of changing built environments in urban areas, such as expanding the suburban regions due to the increased ease of mobility.[39]
Challenges
[edit]Around 2015, several self-driving car companies including Nissan and Toyota promised self-driving cars by 2020. However, the predictions turned out to be far too optimistic.[40]
There are still many obstacles in developing fully autonomous Level 5 vehicles, which is the ability to operate in any conditions. Currently, companies are focused on Level 4 automation, which is able to operate under certain environmental circumstances.[40]
There is still debate about what an autonomous vehicle should look like. For example, whether to incorporate lidar to autonomous driving systems is still being argued. Some researchers have come up with algorithms using camera-only data that achieve the performance that rival those of lidar. On the other hand, camera-only data sometimes draw inaccurate bounding boxes, and thus lead to poor predictions. This is due to the nature of superficial information that stereo cameras provide, whereas incorporating lidar gives autonomous vehicles precise distance to each point on the vehicle.[40]
Technical challenges
[edit]- Software Integration: Because of the large number of sensors and safety processes required by autonomous vehicles, software integration remains a challenging task. A robust autonomous vehicle should ensure that the integration of hardware and software can recover from component failures.[41]
- Prediction and trust among autonomous vehicles: Fully autonomous cars should be able to anticipate the actions of other cars like humans do. Human drivers are great at predicting other drivers' behaviors, even with a small amount of data such as eye contact or hand gestures. In the first place, the cars should agree on traffic rules, whose turn it is to drive in an intersection, and so on. This scales into a larger issue when there exists both human-operated cars and self-driving cars due to more uncertainties. A robust autonomous vehicle is expected to improve on understanding the environment better to address this issue.[41]
- Scaling up: The coverage of autonomous vehicles testing could not be accurate enough. In cases where heavy traffic and obstruction exist, it requires faster response time or better tracking algorithms from the autonomous vehicles. In cases where unseen objects are encountered, it is important that the algorithms are able to track these objects and avoid collisions.[41]
These features require numerous sensors, many of which rely on micro-electro-mechanical systems (MEMS) to maintain a small size, high efficiency, and low cost. Foremost among MEMS sensors in vehicles are accelerometers and gyroscopes to measure acceleration around multiple orthogonal axes—critical to detecting and controlling the vehicle's motion.
Societal challenges
[edit]One critical step to achieve the implementation of autonomous vehicles is the acceptance by the general public. It provides guidelines for the automobile industry to improve their design and technology. Studies have shown that many people believe that using autonomous vehicles is safer, which underlines the necessity for the automobile companies to assure that autonomous vehicles improve safety benefits. The TAM research model breaks down important factors that affect the consumer's acceptance into: usefulness, ease to use, trust, and social influence.[42]
- The usefulness factor studies whether or not autonomous vehicles are useful in that they provide benefits that save consumers' time and make their lives simpler. How well the consumers believe autonomous vehicles will be useful compared to other forms of transportation solutions is a determining factor.[42]
- The ease to use factor studies the user-friendliness of the autonomous vehicles. While the notion that consumers care more about ease to use than safety has been challenged. It still remains an important factor that has indirect effects on the public's intention to use autonomous vehicles.[42]
- The trust factor studies the safety, data privacy and security protection of autonomous vehicles. A more trusted system has a positive impact on the consumer's decision to use autonomous vehicles.[42]
- The social influence factor studies whether the influence of others would influence consumer's likelihood of having autonomous vehicles. Studies have shown that the social influence factor is positively related to behavioral intention. This might be due to the fact that cars traditionally serve as a status symbol that represents one's intent to use and his social environment.[42]
Regulatory challenges
[edit]Real-time testing of autonomous vehicles is an inevitable part of the process. At the same time, vehicular automation regulators are faced with challenges to protect public safety and yet allow autonomous vehicle companies to test their products. Groups representing autonomous vehicle companies are resisting most regulations, whereas groups representing vulnerable road users and traffic safety are pushing for regulatory barriers. To improve traffic safety, the regulators are encouraged to find a middle ground that protects the public from immature technology while allowing autonomous vehicle companies to test the implementation of their systems.[43] There have also been proposals to adopt the aviation automation safety regulatory knowledge into the discussions of safe implementation of autonomous vehicles, due to the experience that has been gained over the decades by the aviation sector on safety topics.[44]
Ground vehicles
[edit]In some countries, specific laws and regulations apply to road traffic motor vehicles (such as cars, bus and trucks) while other laws and regulations apply to other ground vehicles such as tram, train or automated guided vehicles making them to operate in different environments and conditions.
Road traffic vehicles
[edit]An automated driving system is defined in a proposed amendment to Article 1 of the Vienna Convention on Road Traffic:
(ab) "Automated driving system" refers to a vehicle system that uses both hardware and software to exercise dynamic control of a vehicle on a sustained basis.
(ac) "Dynamic control" refers to carrying out all the real-time operational and tactical functions required to move the vehicle. This includes controlling the vehicle's lateral and longitudinal motion, monitoring the road environment, responding to events in the road traffic environment, and planning and signalling for manoeuvres.[45]
This amendment will enter into force on 14 July 2022, unless it is rejected before 13 January 2022.[46]
An automated driving feature must be described sufficiently clearly so that it is distinguished from an assisted driving feature.
— SMMT[47]
There are two clear states – a vehicle is either assisted with a driver being supported by technology or automated where the technology is effectively and safely replacing the driver.
— SMMT[47]
Ground vehicles employing automation and teleoperation include shipyard gantries, mining trucks, bomb-disposal robots, robotic insects, and driverless tractors.
There are many autonomous and semi-autonomous ground vehicles being made for the purpose of transporting passengers. One such example is the free-ranging on grid (FROG) technology which consists of autonomous vehicles, a magnetic track and a supervisory system. The FROG system is deployed for industrial purposes in factory sites and has been in use since 1999 on the ParkShuttle, a PRT-style public transport system in the city of Capelle aan den IJssel to connect the Rivium business park with the neighboring city of Rotterdam (where the route terminates at the Kralingse Zoom metro station). The system experienced a crash in 2005[48] that proved to be caused by a human error.[49]
Applications for automation in ground vehicles include the following:
- Vehicle tracking system system ESITrack, Lojack.
- Rear-view alarm, to detect obstacles behind.
- Anti-lock braking system (ABS) (also Emergency Braking Assistance (EBA)), often coupled with Electronic brake force distribution (EBD), which prevents the brakes from locking and losing traction while braking. This shortens stopping distances in most cases and, more importantly, allows the driver to steer the vehicle while braking.
- Traction control system (TCS) actuates brakes or reduces throttle to restore traction if driven wheels begin to spin.
- Four wheel drive (AWD) with a centre differential. Distributing power to all four wheels lessens the chances of wheel spin. It also suffers less from oversteer and understeer.
- Electronic Stability Control (ESC) (also known for Mercedes-Benz proprietary Electronic Stability Program (ESP), Acceleration Slip Regulation (ASR) and Electronic differential lock (EDL)). Uses various sensors to intervene when the car senses a possible loss of control. The car's control unit can reduce power from the engine and even apply the brakes on individual wheels to prevent the car from understeering or oversteering.
- Dynamic steering response (DSR) corrects the rate of power steering system to adapt it to vehicle's speed and road conditions.
Research is ongoing and prototypes of autonomous ground vehicles exist.
Cars
[edit]Extensive automation for cars focuses on either introducing robotic cars or modifying modern car designs to be semi-autonomous.
Semi-autonomous designs could be implemented sooner as they rely less on technology that is still at the forefront of research. An example is the dual mode monorail. Groups such as RUF (Denmark) and TriTrack (USA) are working on projects consisting of specialized private cars that are driven manually on normal roads but also that dock onto a monorail/guideway along which they are driven autonomously.
As a method of automating cars without extensively modifying the cars as much as a robotic car, Automated highway systems (AHS) aims to construct lanes on highways that would be equipped with, for example, magnets to guide the vehicles. Automation vehicles have auto-brakes named as Auto Vehicles Braking System (AVBS). Highway computers would manage the traffic and direct the cars to avoid crashes.
In 2006, The European Commission has established a smart car development program called the Intelligent Car Flagship Initiative.[50] The goals of that program include:
- Adaptive cruise control
- Lane departure warning system
- Project AWAKE for drowsy drivers
There are further uses for automation in relation to cars. These include:
- Assured Clear Distance Ahead
- Adaptive headlamps
- Advanced Automatic Collision Notification, such as OnStar
- Intelligent Parking Assist System
- Automatic Parking
- Automotive night vision with pedestrian detection
- Blind spot monitoring
- Driver Monitoring System
- Robotic car or self-driving car which may result in less-stressed "drivers", higher efficiency (the driver can do something else), increased safety and less pollution (e.g. via completely automated fuel control)
- Precrash system
- Safe speed governing
- Traffic sign recognition
- Following another car on a motorway – "enhanced" or "adaptive" cruise control, as used by Ford Motor Company and Vauxhall[51]
- Distance control assist – as developed by Nissan[52]
- Dead man's switch – there is a move to introduce deadman's braking into automotive application, primarily heavy vehicles, and there may also be a need to add penalty switches to cruise controls.
Singapore also announced a set of provisional national standards on January 31, 2019, to guide the autonomous vehicle industry. The standards, known as Technical Reference 68 (TR68), will promote the safe deployment of fully driverless vehicles in Singapore, according to a joint press release by Enterprise Singapore (ESG), Land Transport Authority (LTA), Standards Development Organisation and Singapore Standards Council (SSC).[53]
Shuttle
[edit]



Since 1999, the 12-seat/10-standing ParkShuttle has been operating on an 1.8 kilometres (1.1 mi) exclusive right of way in the city of Capelle aan den IJssel in The Netherlands. The system uses small magnets in the road surface to allow the vehicle to determine its position. The use of shared autonomous vehicles was trialed around 2012 in a hospital car park in Portugal.[54] From 2012 to 2016, the European Union funded CityMobil2 project examined the use of shared autonomous vehicles and passenger experience including short term trials in seven cities. This project led to the development of the EasyMile EZ10.[55]
In the 2010s, self-driving shuttle became able to run in mixed traffic without the need for embedded guidance markers.[56] So far the focus has been on low speed, 20 miles per hour (32 km/h), with short, fixed routes for the "last mile" of journeys. This means issues of collision avoidance and safety are significantly less challenging than those for automated cars, which seek to match the performance of conventional vehicles. Many trials have been undertaken, mainly on quiet roads with little traffic or on public pathways or private roadways and specialised test sites.[citation needed] The capacity of different models varies significantly, between 6-seats and 20-seats. (Above this size there are conventional buses that have driverless technology installed.)
In December 2016, the Jacksonville Transportation Authority has announced its intention to replace the Jacksonville Skyway monorail with driverless vehicles that would run on the existing elevated superstructure as well as continue onto ordinary roads.[57] The project has since been named the "Ultimate Urban Circulator" or "U2C" and testing has been carried out on shuttles from six different manufacturers. The cost of the project is estimated at $379 million.[58]
In January 2017, it was announced the ParkShuttle system in the Netherlands will be renewed and expanded including extending the route network beyond the exclusive right of way so vehicles will run in mixed traffic on ordinary roads.[59] The plans were delayed and the extension into mixed traffic was expected in 2021.[60]
In July 2018, Baidu stated it had built 100 of its 8-seat Apolong model, with plans for commercial sales.[61] As of July 2021, they had not gone into volume production.
In August 2020, it was reported there were 25 autonomous shuttle manufacturers,[62] including the 2GetThere, Local Motors, Navya, Baidu, Easymile, Toyota and Ohmio.
In December 2020, Toyota showcased its 20-passenger "e-Palette" vehicle, which is due to be used at the 2021 Tokyo Olympic Games.[63] Toyota announced it intends to have the vehicle available for commercial applications before 2025.[64]
In January 2021, Navya released an investor report which predicted global autonomous shuttle sales will reach 12,600 units by 2025, with a market value of EUR 1.7 billion.[65]
In June 2021, Chinese maker Yutong claimed to have delivered 100 models of its 10-seat Xiaoyu 2.0 autonomous bus for use in Zhengzhou. Testing has been carried out in a number of cities since 2019 with trials open to the public planned for July 2021.[66]
Self-driving shuttles are already in use on some private roads, such as at the Yutong factory in Zhengzhou where they are used to transport workers between buildings of the world's largest bus factory.[67]
In Hong Kong, the police and other workers use driverless vehicles. [68]
Trials
[edit]A large number of trials have been conducted since 2016, with most involving only one vehicle on a short route for a short period of time and with an onboard conductor. The purpose of the trials has been to both provide technical data and to familiarize the public with the driverless technology. A 2021 survey of over 100 shuttle experiments across Europe concluded that low speed – 15–20 kilometres per hour (9.3–12.4 mph) – was the major barrier to implementation of autonomous shuttle buses. The current cost of the vehicles at €280,000 and the need for onboard attendants were also issues.[69]
| Company/Location | Details |
|---|---|
| Navya "Arma" in Neuhausen am Rheinfall | In October 2016, BestMile started trials in Neuhausen am Rheinfall, claiming to be the world's first solution for managing hybrid fleets with both autonomous and non-autonomous vehicles.[70] The test ended in October 2021.[71] |
| Local Motors "Olli" | At the end of 2016, the Olli was tested in Washington D.C.[72] In 2020, a four-month trial was undertaken at the United Nations ITCILO campus in Turin, Italy to provide transport shuttle to employees and guests within the campus.[73] |
| Navya "Autonom" | Navya claimed in May 2017 to have carried almost 150,000 passengers across Europe[74] with trials in Sion, Cologne, Doha, Bordeaux and the nuclear power plant at Civaux as well as Las Vegas[75] and Perth.[76] Ongoing public trials are underway in Lyon, Val Thorens and Masdar City. Other trials on private sites are underway at University of Michigan since 2016,[77] at Salford University and the Fukushima Daini Nuclear Power Plant since 2018.[78] |
| Texas A&M | In August 2017, a driverless four seat shuttle was trialed at Texas A&M university as part of its "Transportation Technology Initiative" in a project run by academics and students on the campus.[79][80] Another trial, this time using Navya vehicles, was run in 2019 from September to November.[81] |
| RDM Group "LUTZ Pathfinder" | In October 2017, RDM Group began a trial service with two seat vehicles between Trumpington Park and Ride and Cambridge railway station along the guided busway, for possible use as an after hours service once the regular bus service has stopped each day.[82] |
| EasyMile "EZ10" | EasyMile has had longer term trials at Wageningen University and Lausanne as well as short trials in Darwin,[83] Dubai, Helsinki, San Sebastian, Sophia Antipolis, Bordeaux[84] and Tapei[85] In December 2017, a trial began in Denver running at 5 miles per hour (8.0 km/h) on a dedicated stretch of road.[86] EasyMile was operating in ten U.S. states, including California, Florida, Texas, Ohio, Utah, and Virginia before U.S. service was suspended after a February 2020 injury.[87] In August 2020 EasyMile was operating shuttles in 16 cities across the United States, including Salt Lake City, Columbus, Ohio, and Corpus Christi, Texas.[88] In October 2020 a new trial was launched in Fairfax, Virginia.[89] In August 2021 a one-year trial was launched at the Colorado School of Mines in Golden, Colorado. The trial uses nine vehicles (with seven active at any time) and provides a 5–10 minute service along three routes at a maximum speed of 12 mph (19 km/h). At the time of launch this was the largest such trial in the United States.[90][91] In November 2021, EasyMile became the first driverless solutions provider in Europe authorized to operate at Level 4 in mixed traffic, on a public road. "EZ10" has been making test runs on a medical campus in the southwestern city of Toulouse since March.[92][93] |
| Westfield Autonomous Vehicles "POD" | In 2017 and 2018, using a modified version of the UltraPRT called "POD", four vehicles were used as part of the GATEway project trial conducted in Greenwich in south London on a 3.4 kilometres (2.1 mi) route.[94] A number of other trials have been conducted in Birmingham, Manchester, Lake District National Park, University of the West of England and Filton Airfield.[95] |
| Next Future Transportation "pods" in Dubai | In February 2018, the ten passenger (six seated), 12 miles per hour (19 km/h), autonomous pods which are capable of joining to form a bus, were demonstrated at the World Government Summit in Dubai. The demonstration was a collaboration with between Next-Future and Dubai's Roads and Transport Authority and the vehicles are under consideration for deployment there.[96] |
| "Apolong/Apollo" | In July 2018, a driverless eight seater shuttle bus was trialed at the 2018 Shanghai expo after tests in Xiamen and Chongqing cities as part of Project Apollo, a mass-produced autonomous vehicle project launched by a consortium including Baidu.[97][98][99] |
| Jacksonville Transportation Authority | Since December 2018, the Jacksonville Transportation Authority has been using a 'test and learn' site at the Florida State College at Jacksonville[100] to evaluate vehicles from different vendors as part of its plan for the Ultimate Urban Circulator (U2C). Among the six vehicles tested[101] are the Local Motors "Olli 2.0",[102] Navya "Autonom"[103] and EasyMile "EZ10".[104] |
| 2getthere "ParkShuttle" in Brussels | In 2019, trials were held at Brussels Airport[105] and at Nanyang Technological University in Singapore.[106] |
| Ohmio "Lift" in Christchurch | In 2019, Trials with their 15-person shuttle were conducted in New Zealand at Christchurch Airport[107] and at the Christchurch Botanic Gardens[108] in 2020. |
| Yutong "Xioayu" | Testing with the first generation vehicle in 2019 at the Boao Forum for Asia and in Zhengzhou.[109] The 10-seat second generation vehicle has been delivered to Guangzhou, Nanjing, Zhengzhou, Sansha, Changsha with public trials due to commence in July 2021 in Zhengzhou.[66][110] |
| ARTC "WinBus" in Changhua city | In July 2020, a trial service began in Changhua city in Taiwan, connecting four tourism factories in Changhua Coastal Industrial Park along a 7.5 km (4.7 mi), with plans to extend the route to 12.6 km (7.8 mi) to serve tourist destinations.[111] In January 2021, Level 4 "WinBus" got a license for one-year experimental sandbox operation.[112] |
| Yamaha Motor "Land Car" based "ZEN drive Pilot" in Eiheiji town, Fukui prefecture, Japan | In December 2020, Eiheiji town started test operation of driverless autonomous driving mobility services by making use of a remotely-operated autonomous driving system.[113] AIST Human-Centered Mobility Research Center modified Yamaha Motor's electric "Land Car" and the tracing road of an abandoned Eiheiji railway line. This system was legally approved as Level 3.[114]
In March 2023, "ZEN drive Pilot" became the first legally approved Level 4 Automatic operation device under the amended "Road Traffic Act" of 2023.[115] |
| WeRide "Mini Robobus" | In January 2021, WeRide began testing its Mini Robobus on Guangzhou International Bio Island.[116] In June 2021, the company also launched trials at Nanjing. |
| Toyota "e-Palette" in Chūō, Tokyo | During the 2021 Tokyo Summer Olympics, a fleet of 20 vehicles was used to ferry athletes and others around the Athletes' Village. Each vehicle could carry 20 people or 4 wheelchairs and had a top speed of 20 mph (32 km/h).[117] (The event also used 200 driver operated variants called the "Accessible People Movers (APM)", to take athletes to their events.) On August 27, 2021, Toyota suspended all "e-Pallete" services at the Paralympics after a vehicle collided with and injured a visually impaired pedestrian,[118] and restarted on August 31 with improved safety measures.[119] |
| Hino "Poncho Long" tuned by Nippon Mobility in Shinjuku, Tokyo | In November 2021, Tokyo Metropolitan Government started three trials. As one of the three, a lead contractor Keio Dentetsu Bus was planned to operate in the central area of the megalopolis.[120] |
Vehicle names are in quotes
Buses
[edit]
Autonomous buses are proposed, as well as self-driving cars and trucks. Grade 2 level automated minibusses were trialed for a few weeks in Stockholm.[121][122] China has a small fleet of self-driving public buses in the tech district of Shenzhen, Guangdong.[123]
The first autonomous bus trial in the United Kingdom commenced in mid-2019, with an Alexander Dennis Enviro200 MMC single-decker bus modified with autonomous software from Fusion Processing able to operate in driverless mode within Stagecoach Manchester's Sharston bus depot, performing tasks such as driving to the washing station, refueling point and then parking at a dedicated parking space in the depot.[124] Passenger-carrying driverless bus trials in Scotland commenced in January 2023, with a fleet of five identical vehicles to the Manchester trial used on a 14 miles (23 km) Stagecoach Fife park-and-ride route across the Forth Road Bridge, from the north bank of the Forth to Edinburgh Park station.[125][126]
Another autonomous trial in Oxfordshire, England, which uses a battery electric Fiat Ducato minibus on a circular service to Milton Park, operated by FirstBus with support from Fusion Processing, Oxfordshire County Council and the University of the West of England, entered full passenger service also in January 2023. The trial route will be extended to Didcot Parkway railway station after acquiring a larger single-decker by the end of 2023.[127][128]
In July 2020 in Japan, AIST Human-Centered Mobility Research Center with Nippon Koei and Isuzu started a series of demonstration tests for mid-sized buses, Isuzu "Erga Mio" with autonomous driving systems, in five areas; Ōtsu city in Shiga prefecture, Sanda city in Hyōgo Prefecture and other three areas in sequence.[129][130][131]
In October 2023, Imagry, an Israeli AI startup, introduced its mapless autonomous driving solution at Busworld Europe, leveraging a real-time image recognition system and a spatial deep convolutional neural network (DCNN) to mimic human driving behavior.[132]
Modular autonomous transit
[edit]Modular autonomous transit is a research concept for public transit using self-driving vehicles with connectable units, or "pods", that can adjust capacity based on passenger demand.[133] Studies suggest these systems could improve efficiency through dynamic routing, with simulations showing reduced travel times in urban networks, though no operational systems existed as of 2025.[134]
Trucks
[edit]The concept for autonomous vehicles has been applied for commercial uses, such as autonomous or nearly autonomous trucks.
Companies such as Suncor Energy, a Canadian energy company, and Rio Tinto Group were among the first to replace human-operated trucks with driverless commercial trucks run by computers.[135] In April 2016, trucks from major manufacturers including Volvo and the Daimler Company completed a week of autonomous driving across Europe, organized by the Dutch, in an effort to get self-driving trucks on the road. With developments in self-driving trucks progressing, U.S. self-driving truck sales is expected to reach 60,000 by 2035 according to a report released by IHS Incorporated in June 2016.[136]
As reported in June 1995 in Popular Science magazine, self-driving trucks were being developed for combat convoys, whereby only the lead truck would be driven by a human and the following trucks would rely on satellite, an inertial guidance system and ground-speed sensors.[137] Caterpillar Incorporated made early developments in 2013 with the Robotics Institute at Carnegie Mellon University to improve efficiency and reduce cost at various mining and construction sites.[138]
In Europe, the Safe Road Trains for the Environment is such an approach.
From PWC's Strategy & Report,[139] self driving trucks will be the source of concern around how this technology will impact around 3 million truck drivers in the US, as well as 4 million employees in support of the trucking economy in gas stations, restaurants, bars and hotels. At the same time, some companies like Starsky, are aiming for Level 3 Autonomy, which would see the driver playing a control role around the truck's environment. The company's project, remote truck driving, would give truck drivers a greater work-life balance, enabling them to avoid long periods away from their home. This would however provoke a potential mismatch between the driver's skills with the technological redefinition of the job.
Companies that buy driverless trucks could massively cut costs: human drivers would no longer be required, companies' liabilities due to truck accidents would diminish, and productivity would increase (as the driverless truck doesn't need to rest). The usage of self driving trucks will go hand in hand with the use of real-time data to optimize both efficiency and productivity of the service delivered, as a way to tackle traffic congestion for example. Driverless trucks could enable new business models that would see deliveries shift from day time to night time or time slots in which traffic is less heavily dense.
Suppliers
[edit]| Company | Details |
|---|---|
| Waymo Semi | In March 2018, Waymo, the automated vehicle company spun off from Google parent company Alphabet Incorporated, announced it was applying its technology to semi trucks. In the announcement, Waymo noted it would be using automated trucks to move freight related to Google's data centers in the Atlanta, Georgia area. The trucks will be manned and operated on public roads.[140] |
| Uber Semi | In October 2016, Uber completed the first driverless operation of an automated truck on public roads, delivering a trailer of Budweiser beer from Fort Collins, Colorado to Colorado Springs.[141] The run was completed at night on Interstate 25 after extensive testing and system improvements in cooperation with the Colorado State Police. The truck had a human in the cab but not sitting in the driver's seat, while the Colorado State Police provided a rolling closure of the highway.[142] At the time, Uber's automated truck was based primarily on technology developed by Otto, which Uber acquired in August 2016.[143] In March 2018, Uber announced it was using its automated trucks to deliver freight in Arizona, while also leveraging the UberFreight app to find and dispatch loads.[144] |
| Embark Semi | In February 2018, Embark Trucks announced it had completed the first cross-country trip of an automated semi, driving 2,400 miles from Los Angeles, California to Jacksonville, Florida on Interstate 10.[145] This followed a November 2017 announcement that it had partnered with Electrolux and Ryder to test its automated truck by moving Frigidaire refrigerators from El Paso, Texas to Palm Springs, California.[146] |
| Tesla Semi | In November 2017 Tesla, Incorporated, owned by Elon Musk, revealed a prototype of the Tesla Semi and announced that it would go into production. This long-haul, electric semi-truck can drive itself and move in "platoons" that automatically follow a lead vehicle. It was disclosed in August 2017 that it sought permission to test the vehicles in Nevada.[147] |
| Starsky Robotics | In 2017, Starsky Robotics unveiled its technology that allows to make trucks autonomous. Unlike its bigger competitors in this industry that aims to tackle Level 4 and 5 Autonomy, Starsky Robotics is aiming at producing Level 3 Autonomy trucks, in which the human drivers should be prepared to respond to a "request to intervene" in case anything goes wrong. |
| Pronto AI | In December 2018, Anthony Levandowski unveiled his new autonomous driving company, Pronto, which is building L2 ADAS technology for the commercial trucking industry. The company is based in San Francisco, California.[148] |
Motorcycles
[edit]Several self-balancing autonomous motorcycles were demonstrated in 2017 and 2018 from BMW, Honda and Yamaha.[149][150][151]
| Company/Location | Details |
|---|---|
| Honda motorcycle | Inspired by the Uni-cub, Honda implemented a self-balancing technology into their motorcycles. Due to the weight of motorcycles, it is often a challenge for motorcycle owners to keep balance of their vehicles at low speeds or at a stop. Honda's motorcycle concept has a self-balancing feature that will keep the vehicle upright. It automatically lowers the center of balance by extending the wheelbase. It then takes control of the steering to keep the vehicle balanced. This allows users to navigate the vehicle more easily when walking or driving in stop and go traffic. However, this system is not for high speed driving.[149][152] |
| BMWs Motorrad Vision concept motorcycle | BMW Motorrad developed the ConnectRide self driving motorcycle in order to push the boundaries of motorcycle safety. The autonomous features of the motorcycle include emergency braking, negotiating intersections, assisting during tight turns, and front impact avoidance. These are features similar to current technologies that are being developed and implemented in autonomous cars. This motorcycle can also fully drive on its own at normal driving speed, making turns and returning to a designated location. It lacks the self standing feature that Honda has implemented.[153] |
| Yamaha's riderless motorcycle | "Motoroid" can hold its balance, autonomously driving, recognizing riders and go to a designated location with a hand gesture. Yamaha used the "Human beings react a hell of a lot quicker" research philosophy into the motoroid. The idea is that the autonomous vehicle is not attempting to replace human beings, but to augment the abilities of the human with advanced technology. They have tactile feedback such as a gentle squeeze to a rider's lower back as a reassuring caress at dangerous speeds, as if the vehicle was responding and communicating with the rider. Their goal is to "meld" the machine and human together to form one experience.[154] |
| Harley-Davidson | While their motorcycles are popular, one of the largest problems of owning a Harley-Davidson is the reliability of the vehicle. It is difficult to manage the weight of the vehicle at low speeds and picking it up from the ground can be a difficult process even with correct techniques. In order to attract more customers, they filed a patent for having a gyroscope at the back of the vehicle that will keep the balance of the motorcycle for the rider at low speeds. After 3 miles per hour, the system disengages. However anything below that, the gyroscope can handle the balance of the vehicle which means it can balance even at a stop. This system can be removed if the rider feels ready without it (meaning it is modular).[152] |
Trains
[edit]The concept for autonomous vehicles has also been applied for commercial uses, like for autonomous trains. The world's first driverless urban transit system is the Port Island Line in Kobe, Japan, opened in 1981.[155] The first self-driving train in the UK was launched in London on the Thameslink route.[156]
An example of an automated train network is the Docklands Light Railway in London.
Also see List of automated train systems.
Trams
[edit]In 2018 the first autonomous trams in Potsdam were trialed.[157]
Automated guided vehicle
[edit]An automated guided vehicle or automatic guided vehicle (AGV) is a mobile robot that follows markers or wires in the floor, or uses vision, magnets, or lasers for navigation. They are most often used in industrial applications to move materials around a manufacturing facility or warehouse. Application of the automatic guided vehicle had broadened during the late 20th century.
Aircraft
[edit]Aircraft have received much attention for automation, especially for navigation. A system capable of autonomously navigating a vehicle (especially aircraft) is known as autopilot.
Delivery drones
[edit]Various industries such as packages and food have experimented with delivery drones. Traditional and new transportation companies are competing in the market. For example, UPS Flight Forward, Alphabet Wing, and Amazon Prime Air are all developing delivery drones.[158] Zipline, an American medical drone delivery company, has the largest active drone delivery operations in the world, and its drones are capable of Level 4 autonomy.[159]
However, even if technology seems to allow for those solutions to function correctly as various tests of various companies show, the main throwback to the market launch and use of such drones is inevitably the legislation in place and regulatory agencies have to decide on the framework they wish to take to draft regulation. This process is in different phases across the world as each country will tackle the topic independently. For example, Iceland's government and departments of transport, aviation, police have already started issuing licenses for drone operations. It has a permissive approach and together with Costa Rica, Italy, the UAE, Sweden and Norway, has a fairly unrestricted legislation on commercial drone use. Those countries are characterized by a body of regulation that may give operational guidelines or require licensing, registration and insurance.[160]
On the other side, other countries have decided to ban, either directly (outright ban) or indirectly (effective ban), the use of commercial drones. The RAND Corporation thus notes the difference between countries forbidding drones and those that have a formal process for commercial drone licensing, but requirements are either impossible to meet or licenses do not appear to have been approved. In the US, United Parcel Service is the only delivery service with the Part 135 Standard certification that is required to use drones to deliver to real customers.[158]
However, most countries seem to be struggling on the integration of drones for commercial uses into their aviation regulatory frameworks. Thus, constraints are placed on the use of those drones such as that they must be operating within the visual line of sight (VLOS) of the pilot and thus limiting their potential range. This would be the case of the Netherlands and Belgium. Most countries let pilots operate outside the VLOS but is subject to restrictions and pilot ratings, which would be the case of the US.
The general trend is that legislation is moving fast and laws are constantly being reevaluated. Countries are moving towards a more permissive approach but the industry still lacks infrastructures to ensure the success of such a transition. To provide safety and efficiency, specialized training courses, pilot exams (type of UAV and flying conditions) as well as liability management measures regarding insurances may need to be developed.
There is a sense of urgency related to this innovation as competition is high and companies lobby to integrate them rapidly in their products and services offerings. Since June 2017, the US Senate legislation reauthorized the Federal Aviation Administration and the Department of Transportation to create a carrier certificate allowing for package deliveries by drones.[161]
Watercraft
[edit]Autonomous boats can provide security, perform research, or conduct hazardous or repetitive tasks (such as guiding a large ship into a harbor or transporting cargo).
DARPA
[edit]Sea Hunter is an autonomous unmanned surface vehicle (USV) launched in 2016 as part of the DARPA Anti-Submarine Warfare Continuous Trail Unmanned Vessel (ACTUV) program.
Submersibles
[edit]Underwater vehicles have been a focus for automation for tasks such as pipeline inspection and underwater mapping.
Assistance robots
[edit]Spot
[edit]This four-legged robot was created to be able to navigate through many different terrain outdoors and indoors. It can walk on its own without colliding into anything. It uses many different sensors, including 360-degree vision cameras and gyroscopes. It is able to keep its balance even when pushed over. This vehicle, while it is not intended to be ridden, can carry heavy loads for construction workers or military personnel through rough terrain.[162]
Regulation
[edit]The British Highway Code states that:
By self-driving vehicles, we mean those listed as automated vehicles by the Secretary of State for Transport under the Automated and Electric Vehicles Act 2018.
— The Highway Code – 27/07/2022, p. 4
The UK considers the way to update its British Highway Code for automated code:
Automated vehicles can perform all the tasks involved in driving, in at least some situations. They differ from vehicles fitted with assisted driving features (like cruise control and lane-keeping assistance), which carry out some tasks, but where the driver is still responsible for driving. If you are driving a vehicle with assisted driving features, you MUST stay in control of the vehicle.
— proposed changes to The Highway Code[163]
If the vehicle is designed to require you to resume driving after being prompted to, while the vehicle is driving itself, you MUST remain in a position to be able to take control. For example, you should not move out of the driving seat. You should not be so distracted that you cannot take back control when prompted by the vehicle.
— proposed changes to The Highway Code[163]
Concerns
[edit]Lack of control
[edit]Through the autonomy level, it is shown that the higher the level of autonomy, the less control humans have on their vehicles (highest level of autonomy needing zero human interventions). One concerns regarding the development of vehicular automation is related to the end-users' trust in the technology that controls automated vehicles.[164] According to a nationally conducted survey made by Kelley Blue Book (KBB) in 2016, it was shown that the majority of people would choose to have a certain level of control behind their own vehicle rather than having the vehicle operate in Level 5 autonomy, or in other words, complete autonomy.[165] According to half of the respondents, the idea of safety in an autonomous vehicle diminishes as the level of autonomy increases.[165] This distrust of autonomous driving systems proved to be unchanged throughout the years when a nationwide survey conducted by AAA Foundation for Traffic and Safety (AAAFTS) in 2019 showed the same outcome as the survey KBB did in 2016. AAAFTS survey showed that even though people have a certain level of trust in automated vehicles, most people also have doubts and distrust towards the technology used in autonomous vehicles, with most distrust in Level 5 autonomous vehicles.[166] It is shown by AAAFTS' survey that people's trust in autonomous driving systems increased when their level of understanding increased.[166]
Malfunctions
[edit]
The possibility of autonomous vehicle's technology to experience malfunctions is also one of the causes of user's distrust in autonomous driving systems.[164] It is the concern that most respondents voted for in the AAAFTS survey.[166] Even though autonomous vehicles are made to improve traffic safety by minimizing crashes and their severity,[166] they still caused fatalities. At least 113 autonomous vehicle related accidents have occurred until 2018.[167] In 2015, Google declared that their automated vehicles experienced at least 272 failures, and drivers had to intervene around 13 times to prevent fatalities.[168] Furthermore, other automated vehicles' manufacturers also reported automated vehicles' failures, including the Uber car incident.[168] A self-driving Uber car accident in 2018 is an example of autonomous vehicle accidents that are also listed among self-driving car fatalities. A report made by the National Transportation Safety Board (NTSB) showed that the self-driving Uber car was unable to identify the victim in a sufficient amount of time for the vehicle to slow down and avoid crashing into the victim.[169]
Ethical
[edit]Another concern related to vehicle automation is its ethical issues. In reality, autonomous vehicles can encounter inevitable traffic accidents. In such situations, many risks and calculations need to be made in order to minimize the amount of damage the accident could cause.[170] When a human driver encounters an inevitable accident, the driver will take a spontaneous action based on ethical and moral logic. However, when a driver has no control over the vehicle (Level 5 autonomy), the system of an autonomous vehicle needs to make that quick decision.[170] Unlike humans, autonomous vehicles can only make decisions based on what it is programmed to do.[170] However, the situation and circumstances of accidents differ from one another, and any one decision might not be the best decision for certain accidents. Based on two research studies in 2019,[171][172] the implementation of fully automated vehicles in traffic where semi-automated and non-automated vehicles are still present might lead to complications.[171] Some flaws that still need consideration include the structure of liability, distribution of responsibilities,[172] efficiency in decision making, and the performance of autonomous vehicles with its diverse surroundings.[171] Still, researchers Steven Umbrello and Roman V. Yampolskiy propose that the value sensitive design approach is one method that can be used to design autonomous vehicles to avoid some of these ethical issues and design for human values.[173]
See also
[edit]References
[edit]- ^ "Self-steering Mars Rover tested at ESO's Paranal Observatory". ESO Announcement. Retrieved 21 June 2012.
- ^ Hu, Junyan; Bhowmick, Parijat; Lanzon, Alexander (August 2021). "Group Coordinated Control of Networked Mobile Robots With Applications to Object Transportation". IEEE Transactions on Vehicular Technology. 70 (8): 8269–8274. doi:10.1109/TVT.2021.3093157. ISSN 0018-9545.
- ^ a b Hu, Junyan; Bhowmick, Parijat; Jang, Inmo; Arvin, Farshad; Lanzon, Alexander (December 2021). "A Decentralized Cluster Formation Containment Framework for Multirobot Systems". IEEE Transactions on Robotics. 37 (6): 1936–1955. doi:10.1109/TRO.2021.3071615. ISSN 1552-3098.
- ^ Chan, Ching-Yao (2017). "Advancements, prospects, and impacts of automated driving systems". International Journal of Transportation Science and Technology. 6 (3): 208–216. doi:10.1016/j.ijtst.2017.07.008.
- ^ Zhao, Jingyuan; Zhao, Wenyi; Deng, Bo; Wang, Zhenghong; Zhang, Feng; Zheng, Wenxiang; Cao, Wanke; Nan, Jinrui; Lian, Yubo; Burke, Andrew F. (2024). "Autonomous driving system: A comprehensive survey". Expert Systems with Applications. 242 122836. doi:10.1016/j.eswa.2023.122836.
- ^ Martínez-Díaz, Margarita; Soriguera, Francesc (2018). "Autonomous vehicles: theoretical and practical challenges". Transportation Research Procedia. 33: 275–282. doi:10.1016/j.trpro.2018.10.103. hdl:2183/21640.
- ^ Hu, Junyan; Turgut, Ali Emre; Lennox, Barry; Arvin, Farshad (January 2022). "Robust Formation Coordination of Robot Swarms With Nonlinear Dynamics and Unknown Disturbances: Design and Experiments". IEEE Transactions on Circuits and Systems II: Express Briefs. 69 (1): 114–118. doi:10.1109/TCSII.2021.3074705. ISSN 1549-7747.
- ^ "Automated Vehicles for Safety | NHTSA". www.nhtsa.gov. Archived from the original on 7 October 2021. Retrieved 21 November 2021.
- ^ Path to Autonomy: Self-Driving Car Levels 0 to 5 Explained. Car and Driver, October 2017.
- ^ "Taxonomy and Definitions for Terms Related to Driving Automation Systems for On-Road Motor Vehicles". SAE International. 15 June 2018. Retrieved 30 July 2019.
- ^ "J3016_202104: Taxonomy and Definitions for Terms Related to Driving Automation Systems for On-Road Motor Vehicles - SAE International". www.sae.org.
- ^ (Badue et al. 2021, p. 1)
- ^ (Azam et al. 2020, p. 6)
- ^ (Serban et al. 2020, p. 23)
- ^ (Badue et al. 2021, p. 2)
- ^ (Azam et al. 2020, p. 6)
- ^ Charroud, Anas; El Moutaouakil, Karim; Palade, Vasile; Yahyaouy, Ali; Onyekpe, Uche; Eyo, Eyo U. (7 February 2024). "Localization and Mapping for Self-Driving Vehicles: A Survey". Machines. 12 (2): 6. doi:10.3390/machines12020118.
- ^ (Serban et al. 2020, p. 26)
- ^ a b Adnan, Nadia; Md Nordin, Shahrina; bin Bahruddin, Mohamad Ariff; Ali, Murad (December 2018). "How trust can drive forward the user acceptance to the technology? In-vehicle technology for autonomous vehicle". Transportation Research Part A: Policy and Practice. 118: 819–836. Bibcode:2018TRPA..118..819A. doi:10.1016/j.tra.2018.10.019. S2CID 158645252.
- ^ Gupta, Abhishek; Anpalagan, Alagan; Guan, Ling; Khwaja, Ahmed Shaharyar (July 2021). "Deep learning for object detection and scene perception in self-driving cars: Survey, challenges, and open issues". Array. 10 100057. Elsevier. doi:10.1016/j.array.2021.100057.
- ^ Kuutti, Sampo; Bowden, Richard; Jin, Yaochu; Barber, Phil; Fallah, Saber (February 2021). "A Survey of Deep Learning Applications to Autonomous Vehicle Control". IEEE Transactions on Intelligent Transportation Systems. 22 (2). Institute of Electrical and Electronics Engineers: 712–733. arXiv:1912.10773. doi:10.1109/TITS.2019.2962338.
- ^ Yeong, De Jong; Velasco-Hernandez, Gustavo; Barry, John; Walsh, Joseph (18 March 2021). "Sensor and Sensor Fusion Technology in Autonomous Vehicles: A Review". Sensors. 21 (6): 2140. Bibcode:2021Senso..21.2140Y. doi:10.3390/s21062140. PMC 8003231. PMID 33803889.
- ^ (Serban et al. 2020, p. 25)
- ^ a b Van Brummelen, Jessica; O'Brien, Marie; Gruyer, Dominique; Najjaran, Homayoun (April 2018). "Autonomous vehicle perception: The technology of today and tomorrow". Transportation Research Part C: Emerging Technologies. 89: 384–406. Bibcode:2018TRPC...89..384V. doi:10.1016/j.trc.2018.02.012.
- ^ Song, Haina; Zhou, Shengpei; Chang, Zhenting; Su, Yuejiang; Liu, Xiaosong; Yang, Jingfeng (1 January 2021). "Collaborative processing and data optimization of environmental perception technologies for autonomous vehicles". Assembly Automation. 41 (3): 283–291. doi:10.1108/AA-01-2021-0007. ISSN 0144-5154.
- ^ "HD Maps vs AV Maps – The Crucial Differences". Mobileye. 28 February 2021.
- ^ Wigley, Edward; Rose, Gillian (2 April 2020). "Who's behind the wheel? Visioning the future users and urban contexts of connected and autonomous vehicle technologies" (PDF). Geografiska Annaler: Series B, Human Geography. 102 (2): 155–171. doi:10.1080/04353684.2020.1747943. S2CID 219087578.
- ^ "Developments in Modern GNSS and Its Impact on Autonomous Vehicle Architectures". www.swiftnav.com. December 2020. Retrieved 11 June 2025.
- ^ EPRS Automated vehicles in the EU, Members' Research Service Page 2 of 12, Glossary https://www.europarl.europa.eu/RegData/etudes/BRIE/2016/573902/EPRS_BRI(2016)573902_EN.pdf
- ^ "AAA Studies Technology Behind Self-Driving Cars". Your AAA Network. 18 February 2019. Archived from the original on 20 June 2021. Retrieved 21 February 2020.
- ^ "The SARTRE project". Archived from the original on 27 November 2010.
- ^ Marshall, Aarian. "After a Deadly Crash, Uber Returns Robocars to the Road". Wired. ISSN 1059-1028. Retrieved 5 May 2023.
- ^ "Uber Self-Driving Cars Hit The Streets Of Pittsburgh". www.cbsnews.com. 14 September 2016. Retrieved 5 May 2023.
- ^ "California's first driverless bus hits the road in San Ramon". ABC7 San Francisco. Retrieved 5 May 2023.
- ^ Mearian, Lucas (19 August 2016). "Ford remains wary of Tesla-like autonomous driving features". Computer World. Retrieved 9 December 2016.
- ^ a b "Automated Vehicle Technology." King Coal Highway 292 (2014): 23-29.
- ^ a b "A Tragic Loss". Tesla. 30 June 2016. Retrieved 10 December 2016.
- ^ Hallerbach, Sven; Xia, Yiqun; Eberle, Ulrich; Koester, Frank (3 April 2018). "Simulation-Based Identification of Critical Scenarios for Cooperative and Automated Vehicles". SAE International Journal of Connected and Automated Vehicles. 1 (2): 93–106. doi:10.4271/2018-01-1066.
- ^ Yigitcanlar; Wilson; Kamruzzaman (24 April 2019). "Disruptive Impacts of Automated Driving Systems on the Built Environment and Land Use: An Urban Planner's Perspective". Journal of Open Innovation: Technology, Market, and Complexity. 5 (2): 24. doi:10.3390/joitmc5020024. hdl:10419/241315.
- ^ a b c Anderson, Mark (May 2020). "The road ahead for self-driving cars: The AV industry has had to reset expectations, as it shifts its focus to level 4 autonomy – [News]". IEEE Spectrum. 57 (5): 8–9. doi:10.1109/MSPEC.2020.9078402. S2CID 219070930.
- ^ a b c Campbell, Mark; Egerstedt, Magnus; How, Jonathan P.; Murray, Richard M. (13 October 2010). "Autonomous driving in urban environments: approaches, lessons and challenges". Philosophical Transactions of the Royal Society A: Mathematical, Physical and Engineering Sciences. 368 (1928): 4649–4672. Bibcode:2010RSPTA.368.4649C. doi:10.1098/rsta.2010.0110. PMID 20819826. S2CID 17558587.
- ^ a b c d e Panagiotopoulos, Ilias; Dimitrakopoulos, George (October 2018). "An empirical investigation on consumers' intentions towards autonomous driving". Transportation Research Part C: Emerging Technologies. 95: 773–784. Bibcode:2018TRPC...95..773P. doi:10.1016/j.trc.2018.08.013. S2CID 117555199.
- ^ Shladover, Steven E.; Nowakowski, Christopher (April 2019). "Regulatory challenges for road vehicle automation: Lessons from the California experience". Transportation Research Part A: Policy and Practice. 122: 125–133. Bibcode:2019TRPA..122..125S. doi:10.1016/j.tra.2017.10.006. S2CID 113811906.
- ^ Umar Zakir Abdul, Hamid; et al. (2021). "Adopting Aviation Safety Knowledge into the Discussions of Safe Implementation of Connected and Autonomous Road Vehicles". SAE Technical Papers (SAE WCX Digital Summit) (2021–01–0074). Retrieved 12 April 2021.
- ^ "Amendment proposal to the 1968 Convention on Road Traffic" (PDF). Economic Commission for Europe. March 2020. Retrieved 13 November 2021.
- ^ "Explanatory memorandum: Proposal of Amendment to Article 1 and new Article 34 BIS of the 1968 Convention on Road Traffic".
- ^ a b SMMT publishes guiding principles for marketing automated vehicles, SMMT, 22 November 2021
- ^ "Driverless robot buses crash". Wolfstad.com. 6 December 2005. Retrieved 20 November 2011.
- ^ "Driverless robot buses crash, Part 2". Wolfstad.com. 17 December 2005. Retrieved 20 November 2011.
- ^ "S&P Global Homepage | S&P Global".
- ^ "Vauxhall Vectra | Auto Express News | News". Auto Express. 29 November 2005. Retrieved 20 November 2011.
- ^ "Nissan | News Press Release". Nissan-global.com. 15 March 2006. Archived from the original on 27 October 2011. Retrieved 20 November 2011.
- ^ "Singapore's driverless vehicle ambitions reach next milestone with new national standards". Channel NewsAsia. Archived from the original on 2 February 2019. Retrieved 2 February 2019.
- ^ "EU SUSTAINABLE ENERGY WEEK 18-22 JUNE 2012" (PDF). p. 14. Retrieved 21 June 2021.
- ^ "Final Report Summary – CITYMOBIL2 (Cities demonstrating cybernetic mobility)". 11 November 2016. Retrieved 17 August 2021.
- ^ "Experiments on autonomous and automated driving: an overview 2015" (PDF). Archived from the original (PDF) on 4 June 2022. Retrieved 28 June 2021.
- ^ Kitchen, Sebastian (8 December 2016). "JTA recommends replacing Skyway with driverless vehicles, creating corridor from Riverside to EverBank Field". Florida Times-Union. Retrieved 25 January 2017.
- ^ "JTA Board Chair Embraces Autonomous Vehicles To Replace Skyway". 15 April 2021. Retrieved 10 June 2021.
- ^ "Introducing the world's first completely unattended public autonomous vehicle". Euro Transport Magazine. 20 February 2017. Retrieved 1 September 2017.
- ^ "Rivium 3rd generation". 12 August 2020. Retrieved 10 June 2021.
- ^ "Baidu just made its 100th autonomous bus ahead of commercial launch in China". Tech Crunch. 4 July 2018. Retrieved 14 July 2021.
- ^ "Top 25 autonomous shuttle manufacturers". 15 October 2020.
- ^ "Toyota Unveils Their e-Palette Self-Driving Shuttles". Retrieved 28 June 2021.
- ^ "Toyota e-Palette autonomous vehicles to be rolled out "within the next few years"". caradvice. 11 February 2021. Retrieved 28 June 2021.
- ^ "China Autonomous Shuttle Market Report 2021 Featuring 10 Chinese Companies & 5 International Companies". 21 April 2021. Retrieved 28 June 2021.
- ^ a b "Yutong has already delivered 100 autonomous micro-buses Xiaoyu 2.0 models to Zhengzhou". YouTube. 4 July 2021. Archived from the original on 14 December 2021. Retrieved 14 July 2021.
- ^ "Walkabout The World's Largest Bus Factory (Yutong Industrial Park)". YouTube. 7 July 2021. Archived from the original on 14 December 2021. Retrieved 14 July 2021.
- ^ https://www.passengerselfservice.com/2022/03/hong-kong-airport-now-has-4-unmanned-patrol-cars/
- ^ "Autonomous Shuttle Pilots in Europe, AMD Aspirations in Austin". 3 June 2021. Retrieved 10 June 2021.
- ^ "BESTMILE AND TRAPEZE JOIN FORCES TO BRING AUTONOMOUS MOBILITY TO CITIES ALL AROUND THE GLOBE". Motion Digest Network. 10 October 2016. Retrieved 1 December 2021.
- ^ "Swiss move on with self-driving buses". swissinfo.ch. 5 October 2021. Retrieved 1 December 2021.
- ^ "Washington D.C. residents can ride in this adorable driverless shuttle starting this summer". Business Insider. 16 June 2016. Retrieved 5 October 2017.
- ^ Autonomous shuttle Olli deployed in Turin, Italy JAN 17, 2020
- ^ Scott, Mark (28 May 2017). "The Future of European Transit: Driverless and Utilitarian". The New York Times. Retrieved 8 September 2017.
- ^ "Las Vegas launches driverless shuttle bus test run on public roads". 12 January 2017. Retrieved 1 September 2017.
- ^ "Driverless bus takes to the road in Perth". The Australian. 1 September 2016. Retrieved 1 September 2017.
- ^ "Navya driverless shuttles to begin ferrying University of Michigan students this fall". Tech Crunch. 21 June 2017. Retrieved 1 September 2017.
- ^ "Experimentations with Navya's Autonomous Shuttles". Retrieved 28 June 2021.
- ^ "HITCHING A RIDE Project looks to bring autonomous shuttles to Texas A&M campus". The Eagle. 24 August 2017. Retrieved 5 September 2017.
- ^ "Autonomous Shuttle". Unmanned Systems Lab Texas A&M University. Retrieved 5 September 2017.
- ^ "Catch A Ride On TTI's Autonomous Shuttle". 17 September 2019. Retrieved 28 June 2021.
- ^ "Driverless robot 'pods' take to the Cambridge guided busway". 18 October 2017. Retrieved 24 October 2017.
- ^ "Driverless Bus TESTING ROUTE NOTICE". Northern Territory Government. 5 September 2017. Archived from the original on 12 September 2017. Retrieved 12 September 2017.
- ^ "Easymile | Driverless shuttle for the last mile". Archived from the original on 1 September 2017. Retrieved 1 September 2017.
- ^ "Society Driverless bus test in Taipei gets positive feedback". Focus Taiwan News Channel. 5 August 2017. Retrieved 1 September 2017.
- ^ "Denver's first driverless shuttle hits the test track, avoids tumbleweed before possible 2018 launch". 4 December 2017. Retrieved 7 December 2017.
- ^ "U.S. agency suspends self-driving shuttle EasyMile use in 10 U.S. states". Reuters. 25 February 2020 – via www.reuters.com.
- ^ "'This is our future': Fairfax tests region's first self-driving shuttle for public transit". 17 August 2020. Retrieved 28 June 2021.
- ^ "Transdev partners with Fairfax County to launch connected AV pilot project". 6 November 2020. Retrieved 28 June 2021.
- ^ "Nation's largest fleet of autonomous, electric shuttles launches in Colorado". Mass Transit. 13 August 2021. Retrieved 13 August 2021.
- ^ "The Mines Rover". YouTube. 10 August 2021. Archived from the original on 14 December 2021. Retrieved 13 August 2021.
- ^ "France approves fully autonomous bus for driving on public roads in a European first". Euronews. 2 December 2021. Retrieved 3 December 2021.
- ^ "EasyMile First Authorized at Level 4 of Autonomous Driving on Public Roads". EasyMile (Press release). 22 November 2021. Retrieved 3 December 2021.
- ^ "GATEway Project". Retrieved 28 June 2021.
- ^ "Westfield Technology Group autonomous POD confirmed for Fleet Live 2019". 1 August 2019. Archived from the original on 28 June 2021. Retrieved 28 June 2021.
- ^ "Dubai tests the world's first autonomous mobility pods". 15 February 2018. Retrieved 25 February 2018.
- ^ "Baidu apollo self driving cars". Business Insider. 2 July 2017. Retrieved 12 November 2018.
- ^ "Baidu starts mass production of autonomous buses". Deutche World. 5 July 2018. Retrieved 12 November 2018.
- ^ "Baidu's Robin Li Reveals Unmanned Bus, AI Chip, Digital Assistant Upgrade at Tech Summit". Yicai Global. 4 July 2018. Archived from the original on 12 November 2018. Retrieved 12 November 2018.
- ^ "JTA, FSCJ execute agreement for autonomous vehicle testing and educational initiatives". Mass Transit. 5 June 2020. Retrieved 28 June 2021.
- ^ "JTA Receives 6th Autonomous U2C Program Test Vehicle; FSCJ Part Of Latest Test". WJCT. 15 September 2020. Retrieved 28 June 2021.
- ^ "Olli 2.0 joins JTA's U2C testing program". Jacksonville Daily Record. 16 September 2020. Retrieved 28 June 2021.
- ^ "JTA testing ADA accessible autonomous vehicle". Mass Transit. 5 November 2019. Retrieved 28 June 2021.
- ^ "JTA Unveils U²C Gen-2 Test Vehicle". Archived from the original on 28 June 2021. Retrieved 28 June 2021.
- ^ "Self-driving people mover makes its maiden trip at brussels airport". 13 May 2019. Retrieved 28 June 2021.
- ^ "NTU Singapore to test autonomous vehicles on the NTU Smart Campus". Archived from the original on 28 June 2021. Retrieved 28 June 2021.
- ^ "NEWS New Zealand's first autonomous shuttle debuts at Christchurch Airport". Retrieved 28 June 2021.
- ^ "Autonomous shuttle rides for Christchurch public to park the fear factor". 16 February 2020. Retrieved 28 June 2021.
- ^ "Yutong 5G-Enabled Intelligent Mobility Solution Live Show". Retrieved 14 July 2021.
- ^ "Riding an Autonomous Bus ON THE CITY STREETS in China (Xiaoyu 2.0)". YouTube. 2 July 2021. Archived from the original on 14 December 2021. Retrieved 14 July 2021.
- ^ "Taiwan's first autonomous minibus begins operations in Changhua". Taiwan Today. 16 July 2020. Retrieved 27 November 2021.
- ^ "First Homegrown Autonomous Shuttle Wins Operating License in Taiwan". Auto Future. 11 January 2021. Archived from the original on 27 November 2021. Retrieved 27 November 2021.
- ^ "Test Operation of Remote Controlled Driverless Autonomous Driving Mobility Services to Start". METI, Japan. 11 December 2020. Retrieved 23 November 2021.
- ^ Shin Kato (November 2021). "AIST's efforts toward social implementation of automated driving mobility services" (PDF). SIP-adus. pp. 5–13. Retrieved 23 November 2021.
- ^ "国内初!自動運転車に対するレベル4の認可を取得しました" [Domestically the first! Approved as Level 4 self-driving car]. METI, Japan. 31 March 2023. Retrieved 3 April 2023.
- ^ "Chinese autonomous driving startup WeRide launches Mini Robobus in Guangzhou". 29 January 2021. Retrieved 28 June 2021.
- ^ "Covid Won't Stop The Olympics Nor Toyota's Autonomous EV Transportation For Athletes". Forbes. 30 June 2021. Retrieved 13 August 2021.
- ^ "Toyota halts all self-driving e-Palette vehicles after Olympic village accident". Reuters. 28 August 2021. Retrieved 29 August 2021.
- ^ "Toyota self-driving buses in Paralympic village to restart on Aug. 31". Kyodo News. 30 August 2021. Retrieved 17 November 2021.
- ^ "【東京の西新宿と臨海副都心で自動運転移動サービスへ 都が実証実験の実施を決定" [Tokyo Metropolitan Government decided to conduct trials on autonomous mobility services in west-Shinjyuku area and bay-front area]. Response (in Japanese). 19 July 2021. Retrieved 2 December 2021.
- ^ "Self-driving shuttle buses hit the streets of Stockholm". New Atlas. 25 January 2018.
- ^ "Smart Mobility is here" – via www.youtube.com.
- ^ "Self-driving buses are being tested in China and they're the largest of their kind yet". Mashable. 4 December 2017.
- ^ "UK's first driverless bus trialled in Manchester". Independent.co.uk. 19 March 2019. Archived from the original on 11 August 2022.
- ^ "First driverless Edinburgh to Fife bus trial announced". BBC News. 22 November 2018.
- ^ Peat, Chris (23 January 2023). "First passengers board Stagecoach autonomous bus". Bus & Coach Buyer. Retrieved 24 January 2023.
- ^ Peat, Chris (23 January 2023). "Autonomous bus starts trials in Oxfordshire". Bus & Coach Buyer. Retrieved 24 January 2023.
- ^ "UK's first self-driving electric bus unveiled". Oxford Mail. 23 January 2023. Retrieved 23 January 2023.
- ^ "Public Road Demonstration Tests of Mid-Sized Buses with Autonomous Driving Systems to be Launched". METI, Japan. 10 July 2020. Retrieved 20 November 2021.
- ^ "The Isuzu Group Value Creation Story: Growth Strategies" (PDF). Isuzu. 10 July 2020. p. 26. Retrieved 20 November 2021.
- ^ Shin Kato 2021, pp. 3–4.
- ^ Hübner, Irina. "Neuronale Netze und selbstlernende KI: Mapless-Autonomous-Fahrlösung für Busse". Elektroniknet (in German). Retrieved 31 January 2024.
- ^ Tian, Qingyun; Lin, Yun H.; Wang, David Z.; Liu, Yang (2022). "Planning for modular-vehicle transit service system: Model formulation and solution methods". Transportation Research Part C: Emerging Technologies. 138 103627. Bibcode:2022TRPC..13803627T. doi:10.1016/j.trc.2022.103627.
- ^ Wu, Jiaming; Kulcsár, Balázs; Qu, Xiaobo (2021). "A modular, adaptive, and autonomous transit system (MAATS): An in-motion transfer strategy and performance evaluation in urban grid transit networks". Transportation Research Part A: Policy and Practice. 151: 81–98. Bibcode:2021TRPA..151...81W. doi:10.1016/j.tra.2021.07.005.
- ^ "Suncor Seeks Cost Cutting With Robot Trucks in Oil-Sands Mine". Bloomberg-.com. 13 October 2013. Retrieved 14 June 2016.
- ^ "HS Clarifies Autonomous Vehicle Sales Forecast – Expects 21 Million Sales Globally in the Year 2035 and Nearly 76 Million Sold Globally Through 2035". ihs-.com. 9 June 2016. Retrieved 14 June 2016.
- ^ Nelson, Ray (June 1995). "Leave The Driving To Us". Popular Science. p. 26.
- ^ Gingrich, Newt (7 October 2014). Breakout: Pioneers of the Future, Prison Guards of the Past, and the Epic Battle That Will Decide America's Fate. Regnery Publishing. p. 114. ISBN 978-1-62157-281-7.
- ^ "Transportation invests for a new future: Automation is rapidly accelerating and disrupting the industry" (PDF).
- ^ "Waymo's self-driving trucks will start delivering freight in Atlanta". The Verge. Retrieved 13 March 2018.
- ^ "Uber's Self-Driving Truck Makes Its First Delivery: 50,000 Budweisers". WIRED. Retrieved 13 March 2018.
- ^ "Colorado officer recounts how Otto's autonomous beer delivery became a reality". Fleet Owner. 9 March 2018. Retrieved 13 March 2018.
- ^ Dillet, Romain. "Uber acquires Otto to lead Uber's self-driving car effort". TechCrunch. Retrieved 13 March 2018.
- ^ McFarland, Matt (26 March 2018). "First self-drive train launched on mainline track". Telegraph.
- ^ Kolodny, Lora (6 February 2018). "A self-driving truck just drove from Los Angeles to Jacksonville". CNBC. Retrieved 13 March 2018.
- ^ "A Self-Driving Truck Might Deliver Your Next Refrigerator". WIRED. Retrieved 13 March 2018.
- ^ "Exclusive: Tesla developing self-driving tech for semi-truck, wants to test in Nevada". Reuters. 10 August 2017. Retrieved 8 September 2017.
- ^ "Silicon Valley's Levandowski returns with self-driving truck start-up". Financial Times. 18 December 2018. Archived from the original on 11 December 2022. Retrieved 10 May 2019.
- ^ a b Eric Adams (6 January 2017), "Honda's self-balancing motorcycle is perfect for noobs", Wired
- ^ Self-balancing Yamaha motorcycle comes on command, Agence France-Presse, 12 January 2018 – via IOL
- ^ Bob Sorokanich (11 September 2018), "Robots replace humans the one place we least expected: motorcycles", Road and Track
- ^ a b "Harley-Davidson Wants To Make Self-Balancing Motorcycles By Putting A Gyroscope In Your Top Case". Jalopnik. 9 June 2020. Retrieved 4 August 2020.
- ^ Sorokanich, Bob (11 September 2018). "Robots Replace Humans the One Place We Least Expected: Motorcycles". Road & Track. Retrieved 4 August 2020.
- ^ "Self-balancing Yamaha motorcycle comes on command". www.iol.co.za. Retrieved 4 August 2020.
- ^ 枝久保達也 (25 January 2021). "世界初の完全自動無人運転、「ポートライナー」が40年前に開業した理由". diamond.jp (in Japanese). Diamond. Retrieved 23 January 2022.
- ^ Topham, Gwyn (26 March 2018). "First self-driving train launches on London Thameslink route". The Guardian.
- ^ "Germany launches world's first autonomous tram in Potsdam". TheGuardian.com. 23 September 2018.
- ^ a b Lee, Jason (23 December 2019). "3 Companies Looking to Dominate Drone Delivery". The Motley Fool. Retrieved 4 August 2020.
- ^ "Toyota Tsusho Launches Drone Delivery of Medical and Pharmaceutical Supplies Business in Nagasaki Prefecture's Goto Islands – Network Powered by Zipline". Toyota Tsusho. Retrieved 21 May 2022.
- ^ "International Commercial Drone Regulation and Drone Delivery Services" (PDF). RAND.
- ^ "Bill S. 1405" (PDF).
- ^ "Home | Boston Dynamics". www.bostondynamics.com. Retrieved 4 August 2020.
- ^ a b "Rules on safe use of automated vehicles on GB roads". GOV.UK.
- ^ a b Liljamo, Timo; Liimatainen, Heikki; Pöllänen, Markus (November 2018). "Attitudes and concerns on automated vehicles". Transportation Research Part F: Traffic Psychology and Behaviour. 59: 24–44. Bibcode:2018TRPF...59...24L. doi:10.1016/j.trf.2018.08.010. S2CID 150232489.
- ^ a b "Despite Autonomous Vehicle Intrigue, Americans Still Crave Control Behind The Wheel, According To New Kelley Blue Book Study" (Press release). Kelley Blue Book. 28 September 2016. ProQuest 1825236192.
- ^ a b c d "Users' Understanding of Automated Vehicles and Perception to Improve Traffic Safety –Results from a National Survey". AAA Foundation. 17 December 2019. Retrieved 4 August 2020.
- ^ Wang, Song; Li, Zhixia (28 March 2019). "Exploring the mechanism of crashes with automated vehicles using statistical modeling approaches". PLOS ONE. 14 (3) e0214550. Bibcode:2019PLoSO..1414550W. doi:10.1371/journal.pone.0214550. PMC 6438496. PMID 30921396.
- ^ a b Ainsalu, Jaagup; Arffman, Ville; Bellone, Mauro; Ellner, Maximilian; Haapamäki, Taina; Haavisto, Noora; Josefson, Ebba; Ismailogullari, Azat; Lee, Bob; Madland, Olav; Madžulis, Raitis; Müür, Jaanus; Mäkinen, Sami; Nousiainen, Ville; Pilli-Sihvola, Eetu; Rutanen, Eetu; Sahala, Sami; Schønfeldt, Boris; Smolnicki, Piotr Marek; Soe, Ralf-Martin; Sääski, Juha; Szymańska, Magdalena; Vaskinn, Ingar; Åman, Milla (2018). "State of the Art of Automated Buses". Sustainability. 10 (9): 3118. Bibcode:2018Sust...10.3118A. doi:10.3390/su10093118.
- ^ "Collision Between Vehicle Controlled by Developmental Automated Driving System and Pedestrian". National Transportation Safety Board. 28 March 2018. Retrieved 19 February 2023.
- ^ a b c Dogan, E; Chatila, R (2016). "Ethics in the design of automated vehicles: the AVEthics project" (PDF). CEUR Workshop Proceedings.
- ^ a b c "How Should Autonomous Vehicles Make Moral Decisions? Machine Ethics, Artificial Driving Intelligence, and Crash Algorithms". Contemporary Readings in Law and Social Justice. 11: 9. 2019. doi:10.22381/CRLSJ11120191 (inactive 1 July 2025). S2CID 213759514. ProQuest 2269349615.
{{cite journal}}: CS1 maint: DOI inactive as of July 2025 (link) - ^ a b "The Safety and Reliability of Networked Autonomous Vehicles: Ethical Dilemmas, Liability Litigation Concerns, and Regulatory Issues". Contemporary Readings in Law and Social Justice. 11 (2): 9. 2019. doi:10.22381/CRLSJ11220191. ProQuest 2322893910.
- ^ Umbrello, Steven; Yampolskiy, Roman V. (15 May 2021). "Designing AI for Explainability and Verifiability: A Value Sensitive Design Approach to Avoid Artificial Stupidity in Autonomous Vehicles". International Journal of Social Robotics. 14 (2): 313–322. doi:10.1007/s12369-021-00790-w. hdl:2318/1788856. ISSN 1875-4805. S2CID 236584241.
Works cited
[edit]- "Uber Self-Driving Cars Hit The Streets Of Pittsburgh". www.cbsnews.com. 14 September 2016. Retrieved 5 May 2023.
- Badue, Claudine; Guidolini, Rânik; Carneiro, Raphael Vivacqua; Azevedo, Pedro; Cardoso, Vinicius B.; Forechi, Avelino; Jesus, Luan; Berriel, Rodrigo; Paixão, Thiago M.; Mutz, Filipe; de Paula Veronese, Lucas; Oliveira-Santos, Thiago; De Souza, Alberto F. (1 March 2021). "Self-driving cars: A survey". Expert Systems with Applications. 165 113816. Elsevier. arXiv:1901.04407. doi:10.1016/j.eswa.2020.113816.
- Azam, Shoaib; Munir, Farzeen; Sheri, Ahmad Muqeem; Kim, Joonmo; Jeon, Moongu (22 October 2020). "System, Design and Experimental Validation of Autonomous Vehicle in an Unconstrained Environment". Sensors. 20 (21): 5999. Bibcode:2020Senso..20.5999A. doi:10.3390/s20215999. PMC 7660187. PMID 33105897.
- Serban, Alex; Poll, Erik; Visser, Joost (2020). "A Standard Driven Software Architecture for Fully Autonomous Vehicles". Journal of Automotive Software Engineering. 1 (1). Atlantis Press: 20. doi:10.2991/jase.d.200212.001.
External links
[edit]- European Commission Intelligent Car website
- U.S. Department of Transportation – Intelligent Transportation Systems Joint Program Office website
- Sheth, Aadit (3 January 2024). "Indian AI And Robotics Startup Claims Level 5 Autonomy". Prompt Engineering Daily. Retrieved 27 January 2024.
Vehicular automation
View on GrokipediaDefinitions and Autonomy Levels
SAE J3016 Framework
The SAE J3016 standard, issued by SAE International, defines a taxonomy for driving automation systems in on-road motor vehicles, categorizing them into six discrete levels (0 through 5) based on the system's capability to perform the dynamic driving task (DDT) on a sustained basis. The DDT encompasses all lateral and longitudinal vehicle motion control, as well as object and event detection, response, and monitoring of the driving environment. First published in 2014 and revised in 2016, 2018, and most recently in April 2021 as J3016_202104, the framework emphasizes that levels are mutually exclusive and represent increasing degrees of automation where the human driver or fallback-ready user may or may not be present.[7] It distinguishes between driver assistance features (below Level 2) and driving automation systems (Levels 2–5), with the latter capable of sustained DDT performance without immediate human intervention in certain conditions. Key distinctions across levels hinge on human engagement: at lower levels, the human performs the entire DDT or parts thereof while remaining fully responsible; at higher levels, the automated system assumes DDT responsibility, potentially eliminating the need for a human driver altogether. The framework also introduces terms like operational design domain (ODD), which specifies the conditions under which a system functions (e.g., geographic limits, speed ranges, or weather), and fallback mechanisms for handling DDT failures. Levels 3–5 require the system to achieve a minimal risk condition if DDT fallback is needed, rather than relying on human intervention.| Level | Designation | Description |
|---|---|---|
| 0 | No Driving Automation | The human driver performs all aspects of the DDT, even if warned of potential issues; automation limited to warning or momentary interventions (e.g., emergency braking). |
| 1 | Driver Assistance | The system assists with either steering or acceleration/braking, but the human must monitor and perform remaining DDT parts (e.g., adaptive cruise control combined with lane centering not yet standard here). |
| 2 | Partial Driving Automation | The system handles both steering and acceleration/braking simultaneously, but the human must remain fully engaged, monitoring the environment and ready to intervene at any time (e.g., Tesla Autopilot or GM Super Cruise in certain modes). |
| 3 | Conditional Driving Automation | The system performs the full DDT within its ODD, but requires a human fallback-ready user to take over upon request; the user need not monitor continuously (e.g., Mercedes Drive Pilot approved for limited use in 2023). |
| 4 | High Driving Automation | The system executes the full DDT within a specified ODD without human intervention or fallback capability required; no human driver present, but operation limited to ODD (e.g., Waymo robotaxis in geofenced areas). |
| 5 | Full Driving Automation | The system performs the full DDT under all roadway and environmental conditions matching a human driver's capabilities; no ODD restrictions or human presence needed (no commercial deployments as of 2025). |
Alternative Classification Systems
One notable alternative to the SAE J3016 framework is the driver-involvement-based taxonomy proposed by Mobileye in May 2023, which shifts focus from the vehicle's technical automation capabilities to the practical requirements on the human driver or occupant.[9] This system categorizes automated driving into four primary modes defined by the presence and attention demands on the driver: hands-on/eyes-on (no system intervention, driver fully responsible for all dynamic driving tasks); hands-off/eyes-on (system handles steering and acceleration/braking, but driver must continuously monitor and be ready to intervene, akin to enhanced SAE Level 2 systems); hands-off/eyes-off (system manages driving without requiring driver attention or input, limited to predefined operational design domains such as specific roads or conditions); and no-driver (fully autonomous operation with no human occupant needed, potentially incorporating remote tele-operation for edge cases).[9] Mobileye's taxonomy, articulated by CEO Amnon Shashua, addresses criticisms of SAE J3016's numerical levels (0-5) by prioritizing consumer-facing clarity on responsibilities—such as whether hands can be removed from the wheel or eyes from the road—over abstract engineering thresholds.[9] Examples include Mobileye SuperVision for hands-off/eyes-on operation and Mobileye Chauffeur for hands-off/eyes-off in geofenced areas.[9] Unlike SAE, which defines levels by the system's ability to perform the dynamic driving task under all conditions (with fallback to human or system for lower levels), this framework explicitly ties categories to operational design domains (ODDs) and avoids implying universal applicability, arguing that rigid levels foster public confusion about real-world deployment limitations.[9] Other proposed systems, such as the 2024 academic taxonomy outlined in a preprint by researchers challenging SAE's structure, integrate automation levels with application types (e.g., highway vs. urban) and ODDs to better accommodate diverse vehicle use cases, but these remain non-standardized and less adopted in industry. Regulatory bodies like the U.S. National Highway Traffic Safety Administration (NHTSA) continue to reference SAE J3016 for consistency in safety guidance and testing frameworks, with no distinct alternative classification formally endorsed as of 2025.[8] The German Association of the Automotive Industry (VDA) employs a parallel five-level scale (from driver-only to fully automated) that closely mirrors SAE, emphasizing driverless operation in Levels 4 and 5 within defined environments since its 2015 proposal, but it functions more as a regional variant than a divergent system.[10] These alternatives highlight ongoing debates over whether classifications should prioritize verifiable system performance, user expectations, or deployment constraints, though SAE remains the de facto global benchmark for standardization.Foundational Technologies
Perception and Sensing
Perception in vehicular automation encompasses the acquisition and interpretation of environmental data to enable safe navigation, object detection, and scene understanding. Sensors provide raw inputs such as visual imagery, depth measurements, and velocity estimates, which are processed through computer vision and signal analysis algorithms to identify lanes, pedestrians, vehicles, and traffic signals. This layer is critical, as inaccuracies in perception can propagate errors to planning and control modules, potentially leading to collisions.[11] Cameras serve as primary visual sensors, capturing high-resolution RGB images for semantic tasks like traffic sign recognition and lane marking detection. Their strengths include cost-effectiveness, dense pixel data for texture and color analysis, and compatibility with deep learning models for classification. However, cameras exhibit weaknesses in low-light conditions, glare, and adverse weather such as rain or fog, where visibility drops significantly, necessitating computational corrections like image enhancement. Empirical evaluations show stereo camera systems achieving depth estimation at 0.5–3 m ranges with 25 frames per second, though intersection over union (IoU) metrics for segmentation hover around 40–70% in controlled tests.[12] LiDAR (Light Detection and Ranging) sensors emit laser pulses to generate 3D point clouds, offering precise distance measurements up to 245 m with centimeter-level accuracy for obstacle mapping and free-space estimation. They perform reliably across day-night cycles and provide direct depth data independent of lighting. Limitations include high costs, sparse point density at longer ranges, and degradation in precipitation—such as a 25% performance drop in fog or snow due to scattering—and lack of inherent color or semantic information. Devices like the Velodyne VLP-32C deliver 32 vertical channels with a field of view from -25° to 15°, enabling detailed environmental reconstruction.[12] Radar sensors utilize radio waves for long-range detection of position and relative velocity via the Doppler effect, excelling in all-weather scenarios including heavy rain or dust where optical sensors fail. They provide robust measurements for dynamic object tracking, with ranges extending to 250 m in automotive-grade units. Drawbacks encompass low angular resolution leading to poor shape discrimination, susceptibility to multipath reflections causing false positives (e.g., high rates at 5–7 m from metallic clutter), and limited object classification without fusion. Millimeter-wave radars operating at 79 GHz demonstrate consistent performance in velocity estimation but require integration for refined localization.[12] Auxiliary sensors like ultrasonic transducers complement primary modalities for short-range tasks such as parking assistance, detecting obstacles within 5–10 m via acoustic echoes, though they lack directional precision. Inertial measurement units (IMUs) and GPS aid in estimating vehicle ego-motion to stabilize perception data against sensor noise.[12] Sensor fusion integrates complementary data streams to mitigate individual limitations, yielding a unified environmental model with enhanced robustness. Early fusion merges raw signals, while late fusion combines high-level detections; deep fusion embeds modalities in neural networks for joint feature learning. LiDAR-camera fusion, for instance, pairs geometric precision with visual semantics, boosting average precision (AP) in benchmarks like KITTI—e.g., 92.45% AP for bird's-eye-view car detection under easy conditions using methods like Painted PointRCNN. Such approaches reduce false negatives in occluded scenes and improve overall detection by 10–20% over monocular systems in empirical studies. For SAE Level 3 autonomy, multi-sensor fusion commonly integrates exceeding 30 sensors, including LiDAR, millimeter-wave radars, high-definition cameras, and ultrasonic radars, to provide comprehensive environmental coverage.[13][14] Perception systems face challenges from environmental variability and adversarial threats. Adverse conditions degrade camera and LiDAR efficacy, with radar providing fallback but at reduced resolution; fusion strategies adapt by weighting inputs dynamically. Security analyses reveal vulnerabilities, including camera overexposure from lasers causing misidentification of signs, LiDAR spoofing via infrared injections triggering phantom braking, and radar jamming that falsifies distances, as demonstrated in real-world tests on production vehicles. Ongoing advances emphasize multi-modal transformers and emergent sensors like event cameras for high-dynamic-range capture, aiming for perception latencies under 100 ms in safety-critical applications.[15]Localization and Mapping
Localization and mapping enable autonomous vehicles to estimate their pose relative to the environment and construct or reference representations of surroundings for safe navigation. Localization determines the vehicle's position, orientation, and velocity, often fusing data from global navigation satellite systems (GNSS), inertial measurement units (IMUs), and exteroceptive sensors like LiDAR and cameras to achieve centimeter-level accuracy required for Level 3+ autonomy. High-precision maps and positioning systems are core to Level 3 conditional automation, encoding detailed lane geometries and static features to constrain localization errors.[16][17] Mapping involves generating spatial models, either online via real-time sensor data or offline using high-definition (HD) maps, which encode lane geometries, traffic signs, and static features to constrain localization errors.[17] Simultaneous localization and mapping (SLAM) is a foundational probabilistic framework that iteratively refines vehicle pose and environmental maps by minimizing discrepancies between sensor observations and predictions, addressing the "chicken-and-egg" problem of needing a map to localize and localization to build a map.[18] LiDAR-based SLAM dominates due to its dense 3D point clouds, enabling feature extraction like pole matching for loop closure, though computational demands limit real-time use without GPU acceleration; variants like LiDAR-IMU fusion mitigate drift by leveraging IMU's high-frequency data for motion compensation during LiDAR scan gaps.[19] Visual SLAM, relying on camera-derived keypoints, offers cost advantages but suffers from textureless scenes and lighting variations, prompting hybrid approaches that integrate radar for adverse weather resilience.[20] GNSS-IMU integration provides baseline global localization, with differential GNSS achieving sub-meter precision in open areas, but urban multipath errors and signal outages necessitate map-matching techniques that align vehicle sensors to precomputed HD maps via particle filters or iterative closest point algorithms.[21] Recent advancements include AI-enhanced SLAM with graph neural networks for dynamic object rejection and lifelong mapping that updates maps incrementally across sessions, reducing storage needs while handling environmental changes like construction.[22] Challenges persist in GPS-denied environments, where IMU drift accumulates at rates of 0.1-1 m/s without corrections, and sensor fusion demands rigorous calibration to avoid pose inconsistencies exceeding 10 cm, potentially causing planning failures. Solutions like multi-sensor fusion frameworks, validated in highway scenarios, fuse LiDAR point-to-map distances with GNSS pseudoranges, yielding errors under 5 cm in 95% of cases even during brief outages.[23] Offline HD maps from fleet crowdsourcing enhance reliability but require secure updates to counter adversarial attacks, underscoring the need for verifiable, low-latency localization independent of cloud dependency.[24]Planning, Decision-Making, and Control
In vehicular automation, planning, decision-making, and control constitute the deliberative layer that translates environmental perception and localization into executable vehicle maneuvers. High-compute chips and algorithms support real-time decision-making, path planning, and behavior prediction essential for SAE Level 3 autonomy. Redundancy designs ensure functional safety through double backups for perception, computation, and execution subsystems.[25] This subsystem operates hierarchically, with high-level decision-making selecting behaviors such as lane changing or overtaking, followed by trajectory planning to generate feasible paths, and low-level control to actuate steering, acceleration, and braking. Such architectures ensure computational efficiency by decomposing complex tasks, enabling real-time operation in dynamic environments.[26][27] Decision-making involves assessing scenarios to choose optimal actions, often using rule-based systems, optimization methods, or machine learning. For instance, dynamic programming (DP) and quadratic programming (QP) frameworks evaluate global paths and local behaviors, prioritizing safety and efficiency in unstructured settings.[28] Reinforcement learning and neural networks approximate nonlinear decision functions, particularly for handling uncertainties like pedestrian intent.[29] Ethical considerations, such as risk distribution among road users, are integrated via constrained optimization in some algorithms, though deployment remains limited by validation challenges.[30] Trajectory planning generates collision-free paths aligning with decisions, divided into global planning for route optimization and local planning for immediate adjustments. Algorithms like A* and Dijkstra's compute shortest paths in known maps, while rapidly-exploring random trees (RRT) handle dynamic obstacles by sampling feasible configurations.[31] Lattice planners and model predictive formulations optimize trajectories over horizons of 5-10 seconds, incorporating vehicle kinematics and constraints like curvature limits.[32] Hybrid approaches combine sampling with optimization to balance exploration and smoothness, as demonstrated in real-time planners reducing computation to under 100 ms.[33] Control executes planned trajectories by modulating actuators, employing feedback mechanisms to track references amid disturbances. Proportional-integral-derivative (PID) controllers provide simple longitudinal speed regulation, but model predictive control (MPC) dominates for integrated lateral-longitudinal tasks due to its ability to enforce constraints on states like velocity bounds (e.g., 0-30 m/s) and steering angles (±30 degrees).[34] MPC solves optimization problems online, predicting future states via linearized bicycle models, with horizons of 10-20 steps at 10 Hz update rates, achieving lateral errors below 0.2 m in simulations.[35] Hybrid MPC-PID schemes further enhance robustness, as in cascaded architectures where MPC handles steering and PID manages throttle.[36] Real-world implementations, such as those in structured roads, report tracking accuracies improving safety metrics by minimizing deviations in high-speed merges.[37]Historical Development
Early Concepts and Prototypes (Pre-2000)
The earliest demonstrations of "driverless" vehicles in the 1920s relied on radio control rather than onboard autonomy, as exemplified by Francis P. Houdina's 1925 "American Wonder" automobile, which navigated 15 miles through New York City streets under remote guidance from a trailing vehicle, though it collided with a truck during the test.[38] Similar radio-controlled setups appeared in demonstrations, but these systems lacked independent environmental perception or decision-making, depending instead on external human operators for real-time control.[39] Mid-century concepts shifted toward infrastructure-guided automation to enable highway travel without constant human input. In 1939, General Motors' Futurama exhibit at the New York World's Fair showcased semi-autonomous vehicles using radio signals and embedded road magnets for steering and spacing, envisioning electrified highways with lead cars setting speeds up to 100 mph.[38] By the 1950s, RCA Laboratories developed embedded roadway detectors in test sites like Lincoln, Nebraska, compatible with vehicles such as GM's 1956 Firebird II concept, which followed inductive guidance from buried wires or magnets at speeds up to 60 mph; these prototypes succeeded in controlled loops but required dedicated infrastructure, limiting scalability.[40] Pioneering onboard sensing emerged in academic prototypes during the 1960s and 1970s. Stanford University's 1961 cart employed rudimentary computer vision to navigate lunar-like terrain, marking an early use of cameras for obstacle avoidance without external guidance.[38] In 1977, Japan's Tsukuba Mechanical Engineering Laboratory built a passenger vehicle that autonomously followed white lane markers at up to 20 mph using optical sensors, demonstrating basic vision-based tracking in structured environments but struggling with unstructured roads or poor visibility.[39] The 1980s and 1990s saw more advanced prototypes integrating multiple sensors and computing for partial autonomy in real-world conditions. Carnegie Mellon University's Navlab project began in 1986 with Navlab 1, a retrofitted van using cameras and early neural networks for road following at low speeds; by 1995, Navlab 5 achieved 98% autonomous steering over 2,850 miles from Pittsburgh to San Diego via vision and map-matching, though it required occasional human intervention for complex maneuvers.[38] Ernst Dickmanns at Germany's Bundeswehr University developed vision-based systems, culminating in the 1995 VaMP van under the EUREKA PROMETHEUS project, which drove nearly 2,000 km autonomously at up to 80 mph on highways, including lane changes and traffic merging using real-time image processing—yet performance degraded in adverse weather without redundant sensors.[40][39] DARPA's 1980s Autonomous Land Vehicle (ALV) initiative tested off-road prototypes with early lidar and stereo vision for obstacle detection at 2-5 mph in rough terrain, highlighting computational limits of the era's hardware.[38] These efforts, often funded by government programs like PROMETHEUS (1987-1995), proved feasibility for highway and structured autonomy but revealed persistent challenges in generalization, sensor fusion, and safety under varied conditions, with no prototypes achieving full disengagement-free operation pre-2000.[39]Government-Led Initiatives (2000s)
In 2004, the U.S. Defense Advanced Research Projects Agency (DARPA) launched the Grand Challenge, a competition requiring autonomous vehicles to navigate a 132-mile off-road course in the Mojave Desert within 10 hours, aimed at accelerating technologies for unmanned ground vehicles with potential military applications.[41] No entrant completed the route, as vehicles struggled with obstacles like rocks and tunnels, highlighting limitations in perception and decision-making algorithms at the time.[42] The event drew 107 teams, primarily from universities and research institutions, and offered a $1 million prize, fostering early advancements in GPS-based navigation and sensor fusion despite the lack of winners.[43] DARPA followed with a second Grand Challenge in 2005, using a similar 132-mile desert course from Barstow to Primm, Nevada, where five vehicles successfully finished, with Stanford University's "Stanley" vehicle—equipped with LIDAR, cameras, and radar—completing it in 6 hours and 53 minutes.[44] This success demonstrated feasible real-time obstacle avoidance and path planning in unstructured environments, attributing progress to improved computing power and machine learning for terrain mapping.[45] The competition expanded participation to 195 teams, emphasizing government investment in dual-use technologies that later influenced civilian automation efforts.[41] By 2007, DARPA's Urban Challenge shifted focus to urban settings, tasking vehicles to navigate a 60-mile mock city course with moving traffic, traffic laws, and parking maneuvers, where Carnegie Mellon University's "Boss" vehicle won the $2 million prize among 11 finalists from 89 entrants.[43] This initiative addressed complexities like intersection negotiation and vehicle-to-vehicle coordination, revealing gaps in handling dynamic human-driven traffic but validating integrated systems for rule-compliant autonomy.[45] Overall, these DARPA programs, funded under U.S. Department of Defense auspices, catalyzed over $100 million in related research by the decade's end, primarily through prizes rather than direct grants, prioritizing innovation over prescriptive development.[42] While European and Japanese governments pursued intelligent transport systems in the 2000s, such as Japan's ITS policy framework for vehicle-road integration, no comparable large-scale autonomous vehicle challenges emerged until the 2010s.[46]Commercial Scaling and Milestones (2010s–Present)
In the early 2010s, regulatory frameworks emerged to enable public road testing of autonomous vehicles, marking the shift from controlled environments to commercial viability. Nevada enacted the first U.S. state law permitting autonomous vehicle operations in June 2011, issuing licenses to companies like Google for testing on public roads. Google's self-driving car project, later Waymo, logged over 140,000 autonomous miles by 2010 and expanded to multi-state testing by 2012, accumulating 4 million miles by 2015 through iterative software improvements and sensor refinements.[47] These efforts prioritized safety data over immediate revenue, with human safety drivers present, but laid groundwork for scaling by validating perception and control systems in diverse conditions. Tesla accelerated consumer-facing deployment with its Autopilot hardware suite introduced in October 2014 on Model S vehicles, enabling highway assist features via cameras and radar, followed by the Full Self-Driving (FSD) capability option in October 2016 promising urban autonomy upgrades via over-the-air updates. By March 2025, Tesla vehicles had driven 3.6 billion cumulative miles on FSD (supervised), leveraging fleet data for neural network training, though regulatory scrutiny persists due to incidents linking Autopilot to crashes, prompting NHTSA investigations.[48] Unlike geo-fenced services, Tesla's approach scales through widespread hardware distribution—over 6 million equipped vehicles by 2025—but remains Level 2 under SAE standards, requiring human supervision and facing delays in unsupervised rollout despite claims of impending robotaxi viability. Meanwhile, in December 2025, China's Ministry of Industry and Information Technology granted the first conditional Level 3 autonomous driving permits for passenger vehicles, allowing operations in designated areas such as Chongqing and Beijing under specific conditions including speed limits of 50 km/h.[49][50] Waymo achieved the first sustained commercial robotaxi service with Waymo One launching paid, driverless rides in Phoenix suburbs in December 2018, following an early rider program in November 2017 that tested public acceptance.[51] By 2025, Waymo operated over 250,000 weekly paid trips across Phoenix, San Francisco, Los Angeles, Austin, and Atlanta, with full driverless deployment in three cities by 2024, amassing billions of miles and demonstrating Level 4 autonomy in operational design domains (ODDs) like urban arterials.[52] Expansion faced hurdles, including a 2024 NHTSA probe into 22 incidents, but empirical safety data shows Waymo vehicles at 85% fewer injury-causing crashes per million miles than human drivers.[53] GM's Cruise pursued aggressive urban scaling, obtaining California's driverless permit in 2019 and launching fully driverless operations in San Francisco by 2022, but a October 2023 incident where a Cruise vehicle struck and dragged a pedestrian 20 feet led to operational suspension, software recall of all 950 units, and layoffs.[54] GM restarted supervised testing in 2025, shifting focus to Super Cruise, a Level 2+ hands-free system used by over 500,000 drivers with zero reported crashes, highlighting risks of rushed Level 4 deployment without robust remote oversight.[55][56] Autonomous shuttles scaled via low-speed, geo-fenced pilots in controlled settings like campuses and airports. French firms Navya and EasyMile led deployments, with EasyMile's EZ10 operating in over 30 countries by 2025 for last-mile transit, including a 2017 Singapore trial carrying 100,000 passengers.[57] Navya's Arma shuttles ran 23-month public services in Sweden complementing fixed-route buses, though the company filed for bankruptcy protection in January 2025 amid market saturation.[58][59] These trials validated multi-vehicle coordination but revealed limitations in adverse weather and pedestrian-heavy areas, with speeds capped at 20 km/h and safety attendants often required. In trucking, Level 4 pilots advanced freight efficiency on highways. TuSimple completed an 80-mile fully autonomous run in Arizona in 2021, but ceased U.S. operations in 2023 due to governance issues.[60] Aurora achieved driverless hauls on Texas corridors in 2024-2025, targeting commercial launch with 20 unmanned trucks by late 2024 via partnerships with PACCAR and Volvo, focusing on hub-to-hub routes to reduce labor costs.[61][62] Embark Trucks tested cross-country autonomy but pivoted to software licensing post-2023 funding constraints, underscoring that while sensor fusion enables safe long-haul operation—e.g., Aurora's systems logging millions of simulated miles—regulatory approval for driverless interstate trucking lags, confined to permitted corridors amid FMCSA oversight.[63] Overall, by 2025, commercial scaling remains niche, with robotaxis like Waymo's serving millions of rides annually but comprising under 0.1% of U.S. miles driven, constrained by high costs ($100,000+ per vehicle) and ODD limitations.[5]Applications in Ground Vehicles
Passenger Cars and Robotaxis
In passenger cars, automation primarily operates at SAE Level 2, requiring constant driver supervision for systems like adaptive cruise control combined with lane centering, which were present in approximately 68.6% of globally sold vehicles meeting at least Level 1 criteria in 2024, with Level 2 dominating adoption by 2025.[64][65] Higher levels, such as SAE Level 3 where the vehicle handles all dynamic driving tasks under specific conditions, remain limited to select models like Mercedes-Benz's Drive Pilot, approved for use in limited U.S. states by 2023 but not widely deployed by October 2025.[2] Tesla's Full Self-Driving (FSD) software, marketed as a supervised Level 2 system, has seen iterative improvements from version 13 to 14 by late 2025, enhancing handling of complex urban scenarios, though it still requires driver attention and has not achieved regulatory approval for unsupervised operation nationwide.[66] Robotaxis, operating at SAE Level 4 for driverless service in geofenced areas, are led by Waymo, which by October 2025 maintains a fleet exceeding 1,500 vehicles providing fully autonomous rides in Phoenix, San Francisco, Los Angeles, and Austin, with over 71 million rider-only miles logged by March 2025.[67][68] Waymo's expansion includes plans for London in 2026 pending regulatory approval and testing in snowy conditions for East Coast U.S. cities, though it faces scrutiny from the NHTSA over incidents involving failure to yield to school buses.[69][70][67] Tesla aims to deploy unsupervised robotaxis in Arizona and other states by year-end 2025, starting with Model Y vehicles before introducing the Cybercab, but full autonomy remains unverified at scale.[71][72] Safety data indicates robotaxis outperform human drivers in controlled metrics; Waymo reports 96% fewer intersection crashes, 88% fewer injury crashes, and 92% fewer property damage claims per million miles compared to human benchmarks as of 2025.[68][73] Independent analyses confirm top robotaxi operators, excluding paused services, achieve around 86,000 miles between disengagements in 2023 data, with California's DMV recording 880 autonomous vehicle collisions by October 17, 2025, though severity is often lower than human-driven incidents.[74][75] GM's Cruise, once a competitor, suspended robotaxi operations in October 2023 following a pedestrian-dragging incident and ceased ride-hailing entirely by February 2025, highlighting regulatory and technical risks.[75][76][77] Challenges persist in scaling beyond geofenced zones, with robotaxi market leaders like Waymo and emerging Chinese operators driving projected growth to $174 billion by 2045, but incidents underscore the need for robust handling of edge cases like emergency vehicles or adverse weather.[78][79] Overall, while empirical evidence supports superior safety in routine operations, full unsupervised deployment in passenger cars lags behind robotaxi pilots due to liability, mapping precision, and regulatory hurdles.[80][81]Commercial Trucks and Freight
Autonomous trucks, operating at SAE Level 4 autonomy for freight hauling, primarily target long-haul and hub-to-hub routes on controlled highways to mitigate driver shortages, enable continuous operations, and enhance fuel efficiency through optimized routing and reduced human error.[82][83] In 2025, the sector saw initial commercial deployments in the United States, with companies focusing on retrofitting existing Class 8 trucks with modular hardware and AI-driven software stacks for perception, planning, and control.[84] The global autonomous truck market, valued at approximately $41.4 billion in 2024, is projected to reach $139.5 billion by 2033, growing at a compound annual rate of 13-16%, driven by demand for scalable freight solutions amid labor constraints.[82] Aurora Innovation launched the first commercial driverless freight service in the US on May 1, 2025, operating between Dallas and Houston, Texas, without human drivers, accumulating over 1,200 miles of unsupervised operation by that date.[85] The company expanded pilots with partners like Werner Enterprises, extending routes to a 1,000-mile corridor from Fort Worth to Phoenix by mid-2025, including night-time autonomous hauling for customers such as Hirschbach.[86][87] Similarly, Kodiak Robotics delivered its first factory-built autonomous truck in September 2025 to Atlas Energy Solutions and completed the initial customer-owned driverless deliveries in January 2025, transporting 100 loads of proppant across Texas routes.[88][89] These deployments leverage hub-to-hub models, where trucks operate autonomously between freight terminals on interstate highways, minimizing exposure to urban complexities.[90] Safety analyses indicate autonomous trucks exhibit lower crash risks than human-driven vehicles in comparable scenarios, with peer-reviewed studies reporting 0.457 times the risk for rear-end collisions and 0.171 times for broadside impacts relative to human-operated trucks.[80] This stems from consistent adherence to speed limits, reduced fatigue-related errors, and advanced sensor fusion enabling proactive hazard avoidance, though real-world data remains limited to pilot miles exceeding millions across fleets.[91] Operational efficiencies include potential 24/7 freight movement and up to 10-15% fuel savings from platooning and aerodynamic optimizations in convoy formations.[92] Regulatory progress supports scaling, with the US Federal Motor Carrier Safety Administration granting Aurora a renewable three-month waiver in 2025 for cabless operations, alongside proposed federal legislation like the America Drives Act to standardize interstate rules by 2027.[93][94] In Europe, the EU's type-approval framework enabled Einride's first Level 4 heavy-duty truck operation on public roads in Belgium in September 2025, focusing on cross-border testbeds starting 2026.[95][96] Challenges persist in state-level variances in the US and validation for adverse weather, but empirical pilot data underscores viability for freight corridors, with projections for broader adoption by 2030 contingent on accrued safety miles and policy harmonization.[97]Public Transit Systems (Buses, Shuttles, Trains)
Autonomous shuttles and buses in public transit primarily operate at SAE Level 4 automation within geofenced areas, enabling driverless operation on predefined routes such as campuses, airports, and urban loops, though widespread commercial deployment remains limited to pilots as of 2025.[59] These systems leverage LiDAR, cameras, and GPS for perception and navigation, often at low speeds under 25 km/h to mitigate risks in mixed traffic.[98] In July 2025, Jacksonville, Florida, launched the first fully autonomous public transit shuttle service in the US, deploying 14 electric vehicles along a 3.5-mile downtown route with 12 stations, operating weekdays without onboard operators.[99] Similarly, Hamburg, Germany, initiated an on-demand shuttle pilot in 2024 using 120 electric vehicles to connect underserved areas, demonstrating potential for flexible first- and last-mile connectivity.[100] For larger buses, automation focuses on retrofitting existing fleets for tasks like yard operations or supervised highway segments, with full driverless public routes still experimental. The US Federal Transit Administration's demonstration projects, ongoing since 2024, test automated docking and maneuvering in controlled environments to reduce labor costs and improve efficiency.[101] In September 2025, ADASTEC partnered with Beep to deploy autonomous buses in US cities, aiming for scalable operations beyond pilots by integrating advanced sensor fusion for urban navigation.[102] The global autonomous bus market is projected to grow by USD 2,877 million from 2025 to 2029 at a 22.4% CAGR, driven by demand in Europe and Asia for emission-free, 24/7 service, though high upfront costs for sensors and mapping limit adoption.[103] Driverless trains, operating under Grades of Automation (GoA) 3 or 4, have achieved higher maturity in metro systems, with communications-based train control (CBTC) enabling fully unattended operation since the late 1990s. Examples include Paris Metro Line 14, automated since 1998, and expansions in cities like Singapore and Copenhagen, where such systems report incident rates below 0.1 per million train-km due to redundant fail-safes and real-time monitoring.[104] These rail applications prioritize safety through fixed infrastructure and signaling, contrasting with road vehicles' dynamic environments, and have facilitated capacity increases of up to 30% via shorter headways.[105] However, transitioning mainline railways to unattended operations faces hurdles like legacy signaling integration and remote oversight needs.[106] Challenges across these systems include regulatory fragmentation, with approvals confined to low-risk zones, and public trust issues stemming from rare but publicized incidents in early trials.[107] Infrastructure demands, such as precise geofencing and V2I communication, add costs estimated at 20-50% above conventional vehicles, while algorithmic brittleness in adverse weather persists despite advancements.[108] Empirical data from pilots indicate operational uptime exceeding 95% for shuttles but highlight vulnerability to cyber threats and the need for hybrid human oversight in dense urban settings.[109] Despite these, automation promises labor savings—potentially reducing crew needs by 50% in buses—and enhanced reliability, contingent on standardized safety validations like ANSI/UL 4600 for system-level assurance.[110]
Applications in Aerial Vehicles
Delivery and Surveillance Drones
Autonomous delivery drones represent a subset of unmanned aerial vehicles (UAVs) designed for last-mile logistics, leveraging sensors such as LiDAR, GPS, and computer vision for navigation and obstacle avoidance. Companies like Zipline have achieved significant scale, surpassing 100 million commercial autonomous miles flown by March 10, 2025, primarily for medical supply deliveries in regions with limited infrastructure.[111] Wing, a subsidiary of Alphabet, completed over 450,000 deliveries by 2023, focusing on retail packages in suburban areas using beyond visual line of sight (BVLOS) operations approved under FAA frameworks.[112] The global delivery drones market, valued at USD 1.08 billion in 2025, is projected to reach USD 4.40 billion by 2030, driven by advancements in battery life and payload capacities up to 5-10 kg for most commercial models.[113] Regulatory progress has enabled wider deployment, with the FAA's August 2025 rulemaking allowing drones up to 1,320 pounds to operate BVLOS at altitudes below 400 feet, provided they incorporate detect-and-avoid systems and remote identification.[114] This builds on Part 135 certifications for package delivery, which require operators to obtain waivers for autonomous flights beyond visual range, addressing risks like mid-air collisions through redundant fail-safes.[115] However, challenges persist in urban environments, including signal interference from buildings, limited battery endurance restricting ranges to 10-20 km per flight, and vulnerability to weather conditions like high winds exceeding 20 mph.[116] Path planning algorithms, often based on deep reinforcement learning, struggle with dynamic obstacles such as birds or other aircraft, necessitating hybrid human oversight in current level 4 autonomy implementations.[117] Surveillance drones, employed in military and civilian contexts, utilize similar automation for persistent monitoring, with AI enabling real-time data analysis and target tracking without continuous human input. In military applications, systems like those integrated into U.S. Army operations feature autonomous swarming capabilities, where drones adapt to contested environments by processing sensor feeds on-board to evade jamming or identify threats.[118] Civilian uses include law enforcement for crowd monitoring and search-and-rescue, with UAVs equipped for thermal imaging and facial recognition, though autonomy levels typically remain at 3-4, requiring ground station confirmation for actions like loitering patterns.[119] Developments emphasize cost reductions, as autonomous operations minimize personnel needs compared to manned flights, but raise concerns over privacy intrusions and erroneous identifications from algorithmic biases in low-light or cluttered scenes.[120] Technical limitations in surveillance include detection vulnerabilities, as small drones evade traditional radar, prompting investments in counter-UAV technologies like RF spectrum analysis.[121] For both delivery and surveillance, adversarial threats—such as GPS spoofing or cyber intrusions—compromise navigation integrity, with studies highlighting the need for encrypted communications and multi-sensor fusion to maintain reliability in non-cooperative airspace.[122] Empirical data from deployments indicate safety improvements, with incident rates below 1 per 100,000 flights in controlled tests, yet scalability hinges on resolving payload constraints and extending operational autonomy amid evolving regulations.[123]Larger Autonomous Aircraft
Autonomous operations in larger fixed-wing aircraft, typically those capable of carrying significant cargo payloads or passengers beyond small unmanned aerial vehicles, have advanced primarily in military and cargo domains due to regulatory and safety constraints on civilian passenger flights. Companies like Reliable Robotics have demonstrated fully autonomous flight in Cessna 208 Cargomaster aircraft, including takeoff, navigation, and landing without human intervention, culminating in a U.S. Air Force contract awarded in September 2025 for a year-long operational test deploying the system for cargo resupply missions.[124] Similarly, Xwing has conducted over 700 autonomous flights in Cessna Caravan aircraft since 2020, focusing on cargo routes with remote pilots transitioning to full autonomy, emphasizing redundancy in sensors like LIDAR, radar, and cameras to handle diverse weather conditions.[125] Cargo applications leverage existing airframes retrofitted with autonomy kits, reducing pilot requirements and enabling operations in contested or remote areas. For instance, Aurora Flight Sciences, a Boeing subsidiary, has tested autonomous systems on larger platforms like the MQ-28 Ghost Bat for collaborative combat and cargo roles, integrating AI for real-time decision-making in swarming configurations.[126] These systems prioritize fault-tolerant architectures, where multiple flight computers cross-verify data from electro-optical/infrared sensors and GPS-denied navigation, achieving mean time between failures exceeding 10^9 hours in simulations validated against empirical flight data. Progress in electric or hybrid propulsion for larger cargo variants, such as Skyways' unmanned systems under a $37 million U.S. Air Force contract in June 2025, aims to scale payload capacities to several tons while minimizing operational costs by 50-70% compared to piloted equivalents.[127] Efforts toward autonomous passenger airliners remain in experimental phases, constrained by certification standards from bodies like the FAA and EASA, which demand probabilistic safety levels orders of magnitude higher than cargo operations. Airbus demonstrated fully autonomous taxiing, takeoff, and landing on an A350-1000 in October 2024 using the Iron Bird simulator and flight tests, incorporating vision-based systems for runway alignment with sub-meter precision.[128] Boeing has pursued similar autonomy via its HorizonX initiative, achieving initial unmanned flights in subscale models by 2025, but full-scale commercial deployment faces hurdles in human-machine teaming, such as single-pilot operations projected for the 2030s under relaxed EASA guidelines before regulatory retraction amid public safety concerns.[129] These advancements rely on machine learning models trained on billions of flight hours, yet real-world integration requires overcoming edge cases like bird strikes or sensor occlusion, with no operational passenger services certified as of 2025.[130]Applications in Water and Submersible Vehicles
Surface Autonomous Watercraft
Surface autonomous watercraft, also known as unmanned surface vehicles (USVs) or autonomous surface vessels (ASVs), are crewless boats or ships capable of operating on water surfaces using onboard sensors, navigation systems, and algorithms for tasks ranging from surveillance to data collection.[131] These systems typically employ levels of autonomy that include remote control, semi-autonomous waypoint following, and limited obstacle avoidance, but no USV achieves full autonomy without human oversight for mission planning, intervention, or contingency handling.[132] Developments in USVs date back over 25 years, with early milestones including ocean crossings and mine detection surveys by the mid-1990s, evolving into multi-week endurance missions by the 2020s.[133] In military applications, USVs support reconnaissance, mine countermeasures, and tactical operations, reducing risks to personnel in contested waters. The U.S. Navy's Large Unmanned Surface Vessel (LUSV) program achieved a 720-hour continuous diesel generator power demonstration in December 2023, validating engine resilience for extended unmanned deployments, followed by a similar engine system test in December 2024.[134] [135] DARPA christened the USX-1 Defiant, a prototype autonomous warship, in August 2025, designed to integrate with manned fleets for distributed lethality.[136] The U.S. Marine Corps adopted the Quickfish Interceptor USV after Pacific exercises in October 2025, a high-speed vessel for multi-week missions including surveillance and interdiction.[137] Commercial and scientific uses leverage USVs for oceanographic monitoring, fisheries assessment, and infrastructure inspection, where smaller vessels under 8 meters enable cost-effective, persistent data gathering in hazardous areas.[138] Examples include deployments for environmental sampling and cargo logistics trials, minimizing manned exposure to rough seas.[139] In intelligent waterborne transport, prototypes explore autonomous ferries and patrol boats, though scalability remains limited by regulatory and integration hurdles.[140] Operational challenges include ensuring collision avoidance in mixed human-USV traffic, where algorithmic limitations in dynamic environments pose safety risks, as evidenced by studies highlighting procedural gaps and the need for standardized protocols.[141] Cybersecurity vulnerabilities and reliance on GPS for navigation further complicate reliability, prompting ongoing U.S. Coast Guard assessments of regulatory frameworks as of August 2024.[142] Despite these, empirical tests demonstrate USVs' potential for efficiency gains in data-intensive tasks over traditional crewed vessels.[143]Underwater Autonomous Vehicles
Autonomous underwater vehicles (AUVs), also known as unmanned underwater vehicles (UUVs) in some contexts, are self-propelled, untethered submersibles that operate independently without real-time human control, using onboard batteries, sensors, propulsion systems, and pre-programmed or adaptive algorithms to navigate, sense environments, and perform missions.[144] Unlike remotely operated vehicles (ROVs), AUVs function beyond surface communication range, relying on acoustic modems or inertial/dead-reckoning navigation for positioning, which limits data transmission to low-bandwidth bursts.[145] Initial development occurred in the 1960s, with the SPURV (Self-Positioning Underwater Remote-controlled Vehicle) prototype tested by the University of Washington, marking the first documented AUV capable of programmed dives to 1,000 meters.[133] By the 1980s, specialized deep-sea AUVs emerged for military and scientific use, evolving into systems operable at depths up to 6,000 meters with mission durations from hours to months.[146] AUVs find primary applications in oceanographic research, where they map seafloors, monitor marine ecosystems, and collect water column data without surface vessel dependency; for instance, NOAA deploys AUVs for deep-sea exploration to image hydrothermal vents and biological communities at resolutions unattainable by manned submersibles.[144] In the oil and gas sector, survey-class AUVs conduct pipeline inspections and seabed mapping, with companies like C&C Technologies operating commercial units since the early 2000s for high-resolution multibeam sonar surveys covering thousands of square kilometers.[147] Military applications include mine countermeasures, anti-submarine warfare, and intelligence gathering; the U.S. Navy's Boeing Echo Voyager, an extra-large AUV exceeding 15 meters in length, demonstrated transoceanic autonomy in 2019, carrying payloads for persistent surveillance over weeks.[148] Fisheries management benefits from AUVs tracking fish stocks via acoustic sensors, as tested in programs enabling untethered operations for biomass estimation without disturbing habitats.[149] Recent advancements have extended AUV capabilities through AI-driven autonomy, enhanced battery endurance (e.g., lithium-polymer cells supporting 24+ hour missions), and miniaturized sensors for simultaneous bathymetry, chemical sampling, and imaging.[150] The Chinese Wukong AUV, developed by Harbin Engineering University, achieved a full-ocean-depth dive record of over 10,000 meters in 2023, integrating hybrid propulsion for energy-efficient gliding.[151] Navigation improvements, such as Doppler velocity logs fused with terrain-aided inertial systems, mitigate GPS unavailability underwater, enabling drift-corrected positioning errors below 1% of distance traveled in trials.[152] However, persistent challenges include acoustic communication latency (up to seconds per kilobyte), biofouling on hulls reducing hydrodynamic efficiency by 20-30% over multi-week deployments, and power constraints limiting payload capacity to 10-20% of vehicle mass.[145] High development costs, exceeding $1 million per unit for advanced models, restrict scaling, though modular designs are reducing this barrier for commercial adoption.[153]Empirical Achievements and Benefits
Safety Improvements from Real-World Data
Real-world deployments of autonomous driving systems (ADS) have demonstrated crash rates per mile that are often lower than those of human drivers in comparable scenarios, based on police-reported and self-logged data. For instance, Waymo's rider-only operations across multiple cities recorded a police-reported crash rate of 2.1 incidents per million miles (IPMM), compared to a human benchmark of 4.68 IPMM derived from similar urban rideshare environments, representing a 55% reduction.[154] [155] This benchmark accounts for factors like location, time of day, and vehicle type to ensure comparability. Similarly, Waymo data indicate an 85% lower rate of injury-involved crashes (0.41 IPMM) relative to human drivers (2.8 IPMM).[156] Tesla's vehicle safety reports, drawing from billions of accumulated miles, show Autopilot-engaged driving achieving one crash per 6.36 million miles in Q3 2025, versus a U.S. national average of approximately one crash per 670,000 miles for human-driven vehicles without advanced driver assistance systems (ADAS).[157] [158] This equates to roughly nine times fewer crashes per mile when Autopilot is active, though the data aggregates supervised use and relies on self-reported metrics without independent verification of causation in all cases. Full Self-Driving (Supervised) features contributed to over 1.3 billion miles driven in the same quarter, with crash frequency aligning with or exceeding Autopilot's safety margins.[159] Independent analyses reinforce these trends for select systems. A Swiss Re study of Waymo's operations found up to 92% fewer liability claims per mile than human-driven vehicles equipped with advanced safety features, attributing gains to consistent adherence to traffic rules and reduced human error factors like distraction or impairment.[160] NHTSA-mandated reporting under its Standing General Order captures over 3,900 ADS-involved incidents from 2019 to mid-2024, but per-mile rates remain favorable in scaled operations; for example, Waymo's 56.7 million autonomous miles in 2024 yielded crash reductions across nearly all severity categories relative to human baselines.[161] [73]| Autonomous System | Crash Rate (per Million Miles) | Human Benchmark (per Million Miles) | Reduction |
|---|---|---|---|
| Waymo ADS (all crashes) | 2.1 | 4.68 | 55% |
| Waymo ADS (injury crashes) | 0.41 | 2.8 | 85% |
| Tesla Autopilot | 0.157 (1 per 6.36M) | ~1.49 (national avg.) | ~89% |
Economic and Operational Efficiencies
Autonomous vehicles reduce operating costs by eliminating driver labor, which comprises about 43% of per-mile expenses in trucking operations.[164] Early deployments and models project total cost of ownership savings, with one analysis estimating monthly reductions of USD 2,399 for a 1-ton truck, USD 2,891 for a 5-ton truck, and USD 3,438 for a 12-ton truck through optimized routing and reduced idle time.[165] In fleet contexts, strategic management of automated electric vehicles has demonstrated up to 40% smaller fleet sizes and 70% less unnecessary cruising mileage, directly lowering capital and fuel expenditures.[166] Fuel and energy efficiencies arise from algorithmic control enabling smoother acceleration, consistent speeds, and minimized idling, yielding 6-10% consumption reductions in tested scenarios.[167] For autonomous trucks, platooning and load-optimized driving deliver 13-32% net energy savings per loaded mile relative to human-operated equivalents, based on simulations validated against real highway data.[168] These gains compound in high-utilization environments, where vehicles operate 24/7 without fatigue limits, increasing effective capacity by extending operational hours beyond human constraints.[169] Broader economic models forecast network-wide savings, such as 29-40% lower transportation costs from reduced accidents, congestion, and inefficiencies, potentially equating to USD 936 billion annually in the United States.[170][171] In public transit applications like autonomous shuttles, per-passenger-mile costs drop due to higher load factors and precise scheduling, though full-scale empirical validation remains limited to pilot programs as of 2025.[96] Such efficiencies hinge on scalable sensor fusion and mapping, with real-world trucking trials confirming viability in hub-to-hub corridors but highlighting upfront hardware costs as a counterbalance to long-term gains.[172]Environmental and Sustainability Gains
Autonomous truck platooning, enabled by vehicular automation, has demonstrated fuel savings of 4% to 10% in empirical field tests and simulations, primarily through reduced aerodynamic drag when vehicles maintain tight formations.[173][174] For instance, experiments with heavy-duty trucks at close gaps of 6 meters showed lead vehicles achieving 4.3% savings and trailing vehicles up to 10%, validated via computational fluid dynamics models against real-world data.[175] These operational efficiencies translate to lower greenhouse gas emissions during freight transport, a sector responsible for significant diesel consumption, though gains diminish with larger platoons beyond five vehicles due to control complexities.[176] In passenger vehicles, automation facilitates eco-driving behaviors such as optimized acceleration, deceleration, and routing, yielding emission reductions in controlled studies. Peer-reviewed models indicate potential decreases in CO2 emissions by up to 21% from improved fuel economy in internal combustion engine fleets, though real-world deployments remain limited to pilots showing modest 5-7% efficiency improvements via smoother traffic flow and reduced idling.[177][178] Integration with electric powertrains amplifies these benefits; simulations of widespread autonomous electric vehicle adoption project annual CO2 savings exceeding 5 megatons in urban settings, equivalent to 30% of light-duty vehicle emissions, by minimizing energy waste in propulsion and enabling shared fleet utilization that curtails total vehicle production needs.[179] System-level sustainability gains arise from congestion mitigation and vehicle sharing, where automation reduces empty miles and promotes higher occupancy rates. Empirical analyses of automated shuttles and buses report 10-15% lower per-passenger emissions compared to conventional counterparts, attributed to precise scheduling and platooning in transit corridors.[180] However, these benefits hinge on high automation penetration rates above 50% to overcome mixed-traffic inefficiencies, with low-penetration scenarios sometimes increasing local pollutants like NOx due to erratic merging behaviors in cautious algorithms.[181] Overall, while manufacturing demands for sensors elevate upfront embodied emissions, lifecycle assessments confirm net environmental positives from operational optimizations when rebound effects—such as induced vehicle miles traveled—are constrained by policy.[182]Technical Challenges and Limitations
Sensor and Algorithmic Constraints
Sensors in autonomous vehicles, including LIDAR, radar, and cameras, exhibit degraded performance under adverse weather conditions such as rain, fog, and snow, which scatter or attenuate signals and reduce detection reliability. LIDAR, reliant on laser pulses for 3D mapping, suffers backscattering in fog and heavy precipitation, limiting range to under 10-20 meters in dense conditions compared to over 100 meters in clear weather.[183] Cameras experience image blur, glare, and diminished contrast in rain or low light, impairing object classification and semantic segmentation accuracy by up to 50% in simulated severe weather tests.[184] Radar maintains better penetration through precipitation but generates noisy data with angular resolution below 1 degree, leading to frequent false positives from multipath reflections off wet surfaces or vehicles.[185] Efforts to fuse sensor data for redundancy encounter algorithmic hurdles, as synchronization errors and mismatched modalities can propagate uncertainties, particularly when one sensor dominates degraded inputs, resulting in incomplete environmental models. For example, in heavy rain, radar-LIDAR fusion may overlook small obstacles due to LIDAR dropouts, while camera inputs fail to calibrate precisely against radar velocity estimates.[186] These limitations persist despite multi-sensor suites, as real-time fusion requires resolving temporal offsets and noise models that scale poorly with increasing data volume, constraining deployment in regions with frequent inclement weather.[183] Perception algorithms, often based on convolutional neural networks, struggle with generalization to edge cases like atypical occlusions or adversarial perturbations, where models trained on clear-weather datasets exhibit error rates exceeding 20% in unseen scenarios.[187] Prediction modules falter in modeling human driver behaviors under uncertainty, relying on probabilistic models like Kalman filters or recurrent networks that underestimate rare multi-agent interactions, such as sudden merges in dense traffic.[188] Path planning faces NP-hard optimization challenges in dynamic environments, with sampling-based methods like RRT* or lattice planners requiring approximations that sacrifice optimality for real-time feasibility under 100 ms latency, often leading to conservative trajectories vulnerable to rear-end risks.[32] Real-world deployments highlight these constraints: the March 2018 Uber autonomous vehicle crash in Tempe, Arizona, involved sensors detecting a pedestrian but an algorithmic failure in the emergency braking classifier, which did not classify the object as imminent threat despite 1.3 seconds of visibility.[189] Similarly, Tesla Autopilot incidents, including a 2016 fatality, stemmed from camera-based perception misinterpreting a tractor-trailer as overhead signage due to lighting contrasts, underscoring algorithmic brittleness to contextual cues absent in training data.[190] Cruise vehicle interventions in San Francisco fog events in 2022 revealed planning errors where fused sensor data prompted hesitant maneuvers, increasing collision probabilities in low-visibility merges.[191] These cases demonstrate that while simulations aid testing, they inadequately capture causal chains from sensor noise to decision lapses in uncontrolled settings.[192]Cybersecurity and System Reliability
Autonomous vehicles rely on interconnected electronic control units, sensors, and over-the-air update mechanisms, introducing cybersecurity vulnerabilities such as remote code execution and sensor data manipulation. Demonstrations at Pwn2Own Automotive 2024 revealed multiple zero-day exploits in Tesla vehicle systems, including modem compromises that could enable unauthorized control, with researchers earning over $700,000 in bounties for 24 vulnerabilities across automotive targets.[193] These exploits highlight risks in vehicle-to-everything (V2X) communications and infotainment systems, where attackers could spoof signals to induce erroneous decisions, though real-world deployments often incorporate isolation layers to mitigate such threats.[194] Standards like SAE J3061 establish a lifecycle framework for cybersecurity engineering in cyber-physical vehicle systems, emphasizing threat modeling, risk assessment, and secure design from concept through production and maintenance.[195] The National Institute of Standards and Technology (NIST) supports this through testbeds for evaluating adversarial machine learning attacks on perception systems, aiming to quantify robustness against perturbed inputs like falsified lidar or camera data.[196] In the automotive sector, ransomware incidents targeting supply chains exceeded 100 in 2024, alongside over 200 data breaches, underscoring the need for segmented networks and encryption to prevent cascading failures in fleet operations.[197] System reliability in autonomous vehicles centers on achieving fault-tolerant architectures to handle software bugs, hardware degradations, and environmental interferences, with software failures comprising the majority of potential incidents due to their prevalence over mechanical faults.[198] Empirical disengagement data from California testing reports indicate variability: Waymo systems averaged approximately 13,000 miles per intervention in recent years, reflecting higher reliability in mapped urban environments, while Cruise and Tesla Full Self-Driving exhibited more frequent human takeovers, often due to edge-case perception errors.[199] [200] Early analyses of autonomous prototypes showed accident rates 15 to 4,000 times higher per mile than human drivers, attributable to immature handling of rare scenarios, though scaled deployments like Waymo's have logged billions of miles with incident rates below human benchmarks in controlled conditions.[201] Redundancy strategies, including diverse sensor fusion and failover computing clusters, target failure probabilities below 10^{-9} per hour for safety-critical functions, akin to aviation standards, to ensure graceful degradation rather than catastrophic loss.[202] Sensor failures, such as lidar occlusion or camera fogging, can propagate errors through perception algorithms, necessitating probabilistic validation and runtime monitoring to predict and avert disengagements.[203] Reliability growth models applied to software updates demonstrate potential for exponential safety improvements, as iterative testing reduces defect density, though empirical validation remains limited by the low volume of unsupervised miles driven to date.[198]Societal Impacts and Ethical Considerations
Labor Market Disruptions and Adaptation
The advent of vehicular automation poses significant risks to employment in transportation sectors, particularly affecting professional drivers. In the United States, the trucking industry employed 3.58 million professional drivers in 2024, representing a substantial portion of the workforce vulnerable to displacement as autonomous trucks gain traction.[204] Similarly, ride-hailing and taxi services, along with delivery roles, face automation pressures, with estimates suggesting up to 5 million nationwide jobs could be lost, including nearly all 3.5 million truck driving positions.[205] Projections indicate a rapid transition could eliminate over four million driving-related jobs, encompassing heavy truck, delivery, and related occupations, due to the scalability of autonomous systems in freight and passenger transport.[206] Empirical assessments highlight trucking as particularly susceptible, with studies modeling labor displacement under varying adoption scenarios and noting that current driver shortages—around 80,000 in 2025—may paradoxically accelerate automation incentives for carriers seeking efficiency.[207] While full-scale deployment remains limited as of 2025, pilot programs in controlled environments underscore the causal pathway from technological maturity to job reduction in routine, long-haul operations.[208] Adaptation may mitigate some impacts through job creation in ancillary fields, with autonomous vehicle technologies projected to generate over 110,000 U.S. jobs by supporting roles in maintenance, software oversight, and fleet management.[209] Broader economic modeling forecasts up to 2.4 million net new jobs and a $214 billion GDP boost from enhanced productivity, though these gains hinge on skill mismatches being addressed.[171] Net employment effects appear modest long-term, with modeled unemployment rises of only 0.06 to 0.13 percent between 2045 and 2055, reflecting historical patterns where automation displaces specific roles but spurs demand in others.[210] Nonetheless, transitional disruptions could exacerbate inequality, as driving jobs often provide above-average wages for non-college-educated workers compared to alternative manual occupations.[211] Retraining initiatives focus on upskilling drivers for hybrid roles, such as monitoring semi-autonomous systems or transitioning to AV operations, with programs like certificates in autonomous vehicle specialist training emphasizing sensor integration and safety protocols.[212] [213] Surveys of truck drivers reveal perceived retraining feasibility, yet empirical analyses of alternative careers—e.g., logistics coordination or light assembly—indicate wage penalties and limited transferability for older workers.[214] [215] Policy responses, including targeted vocational programs, remain nascent, underscoring the need for causal interventions to bridge skill gaps amid uneven adoption timelines.[216]Liability Frameworks and Moral Algorithms
As autonomous vehicle (AV) technology advances toward higher levels of automation (SAE Levels 4 and 5), liability frameworks are evolving from driver negligence-based systems to manufacturer-centered product liability models. In conventional vehicles, responsibility typically rests with the human driver under negligence principles, where fault is assessed based on reasonable care. For fully autonomous systems, however, defects in software, sensors, or algorithms shift accountability to original equipment manufacturers (OEMs) or suppliers, treating AVs as complex products akin to defective machinery. This transition, including for SAE Level 3 conditional automation where drivers must remain ready to intervene, influences insurance models by potentially reducing premiums through fewer human-error-related accidents, shifting liability from drivers to manufacturers for system failures, and evolving coverage to address risks within the operational design domain.[217][218] This transition aims to incentivize robust design and testing, as evidenced by U.S. product liability precedents applying strict standards to foreseeable risks in automated systems.[219][220] Legal developments vary by jurisdiction, creating a patchwork of regulations. In the United States, no comprehensive federal framework exists as of 2025, with states like California and Arizona permitting AV testing under specific reporting requirements, while liability often defaults to existing tort law unless overridden by legislation. The United Kingdom's Automated Vehicles Act, enacted in May 2024 and receiving royal assent in June 2024, explicitly assigns liability to the vehicle's insurer or authorized self-driving entity during autonomous operation, shielding users from civil claims if the system was functioning as intended. In the European Union, directives emphasize type approval and cybersecurity, with product liability directives (updated via the 2022 proposal) extending to AI-driven harms, though enforcement remains fragmented across member states. These frameworks prioritize causation tracing, but opaque machine learning "black boxes" complicate proving defects, prompting calls for enhanced explainability standards.[221][222][223] Debates center on whether to adopt strict liability—holding manufacturers accountable regardless of fault—or retain negligence-based tests calibrated to AV capabilities versus human benchmarks. Proponents of strict liability argue it accelerates innovation by internalizing rare but high-impact risks, given AVs' projected safety superiority (e.g., Waymo's 2023 data showing 85% fewer injury-causing crashes per million miles compared to human drivers). Critics, including some legal scholars, warn it could stifle deployment by imposing undue burdens, advocating hybrid models that compare AV performance to a "reasonable algorithm" standard rather than human drivers. Empirical studies indicate consumers intuitively assign higher blame to AVs even in no-fault scenarios, potentially influencing jury outcomes and insurance models.[224][220][225] Moral algorithms address ethical dilemmas in AV decision-making, particularly in unavoidable collisions reminiscent of the trolley problem, where systems must prioritize outcomes like minimizing fatalities. Unlike human drivers, AVs employ rule-based or machine learning systems programmed to adhere to traffic laws, predict trajectories via sensors (e.g., LiDAR, radar), and optimize for harm reduction, such as braking or swerving to protect occupants and vulnerable road users. Real-world implementations, as in Tesla's Full Self-Driving or Waymo's systems, prioritize de-escalation through redundancy and probabilistic modeling, rendering explicit "moral choices" rare; a 2023 North Carolina State University study found that realistic traffic scenarios emphasize avoidance over binary trade-offs, critiquing trolley hypotheticals as unrepresentative of AV engineering.[226][227] The MIT Moral Machine experiment, launched in 2016 and analyzing over 40 million decisions from 233 countries by 2018, revealed cultural variances in preferences: participants globally favored saving more lives (utilitarian bias) and pedestrians over passengers, but Western respondents prioritized youth and pets less than Eastern ones, highlighting risks of ethnocentric algorithms if trained on biased data. No universal ethical framework has emerged; utilitarian approaches (maximizing net welfare) conflict with deontological rules (e.g., never harm innocents), and legal systems like Germany's 2022 Ethics Commission guidelines mandate equal value for all lives without demographic discrimination. Peer-reviewed analyses underscore that embedding such preferences could expose manufacturers to liability if decisions deviate from statutes, as courts may deem non-law-compliant algorithms defective.[228][229][230] Integration of moral considerations into liability remains nascent, with algorithms tested via simulations but real deployments relying on verifiable safety metrics over philosophical ethics. For instance, the EU's 2024 AI Act classifies high-risk AV systems under transparency mandates, requiring documentation of decision processes to apportion fault. Ongoing research, including a 2025 Nature study on dilemma resolution, proposes hybrid models blending data-driven learning with causal oversight to ensure decisions align with empirical risk minimization rather than subjective morals, mitigating biases from training datasets. Ultimately, frameworks must balance innovation with accountability, as unresolved ethical tensions could prolong regulatory delays despite AVs' demonstrated empirical safety gains over human error, which causes 94% of U.S. crashes per NHTSA data.[231][232]Equity, Access, and Urban Planning Effects
Autonomous vehicles (AVs) hold potential to enhance transportation equity by expanding mobility options for low-income and mobility-impaired individuals through shared ride-hailing services, which simulations indicate could increase access to jobs and amenities compared to privately owned human-driven vehicles, particularly in underserved urban areas.[233] However, early deployments and modeling also reveal risks of exacerbating socioeconomic disparities, as initial AV services have prioritized affluent neighborhoods, potentially creating a "mobility divide" where low-income communities face reduced public transit funding and increased congestion without proportional benefits.[234] [235] Policy analyses emphasize that without targeted regulations—such as subsidies for low-income access or mandates for equitable service coverage—AV adoption could widen gaps, with wealthier users gaining efficiency gains while others encounter higher costs or exclusion due to data privacy concerns and algorithmic biases in routing.[236] [237] Access improvements from AVs are most pronounced for vulnerable populations, including the elderly and disabled, where pilot programs demonstrate reduced reliance on caregivers and expanded reach to medical and recreational facilities; for instance, AV shuttles have enabled independent travel in controlled environments, addressing barriers faced by the 13% of U.S. adults with mobility disabilities who report transportation as a primary obstacle.[238] [239] Yet, recent design studies highlight persistent accessibility shortcomings, such as inadequate interior adaptations for wheelchair users or sensory impairments, with surveys showing divergent needs between disabled and non-disabled users that current prototypes often overlook, potentially limiting broad adoption without iterative engineering focused on universal design.[240] Empirical data from limited real-world tests, like those in rural or aging communities, suggest AVs could mitigate isolation by operating 24/7 without human drivers, but trust remains low—fewer than one-third of Americans believe AVs outperform human drivers in safety for such groups—necessitating education on social benefits to boost acceptance.[241] [242] In urban planning, AVs are projected to transform land use by slashing parking demands—potentially reclaiming up to 20-30% of city space currently devoted to vehicle storage—and enabling repurposed areas for housing, parks, or commerce, as evidenced by simulations showing reduced vehicle ownership rates from 90% to as low as 10% in shared fleets.[243] [244] This shift could foster denser, walkable cities with streets reallocated for pedestrians and cycling, lowering congestion by 9-20% in modeled high-density scenarios through optimized routing and platooning.[245] [246] Counterarguments from longitudinal studies warn of induced sprawl, where cheaper, efficient AV travel extends commutes to suburbs, increasing vehicle miles traveled by 10-60% and straining peripheral infrastructure unless countered by zoning reforms prioritizing mixed-use development.[247] Overall, these effects hinge on integrated planning, with Brookings analyses indicating AVs could enhance urban sustainability if paired with policies curbing empty-mile trips and promoting multimodal integration over car-centric expansion.[246]Regulatory Environments
National and International Standards
The United Nations Economic Commission for Europe (UNECE) World Forum for Harmonization of Vehicle Regulations (WP.29) serves as the primary international body developing harmonized standards for automated vehicles, with its Working Party on Automated/Autonomous and Connected Vehicles (GRVA) drafting regulations on aspects such as cybersecurity (UN Regulation No. 155, effective from 2022) and over-the-air software updates (UN Regulation No. 156, adopted in 2021).[248][249][250] WP.29's Intelligent Transport Systems (ITS) program, initiated in 2015, facilitates global alignment on automated driving requirements, including performance validation and data recording for incident analysis.[251] Influential classification frameworks include SAE International's J3016 standard, which defines six levels of driving automation from Level 0 (no automation) to Level 5 (full automation without human intervention), providing a taxonomy adopted by regulators worldwide for specifying system capabilities and driver responsibilities.[7][252] Complementing this, ISO 26262 outlines functional safety requirements for electrical/electronic systems in road vehicles, mandating hazard analysis, risk assessment, and mitigation to address malfunctions in automated components, with applicability extended to higher automation levels via related standards like ISO 21448 for intended functionality safety.[253][254][255] Nationally, the United States' National Highway Traffic Safety Administration (NHTSA) maintains Federal Motor Vehicle Safety Standards (FMVSS) adaptable to automation, issuing exemptions under 49 CFR Part 555 for non-compliant AV prototypes and, in April 2025, launching an AV Framework to modernize rules, streamline approvals, and impose reporting on crashes involving Level 2+ systems via its Standing General Order.[256][257] In the European Union, Regulation (EU) 2022/1426 enables type-approval of Automated Driving Systems (ADS) for fully automated vehicles in limited series, building on the 2019 Vehicle General Safety Regulation to set performance criteria for systems up to Level 4, with harmonized requirements under Framework Regulation (EU) 2018/858.[258][259] China's Ministry of Industry and Information Technology (MIIT) advanced standards in 2025 by issuing mandatory safety requirements for intelligent vehicles and ethical guidelines prohibiting deceptive data in AV development, alongside national taxonomy GB/T 40429-2021 aligning with SAE levels and Beijing's local rules effective April 2025 for testing and deployment of Level 3+ systems.[260][261][262] These standards reflect a trend toward convergence on safety validation but diverge in emphasis, with WP.29 promoting global reciprocity while national bodies prioritize domestic testing protocols and liability thresholds.Evolving Policies and Deployment Hurdles
Policies on vehicular automation have evolved from restrictive testing frameworks to more permissive deployment models, driven by technological advancements and economic incentives, though regulatory fragmentation persists across jurisdictions. Issuing licenses for SAE Level 3 systems enables conditional automation within defined operational design domains, reducing driver workload by allowing hands-off and eyes-off operation during system engagement while requiring driver readiness to intervene, thereby accelerating real-world safety data collection and facilitating gradual innovation toward higher autonomy levels.[7][263] In the United States, the National Highway Traffic Safety Administration (NHTSA) has streamlined Federal Motor Vehicle Safety Standards (FMVSS) exemptions, allowing manufacturers to produce up to 2,500 non-compliant vehicles annually for automated driving systems (ADS) as of September 2025, a policy extended to facilitate commercial scaling without full redesigns for steering wheels or mirrors.[256] This builds on prior exemptions, with NHTSA granting its first for American-built vehicles in August 2025 and adding rulemakings to clarify ADS standards, reflecting a federal push under Transportation Secretary Sean P. Duffy to modernize outdated rules originally designed for human-driven cars.[264] [265] State-level policies, such as California's Department of Motor Vehicles (DMV) program, exemplify layered regulation: as of September 12, 2025, 28 entities hold testing permits requiring a human driver, with over 4 million autonomous miles logged from December 2023 to November 2024, prompting proposed updates for driverless operations and heavy-duty vehicles.[266] [267] These evolutions address dual federal-state oversight, where states handle registration and operation while NHTSA sets safety baselines, but inconsistencies—such as varying disengagement reporting—complicate national deployment.[268] Internationally, China's policies emphasize rapid commercialization, targeting 50% of new vehicle sales with automation features by 2025 through infrastructure investments like 5G networks and subsidies, enabling leaders like Baidu to expand robotaxis ahead of Western competitors.[269] In the European Union, frameworks encourage testing via harmonized type-approval under UNECE regulations, yet national variations and a focus on risk assessment slow full deployment compared to China's state-backed approach.[223] [270] Deployment hurdles stem primarily from regulatory uncertainty and liability gaps, where absent unified standards force companies to navigate patchwork rules, delaying scalability; for instance, post-2023 Cruise incidents in California led to temporary permit suspensions, heightening scrutiny on safety data reporting.[163] Cybersecurity mandates and data privacy concerns add layers, as vehicles generate vast telemetry requiring secure handling without clear federal precedents, while insurance frameworks lag in assigning fault to algorithms over drivers.[271] Uneven global harmonization exacerbates cross-border challenges, with calls for international alignment to avoid innovation silos, though precautionary stances in some regions prioritize edge-case risks over empirical safety gains from millions of test miles.[223] [272] Public trust erosion from high-profile failures further impedes policy flexibility, necessitating evidence-based benchmarks comparing AV disengagements to human error rates exceeding 90% of crashes.[163]Controversies and Debates
High-Profile Failures and Risk Assessments
On March 18, 2018, an Uber autonomous test vehicle struck and killed pedestrian Elaine Herzberg in Tempe, Arizona, marking the first known pedestrian fatality involving a self-driving car. The National Transportation Safety Board (NTSB) investigation found that the vehicle's sensors detected Herzberg six seconds before impact but failed to classify her as a pedestrian or initiate braking, while the human safety operator was distracted by streaming video on a phone. Uber suspended its self-driving program nationwide following the incident.[273] Tesla's Autopilot system, a partial automation feature requiring driver supervision, has been linked to multiple fatal crashes investigated by the National Highway Traffic Safety Administration (NHTSA). As of April 2024, NHTSA documented 211 crashes involving Autopilot, resulting in 13 fatalities and 14 deaths total, often due to drivers misusing the system by failing to keep hands on the wheel or eyes on the road. Patterns included collisions with stationary emergency vehicles and motorcycles, where the system did not adequately detect or respond. Tesla reports lower crash rates per mile with Autopilot engaged compared to without (one crash per 7.63 million miles versus 1.71 million miles in Q3 2024), but NHTSA probes continue due to recurring misuse and system limitations in low-visibility or complex scenarios.[274][158] In October 2023, a Cruise robotaxi in San Francisco struck a pedestrian who had been thrown into its path by a hit-and-run driver, then dragged her approximately 20 feet while attempting to pull over, causing serious injuries. The vehicle failed to detect the pedestrian underneath it post-impact, leading NHTSA to allege Cruise submitted a misleading injury report to downplay severity and influence the investigation; Cruise agreed to a $1.5 million penalty and admitted fault in federal proceedings. The incident prompted California to suspend Cruise's driverless permits, halting operations nationwide and highlighting sensor detection flaws in dynamic collision aftermaths.[275] Risk assessments of vehicular automation reveal mixed outcomes relative to human drivers, with autonomous vehicles (AVs) demonstrating lower involvement in certain crash types but challenges in others due to immature technology and limited real-world mileage. A 2024 matched case-control study analyzing over 35,000 human-driven versus AV accidents found AVs had 54% lower odds of rear-end crashes and 83% lower for broadside impacts, attributing advantages to consistent sensor-based reactions absent human errors like distraction, which NHTSA estimates cause 94% of overall traffic fatalities. However, AVs showed elevated risks in lane-change and turning maneuvers, often from algorithmic hesitancy or occlusion handling. Waymo's self-reported data through 25 million autonomous miles indicated 92% fewer bodily injury claims and 88% fewer property damage claims than comparable human benchmarks, per a Swiss Re analysis, though company data warrants independent verification.[80][276] Aggregate AV incident data from 2019-2024 logs 3,979 crashes with 496 injuries or fatalities, but per-mile rates remain inconclusive given AVs' fractional exposure to the 3.2 trillion annual U.S. miles driven by humans, where fatality rates hover at 1.33 per 100 million miles. Critics note that high-profile failures amplify perceived risks, yet empirical trends suggest AVs could reduce systemic errors if scaled, though edge-case vulnerabilities persist without billions more test miles.[163]Overstated Fears vs. Human Driver Benchmarks
Public apprehension toward vehicular automation often amplifies rare incidents involving autonomous systems while underemphasizing the pervasive risks posed by human drivers, who are responsible for approximately 94% of road crashes according to analyses of U.S. National Highway Traffic Safety Administration (NHTSA) data. A January 2026 Financial Times opinion piece titled "Europe doesn't need driverless cars" argued that Europe lacks a pressing need for autonomous vehicles, as human drivers already suffice for safe roadways.[277] This perspective drew criticism on platforms like X, with transportation commentator Sawyer Merritt highlighting that an average of about 60 people die daily in Europe from vehicle-related accidents to advocate for AV adoption, aligning with European Commission data reporting 19,940 road fatalities in the EU for 2024 (approximately 55 per day).[278] In 2023, the U.S. recorded a traffic fatality rate of 1.26 deaths per 100 million vehicle miles traveled (VMT), with preliminary 2024 estimates at 1.20, reflecting persistent human-error-driven vulnerabilities such as distraction, impairment, and poor judgment.[279] These benchmarks underscore that human-operated vehicles experience crash rates around 4.1 per million miles, a figure dwarfed by the performance of leading autonomous systems in empirical testing.[280] Autonomous vehicles, by contrast, demonstrate substantially lower involvement in accidents when benchmarked against human drivers. Waymo's fleet, operating over 56.7 million rider-only miles as of mid-2025, reported an 85% reduction in overall crash rates—equating to 6.8 times fewer incidents—compared to human benchmarks in similar urban environments.[281] Independent analyses, including a 2024 peer-reviewed study using California DMV data, found autonomous driving systems (ADS) had lower odds of accidents in most scenarios, with 82% of ADS-involved crashes resulting in minor injuries versus 67% for human-driven vehicles striking ADS-equipped ones.[282] A Swiss Re Institute evaluation of Waymo operations further quantified an 88% drop in property damage claims and 92% in bodily injury claims relative to human drivers, attributing gains to the elimination of fatigue, distraction, and aggression.[283] This disparity highlights overstated fears, as public and regulatory scrutiny fixates on autonomous anomalies—such as the 464 Waymo incidents reported to NHTSA through August 2025, many initiated by human drivers—while normalizing the annual toll of over 40,000 U.S. fatalities from human error.[284] Psychological studies reveal a bias wherein autonomous vehicles attract disproportionate blame for equivalent faults; in hypothetical scenarios, participants cited the non-at-fault AV 43% of the time versus 14% for human-driven vehicles.[285] Such perceptions persist despite evidence that AVs reduce severe crashes by up to 96% in intersections, a common human failure point per NHTSA.[286] Projections suggest widespread AV adoption could avert 34,000 U.S. road deaths annually by surpassing human safety thresholds.[287]| Metric | Human Drivers | Waymo AV (Rider-Only Miles) |
|---|---|---|
| Crash Rate Reduction | Baseline | 85% lower (6.8x fewer crashes)[156] |
| Property Damage Claims | Baseline | 88% reduction[283] |
| Bodily Injury Claims | Baseline | 92% reduction[288] |
| Injury-Involving Intersection Crashes | Leading cause (NHTSA) | 96% fewer[286] |