Book chapter
Can Drivers Construct Accurate Understanding of Tesla’s Autopilot
Advances in Human Factors of Transportation, pp.159-170
Applied Human Factors and Ergonomics International, v. 186, AHFE
2025
DOI: 10.54941/ahfe1006505
Abstract
More car companies are working on integrating partial automated driving capabilities into their cars. Drivers can switch vehicle control to the automated system on designated highways by using partial automated driving mode. The system uses technologies like adaptive cruise control, lane centering, and driver-monitoring to guarantee that drivers stay attentive to the road while supervising the automation. The promise is that partial automated driving will increase road safety. Although the system can take over some driving tasks, it still needs human supervision and intervention, and it is restricted to designated highways. The system should be monitored by humans actively, with the intention of intervening when necessary. The level of trust drivers built into the system affects safe monitoring and taking back control when necessary. Creating the right level of trust in a system requires shaping a correct mental model of the system. To achieve this, it's necessary to comprehend how the system reacts in various scenarios. Our research analyzed the drivers' mental models of Tesla's autopilot in various situations that are likely to be confusing or have caused car accidents with Tesla in the past. The drivers were unfamiliar with autonomous driving and had no previous experience with adaptive cruise control.Method: We conducted an experiment with 10 individuals who drove Tesla S equipped with partial automation known as Autopilot for a week while commuting daily. The subjects took part in an open-ended interview at the end of the week to unfold their mental models. The interviews were recorded and transcribed for thematic analysis. Results: Although the majority of participants' initial mental models were in line with the system's capabilities, there are concerns about mental models related to Tesla autopilot limitations, particularly in city driving, based on our findings. Two safety-critical themes were identified in the interviews: misunderstandings of the autopilot's limitations and misunderstandings of the system's purpose. Misunderstandings regarding limitations included all statements and sub-themes indicating the participants' confusion about situations where Tesla's autopilot can't be used. Misunderstandings of the system purpose included statements expressing participants' misunderstanding of their monitoring role in autonomous systems as supervisors. Our findings suggest that new drivers should be trained in autonomous driving to increase safety.
Details
- Title: Subtitle
- Can Drivers Construct Accurate Understanding of Tesla’s Autopilot
- Creators
- Hugh P. Salehi - Driving Safety Research Institute, The University of Iowa, Iowa City, IA, United StatesJohn Gaspar - University of IowaCher Carney - University of IowaDaniel McGehee - University of Iowa
- Resource Type
- Book chapter
- Publication Details
- Advances in Human Factors of Transportation, pp.159-170
- Series
- Applied Human Factors and Ergonomics International; v. 186
- DOI
- 10.54941/ahfe1006505
- eISSN
- 2771-0718
- ISSN
- 2771-0718
- Publisher
- AHFE
- Grant note
- Toyota Collaborative Safety Research Center (http://data.elsevier.com/vocabulary/SciValFunders/100019690)
- Language
- English
- Date published
- 2025
- Academic Unit
- The National Advanced Driving Simulator; Occupational and Environmental Health; Emergency Medicine; Driving Safety Research Institute; Industrial and Systems Engineering; Center for Social Science Innovation; Injury Prevention Research Center
- Record Identifier
- 9985141987702771
Metrics
8 Record Views