Quantifying the Impacts of Situational Visual Clutter on Driving Performance Using Video Analysis and Eye Tracking
Visual clutter and its impact on driving performance have been widely acknowledged. Visual clutter has been taxonomically categorized into three types, including 1) the “situational clutter” that is sourced from the interaction among the driver, the vehicle, other road users, and the road infrastructure; 2) the “designed clutter” that is sourced from the existing traffic control devices, e.g., signage, signal, work zone, etc., and 3) the “built clutter” that is sourced from other roadside and roadway objects, e.g., billboard, roadside landscapes, etc. The impacts of both the designed clutter and the built clutter have been investigated using both naturalistic driving measures (including driving clips, vehicle status, etc.) and driving simulator measures (including driving clips, vehicle status, eye tracking measures, etc.). Unfortunately, the situational clutter remains an open question, although such a clutter type is considered to play a more lasting and profound role in impacting the driver’s performance. The challenges in investigating the situational clutter are sourced from its complicated constitution of different contributors (e.g., vehicle, other road users, the road infrastructures, etc.) and its dynamically changing manner (e.g., dashboard display, traffic conditions and outlooks of the vehicles, dynamic road, and roadside landscapes, etc.). Although the psychology and cognitive science communities have investigated the situational visual clutter, there lacks effort in studying it in the driving context. The proposed study aims to address such a gap. The objective of this proposed study is threefold: 1) to develop a new situational visual clutter model that objectively quantify the complex and dynamic driving scene based on eye tracking and video analysis; 2) to employ the developed model, and to quantify impact of the situational visual clutter on driving performance under an information searching scenario and a driving distraction scenario using driving simulation; and 3) to investigate the potential of employing the driving scene quantification to support other retrospective studies and data mining using the existing driving simulation data.
- Record URL:
-
Supplemental Notes:
- Received 6 mo. no-cost extension due to COVID-19. Received additional 3 mo. no-cost extension. Additional analysis planned.
Language
- English
Project
- Status: Active
- Funding: $52500
-
Contract Numbers:
69A3551747131
-
Sponsor Organizations:
Office of the Assistant Secretary for Research and Technology
University Transportation Centers Program
Department of Transportation
Washington, DC United States 20590 -
Managing Organizations:
University of Iowa, Iowa City
National Advanced Driving Simulator, 2401 Oakdale Blvd
Iowa City, IA United States 52242-5003 -
Performing Organizations:
University of Massachusetts, Amherst
Department of Civil and Environmental Engineering
130 Natural Resources Road
Amherst, MA United States 01003 -
Principal Investigators:
Ai, Chengbo
Knodler, Michael
- Start Date: 20190901
- Expected Completion Date: 20240630
- Actual Completion Date: 0
- USDOT Program: University Transportation Centers
Subject/Index Terms
- TRT Terms: Data mining; Distraction; Driver performance; Driving simulators; Eye movements; Video
- Subject Areas: Highways; Safety and Human Factors;
Filing Info
- Accession Number: 01699004
- Record Type: Research project
- Source Agency: Safety Research Using Simulation University Transportation Center (SaferSim)
- Contract Numbers: 69A3551747131
- Files: UTC, RIP
- Created Date: Mar 20 2019 9:29AM