Abstract

Modern autonomous driving is undergoing a period of rapid evolution. Image processing, computer vision, and remote sensing, among others, have played crucial roles in the creation of safe and reliable robotic automobiles. However, some robotic vehicles still have difficulties in their driving mechanisms even though the current mainstream autonomous driving approaches seem to be effective in fulfilling driving duties. There have been several high-profile incidents in the early years of testing that have prompted some to question whether or not all relevant domains have been explored in order to attain human-like driving. The current sensor technique based on machine learning needs training miles logged while actually driving. The public's safety is at risk, however, during on-road testing. Furthermore, sensor data is not as effective as a vigilant human driver who has intelligent gaze behavior to promote safe driving over extended distances. The use of human-inspired algorithms to decide when and where to sample the visual world may allow autonomous cars to digest data more efficiently. Thus, I believe that a driving simulation platform strikes a good middle ground between the need of data collection on human drivers and the ethical considerations inherent in collecting data when people are really driving. The combination of human perception and autonomous driving may be studied with the use of an integrated eye-tracker in the platform. The goal of this dissertation is to build a VR driving simulation platform that can be used to investigate a wide range of issues. This environment is a virtual reality 3D engine Unity for HTC Vive head-mounted display (HMD), providing a realistic driving experience. I created a simulated driving environment similar to a tunnel to evaluate the fidelity of this platform in real-world conditions. The platform's data was compared to previously published on-road studies, such as driving speed and lateral lane position. To prove our platform's use as a research tool, I compare the results of this driving simulator with those of real-world road testing. In addition, a Tobii eye-tracker has been included in the HMD to record the drivers' eye movements, including their 3D gaze point of regard (POR), during the evaluation. The data set were assessed for the correlations with driving behaviors. Contributions of this study consist of the following parts: 1) developed an affordable, modular, and realistic simulation platform that is capable of being utilized and extended to multiple research fields, such as perception, psychology, animation, game design, computer science; 2) built the connection between driving behaviors and the gaze database; 3) established component models that benefit researchers for future usage.

Library of Congress Subject Headings

Automobile driving simulators--Computer programs; Virtual reality; Eye tracking

Publication Date

5-2023

Document Type

Dissertation

Student Type

Graduate

Degree Name

Imaging Science (Ph.D.)

Department, Program, or Center

Chester F. Carlson Center for Imaging Science (COS)

Advisor

Susan P. Farnand

Advisor/Committee Member

Rebecca Houston

Advisor/Committee Member

Jeff B. Pelz

Campus

RIT – Main Campus

Plan Codes

IMGS-PHD

Share

COinS