Abstract

Eye tracking has proven to be a valuable tool for a number of disciplines such as imaging science, vision science, psychology, and neurology. They have used eye-tracking to study the human visual system and to understand how humans attend to their environment for different tasks. Analyzing the gaze data obtained from eye-trackers begins with visualizing the gaze point. By monitoring gaze we can answer questions including, ‘where was the person looking?’ and ‘what image region or object was the person looking at?’ Traditional methods aimed at answering such questions from gaze data: required frame-by-frame analysis of videos, manually annotating gaze. In this dissertation, I describe a new automated tool that aids and simplifies analysis of gaze data from one or more observers performing complex tasks in natural environments. I implemented a number of analysis pipelines that process raw gaze video, build a model of the environment, calculate 3D gaze from 2D gaze video, summarize each observer’s 3D gaze interaction with the environment, and compute the observer’s 3D location in the environment. The final pipeline includes an advanced visualizer that allows one to play and pause multi-observer interaction data like a video.

Library of Congress Subject Headings

Eye tracking--Data processing; Computer vision; Neural networks (Computer science); Information visualization

Publication Date

3-28-2023

Document Type

Dissertation

Student Type

Graduate

Degree Name

Imaging Science (Ph.D.)

Department, Program, or Center

Chester F. Carlson Center for Imaging Science (COS)

Advisor

Jeff Pelz

Advisor/Committee Member

Gabriel Diaz

Advisor/Committee Member

Carl Salvaggio

Campus

RIT – Main Campus

Plan Codes

IMGS-PHD

Share

COinS