Abstract
With high definition, high resolution, technology becoming ever more popular, the vast amount of input available to modern object recognition systems can become overwhelming. Given an image taken from a high resolution digital camera, a target object may be very small in comparison to the entire image. Additionally, any non-target objects in the input are considered unnecessary data, or clutter. While many modern object recognition systems have been created to be over 90% accurate in the recognition task, adding large amounts of clutter to an input quickly degrades both the speed and accuracy of many models. To reduce both the size and amount of clutter in an input, a biologically inspired focus of attention model is developed. Utilizing biologically inspired feature extraction techniques, a feature based saliency model is built and used to simulate the psychological concept of a "mental spotlight". The simulated "mental spotlight" searches through each frame of a video, focusing on small sub-regions of the larger input which are likely to contain important objects that need to be processed in further detail. Each of these interesting sub-regions are then able to be used as input by a modern object recognition system instead of raw camera data, increasing both the speed and accuracy of the recognition model.
Library of Congress Subject Headings
Computer vision; Optical pattern recognition; Attention--Computer simulation; Image processing--Digital techniques
Publication Date
2008
Document Type
Thesis
Department, Program, or Center
Computer Science (GCCIS)
Advisor
Gaborski, Roger
Advisor/Committee Member
Tymann, Paul
Advisor/Committee Member
Borrelli, Thomas
Recommended Citation
Harris, Daniel I., "A biologically inspired focus of attention model" (2008). Thesis. Rochester Institute of Technology. Accessed from
https://repository.rit.edu/theses/261
Campus
RIT – Main Campus
Comments
Note: imported from RIT’s Digital Media Library running on DSpace to RIT Scholar Works. Physical copy available through RIT's The Wallace Library at: TA1634 .H37 2008