XWe have detected your location as outside the U.S/Canada, if you think this is wrong, you can choose your location.

Macmillan Higher Education Palgrave Higher Education

Eye Tracking Methodology

Theory and Practice

Edition 3rd Edition
ISBN 9783319578811
Publication Date June 2017
Formats Paperback Ebook 
Publisher Springer

This book focuses on video-based, corneal-reflection eye trackers – the most widely available and affordable type of system, and takes a look at a number of interesting and challenging applications in human factors, collaborative systems, virtual reality, marketing and advertising.

The third edition has been extensively revised and extended, and includes new chapters on calibration accuracy, precision and correction; advanced eye movement analysis; binocular eye movement analysis; practical gaze analytics; design; GIS.

Opening with useful background information, including an introduction to the human visual system and key issues in visual perception and eye movement, the author then surveys eye-tracking devices and provides a detailed introduction to the technical requirements necessary for installing a system and developing an application program.

Dr. Duchowski is a professor of Computer Science at Clemson University. His research and teaching interests include visual attention and perception, eye tracking, computer vision, and computer graphics. He has produced a corpus of publications related to eye tracking research and has delivered courses and seminars on the subject at international conferences. He maintains Clemson's eye tracking laboratory, and teaches a regular course on eye tracking methodology, attracting students from a number of disciplines across campus.

Part I Introduction to the Human Visual System (HVS)
1 Visual Attention
1.1 Visual Attention: A Historical Review
1.1.1 Von Helmholtz’s “Where”
1.1.2 James’ “What”
1.1.3 Gibson’s “How”
1.1.4 Broadbent’s “Selective Filter”
1.1.5 Deutsch and Deutsch’s “Importance Weightings”
1.1.6 Yarbus and Noton and Stark’s “Scanpaths”
1.1.7 Posner’s “Spotlight”
1.1.8 Treisman’s “Glue”
1.1.9 Kosslyn’s “Window”
1.2 Visual Attention and Eye Movements
1.3 Summary and Further Reading
2 Neurological Substrate of the HVS.- 2.1 The Eye.- 2.2 The Retina.- 2.2.1 The Outer Layer.- 2.2.2 The Inner Nuclear Layer.- 2.2.3 The Ganglion Layer.- 2.3 The Optic Tract and M/P Visual Channels.- 2.4 The Occipital Cortex and Beyond.- 2.4.1 Motion-Sensitive Single-Cell Physiology.- 2.5 Summary and Further Reading.- 3 Visual Psychophysics.- 3.1 Spatial Vision.- 3.2 Temporal Vision.- 3.2.1 Perception of Motion in the Visual Periphery.- 3.2.2 Sensitivity to Direction of Motion in the Visual Periphery.- 3.3 Color Vision.- 3.4 Implications for Attentional Design of Visual Displays.- 3.5 Summary and Further Reading.- 4 Taxonomy and Models of Eye Movements.- 4.1 The Extraocular Muscles and the Oculomotor Plant.- 4.2 Saccades.- 4.3 Smooth Pursuits.- 4.4 Fixations (Microsaccades, Drift, and Tremor).- 4.5 Nystagmus.- 4.6 Implications for Eye Movement Analysis.- 4.7 Summary and Further Reading.-Part II Eye Tracking Systems
5 Eye Tracking Techniques.- 5.1 Electro-OculoGraphy (EOG).- 5.2 Scleral Contact Lens/Search Coil.- 5.3 Photo-OculoGraphy (POG) or Video-OculoGraphy (VOG).- 5.4 Video-Based Combined Pupil/Corneal Reflection.- 5.5 Classifying Eye Trackers in “Mocap” Terminology.- 5.6 Summary and Further Reading.- 6 Head-Mounted System Hardware Installation.- 6.1 Integration Issues and Requirements.- 6.2 System Installation.- 6.3 Lessons Learned from the Installation at Clemson.- 6.4 Summary and Further Reading.- 7 Head-Mounted System Software Development.- 7.1 Mapping Eye Tracker Screen Coordinates.- 7.1.1 Mapping Screen Coordinates to the 3D Viewing Frustum.- 7.1.2 Mapping Screen Coordinates to the 2D Image.- 7.1.3 Measuring Eye Tracker Screen Coordinate Extents.- 7.2 Mapping Flock Of Birds Tracker Coordinates.- 7.2.1 Obtaining the Transformed View Vector.- 7.2.2 Obtaining the Transformed Up Vector.- 7.2.3 Transforming an Arbitrary Vector.- 7.3 3D Gaze Point Calculation.- 7.3.1 Parametric Ray Representation of Gaze Direction.- 7.4 Virtual Gaze Intersection Point Coordinates.- 7.4.1 Ray/Plane Intersection.- 7.4.2 Point-In-Polygon Problem.- 7.5 Data Representation and Storage.- 7.6 Summary and Further Reading.- 8 Head-Mounted System Calibration
8.1 Software Implementation.- 8.2 Ancillary Calibration Procedures.- 8.2.1 Internal 2D Calibration.- 8.2.2 Internal 3D Calibration.- 8.3 Summary and Further Reading.- 9 Table-Mounted System Hardware Installation.- 9.1 Integration Issues and Requirements.- 9.2 System Installation.- 9.3 Lessons Learned from the Installation at Clemson
9.4 Summary and Further Reading.- 10 Table-Mounted System Software Development
10.1 Linux Tobii Client Application Program Interface.- 10.1.1 Tet Init
10.1.2 Tet Connect, Tet Disconnect
10.1.3 Tet Start, Tet Stop.- 10.1.4 Tet CalibClear, Tet CalibLoadFromFile, Tet CalibSaveToFile, Tet CalibAddPoint, Tet CalibRemovePoints, Tet CalibGetResult, Tet CalibCalculateAndSet
10.1.5 Tet SynchronizeTime, Tet PerformSystemCheck.- 10.1.6 Tet GetSerialNumber, Tet GetLastError, Tet GetLastErrorAsText.- 10.1.7 Tet CallbackFunction.- 10.2 A Simple OpenGL/GLUT GUI Example.- 10.3 Caveats.- 10.4 Summary and Further Reading.- 11 Table-Mounted System Calibration.- 11.1 Software Implementation.- 11.2 Summary and Further Reading.- 12 Using an Open Source Application Program Interface.- 12.1 API Implementation and XML Format.- 12.2 Client/Server Communication.- 12.3 Server Configuration.- 12.4 API Extensions.- 12.5 Interactive Client Example using Python.- 12.5.1 Using Gazepoint’s Built-in Calibration
12.5.2 Using Gazepoint’s Custom Calibration Capabilities.- 12.6 Summary and Further Reading.- 13 Eye Movement Analysis.- 13.1 Signal Denoising.- 13.2 Dwell-Time Fixation Detection.- 13.3 Velocity-Based Saccade Detection.- 13.4 Eye Movement Analysis in Three Dimensions.- 13.4.1 Parameter Estimation.- 13.4.2 Fixation Grouping.- 13.4.3 Eye Movement Data Mirroring.- 13.5 Summary and Further Reading.- 14 Advanced Eye Movement Analysis.- 14.1 Signal Denoising.- 14.2 Velocity-Based Saccade Detection.- 14.3 Microsaccade Detection.- 14.4 Validation: Computing Accuracy, Precision, and Refitting.- 14.5 Binocular Eye Movement Analysis: Vergence.- 14.6 Ambient/Focal Eye Movement Analysis.- 14.7 Transition Entropy Analysis
14.8 Spatial Distribution Analysis
14.9 Summary and Further Reading
15 The Gaze Analytics Pipeline.- 15.1 Gaze Analytics in Five Easy Steps.- 15.1.1 Step 0: Data Collection.- 15.1.2 Step 1 (dirs): Directory Creation.- 15.1.3 Step 2 (raw): Extract Raw Gaze Data.- 15.1.4 Step 3 (graph or process): Graph or Process Raw Data.- 15.1.5 Step 4 (collate): Collate Data Prior to Statistical Analysis.- 15.1.6 Step 5 (stats): Perform Statistical Analyses.- 15.2 Gaze Analytics: A Worked Example
15.2.1 Scanpath Visualization.- 15.2.2 Traditional Eye Movement Metrics.- 15.2.3 Advanced Eye Movement Analysis.- 15.3 Summary and Further Reading.- 16 Eye Movement Synthesis.- 16.1 Procedural Simulation of Eye Movements.- 16.1.1 Modeling Saccades.- 16.1.2 Modeling Fixations.- 16.2 Adding Synthetic Eye Tracking Noise.- 16.3 Summary and Further Reading.- Part III Eye Tracking Methodology
17 Experimental Design.- 17.1 Formulating a Hypothesis.- 17.2 Forms of Inquiry.- 17.2.1 Experiments Versus Observational Studies.- 17.2.2 Laboratory Versus Field Research.- 17.2.3 Idiographic Versus Nomothetic Research.- 17.2.4 Sample Population Versus Single-Case Experiment Versus

Case Study.- 17.2.5 Within-Subjects Versus Between-Subjects.- 17.2.6 Example Designs.- 17.3 Measurement and Analysis.- 17.4 Summary and Further Reading.- 18 Suggested Empirical Guidelines.- 18.1 Evaluation Plan.- 18.1.1 Data Collection.- 18.1.2 System Identification
18.1.3 Constraints.- 18.1.4 User Selection.- 18.1.5 Evaluation Locale
18.1.6 Task Selection.- 18.2 Practical Advice.- 18.3 Considering Dynamic Stimulus.- 18.4 Summary and Further Reading.- 19 Case Studies.- 19.1 Head-Mounted VR Diagnostics: Visual Inspection.- 19.1.1 Case Study Notes.- 19.2 Head-Mounted VR Diagnostics: 3D Maze Navigation.- 19.2.1 Case Study Notes.- 19.3 Desktop VR Diagnostics: Driving Simulator.- 19.3.1 Case Study Notes.- 19.4 Desktop Diagnostics: Usability.- 19.4.1 Case Study Notes.- 19.5 Desktop Interaction: Gaze-Contingent Fisheye Lens.- 19.5.1 Case Study Notes.- 19.6 Summary and Further Reading.- Part IV Eye Tracking Applications.- 20 Diversity and Types of Eye Tracking Applications.- 20.1 Summary and Further Reading.- 21 Neuroscience and Psychology.- 21.1 Neurophysiological Investigation of Illusory Contours.- 21.2 Attentional Neuroscience.- 21.3 Eye Movements and Brain Imaging.- 21.4 Reading.- 21.5 Scene Perception.- 21.5.1 Perception of Art.- 21.5.2 Perception of Film.- 1.6 Visual Search.- 21.6.1 Computational Models of Visual Search
1.7 Natural Tasks.- 21.8 Eye Movements in Other Information Processing Tasks.- 21.9 Summary and Further Reading.- 22 Industrial Engineering and Human Factors
22.1 Aviation.- 22.2 Driving.- 22.3 Visual Inspection.- 22.4 Summary and Further Reading.- 23 Marketing/Advertising.- 23.1 Copy Testing.- 23.2 Print Advertising
23.3 Ad Placement.- 23.4 Television Enhancements.- 23.5 Web Pages.- 23.6 Product Label Design.- 23.7 Summary and Further Reading.- 24 Computer Science
24.1 Human–Computer Interaction and Collaborative Systems
24.1.1 Classic Eye-Based Interaction.- 24.1.2 Cognitive Modeling.- 24.1.3 Universal Accessibility
24.1.4 Indirect Eye-Based Interaction.- 24.1.5 Attentive User Interfaces (AUIs).- 24.1.6 Usability.- 24.1.7 Collaborative Systems.- 24.2 Gaze-Contingent Displays
24.2.1 Screen-Based Displays.- 24.2.2 Model-Based Graphical Displays.- 24.3 Summary and Further Reading.- 25 Conclusion.- References.- Index


Add a review