University of Patras
Modern cockpit environments present pilots with an overwhelming volume of visual information, increasing cognitive workload and the risk of critical cues being overlooked. This study explores an adaptive human-machine interface designed to enhance pilot situational awareness using real-time eye-tracking data. By dynamically adjusting visual elements based on gaze behavior, the system ensures that essential information remains within the pilot’s attentional focus.
The developed system integrates a Tobii eye tracker with a Python-based interface, featuring a dynamic map display with NATO-standard symbology. Initially, pilots interacted with a standalone eye-tracking interface, but trials revealed that the task was too simple, leading to a revised experimental setup. A dual-screen configuration was implemented, with Microsoft Flight Simulator providing realistic flight scenarios while the adaptive system tracked and responded to pilots' gaze patterns. If a pilot failed to fixate on a critical object for a predefined duration, the interface introduced visual alerts to redirect attention.
A certified pilot trainer evaluated the system and highlighted key improvement areas, including modifying alert mechanisms and optimizing screen arrangement for ecological validity. Based on this feedback, a refined system design was proposed, incorporating auditory and visual enhancements and a vertically stacked screen layout to better align with cockpit ergonomics.
Findings support the integration of adaptive eye-tracking interfaces in aviation, demonstrating their potential to improve situational awareness, reduce workload, and enhance decision-making. Future research will focus on refining adaptation strategies and evaluating long-term usability in operational environments.
Manuskript
Erwerben Sie einen Zugang, um dieses Dokument anzusehen.
Abstract
Erwerben Sie einen Zugang, um dieses Dokument anzusehen.
© 2026