Tower controllers embrace mixed reality
Virtual and augmented reality (V/AR) serves to ease staff tasks and enable more seamless operations. This candidate solution continues work already performed during SESAR 2020 wave 1 and aims to support tower controllers through the use of tracking labels, air gestures and attention guidance thanks to advanced human machine interface (HMI) interactions.
The V/AR blends real world images with computer-generated data (augmented reality) in real-time, so that visual information can be enhanced to improve identification and tracking of aircraft (or vehicles) on and around the airport. In low visibility, synthetic vision can show digital georeferenced data that supplement the missing real vision (virtual reality).
When using V/AR, auxiliary information is merged with the out of the window (OTW) view and presented as an overlay on top of the real-world visual information. In this way, controller attention is no longer forced to divide between the primary visual field (OTW) and auxiliary tools (such as paper or electronic flight strips, surface movement radar, gap-filler camera streams and alert indications), consequently reducing ‘head-down’ time and increasing situational awareness.
The Virtual Reality application allows the controllers to interact with tracking labels by means of a set of air gestures and issue clearances for not-time-critical tasks (start-up, push-back).
Furthermore, the Attention Guidance function can be triggered by input from external sources, such as safety net or airport sensors, to display perceptual cues and direct the attention of air traffic controllers towards a specific event.
Computer-generated overlays can also be adaptively displayed by means of synthetic vision, for example in low visibility conditions.
BENEFITS
Enhanced safety
Increased situational awareness
Improved efficiency
Improved resilience