When working with an ATM system, can the user´s body language and personal choices provide valuable cognitive information to uncover hidden patterns that will allow us to distinguish between “good” and “bad” (erroneous) inputs and interactions and ultimately avoid mistakes and bad decisions?
Safety critical systems in general and ATM systems in particular are designed according to the highest safety standards. The 6th Sense project will concentrate on improved fault tolerances of the human machine interface by accepting the overall user’s body language as input. This means that we want to make use of multiple sensors and actuators, e.g. mouse, pen, eye tracking, gesture recognition, and overall system information, and fuse these into one most likely interaction, eventually, supported by a decision workflow system to judge different possible meanings of the interaction. The 6th Sense module will support the operator by offering the first steps towards a more holistic solution for multimodal user interaction.
The theory will be proven by the software module prototype “6th Sense” that could be implemented into every CWP prototype for improvement of failure tolerance in the Human Machine Interaction.
Within the 6th Sense project the objective is to have an integrated solution for multimodal user interaction. The 6th Sense module will be a central Human Machine Interface (HMI) abstraction layer. With a global view on the heterogeneous input data, it will be possible to analyze the typical workflow of an ATM user. With methods of signal processing, signal fusion, machine learning, among others, the new HMI module will aim to record, analyze and classify human interaction.