Institute of Flight Guidance
Area of research:
Diploma & Master Thesis
The department controller assistance develops prototypic support systems for air traffic controllers. This also encompasses human machine interfaces and functionalities for information visualisation to the air traffic operator. Among other things, systems for assistant based speech recognition and for the detection of visual controller attention have already been developed. An air traffic controller continuously scans his radar display, on which various aircraft labels and waypoints as well as airspace regions are drawn. An existing assistant based speech recognition system generates hypotheses for the most likely next controller commands. For example, only commands for aircraft callsigns that are actually available in airspace are predicted. In addition, descent and speed reduction commands would be expected for approaching aircraft, rather than climb and acceleration commands. This command prediction is to be supplemented by another sensor. The controller is more likely to give the aircraft a command that he is currently looking at on the radar display or has recently looked at. This assumption can also be adopted in latent form for airspace regions in which there are several aircraft. Therefore, an existing software toolbox is to be expanded so that the data of an eye tracker is used as input to verify the regions viewed on the display to then generate possibility assigned command predictions. Expansion to a more valuable thesis:As an extension of the idea, depending on the controller’s scan pattern, a further prediction looking more into the future could be made. In addition, it must be checked whether, when looking at two aircraft flying behind each other, more of a speed reduction command for the following aircraft, when looking at two aircraft with an intersecting flight path, rather heading / DIRECT_TO instructions, and when there are two aircraft at the same altitude, DESCEND / CLIMB commands are more likely. The mouse pointer could also serve as a further sensor if the controller “carries” it along with the scanning of the radar screen. An existing software toolbox should therefore be expanded so that the data of an eye tracker is used as input to verify these hypotheses. This also means taking test data with controllers in advance and verifying the prototype at the end.
In the course of your master’s / bachelor’s thesis, the following tasks need to be handled: familiarisation with the concepts of assistant based speech recognition and eye-tracking in the air traffic control environment development of concept for prediction of controller commands based on eye-tracking data implementation of the developed concept and integration into existing source code of a software-toolbox for controller support systems including integration of gaze and speech recognition continuous documentation of working steps and results if applicable support for the evaluation of developed functionalities with air traffic controllers in simulation studies