Human-machine interface of the future highly automated vehicle
This work package will develop a human-machine interface (HMI) supporting the operator of the future highly automated vehicle. The interface shall intuitively guide the operator during platooning and transient manoeuvres such as joining or leaving a platoon, lane changes and merging.
Many current human-machine interfaces in vehicles provide drivers with salient instant information (symbols, warning sounds, and spoken or written messages). This approach contributes to false alarms, distraction, information overload, and disuse, while not maximizing information potential.
Alexander and Veronika will develop advanced visual cues in the interior via a head-up display. Bastiaan will implement tactile cues via active inceptors such as joysticks, driver’s seat, steering wheel, and/or pedals, while Pavlo will provide directional auditory feedback via speakers in the vehicle cabin. Using driving simulator experiments, we will validate each HMI in isolation and the combined multimodal HMI. Experimental scenarios will be developed in close cooperation with the researchers of WP1, taking into account their observed behavioural effects.
Alexander, Veronika, Bastiaan and Pavlo will study the effects of the HMI concepts on behaviour, mental workload, trust, disuse, situation awareness, locus of control, and stress levels using physiological observation methods such as heart-rate variability and pupil dilation measured with nonintrusive eye-tracking in cooperation with WP3. Special attention will be paid to the effect of the HMI on driver immersion and distraction. A demonstrator, combining visual, haptic, and auditory cues, and including the driver state monitor of WP3 will be prepared for public demonstration in the project’s outreach activities.