Alexander Eriksson
Alexander joined the Transportation Research Group in June 2014 as a Marie Curie Fellow in the HF Auto ITN project. The research focus of Alexander is on human to vehicle instruction in the visual modality. Alexander completed MSc in Cognitive Science focusing on Human Factors and Driving at Linköping University, Sweden in 2014. In his thesis he evaluated decision making in a highly automated driving context at The Swedish National Road and Transport Research Institute. He concluded his BSc studies in Cognitive Science at Linköping University, Sweden in 2013, specialising in Human Factors in Aviation, for his thesis he evaluated touchscreens for use in turbulent conditions in collaboration with SAAB AB.
Interests
- Human Factors
- Automated Driving – Autonomous Vehicles
- Automation Safety
- Transportation Safety
- Aviation Safety
- Computer Science
- Naturalistic Decision-making
Introduction to the WP “Human Machine Interface” (from Annex 1)
This work package will develop a human-machine interface (HMI) supporting the operator of the future highly automated vehicle. The interface shall intuitively guide the operator during platooning and transient manoeuvres such as joining or leaving a platoon, lane changes and merging. The new HMI shall support human- to-vehicle instruction (setting and changing of automation modes and driver preferences) as well as multimodal (e.g., visual, haptic, and auditory) vehicle-to-human semantic information and status feedback (e.g., about automation status, change of automation mode, and environmental information like road infrastructure and surrounding vehicles) during highly automated driving. The idea is to keep the operator informed with multimodal immersive information in the vehicle’s interior. We will develop advanced visual cues in the interior via a head-up display (ESR3 & ESR4). Tactile cues will be implemented via active inceptors such as joysticks, driver’s seat, steering wheel, and/or pedals (ESR5), while directional auditory feedback will be provided via speakers in the vehicle cabin (ESR6).
Research activities for ESR3, Human-to-vehicle instruction (from Annex 1)
- Develop a human-machine interface (HMI) supporting the operator of the future highly automated vehicle. The interface shall intuitively guide the operator during platooning and transient manoeuvres such as joining or leaving a platoon, lane changes and merging.
- Validate the developed HMI concepts by means of behavioural observation.
- Methodology: prototyping and empirical evaluation using volunteers, in a driving simulation / VR context.
Key Publications
- Eriksson, A 2014. Driver Behaviour in Highly Automated Driving: An evaluation of the effects of traffic, time pressure, cognitive performance and driver attitudes on decision-making time using a web based testing platform (MSc Thesis)
- Eriksson, A., Lindström, A., Seward, A. & Kircher, K 2014 Can User-Paced, Menu-free Spoken Language Interfaces Improve Dual Task Handling While Driving? Human-Computer Interaction. Advanced Interaction Modalities and Techniques pp 394-405 Springer International Publishing
- Eriksson, A 2013 Touchscreens in turbulent conditions: To what extent is precision possible (BSc Thesis)
Connect
- mailto: alexander.eriksson@soton.ac.uk