Simulatorstudie in unserem Münchner Usability Lab.

STADT:up – Intuitive and understandable HMI for autonomous vehicles

Dr. Lukas Flohr Senior UX Designer • Specialist Mobility

Esther Barra Lead Communication Manager • Lead Communication Manager

Philipp Shardt Lead UX Software Engineer • Technology Expert Web

Tim Demesmaeker UX Designer

12/06/2025 • 5 minutes reading time

Autonomous vehicles are on the rise and transforming our cities. But how can we ensure that this technology not only works, but also inspires? Understandable Human-Machine Interfaces (HMI) are the key. We have been participating in the collaborative research project STADT:up to develop solutions for autonomous urban mobility since January 2023. Our goal is to create technologies and concepts that enable automated and autonomous driving in urban environments. At Ergosign, we focus on designing human-machine interaction to sustainably foster acceptance, trust, and safety for vehicle occupants. This update offers insights into our current progress and findings.

Human-centered and explainable AI in focus

How can we make autonomous driving understandable and accessible for everyone? That is the central question for us in the STADT:up project. In Subproject 2 – “Human Factors”, Ergosign focuses on developing information and interaction concepts that explain the behavior of autonomous vehicles. Only when we understand what’s happening around us and how the vehicle is making decisions can we feel truly safe. Human-centered user experience design is crucial here: solutions must be aligned with people’s needs to unlock their full potential.

Adaptive and modular interaction concepts

To build trust in AI-based systems like autonomous vehicles, we need to demonstrate how they make decisions. We are experimenting with various ways and modalities to communicate AI decisions — via displays, audio signals, or ambient lighting. It’s important that the information is helpful without being overwhelming. We’re aiming for the perfect information balance and exploring situation- and context-dependent interaction modules.

Simulator studies provide key insights

We rigorously test our ideas and concepts through user studies and simulation environments. Some of our key findings so far include:

  • Information needs: The occupants’ need for information strongly depends on how safe or dangerous the driving situation feels.

  • Critical moments: In critical situations, many occupants find it important to know exactly what the vehicle sees and plans. A clear and understandable visualization of the surroundings is very valuable here.

  • Multimodal signals: Multimodal signals can optimize the user experience but require careful design to avoid overwhelming or irritating the users. Light and sound cues can enhance UX and build trust in autonomous vehicles if used purposefully.

Adaptive and modular interaction concept with Augmented Reality in a critical, urban driving situation.

Augmented Reality: Looking ahead

To improve information delivery, we’re currently exploring the potential of Augmented Reality (AR). With AR, we can project cues and information directly into the user’s field of vision, such as onto the windshield. In summer 2025, we plan to conduct further user studies in Munich and Hamburg to explore this in more detail. Want to take part and help shape the future of mobility? Sign up for our study here.

EngineeringUI: The cockpit for developers

In addition to the UX for passengers, we are also developing an EngineeringUI for developers of AI software in Subproject 5 – “Autonomous Driving.” As mentioned in our first article on STADT:up, we initially analyzed the usage context with our partners, derived requirements, and translated them into personas. Based on this foundation, we identified opportunity areas where the EngineeringUI can support developers in their work.

STADT:up Entwickler-Persona “Eggart Engineer”.
STADT:up Entwickler-Persona “Eggart Engineer”.

The EngineeringUI visualizes vehicle sensor data in a web application and is optimized for mobile use in and around test vehicles. It helps developers record, annotate, and manage data and evaluate and optimize algorithms.

For development, we use the Robot Operating System (ROS) as a foundation — a flexible, open-source framework that handles core robot communication and control. To visualize and analyze the ROS-generated data streams in real-time, we use the open source application Foxglove, also adopted at the start of the project. We have specifically optimized Foxglove for mobile use on tablets and other portable devices and expanded its visualization capabilities with integrated recording and annotation functions that let us mark and log critical events during operation.

In the long term, the UI aims to enable high-performance sensor data visualization and real-time adjustment of visualization or HMI settings. A major challenge is the sheer volume of data transmitted and processed by the test vehicles. To tackle this, we’re working closely with our project partners.

EngineeringUI Design mit Visualisierung des nuScenes Dataset (links) und MVP-Integration am Partner-Versuchsträger (rechts).
EngineeringUI Design mit Visualisierung des nuScenes Dataset (links) und MVP-Integration am Partner-Versuchsträger (rechts).

On the home stretch

The STADT:up project is heading into its final phase and will conclude in spring 2026. Thanks to collaboration with OEM, suppliers, and research institutions, we’re developing practical, scientifically grounded solutions for understandable autonomous systems. We’re also supporting the next generation of technical and scientific talent and creating valuable synergies between human factors research and technology development. The empirical findings will flow directly into the development of human-centered, understandable, and trustworthy autonomous vehicles.

Through this and many other projects, Ergosign is applying its expertise to shape the mobility of tomorrow. We look forward to sharing more insights and results from STADT:up’s final project year soon!

Want to shape the future with us?

Together, we design user-friendly and trustworthy automated driving functions and smart mobility concepts. We support you with our UX expertise, human-centered HMI development, and proven methods from research and practice.

Lukas is a Senior UX Designer and leads the STADT:up project at Ergosign. He’s been with the team since 2016 and wrote his PhD thesis on the interaction with autonomous vehicles. With a collaborative and user-centered mindset, Lukas works on shaping future mobility by developing interaction concepts across a range of industry and research projects.

Dr. Lukas FlohrSpecialist Mobility

Esther is Lead Communication Manager at Ergosign and has been part of the team since 2022. She combines her passion for social media with strategic communication – and with in-depth expert content: she regularly publishes insight articles and case studies on topics such as AI, DXP, and mobility.

Esther BarraLead Communication Manager

This work is a result of the joint research project STADT:up (Förderkennzeichen 19A22006K). The project is supported by the German Federal Ministry for Economic Affairs and Climate Action (BMWK), based on a decision of the German Bundestag.

In recent times, artificial intelligence (AI) has made a significant impact across various sectors. We are confident that the incorporation of AI into both our own and our client's offerings will progress further. Leveraging our extensive experience and diverse UX portfolio, we eagerly anticipate shaping our future alongside these technological advancements.