HMI, car.HMI conference, future trends

Revolutionizing Vehicle Interactions: Exploring the Future of HMIs at the 2023 car.HMI Conference

Discover the future of vehicle interaction and the power of intuitive HMIs in this recap of our experience at the car.HMI conference. Notable topics included the central role of user experience, innovative concepts for autonomous driving, UI showcases based on Android Automotive OS, and the potential of AI in the advancement of HMIs. From driver emotion recognition to advanced gesture interactions, this event made us aware of exciting possibilities for the future.

05 MIN

In mid-June, our team attended the 2023 Conference car.HMI in Berlin, an event that offered many interesting insights on the topics of intuitive vehicle interactions and the future of human machine interfaces (HMI). We’re recapping what we found to be the most impactful takeaways here.

In many talks, the pivotal role of user experience (UX) for HMI was highlighted by the speakers. As  cars become more connected and customizable, the HMI and it's’ UX gains importance for both customer satisfaction and brand perception.

The car.HMI conference was combined with the InCabin Sensing conference, which our team found especially thought-provoking Through many thematic touch points, this immersion fostered many innovative ideas and new perspectives, highlighting how sensor technology can be utilized in-car in impactful ways.

 

 

Autonomous Driving

Unsurprisingly, the field of autonomous driving was a big topic. Some exhibits and talks outlined different concepts for keeping the driver in the loop in the different states of automated driving, recommending various implementations of multimodal feedback and warnings intended to communicate the current situation not only in visual way, but also through haptic and acoustic measures.

Aditionally, many presentations introduced Head-Up Displays (HUD's) and Augmented Reality (AR) concepts for automated driving. We especially liked the presentation by Basemark, which showcased their technology’s ability to manage integrating information seamlessly in the live view of the situation without distracting the driver or covering up essential information of real-world surroundings.

 

 

Android Automotive

Furthermore, there were again many user interface (UI) concept showcases based on the Android Automotive OS. LOTUS’s new Eletre, a battery-electric powered sport utility vehicle (SUV) was on display, and we were able to test the UI experience directly in the cockpit. According to the designers we talked to, the design goal was to make each important function achievable in 3 clicks. In our opinion, this approach, which appealing, led to a very stacked information architecture, which left a cluttered impression.  The display also included many visual elements that did not serve any interaction purpose, which conveyed an apparent focus by LOTUS on East-Asian markets. From our perspective, this may lead to slower adoption in regions where a more minimal approach to interface design is common.

 

 

AI application in future vehicles

Another hot topic was the wide range of possibilities offered by artificial intelligence (AI) for improving HMIs. Several exhibitors showed examples of how AI could be used to enhance the experience of personal assistance or of personalizing the HMI according to a driver’s needs. One concept that caught our attention was an AI-based feature developed by Luxoft that allowed the creation of an individual Central Information Display (CID) background through voice prompts and then automatically adjusted layout, colour, and fonts of the interface accordingly.

 

Detecting Drivers Emotions

There were also discussions around detection of the driver’s emotional state while driving. Several setup and applications were shown (e.g., by Fraunhofer), where cameras captured the driver’s facial expression and combined this information with input from different sensors for pulse, movement, or steering behaviour.

Luxoft showcased an innovative approach to train such systems, using computer generated faces to train the AI. Potentially, this could allow for much faster and more accurate training than having to rely on human test subjects.

Another interesting concept was the “time-off-flight” sensors shown by Infineon, which can be used inside the cabin, enabling smart use of airbags or next level gesture interaction.

 

See you next year 🚀📅

Car.HMI was a fantastic event, and the uintent team was thrilled to have gotten the opportunity for exchange with other industry experts. Many thanks to the organization of the car.HMI conference for hosting such a remarkable and thought-provoking event!

You want to read more blog articles on automative UX?

Author

Lisa Umbach

Lisa is passionate about understanding the needs and motivations of users while keeping all stakeholders and business goals in mind. Therefore, she has been responsible for conducting qualitative and quantitative user research projects since 2012 - from study design, moderation, analysis and reporting to presentations and workshops. Furthermore, she has extensive knowledge in user-centered design methods and the design thinking approach, including many creativity techniques and various prototyping tools (e. g. Figma, Adobe XD, AXURE RP).

Go back