And what’s more: The virtual control is intelligent and moves towards the user. As long as they look aimlessly around them, nothing will happen. The UI elements, which enable various operations, remain inactive. But once the user focuses their eyes on one of these elements, thus signalling interest, the system displays more detailed information, moving the element towards the user where it can be controlled via gestures. Here’s an example: The so-called Audi dimensions anchor point located in the door of the Audi activesphere concept is a physical element. When you rest your eyes on it, the system will register your interest and then display the current temperature of the interior, for example. When you focus on it for longer, it will become interactive: The UI element moves towards the user and can be controlled virtually. When you’re done, it disappears after a few seconds. Alternatively, you can also swipe the content away with your hand, thus returning the UI element to its sleep mode. So there are not just physical elements superimposed with virtual 3D content, but an intelligent, context-based user interface which is flexibly displayed and can be controlled seamlessly and conveniently. Audi dimensions focuses on the users and understands their needs.
When you are in the front left seat of the Audi activesphere concept, the steering wheel is retracted in an invisible position. If the driver wants to take over, the steering wheel swivels out from its invisible position, and the content of the user interface will adjust to manual driving mode. Christina: “Even with navigation, you don’t just click on an icon to access a use case, but you can interact three-dimensionally with the navigation map. There are different layers for different levels of information, which build up gradually depending on what interests you.”
When you leave the vehicle to explore nature on your mountain bike or your skis, for example, you want to continue to wear the glasses, so that relevant information can be displayed, such as trail networks, slope gradients or warnings of limited visibility.
Jan: “Not only are we making sure that now, for the first time, you can use such a technology to interact with your vehicle and control its functions, but we go far beyond that. Ultimately, the vehicle is a digital mixed reality platform that is ready for what we hope to see in the future: I get into the car, and it comes naturally to me to control it via the MR glasses. I get out of the car and go about my activities such as skiing or mountain biking, where the glasses will also be of support. The vehicle is my extended companion, enabling this seamless experience.” The car of the future is embedded in an ecosystem, enriching the real world with context-based virtual content.