When your Car Becomes your Travel Guide
The car of the future will understand us: it will respond to our language and gestures. If there are problems, we’ll get the right tip at the right time. The experts at Volkswagen Electronics Development are working hard to make this goal a reality. Part 3 of our series on user experience trends.
In the future, automated driving and connectivity will fundamentally change our entertainment options on weekend excursions. But that’s not all: in Volkswagen’s Electronics Development department, experts are working on enabling our vehicles to become personal assistants who, with the help of artificial intelligence, understand our needs. Stephanie, who is visiting her friend in Berlin, is a fictitious test person created to help developers with their work. In just a few years though, the new functions will be available to real-life customers as well.
Astrid Kassner is one of the development team’s experts for voice and gesture control. Their goal is to ensure that future vehicles understand – and carry out – wishes expressed with just a small gesture of the finger. This will be particularly important, says Kassner, when the person in a self-driving car no longer has to be responsible for steering. At that point, “we will lean back and no longer be able to reach the cockpit with our arms. So we are developing voice commands and gestures to complement touch operation via the display.”
An infrared camera captures the gestures
Already today the developer can effortlessly control the interior lighting of her cockpit model, for example, with her right hand and a few words. With swiping motions, she navigates between streaming portals and messaging options without touching the display. In tests with test subjects, the developers have had encouraging results. “Many need just a few minutes to get used to the operating concept,” says Astrid Kassner. “That certainly has something to do with the fact that we are accustomed to using similar gestures with our smartphones.”
In technical terms, gesture control utilizes an infrared camera that captures the passenger’s hand motions. “The camera is continuously measuring how long the invisible infrared rays take to travel to the person’s hand and back again. With that information, it is possible to determine the position and the motion of the hand,” explains Kassner.
The cars of tomorrow need not only understand our speech and gestures, but also give us the right tips at the right moment. That’s what developer Stefan Henze is working on. When the car encounters heavy traffic, for example, the vehicle could recommend that the driver use Adaptive Cruise Control (ACC), which regulates the distance between the vehicle and the one in front of it. “What’s makes it special is that the vehicle offers ACC precisely at the moment when the driver could actually make use of the function,” says Stefan Henze. If needed, the car can explain to the person exactly what ACC means. If the customer is interested, they can then activate the function with a voice command.
The driver is in the driver’s seat
The biggest challenge with such recommendations is finding the right dose, says Stefan Henze. “We don’t want to pester the driver with constant tips.” In order to provide the right tip at the right moment, artificial intelligence in future vehicles will evaluate vehicle data like speed and GPS position, explains the developer. “The recommendation to use ACC will only be issued when an extended standstill is imminent.” Another benefit is flexibility: in the vehicle of tomorrow, the driver will have the option of whether to buy the function or merely use it for a limited amount of time. Henze: “Let’s assume I only drive a long distance once a year while on vacation. Then I would subscribe to the traffic jam assistant just for that specific time period.”