Envisioning the Future of Autonomous Driving
In our recent work with Hyundai, we explored the near-term, human-centered design challenges associated with autonomous vehicles (AVs). From building trust in intelligent systems to sharing control with machines, we expect these kinds of pragmatic issues to dominate automotive UX for some time to come. But what will happen to the driving experience itself? What about the simple pleasure of driving? Amidst the optimism surrounding AVs, human mobility is on the brink of a dramatic transformation that is close enough to perceive, yet impossible to predict.
If automakers have their way, future vehicles might allow us to have the best of both worlds: a balance between autonomy and agency. A recent McKinsey report noted that by 2030, only 15% of vehicles will be fully autonomous, complimented by ongoing (though slow) growth in conventional cars (~2%/year). This suggests that semiautonomous vehicles (SAVs) with varying levels of autonomy will continue to be a huge market for a long time, and that true AVs will co-exist—and compete—with the dominant paradigm.
In human terms, if we assume a future where technology augments rather thanreplaces human drivers, what kind of future should we design?
We reject the assumption that SAVs are inherently binary in nature—awkwardly transforming between modes for manual and autonomous driving, with passive infotainment being the presumed hero experience of the latter. These vehicles will understand context, adapt fluidly to changing conditions both inside and out, and ultimately connect us to driving on our own terms.
Environment as interface
Today, automotive HMI is characterized by a complex array of inputs and outputs, redundant controls, multiple screens, competing platforms, and differing mental models—all of which making learning a new car so complicated that many people never use the most advanced features. In the future, we believe that advances in sensing, materials, displays, and AI will afford an opportunity to greatly simplify the experience—going beyond huge touchscreens and gesture controls, and reconsidering the interior environment itself as a holistic interface between driver and vehicle. Voice may become a primary input, complimented by a deep understanding of environmental conditions, contextual awareness, and human intent. The car will be a small scale computing environment—and the interface will progressively adapt to meet our needs and preferences, resulting in a highly personalized experience.
Sensors embedded throughout the vehicle (in the steering wheel, seating, and cockpit), combined with highly malleable materials, allow the car to adapt fluidly and naturally in response to the driver’s preferences, behaviors and intent. The seats themselves will be articulated to allow a wide range of automated positions—from upright to relaxed, with differing levels of fit and character in all dimensions (front-to-back and side-to-side). An inflatable support layer enables multiple degrees of firmness and character, allowing the personality of the car to shift with the driver.
Reinventing the wheel
Like today, the wheel will remain the heart of the interface. Unlike today, it will respond and adapt to a range of signals that include driver behaviors (how I grip the wheel, posture, attentiveness, etc.), physiological measures (gaze, position, heart rate, breathing, etc.) and context (route, previous patterns, external conditions, etc.). Collectively, these signals will influence the vehicle’s level of autonomy, which will operate along a gradual and smooth continuum from performance driving to passive mobility.
When the driver takes control, the interface adapts to support a focused driving mode designed to improve the quality of the experience and the performance of the driver. The wheel extends to its furthest position while its surface expands, forming a grip that enhances the connection of the hands to the wheel. Seat posture is more upright and firm, while haptic feedback enhances the sensation and “feel” of the road. Multiple adjacent surfaces—near the steering wheel, the armrest, and the dash—are touch-sensitive and facilitate contextual interactions. Physical knobs emerge from the dashboard, putting tactile controls for things like media and climate control within easy reach without compromising the driver’s focus on the road ahead.
As the driver signals a change in engagement, the interface and controls adapt accordingly—supporting a range of features enabled by varying degrees of automation, from relaxed cruising to fully autonomous operation. At this end of the spectrum, the wheel disappears completely, seat position changes into a more relaxed posture, and the multiple display surfaces assume different roles depending on context and preference.
The evolution of driving
More than merely getting from A to B, cars are a significant part of our cultural identity, and driving represents a unique form of individual agency not easily replaced by services. Like today, we believe multiple mobility paradigms will ultimately exist side-by-side—a trend that GM, Ford and others are betting on with significant investments. Meanwhile, there are reasons to believe the same technologies that will render driving unnecessary could also make it efficient, safe and compelling. OEMs have a unique opportunity to double down on driving—leveraging the engineering know-how that made them great to begin with, protecting decades of brand differentiation, and preserving aspects of the current business model even as new challenges and forms of mobility emerge.