Advanced driver-assist systems (ADAS), such as adaptive cruise control and lane-keeping assist, will grow from a $2.4 billion business today to $102 billion in 2030, according to a new report.
The 26% annual growth rate that ADAS is expected to achieve over the next 15 years will also change the way drivers interact with their vehicles, according to the report from Lux Research titled, "The $102 Billion Opportunity for Partial Automation for Cars."
As ADAS technology advances, drivers and passengers will be moved away from using knobs and touch screens to voice activation, facial recognition, biosensors and gesture recognition to control vehicle functionality. For example, BMW's next-generation iDrive human-machine interface includes gesture control functions to supplement the traditional iDrive knob in the central console and voice recognition offered in the current system.
Consumers' perceptions of what their vehicle is will also change, from simple transportation to a mobility platform that resembles more of a moving office space or even a gaming platform while you're in transit.
"Our interface or connection with the car will change. You'll no longer have to spend the duration of the drive focused on the road," said Lux Research analyst and lead study author Maryanna Saenko. "The driver will inherently be more distracted by what's going on around them in the car, so how do you interface with the car?"
As vehicles take over more of the driving responsibility, they'll also have to monitor the actions of drivers and passengers more closely, to ensure they're able to react if road conditions change.
For example, inward-facing cameras and biosensors located in seats or steering wheel will be able to determine if occupants are paying attention to the road around them, or reading emails or participating in a Skype meeting.
"Are they holding the steering wheel? And it not, the car will try to alert them. If it can't, it may pull over to the side of the road," Saenko said.
Haptic technology, which creates the illusion of touch by applying vibrations, resistance or other forces, will allow vehicles to communicate with drivers like some video game controllers can today. For example, if traffic becomes congested or roadwork appear ahead, the vehicle's steering wheel may vibrate.
"A lot of driver safety features will focused around [the vehicle] paying attention to the environment around it and then notifying the driver if something is happening," Saenko said.
Sensors placed in seat fabric may also be able to alert first responders to some vital signs before they reach the scene of an accident.
Fully autonomous vehicles, however, won't arrive by 2030, hampered mostly by regulations and a current lack of prototypes, Saenko said.
Sign up for CIO Asia eNewsletters.