Subscribe / Unsubscribe Enewsletters | Login | Register

Pencil Banner

Is it dumb to trust smart technology?

Mike Elgan | Aug. 1, 2016
Automation is nice, but it doesn't mean you should hand over responsibility to the machines.

Climate control, and by extension smoke and carbon monoxide detection, can't be left entirely in the hands of the machines. The elderly or others who might not be able to take care of themselves can't be placed in the hands of algorithms, because sometimes algorithms go haywire.

The really hard problem is: What to think and do about self-driving cars?

The Tesla crash got pundits and social media commenters questioning the wisdom of self-driving cars. The problem with that impulse is that automated emergency braking on many different makes and models of cars will save lives, according to the Insurance Institute for Highway Safety.

It seems likely that the crash was caused by human error. It's reasonable to expect the driver of a car to apply the brake when speeding toward a truck in the road. When we consider that human drivers are at fault for thousands of fatal car accidents each year, we should be pushing for truly self-driving cars because people are more dangerous behind the wheel.

Yet at the same time, I think that for the foreseeable future, the right way to "use" a fully automated, fully self-driving car is to let the car drive, but always have someone behind the wheel, paying full attention to the road and ready to take over control at any time.

The Googles and the Teslas of the world will tell us that we don't need steering wheels, brakes or even windshields in self-driving cars, and that the car will drive more safely than we can. They'll easily back up those statements with statistics that show far fewer accidents compared with human-driven cars. But even if automated driving brings down the car-related fatality rate to, say, 10 percent of what it was with human drivers, we're still talking about thousands of people being killed each year.

If you were to leave your baby in a public park and go shopping, statistically speaking it's unlikely that anyone would harm the baby. But you'd never do that because, as a parent, you're not going to take any chances. Similarly, parents will not -- and should not -- trust the lives of their children to automated driving systems. The best approach is to let the self-driving features do the driving while paying full attention to everything that's happening.

What we need is a set of cultural norms that make it clear to people that automating important things doesn't and can't replace a human being paying attention.

Because, in the words of HAL 9000 from the movie 2001: A Space Odyssey, any harm that results from turning our loved ones' safety over to the machines "can only be attributable to human error."

Words of wisdom, HAL. Words of wisdom. Now open the pod bay door, HAL.



Previous Page  1  2  3 

Sign up for CIO Asia eNewsletters.