Subscribe / Unsubscribe Enewsletters | Login | Register

Pencil Banner

Should hackers be tolerated to test public systems?

Tim Greene | May 20, 2015
FBI probe of whether Chris Roberts hacked an airliner in flight points to a gap between the good hackers do vs a false sense of safety.

United airplane

The purported veering of a jetliner caused by an onboard hacker points up a larger problem, experts say airlines and other providers of services may be blind to the value such security researchers can offer in the name of public safety.

While it's far from clear that security researcher Chris Brown actually did commandeer the avionics system of an airplane and force it to steer to one side, the story is prompting other security experts to call for better cooperation between white-hat hackers and industries whose infrastructures they probe.

Airlines have to get used to the idea that this type of hacking can be useful and ought to make test environments that simulate aircraft systems available for researchers to hack against, says Jeremiah Grossman, the founder of White Hat Security.

There is an emotional reaction on the part of corporate executives not to admit their systems have weaknesses, he says, but as their view matures, they become more cooperative with the research community. "Their first reaction is only authorized people can test their network. Google got over it. Facebook got over it," he says.

He points to his own experience in 2000 with Yahoo, when he hacked his own Yahoo Mail account an exploit he could have carried out against anyone else's account. But instead he told the company, and it not only worked with him to resolve the issue without penalizing him, it gave him a job, he says.

But that's not always the case, and hackers with the best of intentions are warned against probing for flaws without permission. The first thing Grossman tells researchers is, "Never touch or test anything when you don't have express written consent or own it. Otherwise you're at the whim of the target."

If that whim is to call in law enforcement, the consequences can be dire for the hacker personally, but also for the general public safety, says Josh Corman, CTO of Sonatype, and founder of I Am The Cavalry, a grassroots group "focused on issues where computer security intersect public safety and human life."

The group has been working closely with the auto industry for more than a year to build enough trust between hackers and the car manufacturers so they can work toward safer vehicles, he says. And it is hoping to do the same with airlines and medical-device manufacturers.

A big hurdle is that leaders in industries that need protection are not well versed in the nature of cyber security. Their worlds are governed by static sets of facts and principles of science like those used by the engineers who make their products. "Cyber lacks physics," Corman says, which makes security an elusive target. "Assumptions move or are incomplete or are no longer sound." It's a slow process to get industries to first understand the difference and then build enough trust with security researchers so they can work together toward safer products and services, he says.


1  2  Next Page 

Sign up for CIO Asia eNewsletters.