Subscribe / Unsubscribe Enewsletters | Login | Register

Pencil Banner

Should hackers be tolerated to test public systems?

Tim Greene | May 20, 2015
FBI probe of whether Chris Roberts hacked an airliner in flight points to a gap between the good hackers do vs a false sense of safety.

The problem is compounded by these industries wanting to project an image of safety and security to the public, says Paul Kocher. For example, services that store data online want customers to trust that their data is safe. "If I'm operating a service my financial interest is usually in trying to make my customers feel comfortable, which is not necessarily to disclose accurately what the risks are," he says. "

So they are understandably reluctant to allow hackers to test their systems, particularly if the flaws are revealed publicly, even if finding vulnerabilities and fixing them is desirable. "If you want to have researchers providing this level of authentic, unfiltered information at least when people are doing things badly it's going to be challenging," he says.

He thinks there are black and white areas about what is and is not appropriate for hackers to do, and then there is an enormous gray area where things are not so clear.

His company tries to stay "very, very far into that white zone, but I recognize the challenges. The question of messing with a flying plane's avionics is pretty clearly for me in the black area. I see a lot of places where there's a big debate, but I don't see this as one where the shades of gray are as nuanced as they will be in future situations that will come along."

One such area is medical devices, which must receive lengthy FDA approval. When changes are made, devices must be reapproved. "But that doesn't fit very well into your zero-day response cycles," Kocher says. "How do you patch the Linux install running inside your implantable medical device? We haven't worked these things out yet, and perhaps we never will. It's just one of these horribly messy problems that we'll be struggling with for a long time."

But that doesn't mean hackers and industry shouldn't come to agreement, Corman says. Constant testing of existing systems is necessary, even if they have previously determined secure. For example, Shellshock bugs lived in the Unix Bash shell from 1989 to 2014 before they were exposed. Similarly, the Heartbleed bug existed in the Open SSL library from 2011 to 2014.

Hostile reactions to responsible bug disclosures by altruistic hackers could hurt the security of vital systems, he says. "We have a role to play and are willing and able to help," he says.

If white-hat hackers face legal consequences of finding flaws in avionics networks, for example, they may abandon studying them. "They can easily research something less risky and not run afoul of the law," he says. "If it's too risky or politically loaded, who in their right mind would do it?"

One obvious answer: the black hat hackers.

 

Previous Page  1  2 

Sign up for CIO Asia eNewsletters.