Subscribe / Unsubscribe Enewsletters | Login | Register

Pencil Banner

To tap Big Data, federal IT must partner with tech industry

Kenneth Corbin | May 16, 2013
Experts within and outside government IT stress the role the private sector must play in helping cash-strapped federal agencies find order in their growing stockpiles of data.

For IT managers in the federal government to wring more value out of the enormous stores of data they oversee, they must develop deeper partnerships with service providers in the private sector, according to a panel of experts speaking here at the annual FOSE government IT conference.

"The reason why government is hesitant towards a lot of the private sector is the private sector would push solutions looking for problems."

Federal IT workers from the CIO on down are dealing with the challenges of big data, but they're doing so amid the various pressures of contracting budgets, exponential growth in data volumes and a mounting expectation for higher-level, technology-enabled citizen services.

That unique set of circumstances creates a very real opportunity for IT providers in the private sector to step in with repeatable technology solutions that can help mitigate the data challenges that agencies across the government are facing.

"Why is everything a one-off, custom system for the government? The government doesn't have the budgets anymore, and it won't. The mantra is clear. Do more with less," says Thomas Cellucci, the chairman and CEO of the consultancy Cellucci Associates, who previously served in the George W. Bush and Obama administrations. "Technology's the answer and it's going to be with a speed of execution that government isn't used to."

Obama Big on Data
As it happens, Obama recently addressed the government's data situation with an executive order declaring that the production of new data should come in a machine-readable format, and commissioning the development of an Open Data Policy "to advance the management of government information as an asset," while taking steps to ensure that sensitive data is still treated with sufficient protections.

"When implementing the Open Data Policy, agencies shall incorporate a full analysis of privacy, confidentiality and security risks into each stage of the information lifecycle to identify information that should not be released," Obama writes in the executive order.

Machine readability could go a long way toward managing the information that is accumulated within an agency's system, but that's only part of the big data challenge.

A holistic solution for exploding volumes of data is all the more important at a time when agencies are dealing with what David Mitchem, the data fusion lead for the U.S. Army, calls the "wild west of unstructured data."

"As more and more of this data is nonnative ... we have to have a framework to make sense of it," says Mitchem. "What we found is it's all about context, and it's about a disciplined approach at the enterprise level that can bring meaning."

Data Lingers at Homeland Security
The view is similar at the Department of Homeland Security, where too much data lingers in disparate formats for too long before the information is marshaled toward the department's objectives of border security, counter-terrorism and other mission areas.


1  2  Next Page 

Sign up for CIO Asia eNewsletters.