Subscribe / Unsubscribe Enewsletters | Login | Register

Pencil Banner

Aiming for SEC's big data project, Sungard and Google bet on the cloud

Joab Jackson | June 16, 2015
To bid for a massive U.S. government contract, Sungard has teamed up with Google to build a prototype cloud system that could store six years' worth of U.S. stock and financial trading data and let regulators and stock traders scrutinize this mountain of information.

To bid for a massive U.S. government contract, Sungard has teamed up with Google to build a prototype cloud system that could store six years' worth of U.S. stock and financial trading data and let regulators and stock traders scrutinize this mountain of information.

The work is being done to compete for a U.S. Securities and Exchange Commission contract, called the Consolidate Auditing Trail (CAT). The SEC's goal is to build a system that provides more transparency into financial markets, a response, in part, to the computer-driven 2010 "flash crash" that briefly cratered U.S. stock prices.

"The CAT is a huge undertaking," said Neil Palmer, Sungard's chief technology officer for its consulting services practice. "It is the biggest big data problem in the financial industry today."

Palmer described the prototype Friday at the Google Next user conference in New York. Sungard, a provider of financial software and services, is one of six finalists for the work, and has partnered with Google for technology infrastructure.

The flexibility of cloud computing provides the ability for Sungard to pursue such an ambitious job, Palmer told a group of reporters after the keynote.

With building a system in-house, "there are just too many unknowns," he said, referring to the intense hardware and operational demands that would come with building an on-premises system to execute this work.

The system will cost anywhere from $350 million to $1 billion to build, the SEC has estimated.

Once operational, CAT will generate a tremendous amount of data, Palmer said. The system must record every quote and every trade from every financial company participating in the public U.S. markets. The companies must submit their data on a daily basis, and the system must keep this data for six years.

Each day, the system will ingest about 50 terabytes of data, comprised of about 100 billion events. The six year window of when records will be actively kept will amount to about 30 petabytes of data, Sungard estimated.

All this data must be validated, indexed, and posted within four hours.

Tools must also be available to query all this data. "There is no point storing that much data and not being able to generate any actionable information from it," Palmer said.

In addition to the SEC, cloud accessible financial data could also be of immense value to the financial firms themselves, Palmer noted. A comprehensive centralized copy of all financial trading information would reduce the needs for such firms to store that data in house. They could test algorithms on the history market data, to see how will they can predict upcoming changes.

Sungard assembled the infrastructure for the prototype using a variety of Google Cloud Platform components.

 

1  2  Next Page 

Sign up for CIO Asia eNewsletters.