"In a Big Data environment, they were able to write and execute it in under 100 lines of code," Barth says. "They executed it in less than 24 hours, processing hundreds of terabytes of session data. The analysis was on data they already had; they had it inside the bank. They just wanted to know what their own customers were doing on their own channels. This really unlocked that visibility into the way their bank was running."
But the key, Barth says, is to take it one more step. Once you understand what you're seeing, you can develop a model that explains it and metrics to measure your execution in improving your business against that model. That's where traditional BI comes in.
"The "new" and the "known" are not islands; they must be symbiotic systems connected to and feeding each other," NewVantage says. ""New" analyses require rapid access to all the "known" data representing the reality of today's business. Conversely, there must be a disciplined approach to promoting new insights, data and models to evolve the "known." Without this linkage, the systems diverge into incoherence that does not reconcile or scale."
"Emerging technologies and methodologies-including Hadoop, Cloudera, database appliances, accelerators and self-learning and genetic algorithms-can dramatically reduce TTA," NewVantage adds. "The key to streamlining the time from a corporate question to game-changing business insight is to right-size the approach to analysis: rapid iterations during discovery and rigorous engineering into production. Strong governance and oversight of known data capabilities must coexist with agile data analysis that paves the way for new data discoveries. Enterprises with the capability to create an ebb and flow between dynamic discovery and scalable execution position themselves for sustainable success and dominance over their competitors."
Sign up for CIO Asia eNewsletters.