Subscribe / Unsubscribe Enewsletters | Login | Register

Pencil Banner

BLOG: The evolution of Analytics

Andrew Jennings | July 24, 2013
A brief look at five stages in the evolution of predictive analytics over the past 80 years.

Predictive analytics - the use of mathematics to analyse data and forecast the future - is one of today's hottest technologies. Between 2011 and 2012, job posts for data scientists jumped by 15,000 percent, as businesses race to make use of new analytics and unlock the mysteries about their customers and their business that are in their data.

But analytics didn't spring up overnight, or even in the last 20 years. Here is a brief look at five stages in the evolution of predictive analytics over the past 80 years, starting at a time when only governments had access to its power.

Dawn of the Computer Age: In the 1930s and 1940s, government agencies were the first analytic opportunists, and military applications dominated. From decoding German messages to targeting anti-aircraft weapons to simulating the behaviour of nuclear chain reactions in the Manhattan project, this era developed many of the basic techniques that would set the stage for commercial use.

Commercialisation: The 1950s saw large corporations and research institutions begin using analytics. The first weather forecasts were created by way of the ENIAC (Electronic Numerical Integrator And Computer). One of the early wins for analytics was developing the "shortest path problem," which improved air travel logistics. The emerging discipline of Operations Research began to be taught at universities. Experts in this field - such as Bill Fair and Earl Isaac, who met at the Stanford Research Institute and founded analytics company Fair, Isaac (now known as FICO) - put OR's principles and computer expertise together to solve new problems, such as credit risk assessment.

Mainstream: In 1973, analytics took a jump forward with the Black-Scholes model to predict the optimal price for stock options. Still, the real analytic momentum began in the 1990s, when personal computers were widely adopted and the Internet came online. eBay and Amazon developed analytics that personalised the web retail experience. In 1998, Google's algorithms maximised search results, and the modern information age truly began.

Deep Impact: Since 2000, analytics has reached widespread use. From shopping and movie recommendations, to traffic management and beyond, analytics shape the way we live and how organisations develop. The rise of so-called Big Data, generated by people and devices, has spurred the need to decipher the data at a faster rate, and to use advances in natural language processing to turn text into data that can be readily analysed by analytic software. It's estimated that 2.5 quintillion bytes of data are being created every day - only faster, smarter analytics can turn this data into insights and value, both for businesses and people.

Ubiquitous Analytics: What's next? Tomorrow will see analytics used in startling new ways. Already, there are advances in predictive policing and disease identification that could have dramatic impacts on the world. Each action by individuals, from trivial purchases to online communications, will have the possibility to alter the analytic mold that shapes predictive analysis.

 

1  2  Next Page 

Sign up for CIO Asia eNewsletters.