Subscribe / Unsubscribe Enewsletters | Login | Register

Pencil Banner

BLOG: Harnessing the power of big data solutions – what should be considered?

Damien Wong | Jan. 28, 2014
Organisations must pay close attention to the environment they adopt such technologies in and plan for these potentially immense capabilities to be harnessed through the right analytics tools, platform infrastructure and manpower training.

Big data proliferation is evident today as organisations continue to amass information at immensely exponential rates - and then using an infinite number of ways to derive business value from these data. This ranges from optimizing operations and identifying inefficiencies to applying predictive analytics to anticipate events such as customer churn, product failure or quality degradation, and financial fraud.

Companies and businesses see big data technology as a unique opportunity to innovate and deliver critical real-time results, resiliency, and security. However, in the process of jumping on the big data bandwagon, it has gradually become apparent that traditional data processing and analytics tools may have failed to keep up with these changing business demands.

The challenges, that emerge as a result, include having to decide on necessary infrastructure to house these large volumes of data while ensuring performance, scalability and efficiency in heterogeneous environments. As such, organisations are trying to respond by deploying big data solutions and making changes to their current legacy IT infrastructure and strategies. 

Mirroring the presence of these challenges, IDC has predicted that the worldwide big data technology and services market is expected to grow at a rate of about seven times that of the overall information and communication technology market, with revenues forecast to reach US$23.8 billion in 2016. [1] The statistics indicate that organisations should start looking at the implementation of big data solutions accurately, seamlessly and cost-effectively.

So how, and what should organisations look for when implementing a suitable strategy and what can they expect to reap through the use of such solutions?

Companies must realise that exploiting big data opportunities requires an approach that goes beyond single technology solutions, and a suite of big data tools may be needed to shape flexible and effective solutions in line with business and application requirements.

One such tool that has become integral in the big data movement is Apache's Hadoop's open source software framework for running applications on large clusters of commodity hardware.  Its rate of adoption can be attributed to its enormous processing power - the ability to handle virtually limitless concurrent tasks and jobs - making it a remarkably low-cost complement to a traditional enterprise data infrastructure. 

More importantly, the infrastructure in which these tools are implemented is key. It should be able to handle big data and must scale effectively to support the memory, processing, networking, and input/output (I/O) requirements of these new distributed workloads and data.

In today's typical enterprise, there are three types of big data that need to be managed: business-, machine-, and human-generated data. To handle these data, enterprises need to process, filter and route it to the right application for analysis. And to turn all these data into actionable information requires a combination of middleware, storage, and processing capabilities that apply to all data and analytics platforms. It is essential for IT to have a single and scalable platform to manage all types of big data. Interoperability is also crucial to integrate and install comprehensive enterprise big data solutions, without a complete overhaul of the organisation's current IT network.

 

1  2  Next Page 

Sign up for CIO Asia eNewsletters.