Subscribe / Unsubscribe Enewsletters | Login | Register

Pencil Banner

Interview: Veeam on virtualisation woes

T.C. Seow | Feb. 10, 2014
In an exclusive email interview, Charles Clarke, technical director, APAC, Veeam Software, talks about how virtualisation is helping companies reduce costs and gain benefits not seen before.

In an exclusive email interview, Charles Clarke, technical director, APAC, Veeam Software, talks about how virtualisation is helping companies reduce costs and gain benefits not seen before.

Charles Clarke
Photo: Charles Clarke

In many organisations, keeping legacy tools running is unavoidable, as there are applications and file formats that could only be accessible to these tools which, for all intents and purposes, could be a drag on overall productivity. How should organisations tackle this problem when they move into virtualisation?

Charles Clarke: Bizarrely enough, virtualisation can prolong the life of legacy applications. Some applications run on operating systems that no longer have any supported hardware. So running them on a virtual platform where hardware is not an issue is viable until the application can be upgraded or replaced. However, in view of this, IT decision-makers will take a strategic approach to move away from precarious legacy systems towards modern infrastructure. For example, the availability of cloud-based applications is presenting an opportunity for organisations to break away from legacy capital investments and move to a more flexible billing model for their critical applications.

How are your customers expecting their data, and the demands upon it, to change over the next year?

This varies across organisations however a persistent trend we see shaping is the growth of unmanaged data. Our customers predict anywhere from 20 percent to 200 percent data growth over the coming year. This can include file data, orphaned or zombie virtual workloads (running virtual machines serving no function). Overtime, the space and resources required for accumulated data eats into total capacity. However the bigger challenge here lies in ensuring that organisations are getting the best from their investment in storing and maintaining data. Are they using data to their strategic advantage or are they hanging on to it 'just in case'? I reckon advancements in analytics, better tools and processes around data archiving will appear as a trend later in 2014.

What is the best approach to planning to overcome the "3C's" — capability, complexity and cost?

The '3C' problem essentially applies to tool selection. Organisations must understand that there is a clear cost to complexity and missing capabilities that goes beyond capital expenditure. For example, there could be lost revenue while waiting for a server to recover, or a loss of service if a web site cannot respond to user requests fast enough because of a performance problem. The best approach to overcome the 3C issue is to ensure that the best tool is selected for the job in hand. Select tools that are designed for the platform intended to manage. For example, in virtualised environments tools built for virtualisation should be selected rather than attempting to stretch a tool designed for the physical landscape. This will only lead to frustration, and the 3C's. 

 

1  2  3  Next Page 

Sign up for CIO Asia eNewsletters.