Subscribe / Unsubscribe Enewsletters | Login | Register

Pencil Banner

The real cost of lying about IT infrastructure costs

Matt Prigge | April 21, 2010
Are you hiding the truth from management to justify the money you need to operate your infrastructure? That's a dangerous game.

It's hard enough to build a solid primary storage strategy for a complex server environment, but that's nothing compared to the effort required to get your proposal through your organization's budgeting process. A refreshed storage architecture or a new backup infrastructure requires loads of capital, which generally means you have to get creative when you articulate requirements to management.

Yes, some sort of song and dance is almost always required. On the other hand, part of the problem is that we seldom do a good job educating management about how our systems live and breathe and what they actually cost to run. Obfuscation isn't in anyone's interest in the long runespecially as the cloud matures and becomes a viable alternative to some segments of in-house infrastructure.

One reason this disconnect occurs is that management fails to grasp the relationship between the cost and criticality of a given application and its actual technical requirementsstorage or otherwise. Time and time again I have seen huge, massively expensive, mission-critical applications sit on equally huge and massively expensive storage infrastructures that are essentially idling under low load. Meanwhile, elsewhere in the data center, a collection of cheap tertiary applications are burning their storage resources to a crisp with loads many times that of the critical systems.

The reason this happens can usually be traced back to the budget process. When management is told that a mission-critical application will cost millions of dollars to acquire and implement, there's rarely much in the way of pushback over a beefy infrastructure to match. In fact, it's expected.

Conversely, companies often purchase US$5,000 or US$10,000 applications without giving much thought to the effect they'll have on infrastructure. Yet if you do your homework and thoroughly test these applications up front, you'll often find that these "little" applications use far more resources than the "big" ones do. While management may not blink at a request for US$100,000 in storage capital for a US$750,000 software project, you can expect a ton of resistance if you're looking for US$50,000 of storage capital for a US$10,000 software project.

That's how the budgetary two-step was invented. IT departments everywhere use big ticket projects to justify the purchase of infrastructure those projects don't needin order to feed the needs of little projects that nobody would spend enough capital on otherwise.

Is that really such a bad thing, you ask? Management feels it has spent its money wisely, while IT gets the resources it needs to deliver solid performance and reliability back to the business. It's almost as if IT is doing management a favor by quietly doing the right thingseems like a win-win.

Unfortunately, this kind of budgetary shell game has serious drawbacks. By obscuring the "true" cost of running your infrastructure, you perpetuate a decision-making process that consistently yields poor, uninformed choices. It becomes all but impossible to adopt a chargeback system that reflects the actual cost of what IT does. And it feeds into the constant struggle between IT and management over what's truly worth spending money onespecially when it comes to allocating manpower to manage infrastructure and applications. When you get right down to it, if you constantly feel your department is overworked and is never given enough time to finish anything, look in the mirror. Being less than brutally honest about what things actually cost probably caused that situation.

 

1  2  Next Page 

Sign up for CIO Asia eNewsletters.