Subscribe / Unsubscribe Enewsletters | Login | Register

Pencil Banner

Network Performance Monitoring is dead

Jesse Rothstein, CEO, ExtraHop | May 2, 2016
Key questions to ask to ensure your next monitoring tool can meet future needs.

This vendor-written tech primer has been edited by Executive Networks Media to eliminate product promotion, but readers should note it will likely favor the submitter’s approach.

Step back and imagine the world of technology 10 years ago. YouTube was in its infancy, the iPhone was more than a year away from release, Blackberry was the smartest phone on the market and Twitter was barely making a peep.

While the masses are now glued to their iPhones watching cat videos and pontificating 140 characters at a time, the backend infrastructure that supports all of that watching and tweeting—not to mention electronic health records, industrial sensors, e-commerce, and a myriad of other serious activities—has also undergone a massive evolution. Unfortunately, the tools tasked with monitoring and managing the performance, availability, and security of those infrastructures have not kept up with the scale of data or with the speed at which insight is required today.

There is no nice way to say this: What worked 10 years ago isn’t working now. Today, exponentially more data is moving exponentially faster. IT organizations who cling to the old models of monitoring and managing will be at a significant disadvantage to their counterparts who adapt by embracing new technologies.

A whole new network

Take Ethernet, for example. It’s been less than 20 years since the standard for 1Gbps was established, and less than 10 years since 10Gbps started to gain a meaningful foothold. Now, we’re looking at 40Gbps and 100Gbps speeds. It is a different world, and it’s not slowing down. According to the Global Cloud Index, “global IP traffic has increased fivefold over the past five years, and will increase threefold over the next five years.” Between 2014 and 2019, IP traffic is expected to grow 23% annually.

Speeds and feeds are not the only forces at work. Server and application virtualization, software-defined networking and cloud computing are also catalysts for IT change, reshaping how infrastructures are architected and resources are delivered.

Increasingly complex, dynamic and distributed, the network is a different place today than it was 10 years ago. Some view that as a problem to be solved. On the contrary, it’s an opportunity to be seized by forward-thinking network professionals.

In the new era, architectures matter

The reality is that traditional network performance monitoring (NPM) technologies—packet sniffers, flow analyzers, and network probes for deep packet inspection -- can’t scale or evolve to meet this new demand. Capturing, storing and sniffing packets was relatively straightforward for “fast” Ethernet supporting 100Mbps of throughput. At 100Gbps, capturing and storing terabytes worth of packets would require massive time and infrastructure investments, not to mention hours of a person's life just to sniff a small subset of those packets. 

 

1  2  Next Page 

Sign up for CIO Asia eNewsletters.