Blog

Breaking Down Big Data

November 21, 2012 by Ixia Blog Team

The big data trend is expanding as the volume of data increases due to social media, mobility and cloud computing. As this abundance of structured and unstructured data grows, companies are grappling to successfully manage and make sense of it.

Specifically, companies are applying big data analytics to determine business trends and insights from the reams of data created. Analyzing big data could result in business growth, cost savings, revenue increases and better marketing for organizations. But it comes with challenges.

Big Data: What is it?

Big Data is one of the hottest buzzwords in technology today. According to industry analysts Gartner, “Big Data are high-volume, high-velocity, and/or high-variety information assets that require new forms of processing to enable enhanced decision making, insight discovery and process optimization.” Big data is a direct and natural consequence of advances in technology; in fact, the world’s technological per-capita capacity to store information has roughly doubled every 40 months since the 1980s[15]; as of 2012, every day 2.5 quintillion (2.5×1018) bytes of data were created.

Big Data exceed the demands of a typical network because of the size and speed of the data traveling over the network. Almost everything about this new paradigm is different than what traditional networks were designed to handle. But there are some elements of networks running Big Data that are exactly like its more typical counterpart. You still need to monitor the performance of the network, make sure that the network is secure, and check the behavior of the applications that are running on it. And meeting these needs require a way to monitor big data that is as revolutionary as it is – the network monitoring switch.

The Big Data Dilemma

Big Data presents different challenges than typical network installations, and therefore requires a whole new way of thinking about data monitoring. To understand the challenges, it is helpful to consider the origins of big data.

One example is the push for companies that rely on the internet to utilize distributed databases, where very large amounts of data are spread across many commodity servers to provide a highly available and easily scaled infrastructure with no single point of failure. This sort of distributed processing is essential in commercial enterprises like streaming media and communications, as well as in areas like meteorology, biological or environmental research, genomics, and more.

Unfortunately, the more widely spread and dynamic in nature the network is, the more difficult it is to monitor. For virtually every application utilizing the distributed computing model there is a mandate to be able to observe the entire network as a single logical unit. This is necessary to be able to ensure that the network is performing as it should, that there are no security threats, and that the applications running on the network are behaving as expected. And to complicate matters further, the monitoring tasks themselves are often spread between different groups, using different tools.

Traditional data monitoring on smaller networks is done using a combination of analytic tools to examine data collected. The data to be analyzed is gathered via mirror ports on network equipment (also known as switched port analyzer, or SPAN, ports), and network test access ports (TAPs).

Even if there were enough SPAN and TAPs, however, the problem would still remain in the Big Data world, based on the network architectures fundamental to distributed computing. Modern network architectures provide multiple paths through the network, which helps increase reliability.

Monitoring Big Data

What is needed is a way to elevate the perspective of analysis, as well as to spread analysis out to observe the network from many different points. Crucially, this tool must be able to function without disturbing the network flow itself. These requirements are the birthplace of the network monitoring switch.

Big Data requires large investments in monitoring tools. As a result, it is essential that IT teams get the most out of their monitoring tools by taking full advantage of their core capabilities.

Advanced network monitoring switches provide a number of features that off-load compute intensive processing from their tools. Such features include load balancing, packet filtering, packet de-duplication and packet trimming.

Additional Resources:

Optimize Big Data Performance

Network Visibility Solutions