Check Your Virtualized Blind Spot
"Check your blind spot" is a message most of us remember from our driving instructor. The hazard you can see is one problem; the hazard you did not even know was there is potentially a much bigger one. The same principles apply to enterprise data security, where dealing with network blind spots is an emerging challenge for many organizations.
The volume and variety of data carried by a typical enterprise network is growing all the time, creating a complex, changeable and noisy environment that makes analyzing security and performance increasingly difficult and yet more critical than ever.
Looking into blind spots
To counter these blind spots, many organizations are using Network Packet Brokers (NPBs) as a core element of their network visibility environments to receive packet-level data from their virtualized and physical networks. It is the job of the NPB to sit between the network taps and the organization’s security and performance monitoring solutions, aggregating and filtering all data packets, and feeding them to the security and monitoring tools. This enables the tools to analyze the information and detect any potential security or performance issues.
Intelligent NPBs perform a range of packet operations to pre-process data packets they pass on to monitoring tools, such as data deduplication and packet trimming, that are intended to reduce total solution cost by improving tool efficiency. An effective NPB intelligently processes all data packets -- without losing any. At least, that’s the theory. It turns out that one of the most hazardous blind spots facing IT and infosecurity teams today can actually be caused by an intelligent NPB that is intended to improve visibility!
The first issue is that some NPBs can drop data packets while aggregating and deduping them. Consequently, the security and monitoring tools do not just receive filtered, streamlined data -- they receive incomplete data. Packet losses of up to 30% are not unusual with some NPB solutions under typical operating conditions. Any packet loss in the NPB directly and dramatically reduces the effectiveness of security tools. For example, if a hacker uses packet fragmentation to split an exploit across multiple packets, then an Intrusion Detection System (IDS) will likely be unable to detect an attack if it loses a number of the packets involved.
The second issue is how can you detect this data loss? As the function of an intelligent NPB is to reduce load on security and monitoring tools, it will normally discard extraneous packets during proper operation. This fact makes it practically impossible to notice when the NPB is also dropping critical packets just by examining the counters on the NPB. Live networks are constantly changing, so it is impossible to identify what the data packet counts should be at any point in time.
The only way to determine if an NPB is subject to dropping critical traffic in your deployment is to evaluate it with a controlled load before making a purchasing decision and placing the unit in service in your live network. So some NPBs potentially cause not only data blind spots -- but blind spots you do not even know about. The implications of these blind spots for security and your company are business-critical.
You cannot secure what you cannot see
If network visibility information is missing, then it does not matter how effective your organization’s security tools are; they are always going to miss security events that they cannot see. So the blind spot could be concealing a network intrusion attempt, signs of abnormal bot traffic or data exfiltration following a successful exploit. The longer it takes to identify abnormal network activity, such as a hacker infiltrating your network, the more information that hacker will steal.
It also creates a compliance issue. Organizations subject to data security standards like PCI DSS or government regulations like HIPAA could be vulnerable to compliance violations by failing to monitor 100% of their data traffic. Such violations can be costly in terms of reputational damage as well as government fines. Even if sensitive data is never actually compromised, the inability to demonstrate comprehensive data monitoring is enough to fall short of many regulations.
Incomplete data monitoring also means inadequate understanding of traffic volumes, and therefore an inability to predict when systems may be about to fail -- none of which helps IT and security teams to gain control of their networks.
Ensuring total visibility
So what is the solution? Not all NPBs are created equal. NPB solutions are still the most intelligent and effective way of gaining visibility across network environments, ensuring that security and performance monitoring tools have efficient access to 100% of enterprise information -- providing that they do not drop data while aggregating packets.
As such, it is critical when looking to implement a new NPB to ask serious questions of its chosen vendor, such as:
- How does your solution trim data packets and eliminate duplicate ones?
- How does it carry out these functions without introducing any additional packet loss?
- How does it perform under varying network loads?
Savvy decision makers will not rely on vendor claims, but will be sure to test the solutions using known loads prior to making a purchase decision.
Clarity on these issues will help to guarantee the selection of a truly efficient NPB solution: one that ensures network blind spots are eliminated, and potential threats can be seen -- and secured.
— Glenn Chagnot, Senior Director, Product Management, Ixia (Nasdaq: XXIA)
[Editor's note: This article first appeared in Light Reading.]