Are large buffer memories the right solution for oversubscription in aggregation devices?
The answer is nearly always no – because most network tools can’t capture at 100% of full line rate. The vast majority of tools rely on the onboard NICs of the appliance (i.e. the server running the capture/monitoring software) to receive the packets and send to disc those that will be retained. The front side bus speed and write-to-disc capability of even the best and most robust servers simply can’t keep up with the data rate of today’s Gigabit networks when utilization levels are high. If the capture device is doing software based filtering as a way to validate which packets to keep and which to discard the actual sustained throughput capability may be as low as 150 to 200 Mbps on a Gigabit capture tool. If software filtering is not being used then a server doing full packet capture – e.g. the open source sniffer Wireshark or commercial products based on the Wireshark engine – then the throughput capability of such tools may increase to as much as 300 to 400 Mbps. Only when a specia
Related Questions
- Some of the large Anti-Virus vendors are touting an inline solution for detecting spyware. What are the differences in the Intrusion solution and their solution?
- Are large buffer memories the right solution for oversubscription in aggregation taps?
- What Makes Barometrix the Right Solution for Both Medium and Large Organizations?