Important Notice: Our web hosting provider recently started charging us for additional visits, which was unexpected. In response, we're seeking donations. Depending on the situation, we may explore different monetization options for our Community and Expert Contributors. It's crucial to provide more returns for their expertise and offer more Expert Validated Answers or AI Validated Answers. Learn more about our hosting issue here.

Are large buffer memories the right solution for oversubscription in aggregation devices?

0
Posted

Are large buffer memories the right solution for oversubscription in aggregation devices?

0

The answer is nearly always no – because most network tools can’t capture at 100% of full line rate. The vast majority of tools rely on the onboard NICs of the appliance (i.e. the server running the capture/monitoring software) to receive the packets and send to disc those that will be retained. The front side bus speed and write-to-disc capability of even the best and most robust servers simply can’t keep up with the data rate of today’s Gigabit networks when utilization levels are high. If the capture device is doing software based filtering as a way to validate which packets to keep and which to discard the actual sustained throughput capability may be as low as 150 to 200 Mbps on a Gigabit capture tool. If software filtering is not being used then a server doing full packet capture – e.g. the open source sniffer Wireshark or commercial products based on the Wireshark engine – then the throughput capability of such tools may increase to as much as 300 to 400 Mbps. Only when a specia

Related Questions

Thanksgiving questions

*Sadly, we had to bring back ads too. Hopefully more targeted.