> Sorry, Scott, but the version you set forth isn't quite accurate. To
> set the record straight:
> Raptor Eagle had a bug which gave it unfavorable performance results
> in the test published in Data Comm's November 21, 1995 issue (now
> available at www.data.com). To Raptor's credit, the company quickly
> found and fixed the bug. Raptor then came back to the test lab for a
> vendor-sponsored retest. Eagle outperformed all other products in the
> This was done on the same test bed, using the same test application.
> The only thing that changed was the Eagle code. There wasn't any "flaw"
> in the test that caused Eagle to fare better the second time around.
> David Newman dnewman @
> Director, Data Comm Test Program voice 212-512-6182
> Data Communications magazine fax 212-512-6833
I apologize for the implication that the Data Comm test is flawed. A more
accurate message should have been:
In my opinion, the Data Comm test *method* was/is slightly flawed, not the
test itself. The reason I make this claim is that I have seen hundreds of
simultaneous connections through a Raptor firewall with no problem at many
sites. Even though the Data Comm test did expose a bug in the Raptor
software (causing a connection to be dropped and the test not to complete
at that level), I would conclude that something in the test method was not
indicitive of a "real world" load since this result had never been seen
before by me or Raptor (they claim). Unfortunately, I have no further
evidence to back up my claim since as far as I know, the test code is
not publically available for inspection.
The bottom line is that Data Comm found a bug, Raptor fixed it and performs
very well in the retest, and Data Comm should be commended for shedding some
light on the murky world of firewall "performance".
Scott Bartram internet information services, inc.
email: scottb @
net 1680 East Gude Drive
voice: 301-340-1761 Rockville, MD 20850