> This week's Infoworld reports on page 16 that 15 firewall vendors have
> had their products certified after testing by the National Computer
> Security Association (NCSA). Sponsoring vendors (I'm presuming tests
> were paid for by vendors) get to cite the NCSA certification in
> hawking their wares.
NCSA is a for-profit venture. This is not a criticism
(some of my best friends work for a living!) but it's worth
noting, as always, when trying to understand an organization's
> NCSA's Web site summarizes the tests performed. The URL is:
> I'm very curious to hear members of this list comment on these
> tests--are they sufficiently rigorous and complete?
This is a tricky question. *NO* test can be sufficiently
rigorous and complete. A better question to ask, perhaps, is whether
the testing has some value.
I looked at the list of tests that are performed and all
of them are, basically, trivial. No firewall should fall prey
to any of those tests, under any circumstances. On the other hand,
for example, you might get "false positives." I remember that
pingware used to choke on smap(from the toolkit) output when
it attempted a sendmail debug. Smap sent a reply reading:
220 Debug set - NOT!
and pingware would raise the roof.
Programmed testing is not a substitute for intelligence.
That being said, it's better than nothing. Implicit in
NCSA's action is the assumption that there are firewall vendors
so irresponsible or clueless that they'd sell a fireall with
rexd running on it, reachable from the outside. I suppose it is
possible that someone might do that, and I suppose that the
test would detect it. So I suppose that the test will act as
a very low bar to keep out the complete lamers.
From looking at the NCSA test checklist, I believe that
a router could be configured to pass it, with a minimum of
effort. The test also specifies (correctly) that:
-> The internal machines will not be configured in a `secure'
-> manner - it is the job of the vendor's firewall to maintain
-> the security of the inside network.
Here is where things get interesting. There's no
mention of the O/S (or mix) on the inside network, because
that is significant. What about firewalls that are transparent
packet-screening type firewalls? They permit certain forms
of traffic to certain services on the inside. THE SECURITY
OF THE SOFTWARE ON SELECTED SYSTEMS BEHIND THE FIREWALL MAY
BE IMPORTANT. Unless NCSA's intent is to argue that all
packet-screening type firewalls, no matter how powerful
they may be, are invalid technologies.
I have a lot of reservations about any form of
firewall testing programme, because of the configurability
of the systems in question. Here's where I have to admit
that the orange book guys had it right:
1) Ensure the system's design has some basic properties
2) Then test it in its deployed configuration
When I was consulting I went to a number of sites to
check out firewalls, and in several cases I saw firewalls that
had been configured with big gaps in them for some service or
other to some internal system or other, and no attention had
been paid to the security of the software on that system. A
cold-lab test of a firewall isn't going to be able to help
with that kind of thing -- it shouldn't, since it's not the
firewall's problem. The firewall vendor can't take that sort
of thing into account.
So - where does that leave us? Back to the orange book's
idea of making sure the basic design makes sense, and then
making sure the deployed configuration is correct. Doing a
design review is something NCSA can't touch because a design
review will require a LOT of effort, cost a lot, and design
reviews inherently will introduce bias - for some purposes,
some designs simply are better. For other purposes, the same
design may not be so good. That's why there are both Ferraris
and Humvees. A design review has to start from "it depends"
and work from there, whereas NCSA's effort has to start with
what their participating vendors already have, and make some
sense of it. Orange book style design review only works if
you already have a notion of the One True Correct Design.
Which doesn't work in an evolving commercial marketplace.
The other approach remaining to us is to do onsite
testing of every firewall in its deployed configuration.
That works, but it's extremely expensive and it requires a
lot of expertise. It also doesn't work because, frequently,
it runs counter to the business reason for installing the
firewall in the first place: if the tests *say* that the
hole in the firewall is a bad idea, and the hole in the
firewall is "necessary" then what do you do? Ignore the
test. Do you re-test it whenever the firewall changes or
the systems behind it change? You should. There are a lot
of consultants making a lot of money testing and re-testing
firewalls right now. Is it money well spent? I am not sure
it really is. I used to be in that business and discovered
that at almost every site I visitted, the firewall was about
the only part of the network that was "right"; everything
else was broken.
Lastly, I am concerned that product testing is
going to lull people into making the wrong assumption: that
a firewall, once "tested" is OK from then on. Security is
a *PROCESS* not a simple thing you implement once, test,
and then forget about. The idea that you can buy a pre-tested
firewall, install it, and never worry, is dangerously naive.
Marketing that as a "feature" is irresponsible. Using a test
as a barrier to market entry or sales leverage is sleazy.
Unless the test is something rigidly quantifiable, which I
believe firewalls are not, by virtue of their extreme
Six months or so ago, I wrote a lengthy polemic on
this topic, which I had on my (then) web page at iwi.com. It
is now and still on the V-ONE publications area as:
it does not represent V-ONE corporation's official views on
the topic, but I hope it can help provoke some thoughts
and discussion on the topic.
I've known the NCSA guys for a while, and I've even
contributed work (pro bono) to the goal of improving the state
of the firewall market. They have, bless their hearts, been
pushing forward with the firewall product functional summaries
effort which I started; that's good. Clearly, NCSA is trying
to provide value in this area. I'm not sure I agree 100% (or
even 30%) with the firewall testing concept, and I guess it's
really more about marketing and perception than it is about
security. The problem is that, in an environment where some
folks are trying to push Lotus Notes as a "firewall" it's hard
to criticize any organization that tries to certify firewalls
as "apparently OK." I just don't think that a certification
sticker means a whole lot more than that someone paid some
$$ and had some basic tests run, and passed them. That doesn't
impress me a whole lot but I guess it's a start.
Chief Scientist, V-ONE Corporation -- "Security for a connected world"