We developed a test methodology and evaluation criteria in six main areas, including results reporting, product controls and manageability, scan results, vulnerability workflow features, interoperability, and updates and protocol support.
We started by installing each product in our test lab in Tucson, Arizona. For products that offered both on-site and off-site scanners (QualysGuard VM and FusionVM), we elected to install at least one on-site scanner for inside-the-firewall scans and used their outside scanner for outside-the-firewall scans.
We worked with three production networks at three companies, including a manufacturing organization with several hundred mostlyWindows servers, a mid-sized business network with a mix of about 100 Windows and Unix servers and embedded devices, and an Internet-facing DMZ of a software development firm, entirely Unix-based.
We also put together a special test network with non-production systems that we could use to control the deployment of patches and updates. The test network had four specific systems that we chose as typical examples of enterprise servers and clients, including two Windows systems (Windows 2003 and 2008), a Mac, and a Linux system.
Over a three-month period from January to March 2011, we launched the scanners repeatedly against the different networks, trying to use them as a normal enterprise security manager might. As results came in, we saved them and took notes on the different test criteria.
Towards the end of the test, we finally allowed our test systems to patch themselves using built-in operating system tools (such as Microsoft Update) and then re-ran the scans on those systems. Our goal in this area was to evaluate both "delta" scanning (comparing the results of two scans), as well as false positive and false negative results from the scanners by very closely inspecting the results.
When testing, an important part of the process is to create the "persona" of the product user, which helps in deciding what features are important and how the product will be used. Many of the vulnerability analyzer products we tested have their roots in a closely related, but very different product space, vulnerability scanners. These scanners are run only occasionally and the goal of the testing is usually penetration testing or a quarterly compliance audit rather than vulnerability management. (See: Web scanning as an option.)
We were careful to evaluate the products in this test as vulnerability analyzers and not just vulnerability scanners. For example, SAINT Corporation, one of our participants, has a separate product, SAINTexploit, which not only detects vulnerabilities but also actively exploits them to prove their existence. That's an incredibly cool product, but not part of this test.
Read more about wide area network in Network World's Wide Area Network section.