Monday, May 14, 2007

Here come the "rolling" scanner reviews!

It’s been too long since the web application security industry had a good in-depth review of the various vulnerability assessment solutions available. And never have any in the past included software-as-service-models like ours from WhiteHat. Network Computing's Strategic Security: Web Applications Scanners review plans to test products from Acunetix, Cenzic, N-Stalker, SPI Dynamics, Syhunt Technology, Watchfire and WhiteHat Security. Thankfully they have Jordan Wiens conducting the reviews rather than someone with extremely limited domain knowledge. For those who recall, Jordan is not there average journalist. I personally got to see him win Security Innovations's Interactive Testing Challenge web hacking competition. This should be really interesting to watch unfold!

>THE TEST BED:
We chose three applications from volunteer organizations to test our Web app scanners. All are relatively simple Web apps in use for real-world functions, and were built using a variety of development tools and platforms.


Our first application was written in C# using Microsoft's ASP.net with Ajax (also known as ATLAS) and deployed on IIS 6.0. The second was developed using the LAMP stack (the combination of Linux, Apache, MySQL and PHP), and the third was written in Java and deployed with JBoss and Tomcat on Linux.


None of the applications has received a security audit, either at the source-code level or using external scanners. Throughout the process, all scanning applications will be leveled at the same applications--any changes to fix security vulnerabilities found in production systems will be left off test instances that are used for future scanning, to ensure that each product and service has the same potential vulnerabilities to find.


Note that no vulnerabilities were intentionally added or seeded into an application. The applications will be scanned exactly as they existed in the wild at the start of the review.

2 comments:

Jordan said...

Uh oh -- hope this isn't a classic greek tragedy where the big hype comes before the big fall! I hope I can live up to that. Already there's some less-than-optimal situations about the testing (having 9 different services/products means that the best in-depth evaluation would be a full time job for a month or two).

Still, I've seen a lot of useful and some surprising things so far, so hopefully others can get some value from it.

Jeremiah Grossman said...

Nah, not at all.

For myself I'm tired of reading paid for "reviews" that are nothing more than product advertisement which ignore the important issues. I'd imagine for most of the readers here they're of the same opinion and all we're looking for is a little bit more meat to the discussion. Something honest and direct.

Who was good at what, bad at what, what could use improvement, what customers should be aware of, what was the most striking aspects of the testing, etc. All good topics. Perhaps you could also describe false-positive rate, learning curve, reporting, scalability, and time/skill requirements. Just be fair and info packed and there is no way you'll disappoint.