In network scanning the list of “well-known” vulnerabilities is large, but also finite. Databases such as OSVDB, SecurityFocus, MITRE (CVE), and others catalog the known universe of issues. Vulnerability coverage by network scanners is likely close to 100%. In “custom” web applications the luxury of well-known vulnerabilities or database repositories vanishes. Each new vulnerability identified is more or less a one-off / zero-day issue. Just as with bugs in application code, we truly never know how many vulnerabilities exist in a web bank, e-commerce store, payroll system, or any other custom web application. The upper bound in an unknown. Therefore we can never know for sure if any scan/assessment found them all. Vulnerability coverage could be as low as 10-20% or higher in the range of 80-90% or more. The point is we don’t know, its difficult to measure, and changes with each website.
This is a big reason why I’ve been talking a lot about measuring security recently. I’m a big believer in it. Who isn’t? I even took a shot at a Methodology for Comparing Web Application Vulnerability Assessment Solutions. Figured we could use time-it-takes-to-hack-a-website as something we could reliably measure. For some reason I hadn’t got much feedback on the idea. Likely because there hasn’t been customer demand as they’re not REALLY aware of the fact that everything isn’t being found. Whether my methodology works or not, we’re going to need to figure this out. Once customers of ANY webappsec VA solutions gets hacked due to missed vulnerabilities, there’s going to be hell to pay.