Thursday, September 23, 2010

Website Security Statistics Report (2010) - Industry Bechmarks

Full Report and Slides are available on my slideshare account. Enjoy!


"How are we doing?" That's the question on the mind of many executives and security practitioners whether they have recently implemented an application security program, or already have a well-established plan in place. The executives within those organizations want to know if the resources they have invested in source code reviews, threat modeling, developer training, security tools, etc. are making a measurable difference in reducing the risk of website compromise, which is by no means guaranteed. They want to know if their online business is truly more secure or less secure than industry peers. If above average, they may praise their team’s efforts and promote their continued success. On the other hand, if the organization is a security laggard, this is cause for concern and action.


Every organization needs to know where it stands, especially against its adversaries. Verizon Business' 2010 Data Breach Investigations Report (DBIR), a study conducted in cooperation with the United States Secret Service, provides insight. The report analyzes over 141 confirmed data breaches from 2009 which resulted in the compromise of 143 million records. To be clear, this data set is restricted to incidents of a "data" breach, which is different than those only resulting in financial loss. Either way, the data is overwhelming. The majority of breaches and almost all of the data stolen in 2009 (95%) were perpetrated by remote organized criminal groups hacking "servers and applications." That is, hacking Web Servers and Web applications — "websites" for short. The attack vector of choice was SQL Injection, typically a vulnerability that can’t readily be "patched," and used to install customized malware.

As the Verizon DBIR describes, the majority of breach victims are targets of opportunity, as opposed to targets of choice. Directly in the crosshairs are the Financial Services, Hospitality, and Retail industries. Victimized organizations are selected because their security posture is weaker than others and the data they possess can be converted into cash, namely payment card data and intellectual property. As such, organizations are strongly encouraged to determine if they are similar potential targets of opportunity in these industries, have a relatively weak or unknown security posture, and the data they hold is similarly attractive. This is a key point because perfect security may not be necessary to avoid becoming another Verizon DBIR statistical data point.

There are of course many published examples in Web security where the victim was a target of choice. Currently, Clickjacking attacks targeting social networks, more specifically Facebook, are rampant. In these attacks, visitors are being tricked into posting unwanted messages to friends and installing malware. There has also been a rise in targeted Cross-Site Scripting attacks, including a notable incident involving Apache.org in which passwords were compromised. Content Spoofing attacks have been aimed at Wired to spoof a Steve Jobs health scare. Sears suffered a similar embarrassment when a fake product listing appeared on the company’s website. In an Insufficient Authorization incident involving Anthem Blue Cross Blue Shield, customers' personally identifiable information was exposed.

The bottom-line is no matter how mature the software development lifecycle there are always ways to break into and defraud computer systems. A goal of reducing the number of vulnerabilities to zero is an unrealistic, futile pursuit, perhaps impossible, and as we’ve learned likely unnecessary. And, organizations should increase the emphasis on improving responsiveness when vulnerabilities are eventually identified. The risk management question then becomes, "How secure is secure enough?" If the organization is a target of opportunity, perhaps a goal of being at or above average among your peers is good enough. If a target of choice, perhaps.


Until now no metrics have been published which organizations can use as a benchmark to compare themselves against their industry peers. These benchmarks may help answer the question, "How are we doing?" or "Are we secure enough?" WhiteHat Security’s 10th Website Security Statistics Report presents a statistical picture of the vulnerability assessment results from over 2,000 websites across 350 organizations under WhiteHat Sentinel management. For the first time, we’ve broken down the numbers by industry and size of organization. The data provides a unique perspective on the state of website security that may begin answering some of these pressing questions.


Key Findings
• The average website had nearly 13 serious vulnerabilities.
• Banking, Insurance, and Healthcare industries performed the best overall regarding the average number of serious vulnerabilities having 5, 6, and 8 respectively. The worst were the IT, Retail, and Education sectors with an average of 24, 17, and 17.
• Large organizations (over 2,500 employees) had the highest average number of serious vulnerabilities totaling 13, followed by medium (150 - 2,500 employees) at 12, and third was small (Up to 150 employees) at 11.
• Cross-Site Request Forgery moved up to 4th place as one of the most prevalent vulnerability classes. Also new on the list, in 10th place, is Brute Force affecting 10% of websites.
• The Banking industry removed SQL Injection as one of the most prevalent issues they face while all the other industries are still grappling with it. Similarly, Cross-Site Scripting within the Insurance industry has about half the overall likelihood of being exploited versus the others at 36%.
• Industries with a greater average number of serious vulnerabilities tend to have worse remediation rates.
• Small organizations fix the most of their serious vulnerabilities by percentage (62%), followed by medium (58%) and
large (54%).
• 64% of Banking and Telecommunications websites, the industries leading in remediation, have fixed more than 60% of their reported serious vulnerabilities. The laggards are Insurance and IT where only 26% and 33% respectively have fixed more than 60% of their outstanding serious issues.
• It does not appear organization size significantly impacts an Industry’s average number of serious vulnerabilities, the type or degree of specific vulnerability classes, or their time-to-fix metrics. However, remediation rate does seem to correlate. Typically the larger the organization the fewer vulnerabilities they resolve by percentage.
• With respect to the average number of serious vulnerabilities within large organizations, Social Networking, Banking, and Healthcare had the best marks with 4.38, 5.18, and 3.68 respectively. The three Worst were IT, Retail, Financial Services, with 29.55, 18.44, and 10.34
• Among large organizations, the Banking, Financial Services, Healthcare and Education industries turned in the best time-to-fix metrics with 2 weeks, 3 weeks, 4 weeks, and 4 weeks respectively. The worst were Telecommunications, Insurance, Retail and Social Networking with 26 weeks, 10 weeks, 8 weeks, and 8 weeks.
• Telecommunications, Retail, and Healthcare industries had the three best remediation rates of large organizations with 67%, 60% and 58% respectively. The three worst were IT, Banking and Insurance with 32%, 35%, and 35%.

6 comments:

Dan said...

The "scores" for the IT industry are a bit ironic if not surprising...

Retail is interesting too - looking at your report from last year - it's not totally clear but it doesn't look like PCI 6.6 is doing much to help on this front (even if you tighten your scope for what is PCI - I would have expected some improvement if only from education and cross-pollination).

Thanks for publishing this.

Jeremiah Grossman said...

@Dan I'd have to double check, but PCI may be doing some good in certain areas, like remediation or time-to-fix. But one thing is for sure, it is not night & day.

Jhon smith said...

really ironic score for the IT industry. thanks for sharing.

Bhima shankar said...

good post. thanks for sharing. really helping in my work. keep it up.

Aaron Bryson said...

These statistics are based on the data that WhiteHat has collected from its customers, correct? Very neat information. I like how you are very real about the fact that is not feasible for a company to pursue or believe they can achieve "zero vulnerabilities". It really is a matter, of "are we secure enough" and how much risk they are willing to accept. I just had this talk with a client the other day. Their goal was to refine their SDLC (soft dev life cycle) such that they will have no vulnerabilities, and possibly ever need 3rd party consulting (if they get really good at security).

Jeremiah Grossman said...

@Aaron Correct, the data is exclusively ours. Although we do contribute to the WASC Statistics project, which is separate from this report.

Yes, zero-vulnerabilities, MIGHT be possible, but is going to be seriously expensive and time consuming to pull off. Secondly, it is probably not necessary anyway. When good enough software security is achieved, better to invest resources after that to get attack visibility and improve ones ability to respond quickly.