I’m sure other people have noticed this, at least I hope so, but never mentioned it publicly. If you read PCI-DSS 1.1 section 6.5, the part that covers “Cover prevention of common coding vulnerabilities in software development processes”, you’ll notice the list is identical to that of the OWASP Top Ten 2004 while the latest version is 2007:
6.5.1 Unvalidated input
6.5.2 Broken access control (for example, malicious use of user IDs)
6.5.3 Broken authentication and session management (use of account credentials and session
cookies)
6.5.4 Cross-site scripting (XSS) attacks
6.5.5 Buffer overflows
6.5.6 Injection flaws (for example, structured query language (SQL) injection)
6.5.7 Improper error handling
6.5.8 Insecure storage
6.5.9 Denial of service
6.5.10 Insecure configuration management
I guess technically speaking anything that’s in v2007 and not v2004 you don’t have to worry about. That means you still have to code against Buffer Overflows and Application DoS, but not Malicious File Execution, Insecure Direct Object Reference, and Cross Site Request Forgery (CSRF). Ahh, fun fun. Gotta love compliance. :)
15 comments:
You do have to love compliance. They try, but I have found that many are not exactly up to date on the latest in any specific category of security: application , network or otherwise. To me, this is understandable since they generally have to manage a variety of security concerns at a very high level and delegate any clarification to their respective SMEs.
Luckily, we do have them to hopefully manage our vulnerability remediation efforts which lets us focus on discovering them in the first place.
This is probably due to the fact that PCI DSS v1.1 was released in September, 2006. Have you emailed the Standards Council to see if version 1.2 due out in October will be appropriately updated?
In addition, this is also where the role of the PCI QSA comes in. QSA's are the security experts that need to make the decision on whether or not or an organization is compliant. The PCI-DSS tries to be specific enough to give proper guidance, but also allow for interpretation by the QSA. This is both good and bad, but we hardly live in a perfect world.
I would expect for the next update to reference the 2007 guide. For supporting evidence of this, the Payment Application Data Security Standard (PA-DSS) "PABP to PA-DSS Transition guide" (https://www.pcisecuritystandards.org/pdfs/pci_summary_of_pabp_to_pa-dss_changes.pdf) references the OWASP 2007 list.
I had pointed this out at the recent OWASP Irish chapter meeting and I pretty much said the same as damon has said.
Version 1.1 was released before the 2007 top ten had been announced. I'm sure the upcoming version of the standard will address this.
I agree the timing of the document releases would account for this, but I wonder why they didn't address it with the recent supplemental release of 6.6. Im sure they'll probably update the document in Oct with 2.2
Hey - at least the PCI reg addresses the specifics... most regulations suck. What it really comes down to is the auditors... Regulations are inherently behind the times - much like the textbooks in schools... by the time it's proposed, reviewed, approved and distributed it's out of date. Sadly that's the way it goes... follow the spirit, not the letter of the law and ye shall be OK.
That's why I prefer liability based models rather than standards better. No way a single groups of people are going to be smart enough for everyone else.
Well, on the other hand, if they'd refer 2007, would that mean that I don't have to mind buffer overflows?
The PA-DSS has a reference to the latest OWASP Top 10. But I think it is better if they just make a statement, implying "Ensure that the latest OWASP Top 10 Requirements are followed for Application Development"...Now, I dont think that would be too hard would it?
It makes sense. I take it that the PCI council has to go through a lot of red tape (and assorted VP golf games) so staying a bit behind the times is to be expected :)
Since it is a standard, it has to be a self-contained document so referring to something not under their control and of such a rapidly changing nature as the field of web application vulnerabilities, is a bit off.
Haven't RTFA'd yet but I totally agree with you about code review being ultimately impossible...not even taking into account the shortage of security-aware programmers, the amount of code on the web rapidly increasing, and also the rate of obsolescence in competitive markets where site appearance matters is also increasing. Who knows what languages, formats, mashups and munges will be required for companies to be competitive 5 years from now? One thing is for sure though--programmers will be more concerned about those attributes than securing old code, because on a broad scale and absent breaches creating crises, planning and funding follows profitability and innovation a lot more closely than security. --Eponymous
@Alexander, I guess so. But then again the vast majority of webapp don't have buffer overflow issue anyway.
@Thanasis, what's also interesting is that PCI-DSS is update more often than the Top Ten, which is every 3 years it seems. So maybe in 2010, we'll have another version. :)
@Rafal: Unfortunately, if it is a standard that requires compliance with (and having 3rd parties checking items on the questionnaire), following the spirit might be technically sound (in fact I agree with you, but the auditor will "judge" you based on the standard itself :)
Didn't even think of that. Made me worried about the QSA recert exam I just sat where I posted responses based upon the latest top 10. (Hopefully I won't be marked wrong on a technicality so to speak). :-)
Mike Dahn, PCI trainer covers a response here:
http://pcianswers.com/
Youve got to think, "whats the risk". Ok so the acquirers want the data in plain text. Most places I've seen with this requirement install a dedicated link to there acquirer.
If its over a vpn then its going to be encrypted in transport so its no an issue. people get really hung up on this and its not a big deal. Logic doesnt dictate that the standard make data encrypted everywhere. In order for the data to be of some use it will have to be decrypted. Storage of it in a decrypted form is the bigger issue as this is where it is most likely to be compromised. Regular dealings with Visa have shown that they are more interested in the data "at rest" post transaction. As long as sensible protection is applied whilst in transit of course.
People who say that sending unencrpyted data over an encrypted channel is not sending encrpted data are just missing the point.
Post a Comment