Search
RSS Feed
Twitter
« Slides & Code from OWASP Appsec DC Posted | Main | Crypto Challenges at the CSAW 2010 Application CTF Qualifying Round »
Thursday
Nov042010

Slides from CSI 2010 Posted

The slides from my “Stories From the Front Lines: Deploying an Enterprise Code Scanning Program” talk I presented last week at CSI 2010 are now available for download. For those of you that didn't attend, I spoke about lessons learned, hints and tips we've utilized during a deployment of an enterprise code scanning program at a large financial services institution. If you are in the process or have plans to deploy a code scanner within your organization, please read these slides. There's lots of good information in there that could save you from unexpected SNAFUs.

Reader Comments (3)

I am extremely unhappy with this slide deck. This involves setting up secure static analysis. It does not appear to integrate with an appsec program. It does not appear to take appsec risk management into focus. There's not much of a story to tell, and the over-focus on deployment appears to be solution shillery.

Have you worked with the data from Fortify 360 or IBM Rational Appscan Reporting Console? Is it informative over time? Is the standardization of process and using consistent build servers correlated to the long-term success of an appsec program?

Is the data from these panels accessible for other people involved in the appsec program such as penetration-testers, managers, app owners, etc? Is it useful to them over time? How do they leverage this data? Does the appsec consulting company leverage this data and how?

What upsets me most isn't just these unanswered questions, but the dominance of InformIT, Cigital, and Forrester "evidence" as you portray it.

In my honest opinion, Jaquith and Soo Hoo were right, but the way that their data is presented by you (and secure static analysis vendors) is wrong. Forrester obviously has done something horrible by indicating that integration test and beta test are 2-3 times the cost of finding security bugs in the implementation phase. I could cite numerous sources, but only feel the need to cite Steve McConnell 470 at this time, which lists beta testing and prototyping as significantly stronger at finding bugs than code review and static analysis. I think he even listed integration testing as superior to static analysis.

There are questions about the validity of these "security gurus" as you suggest. Who are these people and how did they get qualified to remove false positives? If you say vendor or GDS supplied training, then I have to immediately question the scalability and consistency of this model.

I have seen `forced' secure static analysis in banks and major ISVs. They are total failures, not because of the upper management support (as you say), but rather because the appdevs have figured out a way to subdue upper management, app owners, and appsec/infosec/risk-management. By making sure that certain patterns are implemented in a way that will cause a significantly large amount of false negatives, they have subverted the secure static analysis tool and provided themselves with more time to work on their other tasks -- all the while looking squeaky clean from an appsec perspective.

There are other organizational/human issues that you don't address, which I find odd considering this is supposed to be a slide deck about deploying this in a real world scenario. The "lessons learned" are fleeting and trivial compared to the actual problem of trying to integrate any appsec piece into an overall software lifecycle or infosec management / risk management program.

Perhaps you did get a chance to discuss all of these points in your actual presentation. I would be interested to hear how the feedback surveys went, and what value you got from them. Personally, I don't see how `forced' secure static analysis is sustainable long-term from an efficiency perspective. I am going to need a lot more proof.

November 4, 2010 | Unregistered CommenterAndre Gironda

Hey Dre, you do raise a lot of good questions, but the intent of the presentation was to point out potential pitfalls found from our experience helping large enterprise customers deploy security code scanning programs.

November 5, 2010 | Unregistered CommenterBix

The inability to be forward-thinking and failure to see the big picture are most certainly deployment issues.

I find this path of deployment closer to reality than what you portrayed:
http://silvexis.com/2010/09/23/source-code-tragedy/
http://silvexis.com/2010/10/11/source-code-tragedy-act2/
http://silvexis.com/2010/11/05/source-code-tragedy-act3/

@ Bix: I do encourage you to keep going forward with what you are doing here, and it's great that you are overly optimistic. It's important that somebody goes down this path. I'm just glad it's not me!

November 7, 2010 | Unregistered CommenterAndre Gironda

PostPost a New Comment

Enter your information below to add a new comment.

My response is on my own website »
Author Email (optional):
Author URL (optional):
Post:
 
All HTML will be escaped. Hyperlinks will be created for URLs automatically.