We’re always working on updating our software security / application security coverage, and the time has come to spend a few months on gathering new information for the application security program guidance document. To make it more than “here’s another general maturity model – do everything it says,” I’m looking for what makes and breaks the program in practice. And in particular, I’m looking for anecdotes and data in the area of developer training, which is somewhat of an opaque area for me. To wit, consider if and how the following relates to developer training:
“teach a man how to fish, and he may still end up starving the whole family.”
In other words, what exactly should developers be trained on?
I’ve asked a quite a few people for data. Data that shows how training improves software security quality. And I’ve come up empty-handed. I realize it’s hard to measure. Ideally we’d have a controlled study to gather some data, but such studies can be hard to pull off.
I know some of the more mature software security teams / programs do measure this in various ways. If you have some data to share, please do let me know in comments or via email! (and I’ll keep it in strictest confidence when requested, of course). You can reach me at firstname.lastname@example.org
Related: if you’re going to be at the 2012 U.S. Security Summit, stop by at my session “The Art of Saying Yes - Selling Application Security To Developers and Architects” on Tuesday (in the Business of IT Security track). We’re also featuring many other Technical Insights sessions by my GTP colleagues in the other tracks.