Once a week or so, someone calls and asks for OWASP Top 10 testing. I have to make the call on the spot weather or not to explain that isn't what they want, or say "Sure!" and then give them actually what they need, or have a larger scale meeting to see where their appsec maturity is, and base training on that. Usually it is the third.
The problem is, app security is hard to teach, and frankly many shops need secure coding training, which is even harder. Let's break down why that is the case.
OWASP Training yes, OWASP Top 10 training, no
OWASP is a great organization. For those unfamiliar, it is a global nonprofit with the mission of evangelizing application security to developers. It has it's political problems sure, but in general it solves a very hard problem with grace and clarity.
One of the most famous products to come out of OWASP is the Top 10. This list is the most risky vulnerabilities discovered by member organizations, ranked. It is a useful. Useful for printing out, rolling up, and smacking your CIO with until you get a security budget.
The OWASP Top 10 is not an application security plan. It is also not a training curriculum. It is a marketing vehicle, and a remarkably effective one. Use it for that and you are golden. Try and do an OWASP Top 10 training, and you are performing a disservice.
This discussion doesn't go over well with most. Everyone wants a magic bullet for application security, but there just isn't one.
Sorry.
The plan is simply to do three things:
1) Teach the developers to recognize security flaws.
2) Teach the developers to repair the security flaws.
3) Give the developers tools to prevent security flaws from ever making it in the code.
Let's count 'em down.
When you need to learn how to test apps
Let's be straight here. The only way to make applications more secure is to code them securely. Okay pokey? Good, that's settled.
Now. There are a few things that need to happen first, and therein lies the rub. CIOs and Dev Leads want to drop a process in place that will secure their code. Then I stop by, put '; in their website search field, blow the whole thing up, and get the O face from the team. First, we need to show developers what the attacks are, and how to check for them.
The issue among the high level development security instructors is that they are so far along in their personal skill set that they wanna talk about indepth output encoding for style sheets, without realizing that many developers are still wondering what the other site is in Cross-Site Scripting anyway? I get it. I do. But we gotta judge that audience, and it's rough. Average 40 person dev team you are gonna have 7-8 people that already know the basics, but not well enough to teach the other thirty-odd. We need to start there.
Security champions - I love you all very much. Take a look at your dev teams. Close your eyes. Take a deep breath. Open your eyes. Does everyone in there understand JWT risks? Does your organization remove dev names in comments? If not, you need to run an application security testing class. No, you don't have to have everyone be an uber-hacker. But it is fun, and it does give everyone a starting point.
When you look at code in code review, ask what input validation is being done. Ask about how that viewstate is encoded. If you get a glassy eyed stare, then consider a class on testing.
When you need to learn how to write secure code
Once folks can recognize insecure code, it is time to start fixing things. Sounds far, far easier than it actually is. However, this is when we need to start getting the development staff into the process of building security into their everyday process.
My experience is that you need to do a few things. First, static analysis. It isn't perfect, but it is a start. Static analysis is the process of analyzing the code to best determine the potential security flaws. Dynamic analysis is the act of looking at the flaws in a running application. Either can be automated - meaning a script does the work - or manual - meaning a human does the work. Automatic static analysis, say with a tool like SonarQube, is very likely to generate a ton of false positives at the start, but the rules can be honed over time. It is an imperfect but fairly effective tool.
Another important tool that should be used is a secure coding standard. This is a custom built document not unlike a style guide. It is something you can hand to new devs and say "this is how we do things." Now, this leads well into the next section, about language agnostic testing and training, because the secure coding document should be tailored to the platform used by your organization.
Testing is language agnostic, but secure coding isn't
The issue, as one discovers writing a secure coding standard, is that testing is very platform agnostic, but writing more secure code is not. From a tester perspective, I can say "you need to encode your outputs" but from the developer perspective, there is a different way for every language and platform. Html.Encode()? Sanitize()? Different everywhere, and a few frameworks do the work for you.
When the report is written, there should be remediation advice, and it should have detailed guidance. However, that means the tester should have detailed information about the language and platform and framework used to build the tested application. This is extremely unlikely.
When trying to teach generally, there needs to generally be an expertise in the language, platform, and framework. Now, some folks know several languages, platforms, and frameworks,, if they have been around a while. I for instance know C# and ASP.NET on Windows, Java and JSP on Apache, and Python with various frameworks quite well. Others less so. But I have been doing this a long, long time. Teaching secure coding in Ruby on Rails requires a specialty in appsec, AND Ruby. That's not an everyday set of skills.
So what are we gonna do?
Whatcha gonna do? It's not the easiest problems to solve. I have a system that I would like to share, though.
First, have someone give a security talk at your company. Usually, I do a lunch and learn or something, obviously online these days. Go over the vaunted OWASP Top 10, or give a demo of Burp or ZAP. Heck, click F12 and see what you see. I usually invite developers, business analysts, and testers (quality assurance, whatever your term is). Some people will nap through it, some will stay after to ask questions. Those people that stayed after might very well be your security champions.
OK, so now we know who is interested. Second, we do training on testing. Have the security champions help to collect together the folks they think are important to understand what the vulnerabilities are, and hold a real training - one or two days, with labs - on application security testing. This gets the core group of people the information they need about vulnerabilities to look for. In the labs, have them look for them. In their own code. Encourage folks to test their own dev instances. Dig in.
Third, retrospective. Get the champions back together. What did we learn? How can we do better? Most important, what are the secure coding principles that must be taught? This is where we solve the language agnostic issue. You can't just call someone in to teach secure coding, you must learn what it means to you and your team and your company.
Fourth, write a secure coding standard. It should be based on the lessons from the retrospective. Base it on the categories of vulnerabilities, but couched in developer terms. I use:
- Security Principles
- Access Control
- Content Management
- Browser Interaction
- Exception Management
- Cryptography
- System Configuration
But your mileage may vary. The goal is to build a guide you can give someone on their first day. Say "We write secure code here. This is how it is expected to be done." Think that through. Usually my documents are 12 pages or less.
Finally, you train the secure coding standard. Now you know what needs to be trained. Yes, you have to write the materials, but they can be reused. It can be as long or as short as you like but you get everyone back together and teach. Then, as new people join the team, you have the culture in place to hand them the document.
Next, if you want to, you start to enforce the standard with a static analysis process. That, however, is for another post.