Neil MacDonald

A member of the Gartner Blog Network

Neil MacDonald
VP & Gartner Fellow
15 years at Gartner
25 years IT industry

Neil MacDonald is a vice president, distinguished analyst and Gartner Fellow in Gartner Research. Mr. MacDonald is a member of Gartner's information security and privacy research team, focusing on operating system and application-level security strategies. Specific research areas include Windows security…Read Full Bio

Coverage Areas:

Interactive Application Security Testing

by Neil MacDonald  |  January 30, 2012  |  8 Comments

Dynamic Application Security Testing (DAST) solutions test applications from the “outside in” to detect security vulnerabilities. In contrast, Static Application Security Testing (SAST) solutions test applications from the “inside out” by looking a source code, byte code or binaries.

Both approaches have their pros and cons and, until recently, the market for these tools has evolved separately with different vendors and solutions. Even when a single vendor offers both DAST and SAST solutions, they have not historically been integrated.

In the latest research for clients – Gartner Magic Quadrant for Dynamic Application Security Testing – one of the criteria we looked at was whether or not the vendor’s solution provided Interactive Application Security Testing (IAST). Specifically, we are looking for ways that application security testing solutions combine dynamic and static techniques to improve the overall quality of the testing results. The information gathered by this instrumentation agent gives the hybrid solution an inside-out view that complements the outside-in view of a purely DAST solution — for example, identifying the specific line of code where a security vulnerability occurred, or providing detailed visibility into code coverage. There are a couple of ways that Dynamic and Static testing techniques can be integrated and made to be interactive:

1) The web application platform (IIS, Apache, or other) can be instrumented to observe the application as it is being tested dynamically.

2) The web application can be instrumented via injected code (.NET, Java, or other) so that it can be observed during dynamic testing

3) The output of a static code/binary analysis could be used to create and “tune” the dynamic test that is subsequently performed.

4) The results of observing an application under dynamic test or in use could be used to modify the dynamic test that is being performed in real time. In this way, the dynamic test can be made much more “intelligent” in how it tests an application. This is exactly the approach used by Quotium – a vendor we wrote up in 2011 as a Gartner Cool Vendor.

Multiple DAST solutions now provide IAST capabilities.  Some of the vendors evolving their offerings in this direction and offering IAST include Acunetix, HP, IBM, NTO, Parasoft and Quotium. However, most IAST solutions also requires that an agent be deployed on the application platform, which relegates the technique largely to QA and also requires that the vendor explicitly support the platform or language being instrumented (such as PHP, Java or .NET/ASP).

Look for IAST capabilities in your next evaluation of Dynamic Application Security Testing solutions.

8 Comments »

Category: Application Security Security Intelligence     Tags: , ,

8 responses so far ↓

  • 1 Jeremiah Grossman   January 30, 2012 at 2:54 pm

    Very unlikely, or at least no time soon, for at least a couple reasons.

    Convincing system administrators to place an instrumentation agent on production is going to extremely challenging. There will be questions of performance, stability, ongoing management, and so on. These concerns will persist no matter how much better the security testing will be.

    Speaking of performance, If an instrumentation agent slows down the system at all, which every report I’ve read says they do, then this is a non-starter from the very beginning. Not to mention on large scale and sweeping website infrastructures, just getting the installation in place would be a huge undertaking for the corps.

    This is why IAST rollouts are very few and far between with little momentum.

    Where IAST is going to be most useful is on QA / Staging systems where these issues are less of a concern. And if this is the case, IAST wil have it’s place in the enterprise, but is unlikely to replace strict DAST on production as the dominant way to test software.

  • 2 Arian Evans   January 30, 2012 at 6:13 pm

    This makes a lot of sense. We’ve been combining elements of SAST on client-side JS (source) and RIA (bytecode) like Flash for years now and found it’s the best way to provide thorough (and safe) DAST.

    You can run a headless browser/attack browser to execute the client side code – but it doesn’t always go deep enough. Plus it’s safer to analyze off-domain JS content via SAST. Not to mention Flash often has vulnerable variables and functions buried in the SWF bytecode, but not exposed dynamically. You rip them out of the SWF via SAST, and re-inject them into the DAST scan for much more comprehensive coverage and vulnerability detection.

    Bottom line we find enough issues in client-side code using IAST that I would expect the server-side attack surface to be even larger and much needed. (We saw this in several cases over the years working with FOD, verifying their server-side SAST vulns)

    Also – for comprehensive web 2.0 scanning in the future, I bet IAST as you define it will be a requirement – due to the heavy reliance modern web/mobile apps have on client-side code; 3rd party libraries that you don’t have authorization to DAST; and client-side data storage. Exciting time for useful innovation here.

  • 3 Neil MacDonald   January 30, 2012 at 7:12 pm

    Jeremiah,

    Agree that the instumentation approach in its current incarnation is best suited for QA or further back in dev.

    It’s possible that SAST analysis in the traditional source/byte/binary fashion could be used to craft a dynamic test, but no vendor has shipped this (yet).

    We talk about the need for security testing to be pushed back into dev, perhaps IAST is one way this happens?

    BTW – Quotium’s positioning is exactly this – for use by developers in development, not really for production…

    Neil

  • 4 Neil MacDonald   January 30, 2012 at 7:16 pm

    Arian,

    Yes – we’ve already been heading down the DAST/SAST path…

    see this
    http://blogs.gartner.com/neil_macdonald/2011/01/19/static-or-dynamic-application-security-testing-both/

    RIA challenges DAST and needs SAST.
    Dynamic languages challenge SAST and need DAST.
    Two great techniques that taste great together.

    IAST is the next phase of this evolution.

    Neil

  • 5 Jeremiah Grossman   January 30, 2012 at 7:32 pm

    Neil,

    Arian is spot on with this comments on the client-side of IAST. My thoughts we exclusively for server-side. Didn’t know which your original blog post was intended, so just assumed.

    Anyway, its hard to say what the stimulus is going to be to get any kind of security testing for the mainstream deployed earlier on during the SDL. Seems every company has a different motivation and different approach to testing. We have a pretty good understanding what they don’t like… False-Positives. They want whatever testing to be fast and accurate otherwise its just shelf-ware. If IAST helps that, then it has a shot.

  • 6 Oliver Lavery   January 31, 2012 at 12:58 am

    It seems self-evident that both dynamic analysis and static analysis benefit from each other’s aid. Debuggers have provided those views on a system since the beginning, and nobody would argue that adding breakpoints to IDA made it worse. They’re simply complementary approaches.

    What’s preventing these tools from moving earlier in the SDL is simply organizational inertia. Security conscious companies are doing it — certainly were doing it when the economy was good. The barriers are cultural, not systemic. Writing secure software costs fractionally more, just like writing software free of other sorts of defects. But a fraction is still an expenditure.

    As much as I dislike the acronym, ‘SDL’ and dependant approaches like ‘IAST’ are going to catch on. Give it a couple years and people will cotton to the fact that LulzSec and the like exploit virtually nothing but application vulnerabilities. I’d even wager there’s going to be a split between production oriented tools and SDL oriented tools a-comin’.

  • 7 Neil MacDonald   January 31, 2012 at 9:25 am

    Oliver – agreed complementary. However, years ago when I first started covering application security, the providers really pushed back on this. In 2003-2004, the vendors were small and needed to focus on their area of expertise.

    Since then, the market has matured, acquisitions have taken place and even the smaller vendors are providing both types of approaches.

    Likely a split between production/dev – e.g. production easier to deliver as a service, harder to instrument, integration with WAF desireable

    development easier to instrument, harder to deliver as a service, bias to reduce false positives even if false negatives are increased

    I beleive DevOpsSec – agile development with security is the future
    http://blogs.gartner.com/neil_macdonald/2012/01/17/devops-needs-to-become-devopssec/

    Neil

  • 8 Oliver Lavery   January 31, 2012 at 2:05 pm

    Niel,

    Yes, in 2003-2004 even the cutting edge researchers were arguing static vs dynamic analysis until they were blue in the face (for reverse engineering purposes). It’s just a false dichotomy.

    RE split, agree except F+ are much easier to deal with earlier in the SDLC. Same is true with intrusive scanning techniques / disruption. Production scanners have to have minimal impact on the host, and very low F+ rate. If I’m an ops guy I can’t see a SQL injection vulnerability, for instance, and quickly determine it’s a F+. I have to escalate to engineering, who probably will push back, it has to be scheduled, etc. If I’m an engineer doing end-of-sprint scans of my product, an F+ is very easy to deal with.

    DevOpsSec is a good idea. I’d say it has to go a little further though. There has to be stakeholder involvement. The biggest problems arise from poor security related requirements. An SDL must therefore have product management involvement to articulate security as a requirement for the system in question, and ultimately to justify the cost of that requirement.