Neil MacDonald

A member of the Gartner Blog Network

Neil MacDonald
VP & Gartner Fellow
15 years at Gartner
25 years IT industry

Neil MacDonald is a vice president, distinguished analyst and Gartner Fellow in Gartner Research. Mr. MacDonald is a member of Gartner's information security and privacy research team, focusing on operating system and application-level security strategies. Specific research areas include Windows security…Read Full Bio

Coverage Areas:

Is Microsoft’s Secure Development Lifecycle Losing Its Effectiveness?

by Neil MacDonald  |  March 7, 2011  |  4 Comments

I was performing some background research on the number and severity of vulnerabilities produced by Apple, Microsoft and other vendors when I ran across something quite interesting. (BTW – I was researching the issue addressed in this research note for clients — whether or not antimalware software is recommended for enterprise Apple Macintosh endpoints.)

Microsoft, like any other software vendor, has vulnerabilities in its operating system and applications. In sheer quantity, Apple has had more vulnerabilities than Microsoft recently as shown in data from Secunia, IBM X-Force Labs and others: For example, this chart comes from Secunia’s Half Year Report 2010

image

However, in addition to the number of vulnerabilities, the severity of the vulnerabilities must also be considered. Here’s where the lab data shows an interesting trend. In 2010, Microsoft has a far larger percentage of vulnerabilities rated “critical” or “high” than any of the other vendors in its operating system software. This chart comes from IBM’s X-Force 2010 Mid-Year Trend and Risk Report

X-Force_fig-39

With Microsoft’s Secure Development Lifecycle in place and continuing to be refined over the past 7 years, why does the OS software being produced by Microsoft contain a significantly larger percentage of security vulnerabilities rated critical or high while other OSs are decreasing?

Here are some possibilities:

  • The bad guys are getting better at finding more serious vulnerabilities on Windows. It’s possible, but wouldn’t they be getting better equally across all OS platforms? With its dominant market share, Windows is clearly a favorite target, Perhaps the bad guys are getting the upper hand
  • The SDL is losing its effectiveness in finding the really difficult bugs. As the bad guys continue to evolve their abilities, the tools that enterprises use to detect vulnerabilities in code must also continually evolve.Vendors of commercial solutions such as HP Fortify, IBM, Veracode, Cenzic and others invest a significant amount of money evolving their tools. Many of the tools that Microsoft uses internally to detect vulnerable code are ‘home grown’.
  • Diminishing returns from developers. Microsoft was an early SDL adopter, Even augmented with tools, it is possible that there is only so much that can be caught by developers before diminishing returns set in.
  • Less emphasis on the SDL. I haven’t seen any evidence of this, but it is possible that Microsoft’s need to innovate quickly against Apple, Google and others has taken priority.
  • Microsoft shipped a lot of new products in the late 2009/2010 timeframe so more critical vulnerabilities are expected: Windows 7, Windows Server 2008 R2, Office 2010, Exchange 2010, SharePoint 2010 and so on. Note that the data in the second figure is for the OS only. Windows 7 wasn’t entirely  new – it was a facelift on Windows Vista with minimal kernel-level changes. Why would such a large percentage of critical and high vulnerabilities appear on an existing code base?
  • IE 8 was introduced and is a “part of the OS”. Since IE 8 is treated as a part of the Windows OS and since IE 8 was new and included with Windows 7, this could skew the results as compared to other OSs where the browser is not counted as a part of the OS. Still, the percentages should help to compensate for volume.

I’m sure there are other possibilities. I’d be interested in what others believe might be the cause of this.

4 Comments »

Category: Application Security Information Security Microsoft Security Windows 7     Tags: , , , , ,

4 responses so far ↓

  • 1 steve smith   March 10, 2011 at 1:24 pm

    Is there any compensation for new code volumetrics? New software is presumed to be more vulnerable than established, patched code (not sure why, though). The larger volume of new code (e.g., IE) could reflect a larger and more complex testing ‘surface area’ with a greater opportunity for vulnerabilities between interfaces (OLD research addresses this relationship). Has a correlation been shown between market share and attack exposure (# of black hatters attacking, total # attacks [how measure THIS?], or some other measure) – you hint at this in your first reason.

    Is there any research done on the tools used to build the OS? If the tools are intrinsically flawed, then the results have a higher risk of flaws, no?

  • 2 Neil MacDonald   March 11, 2011 at 5:42 pm

    @Steve -

    There isn’t any adjustment for new code volumetrics that I am aware of. Although Windows 7 was released, it was largely the same kernel as Windows Vista. Windows Vista had a lot of changes as compared to XP and that included a new release of IE7 as compared to IE6, but we didn’t see a similar spike in the data in 2006

    On the tools, Microsoft uses internally-developed tools like prefix, prefast and fxcop to scan their code – some of which they make available (at least partially) to their customers in Visual Studio. Microsoft doesn’t compete in the application security testing tools space against Fortify, IBM, Cenzic, Veracode (as a service) etc so it is hard to know if Microsoft’s internally developed tools are in need of an update. I can tell you that the commercial vendors have to update their engines at least twice a year to keep up with the changing threat environment.

    Neil

  • 3 Andrew Wolfe   March 17, 2011 at 7:53 am

    Why would anyone be surprised that operating systems with substantially different data and control structures would tend to eventually demonstrate substantially different security profiles?

    The Unix/Linux model adopted by Apple in OSX, has been proven highly secure and tested. Protected memory, virtual memory, multitasking – all of these were mature in Unix-model operating systems before Microsoft even got to Windows 3.1.

    Old folks might remember that Unix was derived from the highly secure (for its time) Multics operating system. The model was good, it worked, it is still strong, and the Internet runs primarily on this model. But Unix was not adopted by Microsoft when it created DOS or any version of Windows.

    I am constantly amazed that the assumption is that all operating systems are as bad as Microsoft Windows. This is ignorance compounded by laziness. Why are Apple, IBM, HP, Red Hat, Oracle, Ubuntu and all assumed guilty of Microsoft’s failures?

    I am not speaking in my capacity as an Oracle employee.

  • 4 Ryan   March 18, 2011 at 10:25 am

    @Andrew – please, let’s put the “UNIX is [inherently] secure/better” argument to rest. You mention DOS and accuse others of ignorance, yet it seems you share in what is a nearly universal lack of understanding of Windows NT internals. To put it most concisely, “NT is VMS re-implemented” (nothing to do with DOS), and VMS is also a very well-regarded OS. In many ways, it’s a much more modern OS than UNIX, though both have evolved significantly over the years.

    Long story short, trotting out the tired “UNIX is better” warhorse adds nothing substantive to this debate, and only throws mud on the topic rather than bringing clarity to any of the issues.

    For a quick and modern take on security for a UNIX-based OS (OS X) vs. Windows, see this recent interview with Dr. Charlie Miller: https://www.infosecisland.com/blogview/12526-Pwn2Own-Winner-Charlie-Miller-Discusses-OS-Security.html