Neil MacDonald

A member of the Gartner Blog Network

Neil MacDonald
VP & Gartner Fellow
15 years at Gartner
25 years IT industry

Neil MacDonald is a vice president, distinguished analyst and Gartner Fellow in Gartner Research. Mr. MacDonald is a member of Gartner's information security and privacy research team, focusing on operating system and application-level security strategies. Specific research areas include Windows security…Read Full Bio

Coverage Areas:

Security Thought for Thursday: With DLP, Don’t Just Treat the Symptoms, Address the Cause

by Neil MacDonald  |  September 24, 2009  |  4 Comments

I’ve talked to several organizations (commercial and federal governments) that have banned the use of all USB flash drives as part of a data loss prevention (DLP) strategy. This may indeed be necessary and provides immediate protection of data loss. However, its a blunt, coarse control that really doesn’t solve the underlying problem. Such drastic policies get in the way of legitimate users trying to do their job. Worse, such policies are merely “security theatre” if other ways that information may escape (email, instant messaging, fax, FTP, VoIP, printing and so on) aren’t also addressed.

So what is the root of the problem? Consider what we’ve learned with application security. There is broad industry consensus that shielding and patching after the fact are symptomatic of a faulty development process. For example, we can put up a web application firewall to shield a vulnerable application but we really haven’t solved the problem. To properly address application security issues we must change the way we produce (and procure) applications. We have to get back further into the development process when new applications are created.

Let’s apply this insight to information security. Banning USB flash drives is symptomatic of a faulty information security lifecycle process. Rather than treat the symptoms, we must get back further into the information lifecycle to understand how, when and where sensitive information will be created or acquired. It’s at this point in the information lifecycle that we need to define (and enforce) policies on the information as it moves on to be consumed by systems and users.

Instead of a policy like “nobody is allowed to use a USB flash drive”, a control that enforces a policy like “anyone can use a USB flash drive, but don’t allow sensitive data to be copied to a USB drive” makes more sense. Better, how about a control that enforces a policy like “don’t allow sensitive data to be copied to a USB drive unless the data (or the drive itself) is encrypted”.

The problem is, we don’t really have a good handle on what data is sensitive, how it is used, how if moves around, what systems and users rely on it and how and where it is stored. That’s the real problem DLP projects need to tackle.

Instead, we treat the symptoms… like banning USB flash drives.

4 Comments »

Category: Information Security Next-generation Security Infrastructure     Tags: ,

4 responses so far ↓

  • 1 Adam Hils   September 24, 2009 at 9:19 am

    Neil, the proper term for preventing all USB drive data copying is “Glue In The USB Port” (GITUP),

    Even better than what you suggest is “tie rights to copy certain sensitive types of information onto encrypted USB drives to user identity”. Intelligent blanket policies are good, but customized-to-user policies are better.

  • 2 Neil MacDonald   September 24, 2009 at 2:16 pm

    @Adam,

    I’ve had clients use the glue trick as well as using software-based controls to shut down the USB ports. A couple of thingss though – they also block the USB port from doing good things — like attaching printers, booting failed machines, legitimate needs for moving files around, etc. Also, what about CD/DVD, wireless, bluetooth, infrared, GPRS, 3G modems, legacy parallel and serial ports … and so on. THe USB port is just one of many, many ways data can leave a machine if a malicious person is determined.

    Of course, then you’ve got the network port – and email, web-based email, IM, ftp, …

    And the truly determined will simply print stuff out, take pictures of the screen and so on

    Security is always a balance. In the name of protecting ourselves from the 1 in 100 people that are out to get us, we can’t unnecessarily get in the way of the other 99 getting their job done.

    On user policies – absolutely – good point. Tied to user/group/role using identity is where this should go. “Engineering can copy CAD documents onto USB drives if the data is protected with DRM”– see this post
    http://blogs.gartner.com/neil_macdonald/2009/08/18/security-thought-for-tuesday-drm-and-dlp-are-not-separate-problems/

  • 3 Amichai Shulman   September 24, 2009 at 3:28 pm

    I know that I’ll be flaged as biased since I’m working for a WAF vendor, yet…

    While a Web Application Firewall does not obliterate the root cause of a vulnerability in an application it does deny its exploit. Most times it does that before the organization is even aware of the vulnerability, and it does that in an effective (effort wise) manner.

    Yes, we should always improve on our software production process.! Would that make software bugs in production environments disappear? No!

    A web application firewall is the kind of tool that allows you to compensate for software bugs related to security, regardless of the quality of your software production process. That’s why it should be considered as a strategic tool for organizations who want to maintain an efficient and effective software production process and at the same time maintain the security of their applications and servers.

    – Amichai

  • 4 Adam Hils   September 24, 2009 at 6:19 pm

    @Neil, of course I was having a little fun with glue-in-the-port. You’re right, even that is not foolproof.

    @Amichai – you may well be biased, but you’re also correct. WAFs are extremely useful tools, so long as developers aren’t perfect and bugs aren’t perfectly predictable. The WAF is not a substitute for a better production process, as you rightly point out.