Neil MacDonald

A member of the Gartner Blog Network

Neil MacDonald
VP & Gartner Fellow
15 years at Gartner
25 years IT industry

Neil MacDonald is a vice president, distinguished analyst and Gartner Fellow in Gartner Research. Mr. MacDonald is a member of Gartner's information security and privacy research team, focusing on operating system and application-level security strategies. Specific research areas include Windows security…Read Full Bio

Coverage Areas:

It’s Time to Redefine DLP as Data Lifecycle Protection

by Neil MacDonald  |  February 24, 2010  |  5 Comments

The acronym DLP, Data Loss Prevention, is really just a subset of a broader issue better described as “Data Lifecycle Protection”.

The latter is the real issue. The former is the symptom.

Perhaps we should have two acronyms — “dlp” and “DLP” respectively?

The important of the broader meaning of DLP and its issues hit me again today while discussing a vendor’s offerings and strategy for data obfuscation/masking of databases. Their offering is used in non-production environments with static obfuscation and in production databases with dynamic, real-time obfuscation at the database and application level. As a part of their solution, they also offer the ability to crawl and identify sensitive data.

Sound familiar? It’s all about protecting data. But they don’t do what a Symantec/Vontu, McAfee DLP, Trend LeakProof (or any of the other dozen or so vendors in the traditional DLP space) do. What they do is complementary.

I won’t fight the acronym soup, so I’ll just call the broader issue “data protection” and reiterate the conclusion I have reached:

Data protection is the process of identifying and understanding where and how sensitive information is created, consumed, processed, moved, shared, stored and retired and protecting it throughout this lifecycle.

There are a myriad of security controls and policy enforcement points that map to this process: full drive encryption, file/folder encryption, content monitoring and filtering at email and web security gateways, application-level encryption, end-user activity monitoring, sensitive data discovery tools, digital rights management, … and, yes, sure (why not?) – even an IPS or AV scanner that is programmed to look for sensitive data.

And now you can add data obfuscation/masking tools for consideration in your data protection process as well.

5 Comments »

Category: Information Security     Tags:

5 responses so far ↓

  • 1 Sam Van Ryder   February 25, 2010 at 9:58 am

    Neil, this is a fantastic assessment. It IS the real issue, and encryption is part of the solution – but it doesn’t really give you full control of the data itself. It seems to me that most people/solutions are focused on where the data should reside, rather than on the data itself. Since nobody has full control of the data, this naturally drives the need for Data Loss Prevention…

  • 2 uberVU - social comments   March 1, 2010 at 1:20 am

    Social comments and analytics for this post…

    This post was mentioned on Twitter by Exobox_Security: Data Loss Prevention or Data Lifecycle Protection? Gartner blog: http://digs.by/1VKb

  • 3 Lotfi Ben Othmane   March 29, 2010 at 8:40 am

    Agree that this is a good point. Data is currently seens as a static entity. In the old days to move a chart you need a horse to pull it. Then human kind invented a self-moving machine (If I can use the expression). We currently experience similar things with data. Data is currently static and does not protect itself. It requires an external entities to enforce its security. To protect data we need to couple protection mechanisms with the data itself (Similar to have an engine in the chart itself).

  • 4 Neil MacDonald   March 29, 2010 at 9:52 am

    Lotfi, thanks for the comment

    2 comments:
    1) rather than couple the protection mechanisms with the data, don’t we really need just to couple the policy with the data? The actual policy-decision making can be performed elsewhere – for example, encrypt the data and tag to a policy to unencrypt. Howerver, even this is easier said than done. eDRM is still not yet mainstream despite the technology having been around for more than a decade.

    2) In your metaphor, consider the human occupants of the self-moving machine (car). I use protection mechanisms of the container (the engine-powered car) to protect me, the occupant (the sensitive information being carried). I am not self-protecting. I depend on the container to protect me – whether its on an airplane, in a car, in an elevator etc. Rarely do I need to be self-defending (e.g wear a bullet-proof vest, carry a weapon and so on). Why isn’t the same true for data? Why can’t we depend on the containers (if we own them and trust them enough) to protect the data? Perhaps the notion of always defending all data all of the time is overkill… just like wearing a bullet-proof vest all of the time is overkill…

    food for thought.

  • 5 Lotfi Ben Othmane   March 31, 2010 at 11:31 am

    Thanks for the ideas and answers Neil.

    1) I agree that policies are required to protect data (that the owner whish to protect). The use of policies creates also other problems such as having two policies from a set of policies that may partially contradict. I believe that coupling data and only policies is not enough. The question is: how can you make sure that the polices are applied? One solution is to use a trusted third party which is what some DRM use. Is it user friendly in an open environment context (such as Internet)? The answer is no.
    2) I think you have an important point here, which is that you need to “trust the container”. If you do not trust the container (so you expect bullet) and want to be safe you need to wear the bullet-proof. It is over killing but required in hostile environment.