Neil MacDonald

A member of the Gartner Blog Network

Neil MacDonald
VP & Gartner Fellow
15 years at Gartner
25 years IT industry

Neil MacDonald is a vice president, distinguished analyst and Gartner Fellow in Gartner Research. Mr. MacDonald is a member of Gartner's information security and privacy research team, focusing on operating system and application-level security strategies. Specific research areas include Windows security…Read Full Bio

Coverage Areas:

Data Loss Prevention Needs to Evolve

by Neil MacDonald  |  October 11, 2011  |  1 Comment

Traditional data loss prevention has been focused on looking for signatures and patterns of sensitive data at rest within the organization and as it moves throughout the organization, including to destinations outside of the enterprise (the latter is where most organizations have started).

<digress> You noticed I didn’t use the term “DLP”. That’s because I believe data loss prevention is just one of many controls that need to be mapped to a broader data lifecycle protection process that I believe is the real “DLP”. I digress – that’s another discussion…   </digress>

I had an interesting request for a client a while ago. They wanted to look through all of their file shares for inappropriate data. In their case, an employee had been discovered with dozens of gigabytes of pirated music that was being stored on their enterprise servers that represented a potential legal liability for the organization.The client wanted to search all of their repositories for potentially inappropriate data – such as music files, video files, sexually explicit images and so on. We already have data loss prevention tools that rummage through our systems looking for sensitive data, why not expand this capability to inappropriate data? Taking this further, how about inspecting source code files and scanning these for potentially unlicensed or insecure open source libraries (lPalamida, Black Duck and others provide this today as a point solution).

At the time, none of the data loss prevention tool vendors provided this capability and I directed the client the single enterprise third party tool I was aware of that specialized in detecting inappropriate content.

I don’t see how these use cases are so different that it requires different tools for these use case. Learn a data pattern or signature and look for it by crawling through data repositories. Could be sensitive, could be unlicensed, could be inappropriate – same problem. It seems like a security no-brainer for data loss prevention tools to evolve to support the use case of identifying potentially inappropriate data usage in addition to sensitive data usage.

1 Comment »

Category: Information Security Next-generation Security Infrastructure Security Intelligence     Tags: , , ,

1 response so far ↓

  • 1 Matt   October 12, 2011 at 6:16 am

    A broad approach to data classification should yield information that not only feeds into a DLP strategy, but also encryption, data ownership and other lifecycle issues such as purging (as above – inappropriate) and tiering based on its age and use.