Blog post

DLP Without DLP!?

By Anton Chuvakin | December 03, 2014 | 11 Comments

securityphilosophyDLP

“Titanic” was a big ship (it also was compliant) and it was probably prestigious to be seen on its deck. However, if somebody were to tell you that it would sink soon, you would rapidly develop a need to part ways with it…. Now I am NOT saying that data loss prevention (DLP) market is the “Titanic” – far from it. Our market measurements, done elsewhere at Gartner, still show projected growth (although, this states that “DLP segment growth rates have been reduced by 9.7% and 9.9% for 2014 and 2015, respectively” due to issues like “complexity in deploying companywide DLP initiatives, value proposition realization failures and high costs”).

As we are updating GTP DLP research, I think I noticed a disturbing trend – organizations planning what is essentially a data loss prevention project without utilizing DLP technology.

Think about it! In some cases, the sequence of events is truly ridiculous and goes like this:

  1. DLP technology is purchased and deployed
  2. The organization is breached and data stolen
  3. Anti-data breach project is initiated.

Say what?! Has Anton lost his mind while vacationing in Siberia?

I assure you that this seemingly idiotic sequence of events is real at some organizations. At others, I observed that a project to “detect exfiltration”, “gain network visibility” or even directly “stop data losses” is initiated and DLP technology is not considered central to it or even involved. In essence, they do DLP without DLP! This seemingly caught some vendors between the desire to be present in the DLP market and the readiness to jump off (such as towards an adjacent market or even into the blue ocean of new market creation) upon seen the first signs of an iceberg…

How does a DLP-less data loss project look like? As mentioned above, it may focus on exfiltration detection, network forensics/visibility (with focus on outbound data transfers) or other network-centric security analysis. Indeed, if Sony really did lose 11TB of valuable data, the challenge is not with fancy content inspection, but with basic network awareness. Even a good SIEM consuming outbound firewall logs and an analyst watching the console will be very useful for this and will allow one to detect massive data losses – and sometimes in time to stop the damage…

Thoughts? Have you seen any “DLP without DLP” lately?

Posts related to DLP research in 2013:

The Gartner Blog Network provides an opportunity for Gartner analysts to test ideas and move research forward. Because the content posted by Gartner analysts on this site does not undergo our standard editorial review, all comments or opinions expressed hereunder are those of the individual contributors and do not represent the views of Gartner, Inc. or its management.

Comments are closed

11 Comments

  • Andy says:

    We are in the process of building a ‘data exfiltration monitoring’ using non-DLP tools. DLP is expensive, so we’ve been tasked with figure out how to using our existing tools.

    Examples:

    Reports from proxy / mail gateway logs
    Reports from flow monitoring tools
    Reports from file server logs
    IDS rules to monitor for keywords in plaintext

    It is a fun exercise, but it definitely isn’t DLP.

  • Seth Hall says:

    Hey Anton! Have you had a chance to play with Bro yet? It’s funny because almost every single time you talk about monitoring you seem to work closer and closer to what we’ve been building for years. 🙂

  • sergio Nesti says:

    DLP has it purpose. As much as you think DLP doesn’t help it really does as a tool to educate the well meaning insider. DLP is not meant to catch the thief! its meant to catch the user who needs to know that what they are doing is wrong and provide a better way to do it.

    There have been numerous instances working with the secret service or even the CISO’s to pinpoint what people do on a regular basis. If anything DLP can still assist ACCURATELY(fingerprint), with the ability to deter bad usage of company assets. Protect some crown jewels of a company and educate the users in the process. I always focus companies on Base lining, Refinement of policies, Fingerprinting, notification and then refine the polices again.

    This is a continuous process. not a one time tool. that the problem most of the times. In addition DLP get a bad rap because all companies turn on 20 plus policies and then when they have 300k incidents they complain that its unmanageable.

    s

  • @Andy Well, it is not DLP, but the question is: is it worse than DLP — or is it better? For example, the orgs that capture *ALL* outbound traffic and analyze it may in fact catch something that DLP misses….

    Log-based “poor-man’s DLP” has long been a model I’ve seen in use (since my SIEM vendor days), but now we have tools to do “rich man’s DLP without DLP”…

    @Seth I sure have, albeit not with a recent version. I suspect our visions are just naturally aligned 🙂

    @Sergio I think DLP does help, exactly as a real-time education tool. We have stated this very fact in our research, and thus upset folks who hoped that it can be a “breach-stopper”….

  • David says:

    @Andy – not all DLP solutions are expensive… if all you have seen pricing for are the familiar security names like RSA, Symantec, McAfee, and Verdasys, I can understand the perspective. We’ve been called “DLP-Lite” due to our lack of expensive perimeter appliances, gateways, and until recently a data-at-rest discovery option, but DeviceLock’s endpoint-based DLP is reasonably priced to your endpoint volume size and you can add different modules as you need to or when budget allows. Our fully licensed agent filters peripheral port-device traffic, printing, and network traffic at the endpoint’s IP stack, so we don’t need the appliances/gateways and performance is not bottlenecked there either. Sure there are some coverage gaps for non-Windows systems/WIFI-only/unmanaged device participants, but it is managed from AD group policy GPOs (MMC snap-in console) so the implementation costs are almost nothing. Give the free trial a try…

  • Mark says:

    To expand on some earlier comments, the reality is you need fully developed and inspected internal processes to effectively fight all forms of Data Loss (malicious and misuse).

    DLP solutions are very effective at identifying and stopping unwanted Data Loss transactions if they are properly implemented and maintained. Remember, a majority of Data Loss events are not induced by the malicious hacker, but are everyday business situations where a trusted resource either misuses information or makes an error (usually 80-90% of an organizations risk profile). All the logs and log analysis in the world is not going to stop this from occurring. So you need a proactive tool.

    Asking analyst to read through even more reports (which most companies fail to do know because 1) its tedious and 2) they don’t have either enough or qualified human intelligence (people) is a failing proposition. We have been talking about better security intelligence from the correlation of log data as an effective security process for over a decade. Let get real – few if any companies can effective and sustainably do it. Automated risk reduction through a properly implemented DLP is proven to lower an organization risk

    So invest in logging technology and processes that will fail (as proven time and time again). Or invest in DLP technology that will provide a meaningful result….suddenly a DLP investment doesn’t sound as expensive anymore!!!

    P.S. Yes, read and analyze your logs. Its good fundamental security.

  • @Mark Indeed, processes/practices are central to DLP, but sadly a lot of DLP tech deployments completely lack them. Even to catch accidental/non-mal losses, process is still needed.

    However, “box-only/no-process DLP” is very, very likely to fail for most if not all attempts at usage.

  • Cheri Keith says:

    @Anton—nice post. You’ve certainly hit a nerve to get a good conversation going!

    Let’s all keep focus on what’s important here. The key is that these products prevent data from being lost. Whether they are called “data loss prevention” products or something else isn’t where we’ll focus.

    Our focus is keeping the data out of all of the wrong hands, even if that expands the traditional definition of DLP. I think the industry is at the point that we’re seeing the absolutely necessity of data protection.

    The attackers are getting increasingly savvy and no one technology can do it all. “Traditional DLP” plus expanded DLP plus information from the security ecosystem is an equation in the good guy’s favor.
    Look forward to continuing to follow this discussion.

  • @Cheri Thanks for the comment! I agree that sopping/detecting losses is what matters, and that products and practices must enable that. Whether it is called DLP, DLP+, something else or – more likely – all of the above is indeed secondary to the problem at hand…

  • Michael Gabriel says:

    One of the main benefits of DLP is to restrict an organization’s most critical data to just those places where it is required for the proper execution of defined business processes. By creating sensitive data boundaries, an organization can reduce its sensitive data footprint, thus making it a smaller target for malicious activity while restricting the scope of controls in a manner similar to creating a Payment Card Environment in a PCI engagement.

  • @Michael Thanks for the comment. Indeed, internal data control (vs exfil) is where DLP makes perfect sense