Gartner Blog Network


Comparing UEBA Solutions

by Augusto Barros  |  November 28, 2016  |  5 Comments

As Anton anticipated, we’ve started working on our next research cycle, now with the intent of producing a comparison of UEBA (User and Entity Behavior Analytics) solutions. We produced a paper comparing EDR solutions a few months ago, but so far the discussion on how to compare UEBA solutions has been far more complex (and interesting!).

First, while on EDR we focused on comparing how the tools would fare related to five key use cases, for UEBA the use cases are basically all the same: detecting threats.  The difference is not only on which threats should be detected, but also on how to detect the same threats. Many of these tools have some focus on internal threats (if you consider “pseudo-internal” too, ALL of them focus on internal threats), and there are many ways you could detect those. A common example across these tools: detecting an abnormal pattern of resource access by an user.  That could indicate that the user is accessing data he/she is not supposed to access, or even that credentials were compromised and are being used by an attacker to access data.

But things are even more complicated.

Have you notice that “abnormal pattern of resource access” there?

What does it mean? That’s where tools can do things in very different ways, arriving on the same (or on vastly different results) results. You can build a dynamic profile of things the user usually access and alert when something out of that list is touched. You can also do that considering additional variables for context, like time, source (e.g. from desktop or from mobile), application and others. And why should we stop at profiling only the individual user? Would it be considered anomalous if the user’s peers usually access that resource? Ok, but who are the user peers? How do you build a peer list? Point to an OU on AD? Or learn it dynamically by putting together people with similar behaviors?

(while dreaming about how we can achieve our goal with this cool “Machine Learning” stuff, let’s not forget you could do some of this with SIEM rules only…)

So, we can see how one single use case can be implemented by the different solutions. How do we define what is “better”? This is pretty hard, especially because there’s not something like AV-TEST available to test these different methods (models, algorithms, rules…taxonomy alone is crazy enough).

So what can we do about it? We need to talk to users of all these solutions and get data from the field about how they are performing in real environments. That’s OK. But after that we need to figure out, for good and bad feedback, how those things map to each solution feature set. If clients of solution X are happy about how it’s great on detecting meaningful anomalies (oh, by the way, this is another thing we’ll discuss in another blog post – which anomalies are just that, and which ones are meaningful from a threat detection perspective), we need to figure out what in X makes it good for that use case, so we can find which features and capabilities matter (and which are just noise and unnecessary fluff). Do I need to say we’ll be extremely busy in the next couple of months?

Of course, we could also use some help here; if you’ve been through a bake-off or a comparison between UEBA tools, let us know how you’ve done it; we’d love to hear that!

Category: behavior-analytics  threat-detection  

Tags: new-research  ueba  

Augusto Barros
Research Director
1 years at Gartner
19 years IT Industry

Augusto Barros is Research Director in the Gartner for Technical Professionals (GTP) Security and Risk Management group. Read Full Bio


Thoughts on Comparing UEBA Solutions


  1. Andre Gironda says:

    There is a taxonomy to describe the unintentional insider and/or trusted insider from the perspective of a Windows endpoint, such as MITRE Caret — https://car.mitre.org/caret/ — which is really just a simplification of initial-infection vector (e.g., https://github.com/infoassure/sysmon-logger-client ) from SysMon or similar. Especially combined with EMET and Autoruns (or future-forward EMET-like capabilities from Lotan — http://www.leviathansecurity.com/lotan), SysMon events will allow responders to understand anomalies in user and entity behaviors. Sure, you could work a flash-cut migration of that at the OU layer, but ideally with maximum coverage.

    Of course, none of the above requires a UBA, UEBA, or SIEM commercial product — and understanding user-entity relationships can be done using additional FOSS tools such as adaptivethreat/BloodHound. Orgs need to spend money on people to learn and operate these FOSS tools, not commercial-product platforms.

  2. […] Comparing UEBA Solutions (by Augusto) […]

  3. […] Comparing UEBA Solutions (by Augusto) […]



Leave a Reply

Your email address will not be published. Required fields are marked *

Comments or opinions expressed on this blog are those of the individual contributors only, and do not necessarily represent the views of Gartner, Inc. or its management. Readers may copy and redistribute blog postings on other blogs, or otherwise for private, non-commercial or journalistic purposes, with attribution to Gartner. This content may not be used for any other purposes in any other formats or media. The content on this blog is provided on an "as-is" basis. Gartner shall not be liable for any damages whatsoever arising out of the content or use of this blog.