Monday, November 28, 2016

From my Gartner Blog – Comparing UEBA Solutions

As Anton anticipated, we’ve started working on our next research cycle, now with the intent of producing a comparison of UEBA (User and Entity Behavior Analytics) solutions. We produced a paper comparing EDR solutions a few months ago, but so far the discussion on how to compare UEBA solutions has been far more complex (and interesting!).

First, while on EDR we focused on comparing how the tools would fare related to five key use cases, for UEBA the use cases are basically all the same: detecting threats.  The difference is not only on which threats should be detected, but also on how to detect the same threats. Many of these tools have some focus on internal threats (if you consider “pseudo-internal” too, ALL of them focus on internal threats), and there are many ways you could detect those. A common example across these tools: detecting an abnormal pattern of resource access by an user.  That could indicate that the user is accessing data he/she is not supposed to access, or even that credentials were compromised and are being used by an attacker to access data.

But things are even more complicated.

Have you notice that “abnormal pattern of resource access” there?

What does it mean? That’s where tools can do things in very different ways, arriving on the same (or on vastly different results) results. You can build a dynamic profile of things the user usually access and alert when something out of that list is touched. You can also do that considering additional variables for context, like time, source (e.g. from desktop or from mobile), application and others. And why should we stop at profiling only the individual user? Would it be considered anomalous if the user’s peers usually access that resource? Ok, but who are the user peers? How do you build a peer list? Point to an OU on AD? Or learn it dynamically by putting together people with similar behaviors?

(while dreaming about how we can achieve our goal with this cool “Machine Learning” stuff, let’s not forget you could do some of this with SIEM rules only…)

So, we can see how one single use case can be implemented by the different solutions. How do we define what is “better”? This is pretty hard, especially because there’s not something like AV-TEST available to test these different methods (models, algorithms, rules…taxonomy alone is crazy enough).

So what can we do about it? We need to talk to users of all these solutions and get data from the field about how they are performing in real environments. That’s OK. But after that we need to figure out, for good and bad feedback, how those things map to each solution feature set. If clients of solution X are happy about how it’s great on detecting meaningful anomalies (oh, by the way, this is another thing we’ll discuss in another blog post – which anomalies are just that, and which ones are meaningful from a threat detection perspective), we need to figure out what in X makes it good for that use case, so we can find which features and capabilities matter (and which are just noise and unnecessary fluff). Do I need to say we’ll be extremely busy in the next couple of months?

Of course, we could also use some help here; if you’ve been through a bake-off or a comparison between UEBA tools, let us know how you’ve done it; we’d love to hear that!

The post Comparing UEBA Solutions appeared first on Augusto Barros.



from Augusto Barros http://ift.tt/2fFIQDF
via IFTTT


from From my Gartner Blog – Comparing UEBA Solutions

No comments:

Post a Comment