My security analytics paper has finally published [BTW, one more is coming soon, focused on DIY approach!], but I wanted to share one more post on the topic. If you need to read up on this, either get the paper [Gartner GTP access required], or read the blog series linked below (the paper is about 31 pages, and has more useful structures for types of tools and approach selection than the blogs, and also more vendor details).
So, in any case, the point is: there is not enough data on operational effectiveness (does it work for scenario X?) and efficiency (how well will it work for my problems?) of any of those new analytics tools! Specifically, one of the paper’s key findings is: “at this time, there is not enough data on the comparative effectiveness of various analytic approaches and algorithms (implemented in vendor tools) versus current, real-world threats and problems.”
This is actually very important to realize, if you are planning to buy one of those tools. There is neither effectiveness nor comparative efficiency data! What we have is vendor anecdotes like “retailer X deployed our MagicUBA 2.0 tool and – abracadabra!! neural networks!! suffix trees!! – detected a compromised user account on their VPN box!” or “a client had a SIEM [that nobody ever looked at] and it totally failed to detect this, but our new GigaWombat 1.2 Analytics Appliance did – with gusto.” Even if we assume – in a bout of idealistic malaise – that all such stories are true, we still have no data, since “plural of anecdote is not data.”
The “plural of anecdotes” becoming available also reminds us that you are likely buying “an indicator machine”, not “an answer machine.” This means the tool may be effective in raising a flag on something potentially important, rather than telling you what to do and what it really means. For example, one client expressed excitement over his shiny “analytics appliance”, and made it sound like it really saved his bacon. As conversation progressed, he mentioned that the machine produces 90% “false positive” rate. Wait…what? How can he be happy with that? Easy, he said: the machine produces 10 alerts, 9 are bogus, irrelevant or immaterial – and 1 saves his bacon and cannot be obtained from any other tool he has … So WIN!
Furthermore, this problem is exacerbated by the lack of clear testing methodology during a POC stage. Sure, it may happen that when you deploy the technology on your production network for a quick test run, all the NSA / FSB / PLA / MI6 implants will be revealed … but – honestly! – what are the odds of that? In reality, what will come to light are minor infractions (like shared account usage, ex-employee access, etc) that may – or may not! – prove that the tool will instantly detect APT when it shows its sneaky nose on your network. As a result, you need to test longer and definitely on production IT assets. Do inquire with your “analytics vendor” about it, some may in fact have interesting answers!
What it all means is that you are buying on faith. There is nothing wrong with it, by the way, but it is useful to be honest about it…
An alternative? The BUILD route, of course, if you can pull it off. As you iteratively build the analytics capabilities you also learn – over time – what works and what doesn’t work in your particular environment and under your changing circumstances …..
P.S. Now, some of you may opine that this problem exists for other security tools (heh, we don’t even know whether anti-virus works)
Blog posts on the security analytics topic:
- My “Demystifying Security Analytics: Sources, Methods and Use Cases” Paper Publishes
- The Future Is Here … And It Is … Network? Endpoint?
- Now That We Have All That Data What Do We Do, Revisited
- Who Validates Alerts Validated by Your Alert Validator Software?
- Killed by AI Much? A Rise of Non-deterministic Security!
- Those Pesky Users: How To Catch Bad Usage of Good Accounts
- Security Analytics Lessons Learned — and Ignored!
- Security Analytics: Projects vs Boxes (Build vs Buy)?
- Do You Want “Security Analytics” Or Do You Just Hate Your SIEM?
- Security Analytics – Finally Emerging For Real?
- Why No Security Analytics Market? <- important read for VCs and investors!
- SIEM Real-time and Historical Analytics Collide?
- SIEM Analytics Histories and Lessons
- Big Data for Security Realities – Case 4: Big But Narrowly Used Data
- Big Data Analytics Mindset – What Is It?
- Big Data Analytics for Security: Having a Goal + Exploring
- More On Big Data Security Analytics Readiness
- Broadening Big Data Definition Leads to Security Idiotics!
- 9 Reasons Why Building A Big Data Security Analytics Tool Is Like Building a Flying Car
- “Big Analytics” for Security: A Harbinger or An Outlier?
Read Complimentary Relevant Research
Security Monitoring and Operations Primer for 2017
Security monitoring and operations excellence is a key component of any effective security program. Gartner's 2017 research will guide...
View Relevant Webinars
Top Take-Aways: 2015-2016 Security and Risk Surveys
Analysis from results of surveys conducted in 2015-2016 for CISOs, security, compliance, risk, business continuity and privacy professionals....
Comments or opinions expressed on this blog are those of the individual contributors only, and do not necessarily represent the views of Gartner, Inc. or its management. Readers may copy and redistribute blog postings on other blogs, or otherwise for private, non-commercial or journalistic purposes, with attribution to Gartner. This content may not be used for any other purposes in any other formats or media. The content on this blog is provided on an "as-is" basis. Gartner shall not be liable for any damages whatsoever arising out of the content or use of this blog.