Gartner Blog Network

On SIEM Deployment Evolution

by Anton Chuvakin  |  August 24, 2012  |  Comments Off on On SIEM Deployment Evolution

Is your SIEM stuck in the past? Is it “mature”? Is it evolving? Is it solving one problem or many? Is it collecting logs or collecting dust? This post continues our journey into SIEM deployment architecture and SIEM operational processes.

First, if your SIEM architecture was built in, say, 2003, and it has been solving the original problem (e.g. monitoring user access to servers or reducing IDS/IPS “false positives”) successfully, there is absolutely nothing wrong with that. It may not be “evolving," but so what? Hopefully, you are not paying 20% of $1m every year for FP reduction, but other than that a “static” SIEM deployment is not inherently worse than a “dynamic” one. It may be “static” or “in maintenance mode," but if it makes you happy, who am I to argue with that?

However, there seems to be a non-trivial number of SIEM deployments that are stuck in various non-productive stages. Let’s review that.

  • Stuck in collection: it often happens when people plan a SIEM project in a “horizontal” manner – all collection first, all analysis later (rather than on a use case by use case basis). The end result of this is a nice log collection system at 10x the price of a nice log collection system. Smile
  • Stuck in compliance: it happens when people buy a SIEM “to check the box” and never start using it for anything beyond scaring the auditors. The end result is one REALLY expensive checkbox.
  • Stuck at “problem 1 solved, N to go”: it may occur when an organization builds a SIEM deployment, solves the initial problem (as it should), but then something breaks – maybe staff turns over, security team gets downsized, consulting budget runs out – and the deployment focus shifts on maintaining “status quo.” The end result is one of a missed opportunity – and eventual realization “We paid WHAAAAT to get this taken care of?!!!”
  • Stuck in investigations: nowhere near as harmful as the previous ones, this happens when a SIEM is mostly used to investigate – rather than detect – incidents since the organization never matures to security monitoring stage. A common end result of such deployments is a SIEM being replaced by a well-known log search tool.

On the other hand, the best path for a SIEM deployment is from success to success, with constant refinement and expansion. There is nothing more motivating than a sequence of “quick wins,” solved problems, value realized, etc. It helps retain security personnel, unlock budget, refine processes, improve collaboration and integration  – and ultimately creates a self-fulfilling prophesy of a successful security monitoring program.  In essence, it works like a bicycle – you are happy only if you pedal and thus move forward.

What is the way to get there? A SIEM program self-health check of some sort. Just as we check for “are the logs flowing in?”, we should check for “is the value being delivered?” AND “what else we can solve next?” If no value is seen, what can we change, add, subtract, refine, improve so that it does show up? (hint: it is rarely the product itself) If no obvious next step comes to mind, ask around the organization. This process (oversimplified here for keeping it blog-sized) will definitely help you “run your SIEM well.”

Related posts:

Additional Resources

View Free, Relevant Gartner Research

Gartner's research helps you cut through the complexity and deliver the knowledge you need to make the right decisions quickly, and with confidence.

Read Free Gartner Research

Category: logging  monitoring  security  siem  

Tags: security  security-monitoring  siem  

Anton Chuvakin
Research VP and Distinguished Analyst
8 years with Gartner
19 years IT industry

Anton Chuvakin is a Research VP and Distinguished Analyst at Gartner's GTP Security and Risk Management group. Before Mr. Chuvakin joined Gartner, his job responsibilities included security product management, evangelist… Read Full Bio

Comments are closed

Comments or opinions expressed on this blog are those of the individual contributors only, and do not necessarily represent the views of Gartner, Inc. or its management. Readers may copy and redistribute blog postings on other blogs, or otherwise for private, non-commercial or journalistic purposes, with attribution to Gartner. This content may not be used for any other purposes in any other formats or media. The content on this blog is provided on an "as-is" basis. Gartner shall not be liable for any damages whatsoever arising out of the content or use of this blog.