Blog post

More on SIEM Maturity – And Request for Feedback!

By Anton Chuvakin | July 14, 2014 | 5 Comments

SIEMsecuritymonitoring

During my original SIEM architecture and operational practices research (see the paper here and a presentation here), I looked at the topic of SIEM operation maturity. Organizations that purchase and deploy SIEM technologies are at different stages of their IT and information security maturity (such as when measured by Gartner ITScore for Security). Certain security monitoring goals are extremely hard to achieve at lower maturity stages (such as “hunting” when you can barely collect data); they are also frequently unachievable unless the organization climbs all the steps in the maturity ladder to get to that step [so, no jumping stages].

The key purpose of this maturity scale is to evolve an SIEM deployment toward getting more value out of it at higher stages of the scale. Also, SIEM team members can use it to make sure that specific operational processes are in place as SIEM deployment evolves from stage to stage. For example, enabling alerts without having an alert triage process and incident response process is usually counterproductive and ends in frustration. Still, all the processes from lower stages must remain in place as SIEM deployment maturity grows.

Here is the current version of the table:

Table 7. SIEM Maturity Scale

Stage No. Maturity Stage Key Processes That Must Be in Place

(inclusive of previous stages)

1 SIEM deployed and collecting some log data SIEM infrastructure monitoring process

Log collection monitoring process

2 Periodic SIEM usage, dashboard/report review Incident response process

Report review process

3 SIEM alerts and correlation rules enabled Alert triage process
4 SIEM tuned with customized filters, rules, alerts and reports Real-time alert triage process

Content tuning process

5 Advanced monitoring use cases, custom SIEM content, niche use cases (such as fraud or threat discovery) Threat intelligence process

Content research and development

Source: Gartner (January 2013)

SIEM team members may also choose to add a Stage 0 (“tool deployed, no process”) and possibly higher stages, which are sometimes seen at security-mature, “Type A” organizations (with such exciting activities as data modeling process, visual data exploration process, use-case discovery process and so on).

At this point, I’d like to ask for your feedback and improvement suggestions?

Should I add dimensions to the maturity table, such as essential personnel skills, typical tool components deployed and utilized, use cases common at each stage?

In any case, feel free to suggest it below in comments, via email or via whatever social media venue you happen to frequent.

Previous version of the maturity table:

Select recent SIEM blog posts:

The Gartner Blog Network provides an opportunity for Gartner analysts to test ideas and move research forward. Because the content posted by Gartner analysts on this site does not undergo our standard editorial review, all comments or opinions expressed hereunder are those of the individual contributors and do not represent the views of Gartner, Inc. or its management.

Comments are closed

5 Comments

  • Matthew Gardiner says:

    For starters I would make the maturity levels less tied to specific data sources (such as logs) and less tied to specific analytic techniques (such as correlation). And frame it in terms like “gain improved visibility” (various logs being a potential source of visibility) and improve detection (through analytics and the use of threat intelligence). Furthermore I would make the three legged stool of maturity more explicit….people, processes, & technology…improving at the same time together. If you recall you reviewed my paper on this, which is directly relevant to what you raise in this blog post.

    http://www.emc.com/collateral/white-papers/h12651-wp-critical-incident-response-maturity-journey.pdf

  • @Matt

    Thanks for the comment. I am absolutely changing that to tie maturity to commonly integrated log sources and use cases. 100% of SIEM deployments [I’ve seen] that started from SAP logs died after suffering greatly 🙂

    I have used the 3D maturity in other papers, and it may indeed be a good idea to apply it here as well. Thanks for this idea as well.

  • Prabhat says:

    Is it Ok to say that data collection is generally reduced when reaching the maturity? The collection is not big at the start of the SIEM deployment, which will certainly grow but might get use less if gets too big.. What I mean is that should we assume SIEM to consume all of the available data .. Just inferring from what you have mentioned about SAP logs.

  • Prabhat says:

    Also, at the same time .. I think SIEM maturity is an on going process, every step is continuously corrected/refined/updated.

  • @Prabhat

    >Is it Ok to say that data collection is generally reduced when
    >reaching the maturity

    No, the opposite. Overall people tend to collect more data over time.
    Both breadth (more source types) and depth (more messages from each source) tend to go up.