Blog post

What Cybersecurity Metrics Should I Report to My Board?

By Paul Proctor | April 18, 2022 | 5 Comments

This is one of The Three Unanswered Board Questions That Drive Cybersecurity Investment.

I’ve reviewed hundreds of cybersecurity metrics programs over the last 15 years. I’ve stated repeatedly, and confidently, two things:

  • No one can give you your list of metrics.
  • You should not use operational metrics with executive decision makers.

I was wrong.

It turns out I can tell you exactly what your metrics should be… and ironically, they are operational metrics. They are not particularly complex or sophisticated, they are just measuring the right thing: Value.

See Value is Missing in Executive Communication on Cybersecurity

Gartner’s construct for outcome-driven metrics (ODMs) is ideal to measure cybersecurity value. ODMs measure a direct line-of-sight to protection levels (value) expressed as an operational outcome.

For example, “number of days to patch critical systems” is an ODM for threat and vulnerability management. It is both an operational outcome in which we can directly invest, and it has a direct line of sight to the value proposition of patching which is to reduce the amount of time that vulnerabilities are available for exploitation.

Gartner has more than 100 outcome-driven metric examples across 20 control classes that all share the same characteristics for measuring value delivery. They represent operational outcomes with a direct line of sight to the protection levels (value) created by the controls they measure.

We are benchmarking 20 of these.

We are doing a lot of metrics reviews with our clients. We can identify metrics that are OK as-is, ones that can be improved with the right characteristics, and ones you should just throw away because they’re worthless. Many of the ones we would identify as good are hidden because nobody understands their value.

You’re wasting your time on metrics that don’t guide priorities or investments in security and put it in a business context for your board. That’s an acid test for the value of a metric.

A second acid test is: are these metrics influencing any decision making? Because if they’re not, again, you’re wasting your time.

Enough Already, Just Give Me the Metrics

Here are 5 examples of cybersecurity value deliver metrics you should give to your board. Gartner clients have access to 20 of these that are being benchmarked globally and a catalog of more than 100 across 20 cybersecurity control classes.

Time to Remediate Incidents: What is your average time (in hours) between incident ticket generation and ticket close for “critical & high priority” security incidents?

OS Patching Cadence (Standard): What is your average time (in days) to apply critical operating system patches within your standard patch process?

Risky 3rd Parties Engaged: What percentage of known third parties with poor security assessment results have been engaged by the organization?

Phishing Reporting Rates: What is your percentage of people who report suspicious emails for your standard organization-wide phishing campaigns?

Recovery Testing – Core Systems: What is your percentage of core systems supporting critical business/mission functions that have successfully completed full recovery testing in the last 12 months

Webinar and Benchmark Release

April 20, 2022, 11:00 AM ET: Webinar Make Cybersecurity a Priority Business Investment addresses this topic and more. This webinar is open to everyone and will be available for replay.

On June 10, 2022, 9:00 AM ET: At the Friday morning keynote of our Security and Risk Management Summit in National Harbor, MD I will be releasing the benchmark definitions for our first generation cybersecurity value delivery benchmark and early data gathered from a small number of organizations.

We will also release the survey to all conference participants to fill out their own data and get the earliest version of our official benchmark.

For my blog readers who are not Gartner clients, I will write a blog post with whatever version of the definitions and data we are releasing publicly. I expect a version of the survey to be generally available to gather data and give the respondents a foundation benchmark in return, but the details of that are still being worked out.

Follow me on Twitter
@peproctor

Comments are closed

5 Comments

  • Great article Paul, of course the metrics matter, but I find most organizations struggle with the simple task of a clean and easy to use reporting medium to get the most value from the metrics.

    Most GRC technologies are severely lacking in the area of reporting and while some solve via a Power BI integration or canned reports, it’s often a kluggy solution.

    I’d love to walk you through the Reporting & Analytics Suite of 6clicks if you’d be willing to carve out a few minutes for us.

    Otherwise, keep up the great writing, I enjoy your posts!

    Warm regards,
    Michelle

  • Alex Lawrence says:

    Is it not the case that time-to-remediate is a mess of an issue, driving bad and sloppy remediations when you get into the meat of operations around this reportable.. also phishing reporting can be not wonderful – overly represented in the metrics industry since psychologically people run a mile when they spot them and just ignore, at a certain % response rate. There are a few other issues with some of these metrics… are they representative of a mature state? Or someone just setting out in measuring quantitative risk?

  • Paul Proctor says:

    The devil is in the details. We’ve been working with our global benchmark team to create the necessary context so everyone answers the same question with sufficient detail for comparison. For example, we identified 5 different points to start and stop the clock for incident remediation time. We are very clear in the benchmark survey to eliminate the “messiness”.

    With respect to other challenges like “click-through rates” don’t have as much value as other phishing metrics. We know, and there are other phishing metrics in the benchmark and the catalog to offer choices. But on some level you have to use what is best known and understood by the audience even if it isn’t the highest value metric. Everyone knows what a click-through rate means… but when I get into reporting rates, I have to explain it… and then you lose them.

  • This looks very interesting and needed by our Executive Education Team.

  • Arjun says:

    Thanks for the interesting blog post, Paul. I would appreciate if you could elaborate on the “Risky 3rd Parties Engaged” metric. Are you referring to external products (such as SaaS) with poor assessment results?