During our work to refresh existing Vulnerability Management and Vulnerability Assessment research papers (here and here – GTP access required) we (Anton Chuvakin and I) talked with vendors on the VA space and also with many organizations in very different levels of maturity in VM. An interesting thing we noticed is how much what is considered “best practices” in VA and VM have NOT been changing. That’s right. In a world as dynamic as Infosec, we have a group of practices done in the same way as 1, 5 or even 10 years ago. Sure, there are a few changes here and there, but most processes, recommendations and even tools are not very different than what we used to consider best practices in the past. What does that mean? Have we reached a plateau (should I say “optimized level”? Commoditization?) on those practices?
Actually, should we be seeing any change?
I was expecting deeper changes caused by the adoption of cloud computing, mobile technologies and devops practices, but it seems that those things have just not been enough to disrupt the VA/VM world. I believe it is because those technologies and trends, although well advanced in their Hype Cycles, are still not widely used by all types and organization sizes. Yes, many organizations have stuff running in the cloud, but for a vast number it’s just an exploratory thing, with no production or critical systems involved. For those, dealing with the challenges of doing VA in the cloud is not a problem big enough to cause them to review their entire VM program.
If the external drivers are not strong enough yet, my next question is about the VA/VM market itself: why is it not innovating enough to change the best practices? Well, there is some innovation. There are vendors trying to bring modern data analytics and integration with Threat Intelligence to vulnerability prioritization. Is this enough innovation? It certainly helps organizations trying to find out what to fix next, but it is certainly not market disruption material. Just another evolutionary step.
Eventually we should also ask: Should we bother? Apart from a few radical points of view, most of us understand VM/VA as a critical component of Security. You must do it, and doing it well generally means less risk to the organization. But once it is being done decently, is it worth trying to do it better or does it make more sense to invest the money and resources in other security practices?
I believe this is the key answer about the lack of innovation on Vulnerability Management. Most organizations are doing it, but if you ask them where they should be investing the next security dollar, they will tell you it is not there.
There is no external factor breaking everything and forcing us to redesign how we do it. There is no money out there to be spent with the coolest new VM/VA product. And finally, let’s confess, it’s not the sexiest thing to work with. With all that together, I’m really not surprised that VM/VA is still the same after all those years.
The Gartner Blog Network provides an opportunity for Gartner analysts to test ideas and move research forward. Because the content posted by Gartner analysts on this site does not undergo our standard editorial review, all comments or opinions expressed hereunder are those of the individual contributors and do not represent the views of Gartner, Inc. or its management.
3 Comments
VM does not integrate well with cyber exercises or intelligence-led cyber risk programs. The scoring systems and databases do not lead well towards the patterns and practices used by power analysts in the cyber intelligence space.
One of the key missing areas is threat quantification, or even any mapping of risk towards threats. What we are learning is that it really should be threats towards risk, i.e., one can’t calculate or score risk without quantification of threats. This leaves CVSS, including v3, in the dark: a solution looking for a problem.
Another is data aggregation of what cyber exercise professionals need to know. VM doesn’t give it to them. It doesn’t map well using existing systems, even through a GRC portal that has been mapped to an intelligence-led program.
Credential-to-vuln mapping is one of these areas, but there are many more. With the popularization of credential stuffing by threats, we have not yet seen periphery areas, such as common account take-over (ATO) solved in VM solutions, e.g., brute-force automata, et al. Some others I can think of are infrastructure-to-web (or vice versa), web-to-db, data classifiers, etc. And what of cloud? What of mobile device? What of mobile app? Web Services, Microservices, SOA, micro apps, DevOps, infrastructure-as code, and containers? How about social media security?
No, VM, rather is an outdated solution looking for problems it will never solve. I think Nessus and OpenVAS add unique characteristics to cyber exercises in terms of the vuln-to-exploit cycle. Most other tools in the VM space do not. NeXpose’s only value-add is the forward-looking work in fingerprinting (N.B., long overdue). Threat intel analysts will need these tools, but many can get by without them by using web or FOSS tools such as SHODAN, Nmap, MSF, and Arachni. A variety of FOSS front-ends to these tools are way ahead of the commercial equivalents (e.g., aschmitz/nepenthes, RAWR, OWTF, sixdub/Minions, infobyte/csan) — and the FOSS aggregation tools (e.g., LAIR-framework, Dradis, KvasirSecurity, doreneanu/appvulnms, ThreadFix, discover.sh) are even better. Who would consider even Kenna, a SaaS-based one-day answer to VM, when all of these FOSS tools make a mockery of what is otherwise thought to be commercially-dominated?
The ad-hoc intelligence analysts and cyber exercise professionals won’t buy or recommend VM any longer — and they now run the show with support from NIST CSF, OWASP OpenSAMM, and various other innovations. The new VM is just regular-old cyber intelligence: data aggregation using platforms such as Lumify.io, Splunk Enterprise, and perhaps Hadoop and other evolutions. There is no place for a 2004 VM portal solution, even if it connects to your GRC. It’s over. That’s why no investments have been made and why nothing has changed.
Probably your best blog post so far 🙂
Good to see the area being questioned – you’re so right in saying it’s not sexy, therefore not spoken about at confz, and flies under the radar. Which is the main reason why toolsets are as bad as they are.
There are no best practices, and certainly no Best Practices.
There is room for innovation but not in the areas mentioned (fair play to the author, there is no assertion as such in the article).
The unauthenticated space cannot improve apart from maybe throwing in better delta analysis and removing “heuristic” and other areas of radical guesswork from toolsets. So the unauthenticated space can improve by removing things, not adding them.
The authenticated space is where there is huge room for improvement, in false negatives mainly, and lets stop providing managed services in this area where there are no big red warnings in upper case saying “we have your configs and root passwords”.
This VM/VA thing is only about basics, and has been over-complicated. With cloud and ESX and app-centric’icities, we’ve drifted away from Operating Systems and configurations and tin. These are not 1998 concepts, they are still very real. Clouds are still made of tin, they are not ethers out there somewhere.
The common approach to authenticated toolsets is basically “lets just code a CIS benchmark”. Has anyone ever read one of those things? they are free for a reason. No, but generally the extent of tests has to be improved, Windows is well-covered, others are not.
So the motivation should be there to improve, because Oracle databases (that’s critical infrastructure and 40% of the global market) are not covered by VA/VM. Windows coverage is about 70 to 85% of where it should be, so boxes like DCs could have some critical misconfigurations that are currently invisible.
Thanks for asking these questions, i loved the article.