Most managers rely on qualitative guidance from “heat maps” that describe their vulnerability as “low” or “high” based on vague estimates that lump together frequent small losses and rare large losses. But this approach doesn’t help managers understand if they have a $10 million problem or a $100 million one, let alone whether they should invest in malware defenses or email protection. As a result, companies continue to misjudge which cybersecurity capabilities they should prioritize and often obtain insufficient cybersecurity insurance protection.
No institution has the resources to completely eliminate cyber risks. That means helping businesses make the right strategic choices regarding which threats to mitigate is all the more important. But right now, these decisions are made based on an incomplete understanding of the cost of the various vulnerabilities. Organizations often fail to take into account all of the possible repercussions, and have a weak grasp of how the investments in controls will decrease the probability of a threat. It’s often unclear whether they are stopping a threat or just decreasing its probability – and if so, by how much?
It’s essential that companies develop the capability to quantify their cyber risk exposure in order to form strategies to mitigate that risk. The question is, is it really possible to put a dollar sign on fast-changing cyber risks with data that is difficult to find and often even harder to interpret?
Quantifying cyber risks is challenging – but feasible
Insights Quantifying Cyber Risks
Claus Herbolzheimer talks about cyber risks.
Leslie Chacko is a San Francisco-based principal and Claus Herbolzheimer is a Berlin-based partner in Oliver Wyman’s Digital and Strategic IT practices. Evan Sekeris is a Washington, DC-based partner in Oliver Wyman’s Financial Services practice.
This article first appeared in Harvard Business Review.