A post from Anton Aylward came to me via Anton Chuvakin called “Why I don’t see the need for elaborate Risk Analysis.”
“Standards” like a ISO-17799/27001, ITIL aren’t trying to do anything more than lead people though a process to make them deal with the basic good practices. When they talk of things like Risk Analysis they are trying to get people to think about risk and their risk posture, and that is, all to often, sadly, something most firms don’t seem to have got around to.
And then Anton basically offers that until you can do the “baseline” of “good” practices, don’t bother with “esoteric” risk analysis.
Somethings that jump out at me:
First, Can we stop pretending to be more intellectually honest by using “good” instead of “best” practices?
Despite the seeming rhetoric to the contrary a list of universally accepted “good” or “best” practices doesn’t exist. At best, practitioners use them similarly to what Justice Potter said about pornography – I can’t tell you what the best practices are, but I know one when I see one”. At worst, they are created on the fly to justify an ad-hoc risk analysis (playing cyber-cop) “best practices say you can’t do that, neener, neener, neener.”
As this blog has mentioned before, the entire concept of “good practice” is simply a lazy man’s risk analysis. In as much as it would seem to be good and professional to the reader to do as Donn Parker suggests and hope that we could be standardized like accounting principles, the reality is that security is far too dynamic for that analogue to work (which is why I always find it odd that good practices are offered as a remedy by those who would suggest that risk analysis doesn’t work because attackers are “asymmetric”. I’m not sure this asymmetry is relevant to the study of risk, but we’ll talk about that some other day).
So “good” practices aren’t really “good” at all. If you want to be really honest, then call them “lazy” practices. I would argue that the concept of “X practices” is more esoteric and nebulous to our specific realities than our ability to account for uncertainty in the metrics we have for the factors that make up risk.
Second, Risk Analysis (or Risk Assessment or Risk Management for that matter) Is Not Vulnerability Management
This is a nitpick of mine, but thinking that risk belongs only where ISO 27001 says it does is silly. Using risk only where the ISO tells you to – is even sillier. It’s simply an immature view of risk’s relevance to the Security Program – one that this blog has suggested is a relic of our myopic focus on vulnerability assessment. Risk Analysis/Assessment/Management is not tacking some “ease of exploit combined with loosey-goosey BIA data” onto a vulnerability management cycle. In fact, they are three different things that are not to be confused. If you’re doing traditional threat/vulnerability pairing, and/or not using frequency, you’re not doing risk analysis. If you’re not judging maturity of process and capability of process actors, you’re not doing risk management. If you are looking at a discreet assets and ignoring the interrelated nature of networks – you need to find a better way to assess risk.
As the kids say these days, “You’re Doing it Wrong”. Ok, actually, we’re (mostly) doing it wrong.
Third, Anton Is Asking For Risk Management
That last bit above about understanding maturity and capability is really quite important. What Anton Aylward is saying is that the maturity of your organization matters. Yes, yes it does. Just much more so than I think Anton realizes.
Too many times I see Risk Analysis confused with Risk Management. Too many times I see Risk Management confused with discreet risk issue analysis (ahem, ISO – I’m looking at you). The “What Is Risk Management” question is too large for this post, or even a blog post, but your capability to manage risk encompasses an understanding of all interrelated factors (not FAIR factors, kids) of program management. Among those factors are the maturity of process “Do we understand that what which the business is supposed to be doing?” and the maturity of our capability to perform that process “Do we have any clue as to how to manage (skills, resources) our part of that process?
Anton’s assertion is that if our gut tells us our capability to manage risk is really poor, a risk analysis is superfluous. I would back up and offer that if we have poor risk management, then risk analysis is necessary in order to find out where the uncertainties lie, and what then should be done in order to have proper management of risk. Here’s the rub – if you’re doing “best” practices, you’re really just using someone else’s risk analysis – but you have no idea if it is relevant to you or not.
Finally, I’m against Elaborate Risk Assessment
If it means the every 18 months super-Risk Assessment. Good analysis should be done several times a day. A good framework for analysis will change the way the professional approaches their job. It trades the cyber-cop and the law of best practice in exchange for a scientific approach. Risk analysis must be done well to be useful, and should be done frequently. We’re doing it wrong.