Guest Post by Geary Sikich (first posted on CERM ® RISK INSIGHTS – reposted here with permission)
Awareness of risk can lead to unforeseen risk behaviors based on knowledge that is sufficiently convincing to lead to false positives.
“The more you know, the more you know you don’t know.”
― attributed to Aristotle
Introduction
Knowledge is an opening door to understanding risk; the risk of knowledge is knowing how much you do not know. Unfortunately we have a very limited understanding of where risk is or where risk is going to materialize. Here is a small excerpt from “I, Pencil” by Leonard E. Read. The reason for this example is that we all use or have used pencils in our lifetimes. The pencil is a simple implement, right?
“I am a lead pencil – the ordinary wooden pencil familiar to all boys and girls and adults who can read and write. Writing is both my vocation and avocation; that’s all I do.
Simple? Yet, not a single person on the face of the earth knows how to make me. That sounds fantastic, doesn’t it? Especially when it is realized that there are about one and one-half billion of my kind produced in the U.S.A. each year.
Pick me up and look me over. What do you see? Not much meets the eye – there’s some wood, lacquer, the printed labeling, graphite lead, a bit of metal, and an eraser.”
Consider the complexity of the pencil and then think about the complexity of risk. A pencil looks rather simple and yet when you analyze it the components become a maze of complexity. Risk is much the same. Risk may appear simple and straightforward. Yet, when you analyze risk, you begin to realize the complexity of what you are looking at. Few really comprehend this complexity and, as such, risk is often simplified and discounted.
Yare
Yare is an Old English word that I use as a header for this section. It is described below:
MEANING: adjective: 1. Easily maneuverable; nimble. 2. Ready; prepared.
ETYMOLOGY: From Old English gearo/gearu (ready). Earliest documented use: 888.
USAGE: “I do desire to learn, sir; and, I hope, if you have occasion to use me for your own turn, you shall find me yare.” – William Shakespeare; Measure For Measure; 1604.
“She was a ‘bonnie lass’ in the words of her chief engineer; she was faithful, she was yare — an unlikely compliment for a vessel without sails.” – D.C. Riechel; German Departures; iUniverse; 2009.
How nimble is your risk management and/or business continuity program? Is it based solely on compliance with regulations? Do you really understand the identified risks that you found when you did your risk assessment and/or business impact analysis? Or have you created “False Positives” and a vulnerability that is transparent?
Another example from “I, Pencil”:
“My “lead” itself – it contains no lead at all – is complex. The graphite is mined in Ceylon [Sri Lanka].
The graphite is mixed with clay from Mississippi in which ammonium hydroxide is used in the refining process. Then wetting agents are added such as sulfonated tallow – animal fats chemically reacted with sulfuric acid. After passing through numerous machines, the mixture finally appears as endless extrusions…”
Just assessing one component of the pencil – the “lead” – we see the complexity and the fact that the “lead” is really not lead. A question should be stirring in your brain – what risks, business impacts, etc. have we mislabeled because we did not analyze them in sufficient depth? Do we assess the volatility of risk? And, how about velocity of risk? More importantly does our program – risk and/or business continuity – have the ability to provide us with early warning of risk realization?
BYOD – Bring Your Own Disaster
The lessons of history are rather clear. Things seem to go along fine until the “Kerblamo!” moment. Right up until that event (kerblamo) there are innumerable voices who go right on saying all is well. And then everyone is greatly surprised when the event unfolds. How knowledgeable are you when it comes to disaster management? Do you actively coordinate with local authorities? Do you know where your organization fits into the critical infrastructure protection plans? How about “Executive Orders”? Did you know:
The cedar in a pencil receives six coats of lacquer,
The labeling is a film formed by applying heat to carbon black,
The ferrule is brass,
The eraser (termed “the plug”) is composed of an ingredient called “factice” which actually does the erasing,
We must realize that the limitations of our knowledge will be exposed when we implement our plans in response to events. This is partially due to the fact that we cannot predict how an event will unfold. It is also due to the fact that we tend to do less in depth analysis as a result of the focus of our activities in risk management and business continuity.
Risk and Compliance
Risk management is not compliance; however, compliance can serve as a basis for the management of risks. A risk management program that overlooks compliance or underplays the significance of being in compliance puts the enterprise at risk. That said, risks and the managing of risk is not directly related to compliance; rather risk management is related to ensuring that the organization’s strategy, goals and objectives are achieved by buffering risk from being realized.
Are you driving the car while looking in the rearview mirror?
Many risk management practitioners are not able to recognize risk until it has been realized, hopefully by another organization. Much will be said about this statement I am sure. But let’s face it; we are generally not able to recognize risks until they are responding to an event. Our ability to forecast the probability of occurrence is just as dismal. Hence, the sightings of so many “Black Swans” based on wishful thinking and misinterpretation of information (see my article: “Black Swans or just wishful thinking and misinterpretation?”(www.ContinuityCentral.com).
Unfortunately the problem is not that risk managers are simply mediocre at what they do. The problem is that business leaders trust them to manage risks – those that are recognized and those that are yet to be recognized. It is one thing to be wrong; it is quite another to be consistently and confidently wrong.
Risk modeling, determining probabilities (math algorithms), scenario gaming and compliance all have limitations that most fail to understand. Models are dependent on theories as to how risks are supposed to manifest themselves. Probabilities are measurements or estimations of likelihood of occurrence of an event. Scenario gaming is a simulation exercise used to play out a set of events, the evolution of a market, or some situation in advance of the actual events occurring. A fundamental concept behind war gaming is that the dynamic aspect of business is often hard to describe. Compliance is an after-the-fact exercise in reaction to the last catastrophe.
We tend to focus our attention on compliance, which is essentially like driving a car forward while looking at the rearview mirror. This causes us to focus on the wrong things – i.e.; solving the wrong problem precisely. This is often due to a reliance on incomplete data that is generally subject to revision (sometimes massive revision). If we think about risk and risk data or information we see that it falls into three categories:
- Lagging Data: The road behind, where you have been, etc. Compliance is a classic example of lagging data applied to current situations.
- Concurrent or Coincident Data: Real Time, present situation, where you are right now; a constantly changing mosaic of information and noise that requires active analysis and constant monitoring to buffer risks.
- Leading Data or Indicators: The road ahead, future possibilities; it’s recognizing the potential consequences of an event to the enterprise and its touchpoints. Think competitive intelligence here.
Uncertainty
Fundamental uncertainties derive from our fragmentary understanding of risk and complex system dynamics and interdependencies. Abundant stochastic variation in risk parameters further exacerbates the ability to clearly assess uncertainties.
Uncertainty is not just a single dimension, but also surrounds the potential impacts of forces such as globalization and decentralization, effects of movements of global markets and trade regimes, and the effectiveness and utility of risk identification and control measures such as buffering, use of incentives, or strict regulatory approaches.
Such uncertainty underpins the arguments both of those exploiting risk, who demand evidence that exploitation causes harm before accepting limitations, and those avoiding risk, who seek to limit risk realization in the absence of clear indications of sustainability.
Events are nonlinear and therefore carry uncertain outcomes. Rare events and the evolution of rare events; their randomness, shapeshifting fluctuations and how reactive response underestimates the true consequences of rare events add opacity to risk management and cannot be overcome by compliance. Opacity is the quality of being difficult to understand or explain; i.e. risk whereas compliance is fairly straightforward and prescriptive; i.e. regulations.
Risk is in the future not the past
During the cold war between the United States of America and the former Soviet Union, there were thousands of nuclear warheads targeted at the antagonists and their allies. The result, the concept of mutually assured destruction was created. The term was used to convey the idea that neither side could win an all-out war; both sides would destroy each other. The risks were high; there was a constant effort to ensure that “Noise” was not mistaken for “Signal” triggering an escalation of fear that could lead to a reactive response and devastation. Those tense times have largely subsided, however, we now find ourselves in the midst of global competition and the need to ensure effective resilience in the face of uncertainty.
We are faced with a new Risk Paradigm: Efficient or Effective? Efficiency is making us rigid in our thinking; we mistake being efficient for being effective. Efficiency can lead to action for the sake of accomplishment with no visible end in mind. We often respond very efficiently to the symptoms rather than the overriding issues that result in our next crisis. Uncertainty in a certainty seeking world offers surprises to many and, to a very select few, confirmation of the need for optionality.
It’s all about targeted flexibility, the art of being prepared, rather than preparing for specific events. Being able to respond rather than being able to forecast, facilitates early warning and proactive response to unknown Uknowns.
I think that Jeffrey Cooper offers some perspective: “The problem of the Wrong Puzzle. You rarely find what you are not looking for, and you usually do find what you are looking for.” In many cases the result is irrelevant information.
Horst Rittel and Melvin Webber would define this as a Systemic Operational Design (SOD) problem – a “wicked problem” that is a social problem that is difficult and confusing versus a “tame problem” not trivial, but sufficiently understood that it lends itself to established methods and solutions. I think that we have a “wicked problem”.
As Milo Jones and Philippe Silberzahn in their book “Constructing Cassandra: Reframing Intelligence Failure at the CIA, 1947–2001” write, “Gresham’s Law of Advice comes to mind: “Bad advice drives out good advice precisely because it offers certainty where reality holds none”” (page 249).
The questions that must be asked should form a hypothesis that can direct efforts at analysis. We currently have a “threat” but it is a very ill defined “threat” that leads to potentially flawed threat assessment; leading to the expending of effort (manpower), money and equipment resources that might be better employed elsewhere. It is a complicated problem that requires a lot of knowledge to solve and it also requires a social change regarding acceptability.
Experience is a great teacher it is said. However, experience may date you to the point of insignificance. Experience is static. You need to ask the question, “What is the relevance of the experience to your situation now?”
The world is full of risk: diversify
When it comes to building your risk and/or business continuity program, focusing on survivability is the right approach, provided you have thoroughly done your homework and understand what survivability means to the organization. The risks to your organization today are as numerous as they are acute. Overconcentration in any one area can result in complete devastation.
Just because it is the right thing to do, doesn’t make it the easy thing to do.
Geary Sikich – Entrepreneur, consultant, author and business lecturer
Contact Information: E-mail: G.Sikich@att.net or gsikich@logicalmanagement.com. Telephone: 1- 219-922-7718.
Geary Sikich is a seasoned risk management professional who advises private and public sector executives to develop risk buffering strategies to protect their asset base. With a M.Ed. in Counseling and Guidance, Geary’s focus is human capital: what people think, who they are, what they need and how they communicate. With over 25 years in management consulting as a trusted advisor, crisis manager, senior executive and educator, Geary brings unprecedented value to clients worldwide.
Geary is well-versed in contingency planning, risk management, human resource development, “war gaming,” as well as competitive intelligence, issues analysis, global strategy and identification of transparent vulnerabilities. Geary began his career as an officer in the U.S. Army after completing his BS in Criminology. As a thought leader, Geary leverages his skills in client attraction and the tools of LinkedIn, social media and publishing to help executives in decision analysis, strategy development and risk buffering. A well-known author, his books and articles are readily available on Amazon, Barnes & Noble and the Internet.
REFERENCES
Apgar, David, Risk Intelligence – Learning to Manage What We Don’t Know, Harvard Business School Press, 2006.
Davis, Stanley M., Christopher Meyer, Blur: The Speed of Change in the Connected Economy, (1998).
Jones, Milo and Silberzahn, Philippe, Constructing Cassandra: Reframing Intelligence Failure at the CIA, 1947–2001, Stanford Security Studies (August 21, 2013) ISBN-10: 0804785805, ISBN-13: 978-0804785808
Kami, Michael J., “Trigger Points: how to make decisions three times faster,” 1988, McGraw-Hill, ISBN 0-07-033219-3
Klein, Gary, “Sources of Power: How People Make Decisions,” 1998, MIT Press, ISBN 13 978-0-262-11227-7
Mauldin, John and Tepper, Jonathan, “Code Red” John Wiley & Sons, Inc. 2014, ISBN-978-1-118-78372-6
Read, Leonard E., “I, Pencil” The full title is “I, Pencil: My Family Tree as Told to Leonard E. Read” first published in the December 1958 issue of The Freeman. It was reprinted in The Freeman in May 1996 and as a pamphlet entitled “I… Pencil” in May 1998 and 2008 edition. © 2010 Foundation for Economic Education, All rights reserved, ISBN 1-57246-209-4. http://youtu.be/IYO3tOqDISE Published on Nov 14, 2012, A film from the Competitive Enterprise Institute, adapted from the 1958 essay by Leonard E. Read. For more about I, Pencil, visit http://www.ipencilmovie.org
Sikich, Geary W., Graceful Degradation and Agile Restoration Synopsis, Disaster Resource Guide, 2002
Sikich, Geary W., “Integrated Business Continuity: Maintaining Resilience in Times of Uncertainty,” PennWell Publishing, 2003
Sikich, Geary W., “Risk and Compliance: Are you driving the car while looking in the rearview mirror?” 2013
Tainter, Joseph, “The Collapse of Complex Societies,” Cambridge University Press (March 30, 1990), ISBN-10: 052138673X, ISBN-13: 978-0521386739
Taleb, Nicholas Nassim, “The Black Swan: The Impact of the Highly Improbable,” 2007, Random House – ISBN 978-1-4000-6351-2
Taleb, Nicholas Nassim, “The Black Swan: The Impact of the Highly Improbable,” 2nd Edition 2010, Random House – ISBN 978-0-8129-7381-5
Taleb, Nicholas Nassim, Fooled by Randomness: The Hidden Role of Chance in Life and in the Markets, 2005, Updated edition (October 14, 2008) Random House – ISBN-13: 978-1400067930
Taleb, N.N., “Common Errors in Interpreting the Ideas of The Black Swan and Associated Papers;” NYU Poly Institute October 18, 2009
Taleb, Nicholas Nassim, “Antifragile: Things that gain from disorder,” 2012, Random House – ISBN 978-1-4000-6782-4
Leave a Reply