Guest Post by Geary Sikich (first posted on CERM ® RISK INSIGHTS – reposted here with permission)
“To do something very dangerous takes a certain lack of imagination”
– Anonymous
Introduction
Governments and companies worldwide are emerging from the current financial crisis and subsequent recession. While governments are crafting new regulations, businesses around the world are walking in shifting sand as risk exposures are high and new regulations will create compliance challenges. According to a recent survey by Korn/Ferry International, corporate leaders are focusing more attention on risk management after what is considered by many to be excessive risk-taking during the boom times that factored into the global financial crisis.
One question that both corporate executives and board members will have to wrestle with is, while there will be more regulations, can they be crafted better than the regulations of the past? Will it be just more reactive and ill-conceived regulations? The Financial Stability Act of 2010 commonly referred to as the Dodd-Frank Act alone comprises over 2,300 pages of regulatory directives. More regulations are in the offing, ensuring that compliance will be a top consideration. But, who writes the regulations? Are they industry experts? No. In most cases, regulations are written by politicians and legal professionals who may have a basic understanding of an industry, but who are in no way experts in that industry.
The boards of directors in all companies will be on the high-wire without much of a net between them and the unyielding and, in many cases rocky, ground. Stakeholder expectations will be high, hoping that boards of directors to do their best. How well boards document their risk oversight activities, as is required in the new regulatory scheme, in proxy disclosure documents there may be some protection as such a record may serve as evidence that good “business judgment” (a Delaware Court decision affirmed that “the business judgment rule” protects boards). The pertinent question may be, exactly how should a board oversee risk? Perhaps the answer lies in a combination of an emerging management process – Enterprise Risk Management (ERM) – and a much discussed risk of loss predictive theory – the concept of the “Black Swan” that has been chronicled by Nassim Taleb, author of the book “The Black Swan.”
Risk – what is it and how does the board approach it?
One of the challenges corporate management and their boards face is how to explore new opportunities effectively manage enterprise risks, maintain compliance with regulations and, hopefully, generate a profit. Business is driven by strategy carried out in the form of plans by people who operate in existing and evolving markets. Every organization’s “strategic plan” (developed either formally or informally) identifies the company’s critical objectives. Consequently, ERM is being recognized as consisting of more than the traditional definition:
ERM is the process of planning, organizing, leading, and controlling the activities of an organization in order to minimize the effects of risk; this includes but is not limited to, financial, accidental losses, strategic, operational and other unrecognized risks.
Risk is the potential for loss or gain caused by an event (or series of events) that can adversely or positively affect the achievement of a company’s goals and objectives. Risk must be viewed as a potential vulnerability to the downside and as an opportunity driver for those prepared to capitalize on the upside. A broader and more applicable definition of enterprise risk can be summarized thus:
All initiatives taken to assure the survival, growth and resilience of the enterprise
Taken in this context ERM has new relevance for corporations and their boards. As economies emerge from the economic downturn, many companies are now regarding enterprise risk management not just as a “necessary evil” for regulatory compliance, but as an effective tool that can be used to gain a competitive advantage in the global marketplace.
In this article, we explore several questions around this issue, including:
- Can ERM help engage the board, limit their liability and enhance governance?
- Can the Board assure that potential Black Swan events’ effects on stakeholders is periodically and properly assessed, documented and offset?
- Can enterprise risk management offer organizations a competitive advantage?
- Is it possible to conduct “Black Swan” audits with board supervision?
Ability to Identify and Manage Risk
Risk; business leaders know it exists. However, oftentimes companies aren’t taking a holistic approach to assess and manage their risk exposures. Disruption happens. Natural disasters, technology disasters, manmade disasters happen. As an example, oil companies entered the deep waters of the gulf armed with technology that works and generally works well. How did technology fail them? The failure is not in the technology it is in the unanticipated difficulties that are encountered when drilling at depths that are relatively unfamiliar to the industry.
Could a technology breakthrough have changed what occurred to the Deepwater Horizon? Will there be a shift in consumer demand or a rise, or fall, in the price of oil that affects critical markets? Any of these can rewrite the future of a company – or a whole industry. If you haven’t faced this moment, you may soon. It’s time that oil company executives change the way they think about enterprise risk management, continuity of business operations and the way they run their businesses.
Because a splintered approach to enterprise risk management has been the norm, with silos of risk management within organizations, the result has been that risk is poorly defined and buffering the organization from risk realization is; pardon the pun, risky at best. Taking enterprise risk management on with a truly integrated head-on approach is necessary.
Enterprise Risk Management (ERM) is defined by many different groups in a variety of ways. Each group has a vested interest in their view of what ERM constitutes. Risk and non-risk management professionals are so enmeshed in following risk management protocols promulgated by financial and non-financial regulatory and oversight entities that they cannot see risk for what it really is. They get caught in the “Activity Trap.”
There is usually a lack of data pointing to excessive risk in an enterprise. The board can expect senior management to make risk decisions based on combination of intuition and facts. Otherwise there is a high potential for an “Activity Trap” situation to be created resulting in a less effective decisions being made over time as the problem compounds.
According to a New York Times article by Cyrus Sanati (10 August 2010), entitled “Crisis-Shaken Executives Sharpen Focus on Risk;” citing the survey of senior executives by Korn/Ferry International, the article stated that:
“increased focus on risk has resulted in the “hiring of executives who understand the key part risk assessment plays in setting good strategy,” Steve Mader, vice chairman and managing director of Korn/Ferry Board Services told DealBook. As a result, about 58 percent of those surveyed said they believed that their companies had improved the quality and timeliness of internal oversight and reporting to their boards.”
“Boards and C.E.O.’s are reporting that the overriding lesson of effective risk management is that it must become an integrated element of strategy,” Mr. Mader said. “Corporate leaders increasingly see the levels of risk and the metrics of risk as inherent components of developing and executing strategies, and in evaluating the appropriate tolerance for risk.”
Sanati stated in the article that 57% of senior executives surveyed indicated that their companies are spending more time dealing with risk management. 26% said there had been no change at all in their risk management practices. Only 14% indicated that their companies were actually spending less time on risk management.
A notable quote comes to mind.
“Because management isn’t an exact science, and perhaps never will be, helps explain why it is so prone to fads. The tendency to leap from one fad to the next, to adopt the latest with the same zeal and enthusiasm as those which preceded, and to abandon each in succession and as quickly as the next appear is responsible for much of the cynicism and despair in today’s organizations. If fads fool anyone, it is only those at the top who push them, not those at the bottom and middle who are forced to implement and to suffer them.”
Ian Mitroff, Smart Thinking for Crazy Times,1998.
In a separate article by Geary Sikich, entitled, “Are we creating False Positives” (Continuity e-Guide, March 2004), He stated that corporate and government leadership should realize that “crisis” management isn’t an exact science, it never will be! This obviously applies just as stringently to “risk management.” Ever since September 11, 2001, and most recently with the ongoing financial crisis that traces its roots to the mid-2000’s, there has been a tendency toward what Sikich coined, “decision paralysis.” It is the result of uncertainty, fear and instantaneous judgment of negative consequences. “Decision paralysis” leads to being constantly in a reactive mode. Some may claim that speed, connectivity and unforeseen events are the causes of “decision paralysis.” To say that is offering an excuse not an explanation.
Mitroff states, “All “real problems” have more than one way of being stated.” For example, “whoever controls the definition of a problem controls its solution,” “a problem well put is half solved,” “the first definition of an important problem is almost invariably wrong,” and “never trust a single definition of an important problem.”
Management is never put more strongly to the test than in a “crisis.” Objectives are immediate, and so are the results. What you do or don’t do will have long-lasting implications. Today, individuals responsible for managing businesses and public agencies must deal effectively with increasingly complex laws and issues or face the consequences. But, are we truly prepared to face the consequences? What’s needed is a change of mindset. Today we cannot merely think about the plannable or plan for the unthinkable, but must learn to think about the unplannable. The board must encourage senior managers to make such a mindset change. In making this mindset change we also need to rethink how we ask questions as we seek information or we will end up solving the wrong problem precisely.
Are We Creating Another Set Of False Positives?
If the board asks the wrong questions precisely, they will continue to get the wrong answers precisely; and as a result create false positives. If we do not ask the right questions we will get precisely the wrong answers. Getting a wrong answer does not mean that there is intent to defraud, it means that the answer creates a false positive, based on less than relevant factors being presented. For example:
Jon Surmacz’s “Disaster Preparedness,” CSO Magazine, August 14, 2003, states that 67% of Fortune 1000 executives say their companies are more prepared now than before 9/11 to access critical data in a disaster situation.
60% say they have a command team in place to maintain information continuity operations from a remote location if a disaster occurs. 71% discuss disaster policies and procedures at executive-level meetings. 62% have increased their budgets for preventing loss of information availability (source Harris Interactive).
The above is a classic example of creating a false positive. Read the previous paragraph carefully and you will clearly see that the executives are referring to the ability to access information and to maintain information availability. None are saying that their companies are prepared for the loss of personnel, facilities, access to normal business environments or any of the other potential problems typically encountered in a disruptive event. This is not to say that they have failed in any way, shape or form. We are merely pointing out that the above statement could make senior management, stakeholders, etc., think that they are prepared to handle a disruptive event, when in fact they are only partially prepared! The board has to get senior managers to question their own analysis.
Consider the following hypothetical exchange between a company’s management team and their investment bankers as part of a due diligence investigation prior to underwriting the company’s most recent offering.
INVESTMENT BANKERS (asking precisely the wrong question precisely):
“Are you testing your business continuity plan?”
MANAGEMENT TEAM (responding with the wrong answer precisely):
“Yes, we are testing our plan.”
The above exchange does not reflect a false statement. The Management Team has answered correctly, albeit, precisely the wrong answer. How should this hypothetical exchange have gone?
INVESTMENT BANKERS (asking the right question):
“What does your business continuity plan encompass?”
MANAGEMENT TEAM (responding with the right answer precisely):
“Our plan consists of the steps that we will take to recover our information systems.”
This exchange reflects an accurate statement regarding the scope of the company’s business continuity plan. The next question the Investment Bankers should be asking the Management Team is, “What other plans does your company have to address non-systems business recovery?
When we begin to assess how our organization approaches problem solving we must begin to address the issue of complexity in today’s modern organization. Outsourcing, just-in-time supply/production, getting back to core services, etc. work well in an ideal world; a world where nothing goes wrong. In today’s world so many things can go wrong, everything from natural disasters to cyber-security. Business is so interconnected that we have had to evolve new solutions for these complexities. Not long ago one really ever heard of Enterprise Risk Management (ERM), Customer Relationship Management (CRM), Enterprise Resource Planning (ERP), Supply Chain Management (SCM) and a host of other terms that today are commonplace.
If the board allows the organization to continue to ask the wrong questions precisely, decisions will be made based on precisely the wrong answers. We will continue to operate in a reactive mode instead of a proactive mode. Corporate management must learn to develop an intelligence mosaic by asking the right questions precisely.
Many view “Mitigation” as a panacea, thinking that once mitigated risk does not have to be worried about. This false premise creates a reaction time gap. Practitioners can only hope that they are buying their enterprises sufficient reaction time, when in fact, they should be asking, “how much reaction time are we losing because our ERM program is fragmented and fails to understand risk throughout the enterprise?”
The greatest failure of most enterprise risk management programs is that they cannot de-center. That is, they cannot see the risk from different perspectives internally or externally. Poor or no situation awareness generates a lack of expectancies, resulting in inadequate preparation for the future. The board can require senior manages to see risk from other perspectives.
Corporate Governance, Enterprise Risk Management and Compliance
Corporate Governance has traditionally defined the ways that a firm safeguards the interests of its financiers (investors, lenders, and creditors). Governance provides a framework of rules and practices for a board of directors to ensure accountability, fairness and transparency in a firm’s relationship with all stakeholders (financiers, customers, management, employees, government and the community).
The governance framework generally consists of explicit and implicit contracts between the firm and the stakeholders for distribution of responsibilities, rights and rewards; secondly it establishes procedures for reconciling the sometimes conflicting interests of stakeholders in accordance with their duties, privileges, and roles and third, it establishes procedures for proper supervision, control, and information-flows to serve as a system of checks-and-balances.
The failure to identify and manage the risks present in the energy industry will have a cascade effect, creating reputational damage (either real and/or perceived). The industry is faced with several issues that are transparent to many. A heavy dependence on performing processes that become activity traps creates an inability to change and/or even recognize the need for change. In his book entitled, “Management and the Activity Trap,” George Odiorne concludes that activity traps are created when:
Processes and procedures are developed to achieve an objective (usually in support of a strategic objective).
Over time goals and objectives change to reflect changes in the market and new opportunities. However, the processes and procedures continue on.
Eventually, procedures become a goal in themselves – doing an activity for the sake of the activity rather than what it accomplishes.
- Edwards Deming created 14 principles for management. Deming recognized the folly in working for the sake of procedure rather that finding the goal and making every effort to achieve it. We know now what to measure, we know the current performance and we have discovered some problem areas. Now we have to understand why problems are generated, and what the causes for these problems are; however, as Sikich stated in a speech given in 2003 “because we are asking the wrong questions precisely, we are getting the wrong answers precisely; and as a result we are creating false positives.”
Nassim Taleb, author of the best seller, “The Black Swan,” has stated that “we lack knowledge when it comes to rare events with serious consequences. The effect of a single observation, event or element plays a disproportionate role in decision-making creating estimation errors when projecting the severity of the consequences of the event. The depth of consequence and the breadth of consequence are underestimated resulting in surprise at the impact of the event.”
Black Swans, Prudent People Reasonable Analysis and Board mandated “how to’s”
Are we flying behind the plane when it comes to Enterprise Risk Management or are there Black Swans everywhere? Can the Board use Black Swans as a way to get senior management to think of Risk from a different perspective? There seem to be a lot of sightings of “Black Swans” lately. Should we be concerned or are we wishfully thinking, caught up in media hype; or are we misinterpreting what a “Black Swan” event really is? The term “Black Swan” has become a popular buzzword for many; including, contingency planners, risk managers and consultants. However, are there really that many occurrences that qualify to meet the requirement of being termed a “Black Swan” or are we just caught up in the popularity of the moment? The definition of a Black Swan according to Nassim Taleb, author of the book “The Black Swan: The Impact of the Highly Improbable” is:
A black swan is a highly improbable event with three principal characteristics: it is unpredictable; it carries a massive impact; and, after the fact, we concoct an explanation that makes it appear less random, and more predictable, than it was.
In most businesses there is a general lack of knowledge when it comes to rare events with serious consequences. This is due to the rarity of the occurrence of such events. The rare event must be the focus of ERM. In his book, Taleb states that “the effect of a single observation, event or element plays a disproportionate role in decision-making creating estimation errors when projecting the severity of the consequences of the event. The depth of consequence and the breadth of consequence are underestimated resulting in surprise at the impact of the event.”
To quote again from Taleb, “The problem, simply stated (which I have had to repeat continuously) is about the degradation of knowledge when it comes to rare events (“tail events”), with serious consequences in some domains I call “Extremistan” (where these events play a large role, manifested by the disproportionate role of one single observation, event, or element, in the aggregate properties). I hold that this is a severe and consequential statistical and epistemological problem as we cannot assess the degree of knowledge that allows us to gauge the severity of the estimation errors. Alas, nobody has examined this problem in the history of thought, let alone try to start classifying decision-making and robustness under various types of ignorance and the setting of boundaries of statistical and empirical knowledge. Furthermore, to be more aggressive, while limits like those attributed to Gödel bear massive philosophical consequences, but we can’t do much about them, I believe that the limits to empirical and statistical knowledge I have shown have both practical (if not vital) importance and we can do a lot with them in terms of solutions, with the “fourth quadrant approach”, by ranking decisions based on the severity of the potential estimation error of the pair probability times consequence (Taleb, 2009; Makridakis and Taleb, 2009; Blyth, 2010, this issue).”
The above may seem way beyond the capability of most individual business managers to evaluate. However, the collective wisdom of the board can be very useful in helping to gauge the effect of single observations, for example, in decision-making. Such wisdom can also be useful in helping to assess the degree of knowledge and ignorance in the enterprise as well as if a potential catastrophe fits the Black Swan criteria.
There was a great deal of intense media focus (crisis of the moment) on the eruption of the Icelandic volcano Eyjafjallajokull and the recent Deepwater Horizon catastrophe. Note that less attention was paid by the media to a subsequent sinking of the Aban Pearl, an offshore platform in Venezuela that occurred on 13 May 2010.
Some have classified the recent eruption of the Icelandic volcano Eyjafjallajokull and the Deepwater Horizon catastrophe as Black Swan events. If these are Black Swans, then shouldn’t we classify the Aban Pearl also a Black Swan? Or is the Aban Pearl not a Black Swan because it did not get the media attention that the Deepwater Horizon has been receiving?
Please note also that Taleb’s definition of a Black Swan consists of three elements:
“it is unpredictable; it carries a massive impact; and, after the fact, we concoct an explanation that makes it appear less random”
While the above cited events have met some of the criteria for a “Black Swan” – unpredictability; the massive impact of each is yet to be determined and we have yet to see explanations that make these events appear less random. Interestingly, the Icelandic volcano Eyjafjallajokull may qualify as a “White Swan” according to Taleb in his latest version of “The Black Swan” recently published. Eyjafjallajokull on 20 April 2010 (the date of the Deepwater Horizon event) was emitting between “150,000 and 300,000″ tons of CO2 a day. This contrasted with the airline industry emissions of almost 345,000 tons, according to an article entitled, “Planes or Volcano?” originally published on 16 April 2010 and updated on 20 April 2010 (http://bit.ly/planevolcano).
While we can only estimate the potentially massive impact of the Deepwater Horizon event: the Aban Pearl according to statements by Venezuelan President Hugo Chavez, appears to have resulted in no environmental release or loss of life.
Venezuela’s energy and oil minister, Rafael Ramirez, said there had been a problem with the flotation system of the semi-submersible platform, causing it to keel over and sink. Ramirez also said a tube connecting the rig to the gas field had been disconnected and safety valves activated, so there no risk of any gas leak. The incident came less than a month after an explosion that destroyed the Deepwater Horizon rig in the Gulf of Mexico. At the time of this writing oil prices are actually declining instead of rising as would be the expected outcome of a Black Swan event (perhaps we should rethink Deepwater Horizon and Aban Pearl and classify them as “White Swan” events?). What may be perceived as or classified as a Black Swan by the media driven hype that dominates the general populace may, in fact, not be a Black Swan at all for a minority of key decision makers, executives and involved parties. This poses a significant challenge for planners, strategists and CEO’s. However, Board mandated evaluation can go a long way to identification and mitigation.
Recent events, such as the eruption of the Icelandic volcano Eyjafjallajokull, the Deepwater Horizon catastrophe or the Aban Pearl sinking cannot be classified as Black Swan events as they do not meet the three criteria for a Black Swan. However, their impact (yet to be fully determined) may be far reaching. The three events cited do have Black Swan qualities when viewed in context to today’s complex global environment. This we believe is the Strategist, Planner, CEO’s and board of directors’ greatest challenge – to develop strategies that are flexible enough to adapt to unforeseen circumstances while meeting corporate goals and objectives. This requires a rethinking of how much involvement the board of directors has in contingency planning, competitive intelligence activities and cross-functional relationships internally and externally as these become more important from a governance perspective.
Challenges for CEO’s and Board Members
Figure 1, entitled, “The “Big Bang” – Complex Systems – Black Swan Events,” depicts the effect of an outlier event that triggers independent events and reactionary events that result in a cumulative Black Swan event/effect.
Figure 1 recognizes four elements:
- Agents (Outlier Events) acting in parallel
- Continuously changing circumstances
- Reactionary response creates potential cascades resulting in cumulative effects
- Lack of pattern recognition leads to a failure to anticipate the future
How does one overcome the cumulative effect of outlier events? We have to rethink business operations and begin to focus on what we will term “Strategy at the edge of chaos.” This should not be considered a radically new concept in management thinking; rather it recognizes that while strategic concepts are the threshold of management theory, appropriate strategic responses do not always happen fast enough. Markets are not in a static equilibrium; the recent crisis in Europe has cascaded from Greece to concerns over the banking systems in Spain, Portugal and may see Germany leave the European Union. Markets and organizations tend to be reactive, evolving and difficult to predict and control.
Complex Adaptive Systems
Unpredictability is the new normal. Rigid forecasts, cast in stone, cannot be changed without reputational damage; therefore strategists, planners and CEO’s are better served to make assumptions – an assumption can be changed, adjusted – assumptions are flexible and less damaging to an enterprise’s (or person’s) reputation. Unpredictability can be positive or negative. Never under estimate the impact of change (we live in a rapidly changing, interconnected world), inflation (this is not just monetary inflation, it includes the inflated impact of improbable events), opportunity (recognize the “White Swan” effect) and the ultimate consumer (most often overlooked in contingency plans is the effect of loss of customers).
12 Steps to get from here to there and temper the impact of Black Swans
How does the board communicate the structured approach that enhances governance and limits the boards liability?
Michael J. Kami author of the book, “Trigger Points: how to make decisions three times faster,” wrote that an increased rate of knowledge creates increased unpredictability. The Stanley Davis and Christopher Meyer authors of the book “Blur: The Speed of Change in the Connected Economy,” cite Speed – Connectivity – Intangibles as key driving forces. If we take these points in the context of the Black Swan as defined by Taleb we see that our increasingly complex systems (globalized economy, etc.) are at risk. Kami outlines 12 steps in his book that provide some useful insight. How you apply them to your enterprise can possibly lead to a greater ability to temper the impact of a Black Swan event(s). If the board has documentation of their oversight of such analysis the courts are likely to hold that all reasonable effort was exerted to identify and mitigate the impact.
Step 1: Where Are We? Develop an External Environment Profile
Key focal point: What are the key factors in our external environment and how much can we control them?
Step 2: Where Are We? Develop an Internal Environment Profile
Key focal point: Build detailed snapshots of your business activities as they are at present.
Step 3: Where Are We Going? Develop Assumptions about the Future External Environment
Key focal point: Catalog future influences systematically; know your key challenges and threats.
Step 4: Where Can We Go? Develop a Capabilities Profile
Key focal point: What are our strengths and needs? How are we doing in our key results and activities areas?
Step 5: Where Might We Go? Develop Future Internal Environment Assumptions
Key focal point: Build assumptions, potentials, etc. Do not build predictions or forecasts! Assess what the future business situation might look like.
Step 6: Where Do We Want to Go? Develop Objectives
Key focal point: Create a pyramid of objectives; redefine your business; set functional objectives.
Step 7: What Do We Have to Do? Develop a Gap Analysis Profile
Key focal point: What will be the effect of new external forces? What assumptions can we make about future changes to our environment?
Step 8: What Could We Do? Opportunities and Problems
Key focal point: Act to fill the gaps. Conduct an opportunity-problem feasibility analysis; risk analysis assessment; resource-requirements assessment. Build action program proposals.
Step 9: What Should We Do? Select Strategy and Program Objectives
Key focal point: Classify strategy and program objectives; make explicit commitments; adjust objectives.
Step 10: How Can We Do It? Implementation
Key focal point: Evaluate the impact of new programs.
Step 11: How Are We Doing? Control
Key focal point: Monitor external environment. Analyze fiscal and physical variances. Conduct an overall assessment.
Step 12: Change What’s not Working Revise, Control, Remain Flexible
Key focal point: Revise strategy and program objectives as needed; revise explicit commitments as needed; adjust objectives as needed.
We would add the following comments to Kami’s 12 points and Davis, Meyer point on speed, connectivity, and intangibles. Understanding the complexity of the event can facilitate the ability of the organization to adapt if it can broaden its strategic approach. Within the context of complexity, touchpoints that are not recognized create potential chaos for an enterprise and for complex systems. Positive and negative feedback systems need to be observed/acted on promptly. The biggest single threat to an enterprise will be staying with a previously successful business model too long and not being able to adapt to the fluidity of situations (i.e., Black Swans). The failure to recognize weak cause-an-effect linkages, small and isolated changes can have huge effects. Complexity (ever growing) will make the strategic challenge more urgent for strategists, planners and CEO’s.
Taleb offers the following two definitions in his book “The Black Swan,” The first is for “Mediocristan;” a domain dominated by the mediocre, with few extreme successes or failures. In Mediocristan no single observation can meaningfully affect the aggregate. In Mediocristan the present is being described and the future forecasted through heavy reliance on past historical information. There is a heavy dependence on independent probabilities
The second is for “Extremeistan;” a domain where the total can be conceivably impacted by a single observation. In Extremeistan it is recognized that the most important events by far cannot be predicted; therefore there is less dependence on theory. Extremeistan is focused on conditional probabilities. Rare events must always be unexpected, otherwise they would not occur and they would not be rare.
When faced with the unexpected presence of the unexpected, normality believers (Mediocristanians) will tremble and exacerbate the downfall. Common sense dictates that reliance on the record of the past (history) as a tool to forecast the future is not very useful. You will never be able to capture all the variables that affect decision making. We forget that there is something new in the picture that distorts everything so much that it makes past references useless. Put simply, today we face asymmetric threats (Black Swans and White Swans) that can include the use of surprise in all its operational and strategic dimensions and the introduction of and use of products/services in ways unplanned by your organization and the markets that you serve. Asymmetric threats (not fighting fair) also include the prospect of an opponent designing a strategy that fundamentally alters the market that you operate in.
The Diagnostic Bias Trap
A diagnostic bias is created when four elements combine to create a barrier to effective decision making. Recognizing diagnostic bias before it debilitates effective decision making can make all the difference in day-to-day operations. It is essential in crisis situations to avert compounding initial errors. The four elements of diagnostic bias are:
- Labeling
- Loss Aversion
- Commitment
- Value Attribution
Labeling creates blinders; it prevents you from seeing what is clearly before your face – all you see is the label. Loss aversion essentially is how far you are willing to go (continue on a course) to avoid loss. Closely linked to loss aversion, commitment is a powerful force that shapes our thinking and decision making. Commitment takes the form of rigidity and inflexibility of focus. Once we are committed to a course of action it is very difficult to recognize objective data because we tend to see what we want to see; casting aside information that conflicts with our vision of reality. First encounters, initial impressions shape the value we attribute and therefore shape our future perception. Once we attribute a certain value, it dramatically alters our perception of subsequent information even when the value attributed (assigned) is completely arbitrary.
Recognize that we are all swayed by factors that have nothing to do with logic or reason. There is a natural tendency not to see transparent vulnerabilities due to diagnostic biases. We make diagnostic errors when we narrow down our field of possibilities and zero in on a single interpretation of a situation or person. While constructs help us to quickly assess a situation and form a temporary hypothesis about how to react (initial opinion) they are restrictive in that they are based on limited time exposure, limited data and overlook transparent vulnerabilities.
The best strategy to deal with disoriented thinking is to be mindful (aware) and observe things for what they are (situational awareness) not for what they appear to be. Accept that your initial impressions could be wrong. Do not rely too heavily on preemptive judgments; they can short circuit more rational evaluations. Are we asking the right questions? When was the last time you asked, “What Variables (outliers, transparent vulnerabilities) have we Overlooked?”
Our colleague, John Stagl adds the following regarding value. Value = the perception of the receiver regarding the product or service that is being posited. Value is, therefore, never absolute. Value is set by the receiver.
Taleb, in the revised 2nd edition, of “The Black Swan” posits the following: “How much more difficult is it to recreate an ice cube from a puddle than it is to forecast the shape of the puddle from the ice cube?” His point is that we confuse the two arrows: Ice cube to Puddle is not the same as Puddle to Ice cube. Ice cubes and puddles come in different sizes, shapes, etc. Thinking that we can go from theory to practice and practice to theory creates the potential for failure.
While the Icelandic volcano will have non-regulatory consequences that could as yet, be far reaching, the regulatory deluge to be expected as a result of Deepwater Horizon could be a watershed event for the offshore drilling industry, much as the Oil Pollution Act of 1990 changed many oil companies’ shipping operations.
Conclusion
It takes 85 million barrels of oil per day globally, as well as millions of tons of coal and billions of cubic feet of natural gas to enable modern society to operate as it does. In 2009 there were 214 vessels attacked, resulting in 47 hijackings and $120 million in ransoms paid for those ships were realized by the pirates. As gaps in the global network open up, guerrilla entrepreneurship is sure to follow. The rewards are substantial.
We stated earlier that, “because we are asking the wrong questions precisely, we are getting the wrong answers precisely; and as a result we are creating false positives.” Unless we change the paradigm of enterprise risk management, we will continue to get false positives and find ourselves reacting to events instead of managing events effectively.
Unpredictability is the new normal. Rigid forecasts, cast in stone, cannot be changed without reputational damage; therefore strategists, planners and CEO’s are better served to make assumptions – an assumption can be changed, adjusted – assumptions are flexible and less damaging to an enterprise’s (or person’s) reputation. Unpredictability can be positive or negative. Never under estimate the impact of change (we live in a rapidly changing, interconnected world), inflation (this is not just monetary inflation, it includes the inflated impact of improbable events), opportunity (recognize the “White Swan” effect) and the ultimate consumer (most often overlooked in contingency plans is the effect of loss of customers). If there is one thing we’ve learned, it is that humans are not great at measuring and responding to risk when placed in situations too complicated to understand:
First, people have trouble imagining how small failings can combine to lead to catastrophic disasters. At the Three Mile Island nuclear facility, a series of small systems happened to fail at the same time. It was the interplay between these seemingly minor events that led to an unanticipated systemic crash.
Second, people have a tendency to get acclimated to risk. As the physicist Richard Feynman wrote in a report on the Challenger disaster, as years went by, NASA officials got used to living with small failures. If faulty O rings didn’t produce a catastrophe last time, they probably won’t this time, they figured. Feynman compared this to playing Russian roulette. Success in the last round is not a good predictor of success this time. Nonetheless, as things seemed to be going well, people unconsciously adjust their definition of acceptable risk.
Third, people have a tendency to place elaborate faith in backup systems and safety devices. More pedestrians die in crosswalks than when jaywalking. That’s because they have a false sense of security in crosswalks and are less likely to look both ways. On the Deepwater Horizon oil rig, a Transocean official apparently tried to close off a safety debate by reminding everybody the blowout preventer would save them if something went wrong. The illusion of the safety system encouraged the crew to behave in more reckless ways. As Malcolm Gladwell put it in a 1996 New Yorker essay, ”Human beings have a seemingly fundamental tendency to compensate for lower risks in one area by taking greater risks in another.”
Fourth, people have a tendency to match complicated technical systems with complicated governing structures. The command structure on the Deepwater Horizon seems to have been completely muddled, with officials from BP, Transocean and Halliburton hopelessly tangled in confusing lines of authority and blurred definitions of who was ultimately responsible for what.
Fifth, people tend to spread good news and hide bad news. Everybody wants to be part of a project that comes in under budget and nobody wants to be responsible for the reverse. For decades, millions of barrels of oil seeped out of a drill off the Guadalupe Dunes in California. A culture of silence settled upon all concerned, from front-line workers who didn’t want to lose their jobs to executives who didn’t want to hurt profits.
Sixth, people in the same field begin to think alike, whether they are in oversight roles or not. The oil industry’s capture of the Minerals Management Service is actually misleading because the agency was so appalling and corrupt. Cognitive capture is more common and harder to detect.
Some final thoughts:
- If your organization is content with reacting to events it may not fare well
- Innovative, aggressive thinking is one key to surviving
- Recognition that theory is limited in usefulness is a key driving force
- Strategically nimble organizations will benefit
- Constantly question assumptions about what is “normal”
Ten blind spots:
#1: Not Stopping to Think
#2: What You Don’t Know Can Hurt You
#3: Not Noticing
#4: Not Seeing Yourself
#5: My Side Bias
#6: Trapped by Categories
#7: Jumping to Conclusions
#8: Fuzzy Evidence
#9: Missing Hidden Causes
#10: Missing the Big Picture
Lord John Browne, former Group Chief Executive of BP, sums it up well:
“Giving up the illusion that you can predict the future is a very liberating moment. All you can do is give yourself the capacity to respond to the only certainty in life – which is uncertainty. The creation of that capability is the purpose of strategy.”
The collective wisdom of the Board can help identify resources in an enterprise to prepare for uncertainty.
In a crisis you get one chance – your first and last. Being lucky does not mean that you are good. You may manage threats for a while. However, luck runs out eventually and panic, chaos, confusion set in; eventually leading to collapse.
How you decide to respond is what separates the leaders from the left behind. Today’s smartest executives know that disruption is constant and inevitable. They’ve learned to absorb the shockwaves that change brings and can use that energy to transform their companies and their careers.
(C) Geary W. Sikich
Geary W. Sikich is the author of “It Can’t Happen Here: All Hazards Crisis Management Planning” (Tulsa, Oklahoma: PennWell Books, 1993). His second book, “Emergency Management Planning Handbook” (New York: McGraw-Hill, 1995) is available in English and Spanish-language versions. His third book, “Integrated Business Continuity: Maintaining Resilience in Uncertain Times,” (PennWell 2003) is available on www.Amazon.com. His latest book, entitled, “Protecting Your Business in a Pandemic: Plans, Tools, and Advice for Maintaining Business Continuity” is published by Praeger Publishing. Mr. Sikich is the founder and a principal with Logical Management Systems, Corp. (www.logicalmanagement.com), based near Chicago, IL. He has extensive experience in management consulting in a variety of fields. Sikich consults on a regular basis with companies worldwide on business-continuity and crisis management issues. He has a Bachelor of Science degree in criminology from Indiana State University and Master of Education in counseling and guidance from the University of Texas, El Paso. Geary can be reached at (219) 922-7718.
REFERENCES
Apgar, David, Risk Intelligence – Learning to Manage What We Don’t Know, Harvard Business School Press, 2006.
Brooks, David, “Our risk mismanagement,” New York Times Published on Sunday, May 30, 2010
Davis, Stanley M., Christopher Meyer, Blur: The Speed of Change in the Connected Economy, (1998).
Kami, Michael J., “Trigger Points: how to make decisions three times faster,” 1988, McGraw-Hill, ISBN 0-07-033219-3
Klein, Gary, “Sources of Power: How People Make Decisions,” 1998, MIT Press, ISBN 13 978-0-262-11227-7
Levene, Lord, “Changing Risk Environment for Global Business,” Union League Club of Chicago, April 8, 2003.
Odiorne, George, “Management and the Activity Trap,” Harper & Row Publishers; 1974, ISBN 0-06-013234-5
Orlov, Dimitry, “Reinventing Collapse” New Society Publishers; First Printing edition (June 1, 2008), ISBN-10: 0865716064, ISBN-13: 978-0865716063
Rosenbaum, Eric, “BP, Big Oil: Meet Sarbanes-Oxley Section 302,” 22 June 2010
Sikich, Geary W., Managing Crisis at the Speed of Light, Disaster Recovery Journal Conference, 1999
Sikich, Geary W., Business Continuity & Crisis Management in the Internet/E-Business Era, Teltech, 2000
Sikich, Geary W., What is there to know about a crisis, John Liner Review, Volume 14, No. 4, 2001
Sikich, Geary W., The World We Live in: Are You Prepared for Disaster, Crisis Communication Series, Placeware and ConferZone web-based conference series Part I, January 24, 2002
Sikich, Geary W., September 11 Aftermath: Ten Things Your Organization Can Do Now, John Liner Review, Winter 2002, Volume 15, Number 4
Sikich, Geary W., Graceful Degradation and Agile Restoration Synopsis, Disaster Resource Guide, 2002
Sikich, Geary W., Are We Creating False Positives?, Continuity e-Guide, March 2004
Sikich, Geary W., “Aftermath September 11th, Can Your Organization Afford to Wait”, New York State Bar Association, Federal and Commercial Litigation, Spring Conference, May 2002
Sikich, Geary W., “Integrated Business Continuity: Maintaining Resilience in Times of Uncertainty,” PennWell Publishing, 2003
Sikich, Geary W., “It Can’t Happen Here: All Hazards Crisis Management Planning”, PennWell Publishing 1993.
Sikich Geary W., “The Emergency Management Planning Handbook”, McGraw Hill, 1995.
Sikich Geary W., Stagl, John M., “The Economic Consequences of a Pandemic”, Discover Financial Services Business Continuity Summit, 2005.
Sikich, Geary W., “Black Swans or just Wishful Thinking and Misinterpretation?” e-Continuity Guide, July 2010
Sikich, Geary W., “When is a Black Swan not a Black Swan?” ContinuityCentral.com, June 2010.
Sikich, Geary W, “Will BP’s catastrophe in the gulf be the defining moment for oil industry executives?” pending publication August 2010.
Sikich, Geary W., “BP Vortex 6” Created, June 2010.
Surmacz, Jon, “Disaster Preparedness,” CSO Magazine, August 14, 2003
Tainter, Joseph, “The Collapse of Complex Societies,” Cambridge University Press (March 30, 1990), ISBN-10: 052138673X, ISBN-13: 978-0521386739
Taleb, Nicholas Nassim, The Black Swan: The Impact of the Highly Improbable, 2007, Random House – ISBN 978-1-4000-6351-2
Taleb, Nicholas Nassim, The Black Swan: The Impact of the Highly Improbable, Second Edition 2010, Random House – ISBN 978-0-8129-7381-5
Taleb, Nicholas Nassim, Fooled by Randomness: The Hidden Role of Chance in Life and in the Markets, 2005, Updated edition (October 14, 2008) Random House – ISBN-13: 978-1400067930
Taleb, N.N., Common Errors in Interpreting the Ideas of the Black Swan and Associated Papers; NYU Poly Institute October 18, 2009
Vail, Jeff, The Logic of Collapse, www.karavans.com/collapse2.html, 2006
Van Vactor, Sam, Dr., “US Energy and the Dodd-Frank Act,” 10 August 2010
Leave a Reply