Intelligent Risk Management
  • Home
  • Talk With Us
  • What We Do
  • Events
  • Other Publications
  • About

Lessons for The Private Sector from the January 6 Intelligence Failure

11/23/2021

0 Comments

 
   Steve: Jay, have the investigations into the January 6 assault on the US Capital revealed any lessons for private sector organizations?
   Jay: While we lack access to the most sensitive information, based on open sources it’s clear that intelligence and federal and state law enforcement agencies tragically underestimated the scope, size, and intensity of the protestors gathered just a few blocks from the US Capitol. And from these and my own experiences, Steve, I want to pass on to our private sector audience the single most important takeaway from what happened. It was, by definition, an intelligence failure, which occurs when analysts and their managers for multiple reasons fail to anticipate an event or development that impacts clearly-understood US interests, despite information that could have mitigated that risk. How bad can this get? By all official accounts, the fallout from the January 6 assault has been catastrophic in terms of loss of life, property damage and the undermining of our political institutions. The effects will be long lasting.
   Steve: This notion of intelligence failures clearly applies to our audience. Organizations often have all the data they need to make sound decisions, but emotions, biases, and time pressure can derail the process when stakes are highest. I know from my own experience  that such failures can happen to any organization at any time, regardless of its success rate in making good decisions. No organization one is immune, and the more organizations are convinced that their track record inoculates them from such intelligence failures, the more vulnerable they are to bad decisions. I also strongly believe that there is no single point of failure. These are systemic failures that prevent organizations from objectively assessing risk, especially under the pressure of high-stakes decisions.
   Jay:  Good point. In the case of the January 6 assault, multiple post-mortems made clear that even with the vast analytic and data-collection resources available to law enforcement agencies, that signaled the extent and intent of dozens of extremist groups, they mis-read the evidence. In fact, a sole DHS senior analyst had gathered reporting from all over the country that should have been a flashing red light to the decision-makers, but he was ignored and not even invited to planning meetings. In the CHP’s last “Special Event Assessment” on January 3, in their bottom-line-up-front (their label), they focused only on the expected efforts by senators and member of congress to stop the certification process. CHP analysts raised the potential for violence only in the last paragraph—labeled their “Overall Analysis”-- and then only assessed that it “cannot be ruled out.” The FBI’s analysis was no more accurate, assuming that the language in the reporting was “aspirational” and, therefore, protected by the First Amendment.
   Steve:  These were critical errors in judgment and make the crucial point of Intelligent Analysis,  that immunity from bad high-stakes decisions is an illusion. It would be easy for our private sector audiences to claim (incorrectly) that the intelligence services were simply incompetent and that their own organizations don’t operate like that, that their decision making processes are airtight. But I have seen these same failures in decisions across business areas and within organizations that could have been avoided. 
   Jay: Right. Decision making is a minefield, but the analytic tools and processes set out in our book can help break down this sense of invulnerability that gets in the way of objective, evidence-based assessments. Here are three areas where public and private sector challenges overlap.
  • First, the failure to warn. Private sector organizations need a system in place that can react quickly when an event or development threatens their organization. Warnings are comprised of three elements: “what, when, and how bad.” To make this work, their risk analysts must see themselves as “trip wires” and, in order to play that role, they need to know what issues matter most and why, so they can monitor for changes.
  • Second, the failure to test assumptions. This was an egregious shortfall in the pre-January 6 analysis. Our book goes into great depth on the importance of testing assumptions that underpin assessments, in order to test the assumptions’ soundness before implementation. In this regard, the Key Assumptions Check becomes a crucial tool that requires evidence-based assessments of key judgments. In the case of the CHP, it would have clearly pushed the warning up front in the memo, and raised the warning probability to “highly likely,”  instead of “aspirational.”
  • Third, the failure to bring all stakeholders to the table. Intelligence failures, like what happened in the runup to January 6, are failures of process—they are systemic. A good system of warning and assumptions tests are necessary but not sufficient. In my experience, the “answer is in the room,” meaning that, in open discussions with the all the stakeholders around the table, someone, often a new or less experienced  participant, has the insight that can change the entire conversation. In the case of the law enforcement community however, the answer—the analyst with the data—was not in the room and no matter how much he tried to call attention to the risks, he was dismissed. Seniority and experience does not trump expertise and data. What leaders needed to know—what they should have demanded to know—was there.
They all should have known better. 

0 Comments

    Authors

    Jay Grusin & Steve Lindo

    Archives

    November 2021
    September 2021
    August 2021
    June 2021

    Categories

    All

    RSS Feed

Proudly powered by Weebly
  • Home
  • Talk With Us
  • What We Do
  • Events
  • Other Publications
  • About