Crisis communications professionals often opt for sub-optimal crisis response strategies, preferring actions that make for immediate gains over approaches that guarantee long term success. Two heuristic biases that are triggered by the particular stressful conditions that come with crises are at play: myopic loss aversion and hyperbolic discounting. Both modifications to people and environments offer potential remedies.
There are two specific strategic crisis communications theories of which the claims have been substantiated by a large body of empirical data in the last few decades. These theories have (for that reason) been part of the training curriculum of large cohorts of senior communications professionals in the last decades. Making crisis communications choices in a manner that is prescribed by both theories I will call henceforth a “rational” approach to crisis communications.
The first rational theory is the Situational Crisis Communications Theory (SCCT). It has been developed and has been kept current by Timothy W. Coombs (Texas A&M University). In short, SCCT posits that the ideal choice for any strategic response strategy will depend to a great amount on the degree to which an organization is perceived to be responsible for the crisis at hand. The more the organization is perceived to be responsible, the more accommodating its strategy will need to be (with potentially room for apologies and compensation).
The second theory is Stealing Thunder Theory and has been initially developed by Laura Arpan (Florida State University) and Donnalyn Pompper (Temple University). It says that organizations should “steal the thunder” of their accusers by breaking the bad news of a crisis before anyone else does.
A great many communications professionals go into a crisis being well versed in both the crisis response and timing theories explained above, yet in the heat of the moment they will not apply them. What is going on?
In a crisis, intuition takes over
A 2018 experiment with a total of 80 Belgian and Dutch communication professionals has provided empirical backing for a long held belief among anyone who has seen communicators at work in a crisis: because of the specific challenges of among others information overload and time pressure, in a crisis communicators will tend to fall back on sub-optimal, intuitive decision making.
On average, the participants in the experiment had about 10 years of experience in crisis communications. One half of the group received no more than four minutes to make a strategic recommendation on a scenario they had been briefed on. The remaining time was shown on the screen and they were told to keep an eye on the time while writing down their recommendations for a fictitious organization in crisis. I was part of that first group. I do not remember what made up the scenario, only that the short time span was pretty demanding on me. The other half of the respondents could take all the time they wanted to formulate their advice (I would have liked that!).
The coding and analysis of the data that was collected for this pilot study is happening as we speak. One conclusion that An-Sofie Claeys from the University of Leuven who oversaw the experiment shared with me, is that the group that was under time pressure was inclined to make crisis communications decisions in a more intuitive manner than the other group. These participants were for example more likely to agree that “the recommendation for the organization in crisis just came to me naturally” than those that were not under any time pressure. The results of the study will be presented in detail at Crisis6 (a Leeds Beckett University conference) later this year.
Is intuition always a bad thing to lean on? The answer to this question depends on what kind of intuition you are talking about. Professionals will at times resort in their practice to what are called “expert schemas” meaning that a good course of action will come to mind based on past experience. However, while intuition can inform sound judgement when it is based on experience, crisis situations which are by their very nature often very new connect poorly with prior experience. This means that communicators will in such circumstances often fall back to another kind of intuition, one that behavioral economists call “heuristics” and that in common parlance is well… nothing more than“gut feeling.” This is where things quickly go awry, because heuristics are often rife with biases.
Two heuristic biases at play in a crisis
Behavioral economics helps us understand through which biased heuristics people make decisions that rationally don’t make much sense. The aforementioned SCCT and Stealing Thunder approaches reflect a rational approach because they are evidence-based. With options available that are true and tested, what keeps communicators from acting rationally in a crisis? Claeys and Coombs made a thorough analysis of the research on heuristic biases and propose in a recent article in Communication Theory two biases as likely culprits.
The first bias is myopic loss aversion. This bias makes people have a greater sensitivity to losses than they have to gains (this insight is a cornerstone of behavioral economics) and also causes people to have a tendency to frequently evaluate outcomes. A second bias is hyperbolic discounting and this bias makes present rewards be weighted more favorably than future rewards.
Sounds complicated? It does not have to be. Let’s illustrate both biases with an example. Let’s depart from the (sadly enough not unlikely) scenario of a healthcare company fallen victim to a data breach. The company is in large part to blame for the incident because the party who stole the data found a way in through a hole in the security that the company left unsecured. The rational, optimal approach would be to communicate proactively about the data breach to all concerned (Stealing Thunder Theory) and to communicate about the crisis in a very accommodating way, assuming responsibility, and possibly expressing apologies and even offering compensations (SCCT).
The rational approach is what a team that falls prey to the aforementioned heuristic biases will not follow however. An intuitive approach where both biases are at play will consist of not acting proactively to avoid an immediate hit in reputation. This is myopic risk aversion at play. That first potential “loss” is never really avoided of course, since the reputational consequences will always prove worse when stakeholders find out at a later time that the company knew about the breach.
An initial approach might also, if and when news comes out, contain a knee-jerk denial strategy. This might be considered by the communicator(s) as an option that obtains a short term benefit. Here we see hyperbolic discounting at work. The immediate (very temporary) gain of warding off reputational damage will soon disappear however once it becomes clear that the denial strategy can not be sustained because the company is indeed responsible for what went wrong.
So we absolutely want to avoid an immediate loss of reputation, and we think in the heat of the crisis that the temporary gains of warding off reputational loss outweigh the potential long-term gains to be had by taking the other road.
Claeys and Coombs do not offer solutions for the biases that cause crisis communications professionals to resort to sub-optimal heuristics. So, what can be done?
Based on the existing literature on debiasing, as summarized by Jack B. Soll (Duke University), Katherine L. Milkman (The University of Penssylvania) and John W. Payne (Duke University) in their chapter on the topic in the Wiley-Blackwell Handbook of Judgement and Decision Making, different options present themselves for both preparing people and their environments.
My article is not meant to offer an exhaustive overview of all the debiasing techniques offered in literature (I will revisit this topic in greater length at a later date), however, the following techniques present themselves as prime candidates for helping organizations avoid the pitfalls of the heuristics at hand.
Before I list the different techniques, an important remark needs to be made. It goes without saying that the tools and processes laid out in any professional crisis communications manual can and should help mitigate the cognitive overload and time pressure challenges that cause the heuristics biases to be resorted to in the first place. This is a matter among other things of designing efficient work processes where no tasks are escalated to the decision-makers that can be handled by lower levels of execution.
Changes to the person
“Is there a reason you had to mention the catering services in your crisis communications manual?” a client once asked me. Well, there was and there is. Being hungry and fatigued causes people to have lowered decision readiness. This means that processes need to be put in place that guarantee the care of those that make decisions. Where possible, there needs to be room for work in shifts, small breaks need to be allowed and even enforced and, as I said, it is no detail to make sure that somebody orders food in time.
Education is another road to take when preparing decision-makers. Communicators need to not only be trained in the theories that they should apply when deciding on the right course of action in a crisis. In many cases that is really not where things go wrong. They also need to be made aware in advance of the existence of heuristic biases and the need to make use of tools that counter them. This brings us to the next point.
Changes to the environment
One debiasing tool that will come in handy in any crisis communications decision-making context is a checklist. I mentioned how the long term benefits are sacrificed for short term gains when the combined heuristics of myopic risk aversion and hyperbolic discounting ravage the rational thought processes of communicators. The crisis communications manual can provide the crisis communicators with a checklist prompting them to reflect on the expected benefits and costs, both short term and long term, of every strategic decision they consider.
Another way to counter biased heuristics is to embed the debiasing in an organization’s routines and culture. This can be as simple as senior management repeating a proverb regularly to make sure that a lesson is well heard, assimilated, and never forgotten. I once worked on a training program with a communications manager who repeated an anecdote on a past mishandled crisis that was due to biased thinking in every meeting where crisis preparedness would come up. I am confident that he made sure they learned their lesson.
We still have a long road ahead in understanding all of the biases that come into play in crisis communications decision-making and how these biases can best be countered. The groundwork laid by Claeys and Coombs in applying the existing corpus of behavioral economics knowledge to crisis communications, is both exciting and promising. In-house professionals and consultants do well to follow up on this research closely if they want to serve their clients well in the preparations for any crisis.
Did you enjoy this piece on heuristic biases in crisis communications?
You might also like my interview with Paul Gibbons where he talks about the need for change managers to apply more scientifically grounded approaches.