An important problem with the way we are looking for answers is that we don’t know what questions we are looking to answer in the first place, says Rob Briner, expert in evidence-based management.
Earlier, I spoke with Paul Gibbons on the need for the change management practice to “science up”. Where Paul Gibbons focuses on change management, Rob Briner works on developing a methodology for practitioners in all sectors to apply evidence-based management.
Briner is a Professor in Organizational Psychology at Queen Mary University in London and is also the founder and scientific director of the Center for Evidence-Based Management (CEBMa).
I had a talk with Briner on why evidence-based management should be considered important by communications professionals, what evidence-based management really is (and what it is not), and how communications practitioners can get to work implementing the core principles of evidence-based management in their daily routine.
Methodology of evidence-based management
The evidence-based approach yields solutions for problems that matter.
Jo: First things first: Why do communications professionals need to get involved in evidence-based management in the first place?
Rob: If you want to focus on important problems and opportunities and you want to provide solutions that work, you will need to adopt an evidence-based practice approach. This is about doing not only what works but also what is important to the organization that you serve. Often managers and consultants will do things that might work, but in ways that are not important to their organization. The evidence-based approach yields solutions for problems that matter.
Jo: What exactly is the difference then between being data-driven and applying evidence-based management principles?
Rob: “Data-driven” does not mean anything. Are you using data? Good, well, everybody is. The evidence-based management approach is a more particular and specific thing. It uses multiple sources and does that in a critical way. Often people use poor quality data. That is why you need critical appraisal of the data.
In a note I wrote for the SHRM Executive Network I talk about six steps. First, you need to translate the perceived problem into answerable questions. Step two is acquiring evidence from four different sources. The following step three is where you appraise the obtained evidence and make a judgement about its trustworthiness. In Step 4, you will pull together the best available evidence from each of the four sources to build an overall picture of the answer to the questions. In step 5, you make better informed decisions based on the answers you found and finally, in step 6, you evaluate the effects of any decisions that were made.
Evidence-based management in consultancy
Often research is limited to reading a single article from the Harvard Business Review.
Jo: It feels to me like communications consultants are playing catch-up to management consultants who are already making a turn towards the evidence-based approach.
Rob: It’s funny that you would say that because in my experience, none of them apply evidence-based management. I am not saying they do not use any evidence at all, but the process they apply has nothing to do with evidence-based management. They have a typical product or service they want to sell, and they will try to make it fit. Over the last few years, CEBMa has had conversations with different management consulting firms. Initially these conversations were enthusiastic, but at the end of the day the consultants walked away. Consultants want the work. The incentive is to get to work. A lot of management consulting clients will ask a consultant if he or she can do something for them. Many times, if a consultant then pushes back too much and says they can not take on the assignment because they first need to know what the problem is, the potential client will go to someone else.
This is now just anecdotal, but from what I have seen with the management consultants I have worked with, it appears that most firms do not have people who have the skills to synthesize the scientific evidence. They don’t know how to scan the scientific publications, they don’t always understand what is in the articles. So often research is limited to reading a single article from the Harvard Business Review.
Jo: I am feeling better about the communications consultancy practice now, for which I thank you. You have mentioned the need to consult four different sources. Can you elaborate on this?
Rob: In an evidence-based management approach, people should consult their own professional knowledge, organizational data, scientific knowledge, and finally they should also consult their stakeholders.
Jo: Do they need to consult sources from all four buckets? I presume you want to have as many sources triangulate the findings from as many other sources as possible, corect?
Rob: Yes and no. Triangulation is part of it, but it is not all. The issue with triangulation is that it implies a certain level of precision. Triangulation is a concept from surveying and navigation. A ship can only be at one place. But in daily life you will often get different stories, where a conflict between different truths will be the norm.
Jo: So where does that leave us?
Rob: You have to understand why there are conflicts in your findings. Suppose that scientific evidence says that A causes B and organizational data says something else. You then need to ask yourself why that is the case. Maybe the scientific evidence is biased. Or maybe the organization is unique in a certain way that makes that the findings from the available scientific research do not apply. So next to triangulation you have contextualization, which is of great importance.
Jo: Do you always need four sources? Is simply relying on my professional expertise not enough at times?
Rob: Most important is that you go through a well structured approach. Now, what sources are concerned: One source of evidence is problematic. Two is better. Three beats two for sure. The answer to your question depends on the context.
Evidence-based management in communications
Science does not change every day. If you would read the newspaper every day, you would think it does, but it does not.
Jo: I am still a bit unsure about what all of this means to the daily practice of the communications professional. How could a consultant, for example, know how many types of sources need to be consulted for any given problem? I know consultants who do workshops and have happy customers. They do not ask themselves a great many questions on the use of different types of sources.
Rob: It depends on the question. You said something of great importance. You said they had happy customers. So what does that mean? Did the customers like the offering? Did they pay the consultants again? If your criteria for success are that they did not complain, then you are good. If your criteria for success are more ambitious, it might not be enough.
So this really brings us back to figuring out what the problem is. Would you say you need help catching a bus? You have done it all your life. No, you don’t need another source other than your own experience when you are trying to catch the bus. But what if you decided you want to reduce your wait? Maybe you need another source now and look at the timetable for example. Or maybe you will be best served looking online at the bus tracks. It all depends on the question you are asking.
For professional knowledge to be worth something as a source of evidence, you need practice, instant feedback between the decision and the outcome you are looking for, and a relatively stable environment. So you might have given some workshops at different times of day, maybe at times you felt great, at other times you felt flat, then it becomes difficult in all of this to link what you have been doing with the outcome of your work.
Jo: Enter the importance of science.
Rob: Correct. Science gives us a repeatable design, a stable environment where many variables are kept under control. But maybe in some instances scientific research will not be of help, even if you would have really good use for it, because it is possible that that one important study was done with a group of people who have very little in common with your clients or their needs.
Jo: I have a question about the need to take into account stakeholders concerns. What do you do if they are biased?
Rob: That is fine. Everything is biased. The question is: In which way are they biased? Let’s say you are introducing a change program. All sources lead you to believe that weekly meetings between the managers and the team on Thursday at lunch time will be the measure you need to take. But maybe the stakeholders, could be the managers themselves actually, will tell you – for whatever reason – that they don’t like the idea. You could say: They are biased. But you would probably not want to pursue the idea because your stakeholders are not going to want to do it.
Here is another example. A doctor might recommend a treatment that comes with an injection. Stakeholders might hate needles. You can then frown on their attitude and say their bias really makes no sense. But that does not make their bias disappear. Another approach could consist of looking at another way of doing things and maybe there is a treatment available that does not need doctors working with needles.
Jo: “What is science anyway. It changes every day.” some people will say. I am a communications professional. There are many thousands of articles that exist that contradict one another. How can I possibly find the scientific consensus in all of this.
Rob: Science does not change every day. If you would read the newspaper every day, you would think it does, but it does not. What is relevant is the body of evidence and that does not change that much.
Putting evidence-based management into practice
If you find all the answers you are looking for, you have done something wrong.
Jo: Should I read meta-studies to inform myself on the scientific body of evidence?
Rob: Of course you should. If you go to the CEBMa website you will find guidance on how to best proceed. In general, reading a single study is pointless and misleading. One study might be useful to help you understand a concept, or a method, but it does not help you get an overview of the body of evidence.
Jo: Practitioners could go to the website of the Institute for Publications for (IPR) example, or other places where knowledge is summarized and consolidated. Is that the recommended heuristic to follow?
Rob: It depends on what access IPR has to which databases. The key is to develop some skills. Go to the CEBMA website and you will find useful resources (Rob is referring to CEBMa’s Guideline for Critically Appraised Topics in Management and Organizations).
What comes first is always asking the right question. Take crisis communications, to stay close to your field of expertise. Let’s say there has been an incident with glass found in baby food. In communications, you will first and foremost try to identify the problem you will need to fix. Is the problem about harm? About impression management maybe? Do we need to make customers feel confident? Only once the question – the problem that matters – has been defined, can you start digging around and consulting academic research, or any other source.
Jo: What do you do if you do not find all the answers?
Rob: If you find all the answers you are looking for, you have done something wrong. [laughs] Unless it is a math problem. Most of the time there is not an answer. It will not be 4. Or 5.8. Or 3.993.
Jo: There is only an approximation of an answer.
Rob: Yes, or there might be several answers.
Jo: What would you say to people who hold against evidence-based management that it is costly and will slow down the decision-making process?
Rob: My answer to that remark is: Was whatever you did worth doing then, if you did not know what the problem was, and what was more likely to work than not? Slowing down the decision-making process is definitely good. The more you slow down the process, the more you will manage to think critically and prevent System 1 biases. If you are trying to make up your mind about which pizza to order you do not need to complicate things for yourself, but if you are considering spending half a million dollars on a training program, then you are best off not making an intuitive decision.
Jo: Would you agree with me that I can consider the evidence-based approach to be a solid safeguard against pseudoscience? It is not something I have seen you mention.
Rob: It is not helpful to label anything as pseudoscience. A lot of claims are made in very reputable journals that count me as pseudoscience. A lot of science is terribly wrong because it fails in replication for example. So to state that something is pseudoscience and not science is also a problem. You can not just simply distinguish between the two. The key thing is to gain skills to weigh the claims being made. Where is the data coming from? Do the authors have a vested interest? Those are questions that need to be asked.
Applying evidence-based management to the COVID-19 response
Jo: If you look at how the UK government managed the COVID-19 outbreak, did the evidence-based approach prevail?
Rob: Well, I have elaborated on this not so long ago in an article for People Management.
Initially, they were saying they were following the science. That doesn’t really make much sense because there is no such thing as “the science” of course. I think they were overly influenced by a few enthusiasts, people who were specialized at nudging for example. So they were overly dependent on a few experts, that is my impression. They were also not clear enough about how everything was coming together, from different sources. Other countries were more transparent about this, how they consulted the stakeholders (the industry for example). You need to know that there was not a lot of experience present in the government. A lot of them had not been in parliament that long and some were ministers for the first time.
Jo: So was the problem that there was a lack of communicating well about the different sources that had been consulted, or was the problem more that simply not a lot of different sources had been consulted?
Rob: That is hard to know, isn’t it? There was public disagreement between the scientific advisors. At some briefings the advisors were present, at others they were not. The spokespeople were providing scientific evidence at certain times, but not at others. For days and days they would report on how many new tests were being done, and then it turned out they were not being quite honest about that, and then they suddenly stopped reporting on the tests for a while. So there were issues around consistency for sure. Have you watched Cuomo’s briefings on the COVID-19 outbreak in New York City?
Jo: I have. He is a great communicator, isn’t he?
Rob: What I like about how he communicates is that we see part of the evidence-based approach at work here. He talks about what he knows, what he is unsure about, and what he really knows nothing about. This manner of communicating is important in gaining trust and buy-in for your audience.
Did you enjoy this piece on evidence-based media trainings?
You might also like our interview with Paul Gibbons where he talks about the need for change managers to apply more scientifically grounded approaches.
492 people already subscribed to our quarterly newsletter. Join them today.
- Webinar: Introduction to American tech media relations - February 18, 2021
- Detavernier hosts open media training at FINTECH BELGIUM - February 17, 2021
- How tech start-ups should share funding news through press releases - February 8, 2021