All of us can use the scouts' motto. Now in the time of Corona —how well prepared are we, or were we? Did we see this risk coming? Some call Covid-19 the black swan of 2020. Others think this is thoroughly predictable and that we have no reason not to be prepared.
We all work in a system. This system consists of people with their strengths and weaknesses, knowledge, habits, motivation and engagement. The system also has a culture where we work according to various formal and informal, more or less established patterns governed by purpose, values and norms. And the system has structures such as rules, routines, reporting lines and processes. Our system is relatively stable over time.
If we are great, we have considered different scenarios for our system – what we do if something happens. We build barriers and create processes that come into force by a unique signal – a trigger. If so, we have done everything "right" according to ISO 9001, which says we should provide insight into the context of the business and clarity on what risk factors exist to be at the forefront of both threats and opportunities that may come our way.
This system is our everyday life, and everything is under control. But we all know that exceptions happen. Then we go from normal state to state of emergency. Exceptions can occur in many circumstances. If we are proactive and gaze into the future, we may see risks that we have not thought about before. We can take our precautions against these scenarios and strengthen our barriers or be ready the day the opportunity becomes a reality. But it may also be that we have a customer asking for a change late in the delivery, or we have a shift in the market situation that requires us to pivot at speed. Another exception happens when we discover that we cannot keep what we promised the customer and must make that bitter phone call and ask for forgiveness and permission to deliver something else. We find similarities between all these exceptions. If we are at the forefront of the incident, we can, to some extent, control the use of resources and costs that follow. It is still predictable.
It is a lot drearier with those exceptions that come as a surprise to us. If we deliver our products or services to the customer in good faith, and they detect errors and deficiencies, this can cost us a lot in terms of time and money. Or if we get an accident where people, the environment or assets get hurt. Very often, these exceptions are possible to predict; we just have not taken our necessary precautions or done a good enough job before delivering to the customer. And often they end up becoming very expensive to handle for our business. I mention as an example recall of products, clean-up costs, crisis meetings and everything that is part of it.
But then there are crises! These unpredictable black swans that appear as a jack-in-a-box and that we DO NOT control. In short form, we define a black swan as "an event, positive or negative, which is considered unlikely but causes enormous consequences."
In risk theory, one talks about "low or very low probability or frequency, and large or substantial consequences" as a separate risk category. Common risk, where we use probability and consequence and come out with a number or colour in a risk matrix, occurs in a relatively stable system.
My good friend and fellow QRN ambassador, Dr Jan Emblemsvåg, has written several books and articles about risk. He clearly distinguishes between common risks, acting in stable systems with a high number of incidents, and HILF risk (HILF=High Impact Low Frequency), which occurs in highly unstable and complex systems, which are unpredictable in terms of how the event unfolds and what the actual consequences will be.
"Åknesrevnen" at Sunnmøre in North-West Norway is such an unstable system. The movie "The Wave" shows a possible scenario if this mountain section avalanches into the fjord. The Norwegian Geological Survey, NGU, says that such landslides occur about four times per 10 000 years. That is a frequency of 1 landslide per 2 500 years. The consequence of such a landslide will involve approximately 3 000 dead. With conventional risk thinking, this means that we risk about one death per year as a result of such a significant landslide. Acceptable?
You might exclaim that we cannot think like that! We must do something, right? We know that "Åknesrevnen" will fall—just as we knew the similar mountain section "Mannen" would fall. And in the last case we had taken precautions, prepared warning routines, evacuated those who were at risk and could calmly stand at a distance and admire the forces of nature.
Yes, and this is precisely the point. Incidents such as pandemics, terrorist incidents, major natural disasters, individual events such as Deepwater Horizon, Chernobyl, Fukushima and similar can be called black swans or HILF-risks. When we face HILF risk incidents, we need to think quite differently than with the usual risk mindset, handled by corrective or preventive measures.
The major security-critical enterprises are good at this. They have thought through many different scenarios for events and made contingency plans. The great ones test their contingency plans regularly with extensive training.
In the autumn of 2002, I experienced being part of an organisation that had to implement its contingency plan. The car ship HUAL Europe was in great difficulty in a typhoon off the coast of Japan and ended up grounding on the island of Izu Oshima. They were able to rescue everyone on board, thankfully with only one person injured with a broken leg. The ship had about 4,000 cars in the cargo hold. A few days after the grounding, a fire broke out, and the vessel collapsed like a milk carton.
The press interest in the incident was excessive, and we eventually understood that our carefully prepared contingency plan did not work as intended. We literally had NRK, the Norwegian Broadcasting Company, inside the heart of our emergency response organisation, inside the emergency room. We hadn't thought through how we were going to organise, and we spent a lot of time afterwards making more thoughtful contingency plans. We trained all personnel to use the new plans. And we improved them based on learning from both the actual event and from the training situations.
It may be too late to make contingency plans now – the Covid-19 pandemic is, after all, a fact. Lots of people find themselves running after and reacting to news bulletins, rumours, real and fake news, government guidelines and everything that come our way.
Most workplaces and schools are closed, events cancelled, holiday trips postponed, aeroplanes grounded – yes, the whole world is grounded. The only traffic that increases is traffic in the broadband and fibre networks. We wash our hands incessantly and find new ways to interact. But how well prepared were we?
It should come as no surprise that a pandemic would occur. But it is still a HILF risk because it hits in the middle of the definition of HILF (i.e. an event that occurs with a low degree of frequency, usually in a way that is irregular, unpredictable, and that causes a significant degree of disturbance when it happens).
What we can do now is learn. We must learn from all exceptions, but especially from crises. What can we do better next time? How can we be more conscious, front-ended, more proactive and better prepared? While it may feel like we're closing the stable door after the horse has escaped, but it is now – as we all sit in our own home office – that the experience is strongest.
If you're not already doing it, this is a perfect time to write your pandemic plan. Then you're prepared the next time this happens.
Sidsel W. Storaas is in the "Exceptional Quality Leader" category. She has experience from substantial professional and leadership tasks with quality and improvement management as part of the business management. She has an extended engagement in the association Quality and Risk Norway as, among other things, President, a lecturer at conferences and the EOQ Congress, Ambassador, educator, podcast host and representative of member companies. Sidsel is an MSc-graduate of NTNU in marine technology and has worked for the maritime industry and in oil & gas. Sidsel has an intense curiosity and passion for quality. She is a Lean Six Sigma Master Black Belt and is always enthusiastically sharing her knowledge. Sidsel was nominated as Interim leader of the year in 2019 in Norway.