Risk management

The Ostrich Paradox: Why We Underprepare for Disasters

May 14, 2025
8 min read
"When individuals are unsure how best to prepare for a disaster, they often choose the option that requires the least amount of mental effort.

In 2017, Robert Meyer and Howard Kunreuther wrote a book that people working in disaster preparedness consider a must read - The Ostrich Paradox: Why We Underprepare for Disasters.

Risk identification and consistent, meaningful evaluation can prove challenging with security practitioners favouring qualitative or quantitative risk analysis as two different approaches to assessing and managing risks.

The authors identofy six decision-making biases that cause individuals, communities and organisations to underinvest in protection against low-probability, high-consequence events and strategies to improve risk-based decision making.

Those six human vulnerabilities are:

  • Myopia - a tendency to focus on overly short future time horizons when appraising immediate costs and the potential benefits of protective investments
  • Amnesia - a tendency to forget too quickly the lessons of past disasters.
  • Optimism – a tendency to underestimate the likelihood that losses will occur from future hazards.
  • Inertia - a tendency to maintain the status quo or adopt a default option when there is uncertainty about the potential benefits of investing in alternative protective measures.
  • Simplification - a tendency to selectively attend to only a subset of the relevant facts to consider when making choices involving risk.
  • Herding – a tendency to base choices on the observed actions of others.

How does human bias lead to poor risk decision making?

Not sure how those behaviours play out? Here are some examples:

We fail to evacuate when advised to do so. We rebuild in hazard-prone areas after experiencing a disaster. We don’t wear helmets when riding motorcycles. We often purchase insurance only after experiencing a disaster and then cancel our policy a few years later if we haven’t made a claim."

The legendary Bruce Schneier has previously written about the difference between perceived and actual risk and how it explains many security trade-offs made in haste. He states: "People have trouble estimating risks for anything not exactly like their normal situation... people overestimate risks that are being talked about and remain an object of public scrutiny."

A great example of this is the annual FBI Internet Crime (IC3) Report that demonstrates that whilst ransomware attacks make headlines, the less technical email threat vector of Business Email Compromise has for the past 5 years led to financial losses greater by a factor of 60 to 80 times, depending on the year in question.

Robert Meyer and Howard Kunreuther propose the concept of a behavioral risk audit that identifies these human decision-making biases. By using framing techniques to reduce risk timelines and making baseline protections the default, inertia and lethargy should see few choose to remove these recommended security controls.

Lastly, they propose 'seals of approval' to demonstrate excellence, and the efforts invested to exceed the bare minimum standard. Think assurance frameworks and certification of systems. Perhaps the security baselines set out in regulatory compliance do have an upside after all?

Similar posts

Identify. Secure. Assure.

Ready to simplify cybersecurity compliance for critical infrastructure?
Book a demo