Risk Savvy: How to Make Good Decisions – Book Notes and Summary

Risk Savvy: How to Make Good Decisions by Gerd Gigerenzer

One-sentence summary: Risk Savvy reminds us that intuition is an indispensable part of good decision-making, not rationality’s annoying little brother.

Rating: 7/10

Author: Gerd Gigerenzer

Date Completed: December 2020

Tags: Decision-Making, Risk, Uncertainty, Mental Models, Experimentation, Optionality, Money, First Principles

Hot take: I think Gigerenzer wants heuristics and intuition to take up more space in Risk Savvy than they end up occupying, but that’s OK – the primer on Bayesian reasoning is worth the price of admission by itself.

Big Ideas

Risk and uncertainty are not the same things.

  • Risk is “a world where all alternatives, consequences, and probabilities are known.”
  • Uncertainty is the land of unknown risks, a vast territory next to the relatively narrow world of risk. Here, as Keynes writes, we can’t calculate the probabilities. We simply do not know!
  • “Whom to marry? Whom to trust? What to do with the rest of one’s life? In an uncertain world, it is impossible to determine the optimal course of action by calculating the exact risks. We have to deal with ‘unknown unknowns.’ Surprises happen. Even when calculation does not provide a clear answer, however, we have to make decisions.”
  • The two situations require different tool kits:
    • Risk: If we know our risks, we need to use logic and statistics to come up with the optimal solution
    • Uncertainty: If we don’t know all our risks, we need to come up with quick, easy-to-use heuristics to make good decisions

Because most decisions are made under uncertainty, not risk, heuristics are an underrated decision-making tool.

  • “Calculated intelligence may do the job for known risks, but in the face of uncertainty, intuition is indispensable. Our society, however, often resists acknowledging intuition as a form of intelligence while taking logical calculations at face value as intelligent.”
  • “To assume that intelligence is necessarily conscious and deliberate is a big error.”
  • However, you can’t just use one rule of thumb for every situation – you must have a toolbox of them, like Charlie Munger’s famous “latticework of models.”
  • There are three characteristics of every good heuristic:
    • It appears quickly in the consciousness
    • Underlying reasons aren’t apparent yet
    • Is strong enough to act upon
  • Experts often make decisions using heuristics – focusing on a couple of critical pieces of information and ignoring the rest. The expert approach can yield “better, faster, and safer decisions.”
  • Put another way: “In an uncertain world, complex decision-making methods involving more information and calculation are often worse and can cause damage by invoking unwarranted uncertainty.”

Why we underrate intuition

  • If intuition is so powerful, why don’t we use it more often in decision-making? There are a few reasons:
    • Defensive decision-making – we want to make calls that are explainable rather than the best possible choices
    • Group decision-making makes it hard to go with your gut
    • We don’t like feeling as though we don’t have/haven’t considered all the facts
  • We also make more decisions based on intuition than we think we do. We’ll often cover it with reasons we produced after the fact or engage in defensive decision-making to go with a choice you can explain (to your partner, parents, boss, etc.) rather than the one we believe is best.
  • Common misconceptions about intuition:
    • It’s the opposite of rationality. Instead, intuition is “unconscious intelligence based on personal experience and smart rules of thumb. You need both intuition and reasoning to be rational.”
    • Intuition is female. It’s not, but men have a harder time trusting their guts (or admitting they do) than women.
    • Intuition is inferior to deliberate thinking. “Deliberate thinking and logic is not generally better than intuition or vice versa. Logic (or statistics) is best for dealing with known risks, while good intuitions and rules of thumb are indispensable in an uncertain world.”
    • Intuition is based on a complex, unconscious weighing of the evidence. This implies unconscious bookkeeping, but that’s not how intuition seems to work. Bookkeeping only helps when there are risks rather than uncertainty. “There is strong evidence that intuitions are based on simple, smart rules that take into account only some of the available information.”

Risk aversion vs. defensive decision-making

  • Risk aversion is closely tied to the anxiety of making errors. If you work in the middle management of a company, your life probably revolves around the fear of doing something wrong and being blamed for it.”
  • An example of defensive decision-making: “A person or group ranks option A as the best for the situation, but chooses an inferior option B to protect itself in case something goes wrong.”
  • Though the two concepts seem similar, they generate different results. “The emotional fabric of defensive decision-making differs from that of risk aversion. It can lead to excessive risk-taking. If your intuition says that an investment is overvalued, but you join because everyone else invests in it, you may take undue risks.”
  • This is why error cultures in organizations matter. If you’re allowed to make educated guesses and be wrong without being punished or fired, that’s a positive error culture. If you get assigned blame or potentially let go for making a decision “without having all the facts,” you may have a negative error culture. Error cultures influence decision quality across the organization.
  • There’s a ton on negative error cultures and defensive decision-making – which Gigerenzer posits is almost cancerous to organizations – in this (long) summary of Moral Mazes, particularly in the Short-Term Thinking and Avoiding Decision-Making sections.
  • ‘Fear of blame, criticism, and litigation is the motivation for hiring the second best, making the second-best management decisions, and practicing defensive medicine. To avoid blame, people hide behind ‘safe’ procedures. Rely on big names, do what everyone else does, and don’t listen to your intuition. Put your faith in tests and fancy technology, even if these are useless or harmful.”

We tend to fear things we shouldn’t and not to fear things we should.

  • Terrorism is scary. We’ve spent exorbitant sums (Gigerenzer cites $500B) fighting it in the wake of 9/11, but terrorism is nowhere near as deadly as car crashes or heart disease – neither of which scare us at all. What gives?
  • A similar phenomenon is that we’re not scared of going to the airport, but we’re usually pretty edgy about getting on a flight. But, as Gigerenzer notes, “If your car makes it safely to the airport, the most dangerous part of your trip is already behind you.”
  • If the deaths are distributed, the most common causes of death aren’t scary. For example, diabetes and heart disease kill millions of people without a lot of public fear. Yet any flight that crashes, killing a fraction of the people, is headline news for days. Because the crash kills many at once, we’re more scared of it than the thing that’s orders of magnitude more likely to kill us. The same is true for terrorism.

Bayesian reasoning leads to better medical decisions.

  • Most people advocate for increased screening for things like breast cancer in women and prostate cancer in men. However, these tests may do more harm than good – and even routine screenings for HIV may do the same.
  • The reason isn’t immediately apparent, but effectively: if a screen has some false positives, and those cause harm (e.g., biopsies, unnecessary treatment, and surgeries), the cost of those unintended side effects for people with nothing wrong outweigh the benefits of the screening for those who do have it. Prostate cancer, in particular, is pretty slow-moving – far more men die with prostate cancer than die from prostate cancer.
  • It’s easy to misunderstand the efficacy of screenings because studies and doctors often share unintentionally misleading statistics. Preventative screenings (in healthy people without a family history of cancer) are unlikely to decrease your risk of death and introduce all kinds of unforeseen challenges.
  • We don’t think in Bayesian terms, which makes this counterintuitive. Bayesian reasoning incorporates conditional probabilities and updates them when we receive new evidence. For example, you might want to revise your thesis that aliens don’t exist if you saw a green figure land a flying saucer on 3rd Avenue.
  • This guide isn’t from the book but was extremely helpful in explaining Bayesian reasoning better than I can.

No bullshit advice to reduce your cancer risk

  • About half of all cancers are behavioral – i.e., they come from stuff we do to ourselves. If changes in lifestyle and habits could save you from developing cancer, that’s a big win. The biggest offenders:
    • Cigarette smoking causes 20%-30% of all cancers. If you don’t smoke, great! Don’t start now. If you do smoke, the best thing to do is to stop. Easy to say, hard to do.
    • Obesity, diet, and physical inactivity cause 10%-20% of all cancers. There are a few things you can do to avoid these cancers:
      • Maintain a normal body weight
      • Be physically active every day
      • Avoid fast food and sugary drinks
      • Eat a plant-based diet (if not going entirely vegetarian)
      • Avoid salted food (which Gigerenzer links to stomach cancer)
      • Get your nutrients from your diet, not dietary supplements, which can cause unintended consequences dietarily and are of questionable effectiveness
      • Breastfeed infants, if possible
    • Alcohol abuse causes about 10% of all cancers in men and 3% in women. To prevent cancer, Gigerenzer suggests going off the sauce entirely. Apparently, there are protective benefits of low alcohol use against coronary heart disease, but these are in dispute elsewhere.
    • CT scans cause about 2% of all cancers. These carry 100x the radiation of a chest X-ray and are often over-prescribed. “The risk of dying from radiation-induced cancer from a single full-body CT is higher than the risk of dying in a traffic accident.” This means we should limit exposure to these tests, particularly for children. They’re often done as an exploratory measure, rendering them unnecessary in many cases. The next time your doctor wants to do a CT to “see what’s going on,” you might reconsider.

Bits and Pieces

  • Risk aversion is more about culture than personality. “Social learning is the reason why people aren’t generally risk-seeking or risk-averse. They tend to fear whatever their peers fear, resulting in a patchwork of risks taken and avoided.” Many of the fears we easily acquire are “biologically prepared associations.” They include the fear of animals, physical objects or events, and social rejection.
  • External goals cause (some) millennial angst. “Annual polls of college freshmen showed that recent generations judged “being well off financially” as more important than “developing a meaningful philosophy of life” – which was the reverse of what we saw in the 1960s and 1970s. As people’s goals have shifted more toward external things – income, social acceptance, looks, etc. – people have gotten more anxious.

    People who focus on external goals (and thus feel less in control of their lives) report higher anxiety levels. As we know, humans are passionate about control. People who believe they have more internal control – those focused on building skills rather than wealth, for example – have lower anxiety.
  • Financial predictions are often worthless because they fall into the same trap of many other predictions – believing next year will be a lot like this year. “If you look at the ten years of predictions, they consistently predicted the upward or downward trend of the previous year. In over 90 percent of the individual forecasts, the experts followed that rule…Predictions that always work except when this year is not like last year provide the false certainty of an airbag that works all the time except when you have a car accident.”
  • Any decision should be as simple as it can be, but no simpler. A few rules:
    • The more uncertain the domain, the simpler the decision-making process should be. The inverse is also true – the more risk is involved rather than uncertainty, the more complex we should make the decision and the more data it should involve.
    • The more alternatives, the simpler the process should be. Fewer options allow for more complex calculations. Complex decision-making needs to calculate risk factors, which you can’t do if there are 100 alternatives.
    • The more past data there are, the better complex decision-making works. You can model off daily price data for the past 100 years, but you probably can’t model from three days’ worth.
  • Some rules of thumb for choosing a meal at a restaurant:
    • Ask the waiter: What would you order here this evening? Note: don’t ask what they would recommend, but rather what they would eat.
    • Satisficing: choose the first option that is satisfactory or “good enough” – scan the menu until you find something that looks good, then order it.
    • Imitate your peers: Don’t open the menu – find out who at your table has visited this restaurant most often and order the dish they do. If you’re in a foreign country or in a restaurant you don’t know, this is probably a good option.
    • If you are a maximizer: Relax. Maximizing can leave you with the dispiriting feeling that you still don’t know whether you made the best choice. Getting comfortable with this is the maximizer’s job. 
  • When shopping:
    • Go for a product that’s good enough but not the best. Not “this is the best” but rather “this should get the job done.” “After all, in an uncertain world, there is no way to find the best. Even if you got it by accident, you would not know and might still look for something better.”
    • Set parameters for yourself: “look for a pair of trousers that is black, fits you and does not cost more than $100. When you find a pair that meets this aspiration, just buy it and go for a cup of coffee. Don’t search for any more alternatives…Studies indicate that people who rely on aspiration rules tend to be more optimistic and have higher self-esteem than maximizers.”

This process holds up well even on bigger choices like a romantic partner or a place to live. “Unless the aspiration level is too high, it will lead to fast decisions. If it proves to be too high, it can be lowered, step by step.”

Related Books:

Thinking in Bets – Annie Duke
Stumbling on Happiness – Daniel Gilbert
Optionality – Richard Meadows

You can get Risk Savvy at the library for free. If you’d like to buy it instead, you can get a copy here.