Thanks for this useful corrective. The example that always got to me was the one where people conclude that it's more likely that Barbara, after her activist youth, is a bank teller and a feminist than that she is a bank teller. In an everyday reasoning sense, it IS more likely, even though statistically, that makes no sense. But when I learned that the only group of people who get the Wason Selection Task (knowing how to disprove a conditional) are philosophy graduate students (as I was at the time), that cemented my sense that the critical studies of human rationality were probably looking for the wrong thing. If it means that your thinking doesn't always conform to the canonical rules of logic, so what?
Yeah there's a big difference in performance of people in these sorts of abstract, artificial scenarios versus "ecologically valid" scenarios for sure. I'm currently working on a post that talks a bit about this and the Wason selection task!
When I lived in Africa, I asked why folks would go to witch doctors for healing when modern medicine was available. They explained that the witch doctor was much cheaper and if their treatment didn’t work then they could resort to spend more money for the stronger magic of the Western trained doctor.
I suppose in the lab it’s really hard to replicate the noise of daily life. For example distraction by other things (eg stress and impulsivity being mood states must be contextually leaky) must favour heuristics. I appreciate the lab has unique cognitive pressures.
The marshmallow example is interesting because I've read that it predicts success in later life but that might be because it really measures how stable your home environment is, which would be beneficial for success.
It can be both. If your environment teaches you to make very short term decisions, you're not going to do well in more stable situations where long term thinking is better.
This is an insightful analysis. I think people are often more rational than we give them credit for. We're just not always aware of all the issues going into their decisions. Even when they're actually being irrational, it's frequently far more nuanced once we're more aware of those factors.
Get a group of friends together and play this game. Ask them all what they think the height of the Eiffel Tower is. Say that they should answer by giving a range, e.g 1m - 1000m. Say that they should give a range so that they have a 90% likelihood of getting it correct.
Collect all your answers. You’ll find that only like 40% of the guesses have the answer in their range. People believe they are a lot more accurate than they are.
But then perhaps a lack of intellectual humility, or an intellectual arrogance, is incredibly beneficial for getting shit done.
Thanks for this useful corrective. The example that always got to me was the one where people conclude that it's more likely that Barbara, after her activist youth, is a bank teller and a feminist than that she is a bank teller. In an everyday reasoning sense, it IS more likely, even though statistically, that makes no sense. But when I learned that the only group of people who get the Wason Selection Task (knowing how to disprove a conditional) are philosophy graduate students (as I was at the time), that cemented my sense that the critical studies of human rationality were probably looking for the wrong thing. If it means that your thinking doesn't always conform to the canonical rules of logic, so what?
Yeah there's a big difference in performance of people in these sorts of abstract, artificial scenarios versus "ecologically valid" scenarios for sure. I'm currently working on a post that talks a bit about this and the Wason selection task!
When I lived in Africa, I asked why folks would go to witch doctors for healing when modern medicine was available. They explained that the witch doctor was much cheaper and if their treatment didn’t work then they could resort to spend more money for the stronger magic of the Western trained doctor.
I suppose in the lab it’s really hard to replicate the noise of daily life. For example distraction by other things (eg stress and impulsivity being mood states must be contextually leaky) must favour heuristics. I appreciate the lab has unique cognitive pressures.
Good post, it made me rethink a few things.
The marshmallow example is interesting because I've read that it predicts success in later life but that might be because it really measures how stable your home environment is, which would be beneficial for success.
It can be both. If your environment teaches you to make very short term decisions, you're not going to do well in more stable situations where long term thinking is better.
This is an insightful analysis. I think people are often more rational than we give them credit for. We're just not always aware of all the issues going into their decisions. Even when they're actually being irrational, it's frequently far more nuanced once we're more aware of those factors.
The "Bias Bias" bit is especially useful. It holds for news as well as "studies."
Get a group of friends together and play this game. Ask them all what they think the height of the Eiffel Tower is. Say that they should answer by giving a range, e.g 1m - 1000m. Say that they should give a range so that they have a 90% likelihood of getting it correct.
Collect all your answers. You’ll find that only like 40% of the guesses have the answer in their range. People believe they are a lot more accurate than they are.
But then perhaps a lack of intellectual humility, or an intellectual arrogance, is incredibly beneficial for getting shit done.
Very good
Honestly thought the opening was gonna be a where’s the goat Monty hall thing