See Also: Book Notes, (me), Notes on Consciousness, Happiness Hypothesis, Consciousness and the Brain, Human: Makes Us Unique, Righteous Mind, Consciousness: Confessions, Blank Slate, Neuroscience of Human Relationships, Thinking, Fast and Slow

For No Good Reason

Musing on Why human reasoning is unreasonable in preparations for the First Parish Salon: Human Reasoning. Wikipedia Confirmation bias Wikipedia Reason

Why facts don’t change our minds, New Yorker, 2017

This article has lots of insights from books and research. The gist is that "people can’t think straight" ... "any graduate student with a clipboard can demonstrate that reasonable-seeming people are often totally irrational. Rarely has this insight seemed more relevant than it does right now. Still, an essential puzzle remains: How did we come to be this way?"

The article refers to the book, “The Enigma of Reason” (Harvard), the cognitive scientists Hugo Mercier and Dan Sperber take a stab at answering this question...

" Humans’ biggest advantage over other species is our ability to coöperate. Coöperation is difficult to establish and almost as difficult to sustain. .. Reason developed not to enable us to solve abstract, logical problems or even to help us draw conclusions from unfamiliar data; rather, it developed to resolve the problems posed by living in collaborative groups. .. There was little advantage in reasoning clearly, while much was to be gained from winning arguments. .."

" If reason is designed to generate sound judgments, then it’s hard to conceive of a more serious design flaw than confirmation bias."

Stripped of a lot of what might be called cognitive-science-ese, Mercier and Sperber’s argument runs, more or less, as follows: Humans’ biggest advantage over other species is our ability to coöperate. Coöperation is difficult to establish and almost as difficult to sustain. ... Reason developed not to enable us to solve abstract, logical problems or even to help us draw conclusions from unfamiliar data; rather, it developed to resolve the problems posed by living in collaborative groups.

This lopsidedness, according to Mercier and Sperber, reflects the task that reason evolved to perform, which is to prevent us from getting screwed by the other members of our group. Living in small bands of hunter-gatherers, our ancestors were primarily concerned with their social standing, and with making sure that they weren’t the ones risking their lives on the hunt while others loafed around in the cave. There was little advantage in reasoning clearly, while much was to be gained from winning arguments.

We are not good at spotting our own faulty resigning, however a clever experiment shows we are much better at finding problems in the reasoning of 'others', if when it our very own reasoning that we are critiquing.

In step three, participants were shown one of the same problems, along with their answer and the answer of another participant, who’d come to a different conclusion. Once again, they were given the chance to change their responses. But a trick had been played: the answers presented to them as someone else’s were actually their own, and vice versa. About half the participants realized what was going on. Among the other half, suddenly people became a lot more critical. Nearly sixty per cent now rejected the responses that they’d earlier been satisfied with.

We have many, many biases.See below:
evolutionary benefit of “reason” is social network calculus / winning arguments
Much better at spotting others’ errors
Social intuitionism ballpark. Reason developed to help us convince others.

Good blog post on the Cognitive Science of Rationality

Cognitive Science of Rationality

More academic take the Evolution of Human Reasoning

Dominance Hierarchies and the Evolution of Human Reasoning
Denise Dellarosa Cummins

..the necessity of reasoning effectively about dominance hierarchies left an indelible mark on primate reasoning architectures, including that of humans. In order to survive in a dominance hierarchy, an individual must be capable of (a) making rank discriminations, (b) recognizing what is forbidden and what is permitted based one’s rank, and (c) deciding whether to engage in or refrain from activities that will allow one to move up in rank. "Smarter People are more vulnerable to these thinking errors" (biases)

The philosopher, it turns out, got it backward. A new study in the Journal of Personality and Social Psychology led by Richard West at James Madison University and Keith Stanovich at the University of Toronto suggests that, in many instances, smarter people are more vulnerable to these thinking errors. Although we assume that intelligence is a buffer against bias—that’s why those with higher S.A.T. scores think they are less prone to these universal thinking mistakes—it can actually be a subtle curse.

Twitter: @JonHaidt quoting @tylercowen: “Every time you frame something as good vs. evil, your IQ goes down 10 points.”


1. Anchoring bias. People are over-reliant on the first piece of information they hear. In a sale, negotiation. whoever makes the first offer establishes a range of reasonable possibilities in each person's mind.

5. Choice-supportive bias. When you choose something, you tend to feel positive about if even if that choice has flaws. Like how you think your dog is awesome — even if it bites people every once in a while.

2. Availability heuristic. People overestimate the importance of information that is available to them. A person might argue that smoking is not unhealthy because they know someone who lived to 100 and smoked three packs a day. 6. Clustering illusion. This is the tendency to see patterns in random events. It is key to various gambling fallacies, like the idea that red is more or less likely to turn up on a roulette table after a string of reds.

3. Bandwagon effect. The probability of one person adopting a belief increases based on the number of people who hold that belief. This is a powerful form of groupthlnk and is mason why meetings are often unproductive.

7. Confirmation bias. We tend to listen only to information that confirms our preconceptions — one of the many reasons it's so hard to have an intelligent conversation about climate change. Wikipedia

4. Blind-spot bias. Failing to recognize your own cognitive biases is a bias in itself. People notice cognitive and motivational biases much more in others than in themselves.

8. Conservatism bias. Where people favor prior evidence over new evidence or information that has emerged. People were slow to accept that the Earth was round because they maintained their earlier understanding that the planet was flat.

9. Information bias. The tendency to seek information when it does not affect action. More information is not always better. With less information, people can often make more accurate predictions.

13. Placebo effect. When simply believing that something will have a certain effect on you causes it to have that effect. In medicine, people given fake pills often experience the same physiological effects as people given the real thing.

17. Selective perception. Allowing our expectations So influence how we perceive the world. An experiment involving a football game between students from two universities showed that one team saw the opposing team commit more infractions.

10. Ostrich effect. The decision to ignore dangerous or negative information by 'burying one's head in the sand', like an ostrich. Research suggests that investors check the value of their holdings significantly less often during bad markets.

14. Pro-innovation bias. When a proponent of an innovation tends to overvalue Its usefulness and undervalue its limitations. Sound familiar, Silicon Valley?

11. Outcome bias. Judging a decision based on the outcome — rather than how exactly the decision was made in the moment. Just because you won a lot in Vegas doesn't an gambling your money was a smart decision.

15. Recency. The tendency to weigh the latest Information more heavily than older data. Investors often think the market will always look the way it looks today and make unwise decisions.

18. Stereotyping. 19. Survivorship bias. Expecting a group or person to An error that comes from have certain qualities without focusing only on surviving having real information about examples, causing us to the person. 11 allows us to misjudge a situation. For quickly identify strangers as instance, we might think that friends or enemies, but people being an entrepreneur is easy tend to overuse end abuse it. because we haven't heard of all those who failed.

12. Overconfidence. Some of us are too confident about our abilities, and this causes us to take greater risks in our daily lives. Experts are more prone to this bias than laypeople, since they are more convinced that they are right.

16. Salience. Our tendency to focus on the most easily recognizable features of a person or concept. When you think about dying, you might worry about being mauled by a Hon, as opposed to what is statistically more likely, like dying Ina car accident.

20. Zero-risk bias. Sociologists have found that we love certainty — even if ifs counterproductive. Eliminating risk entirely means there is no chance of been being caused.

SOURCES: Brain Biases; Ethics Unwrapped; Explorable; Harvard Magazine; HowStuffWorks; LeamVest: Outcome bias in decision evacuation, Journal of Personality and Social Psychology; Psychology Today; The Bias Bend Sp. Perceptions of Bias in Self Versus Others, Personality and Social Psychology Bulletin; The Cognitive Effects of Maas Communication, Theory and Research In Mass Communications; The lesmismore effete Predictions and tests, Judgment and Decision Making; The New York Timm; The Wail Street Journal; WISI.M.You Are Not So Smart; Thurnaiywiki 2019-07-21   YON   Book Notes