Little Ideas with Big Impacts
Below is a copy of text originally published on the Collaborative Fund blog. A list of ideas, in no particular order and from different fields, that help explain how the world works: Depressive Realism: Depressed people have a more accurate view of the world because they’re more realistic about how risky and fragile life is. The opposite of “blissfully unaware.” Skill Compensation: People who are exceptionally good at one thing tend to be exceptionally poor at another. Curse of Knowledge: The inability to communicate your ideas because you wrongly assume others have the necessary background to understand what you’re talking about. Base Rates: The success rate of everyone who’s done what you’re about to try. Base-Rate Neglect: Assuming the success rate of everyone who’s done what you’re about to try doesn’t apply to you, caused by overestimating the extent to which you do things differently than everyone else. Compassion Fade: People have more compassion for small groups of victims than larger groups, because the smaller the group the easier it is to identify individual victims. System Justification Theory: Inefficient systems will be defended and maintained if they serve the needs of people who benefit from them – individual incentives can sustain systemic stupidity. Three Men Make a Tiger: People will believe anything if enough people tell them it’s true. It comes from a Chinese proverb that if one person tells you there’s a tiger roaming around your neighborhood, you can assume they’re lying. If two people tell you, you begin to wonder. If three say it’s true, you’re convinced there’s a tiger in your neighborhood and you panic. Buridan’s Ass: A thirsty donkey is placed exactly midway between two pails of water. It dies because it can’t make a rational decision about which one to choose. A form of decision paralysis. Pareto Principle: The majority of outcomes are driven by a minority of events. Sturgeon’s Law: “90% of everything is crap.” The obvious inverse of the Pareto Principle, but hard to accept in practice. Cumulative advantage: Social status snowballs in either direction because people like associating with successful people, so doors are opened for them, and avoid associating with unsuccessful people, for whom doors are closed. Impostor Syndrome: Fear of being exposed as less talented than people think you are, often because talent is owed to cumulative advantage rather than actual effort or skill. Anscombe’s Quartet: Four sets of numbers that look identical on paper (mean average, variance, correlation, etc.) but look completely different when graphed. Describes a situation where exact calculations don’t offer a good representation of how the world works. Ringelmann Effect: Members of a group become lazier as the size of their group increases. Based on the assumption that “someone else is probably taking care of that.” Semmelweis Reflex: Automatically rejecting evidence that contradicts your tribe’s established norms. Named after a Hungarian doctor who discovered that patients treated by doctors who wash their hands suffer fewer infections, but struggled to convince other doctors that his finding was true. False-Consensus Effect: Overestimating how widely held your own beliefs are, caused by the difficulty of imagining the experiences of other people. Boomerang Effect: Trying to persuade someone to do one thing can make them more likely to do the opposite, because the act of persuasion can feel like someone stealing your freedom and doing the opposite makes you feel like you’re taking your freedom back. Chronological Snobbery: “The assumption that whatever has gone out of date is on that account discredited. You must find why it went out of date. Was it ever refuted (and if so by whom, where, and how conclusively) or did it merely die away as fashions do? If the latter, this tells us nothing about its truth or falsehood. From seeing this, one passes to the realization that our own age is also ‘a period,’ and certainly has, like all periods, its own characteristic illusions.” – C.S. Lewis Planck’s Principle: “A new scientific truth does not triumph by convincing its opponents and making them see the light, but rather because its opponents eventually die and a new generation grows up that is familiar with it.” McNamara Fallacy: A belief that rational decisions can be made with quantitative measures alone, when in fact the things you can’t measure are often the most consequential. Named after Defense Secretary McNamara, who tried to quantify every aspect of the Vietnam War. Courtesy Bias: Giving opinions that are likely to offend people the least, rather than what you actually believe. Berkson’s Paradox: Strong correlations can fall apart when combined with a larger population. Among hospital patients, motorcycle crash victims wearing helmets are more likely to be seriously injured than those not wearing helmets. But that’s because most crash victims saved by helmets did not need to become hospital patients, and those without helmets are more likely to die before becoming a hospital patient. Group Attribution Error: Incorrectly assuming that the views of a group member reflect those of the whole group. Baader-Meinhof Phenomenon: Noticing an idea everywhere you look as soon as it’s brought to your attention in a way that makes you overestimate its prevalence. Ludic Fallacy: Falsely associated simulations with real life. Nassim Taleb: “Organized competitive fighting trains the athlete to focus on the game and, in order not to dissipate his concentration, to ignore the possibility of what is not specifically allowed by the rules, such as kicks to the groin, a surprise knife, et cetera. So those who win the gold medal might be precisely those who will be most vulnerable in real life.” Normalcy Bias: Underestimating the odds of disaster because it’s comforting to assume things will keep functioning the way they’ve always functioned. Actor-Observer Asymmetry: We judge others based solely on their actions, but when judging ourselves we have an internal dialogue that justifies our mistakes and bad decisions. The 90-9-1 Rule: In social media networks, 90% of users just read content, 9% of users contribute a little content, and 1% of users contribute almost all the content. Gives