Categories
Psychology Society

Will Storr – The Status Game

As a tribal species, our personal survival has always depended on our being accepted into a supportive community. Powerful emotions compel us to connect: the joy of belongingness and agony of rejection. But once inside a group, we’re rarely content to flop about on its lower rungs. We seek to rise. When we do, and receive acclaim from our people, we feel as if our lives have meaning and purpose and that we’re thriving.

Our need for status gives us a thirst for rank and a fear of its loss that deforms our thinking and denies us the possibility of reliable happiness. It’s why, even as we raise ourselves so high above the other animals we appear to them as gods, we still behave like them – and worse. Always on alert for slights and praise, we can be petty, hateful, aggressive, grandiose and delusional.

This is why, I’ve come to believe, we make a fundamental error when we reflexively categorise our desire for status as shameful. A greater understanding of what helps drive us on our good days and bad must surely be useful. Digging beneath the flattering stories we like to tell of ourselves can help us see more clearly how we can become better, but also how easily we become tempted into delusion and tyranny.

We’re going to define three different forms of the status game – the dominance game, the virtue game and the success game – and ask how certain kinds of play can lead us into a fairer, wealthier tomorrow.

When asked why we do the things we do, we rarely say, ‘It’s because of status. I really love it.’ It can be distasteful to think of it as any kind of motivating force, let alone a vital one. It contradicts the heroic story we like to tell of ourselves. When we pursue the great goals of our lives, we tend to focus on our happy ending. We want the qualification, the promotion, the milestone, the crown. These motivations, that tend to spring to mind immediately, are known by researchers as ‘proximate’. They’re absolutely real and valid but they have other upstream ‘ultimate’ causes. Ultimate causes are often subconscious and so hidden from us: they’re the reason we want the qualification, the promotion, the milestone, the crown, in the first place.

Wherever psychologists look, they find a remarkably powerful link between status and wellbeing. One study of more than sixty thousand people across 123 countries found people’s wellbeing ‘consistently depended on the degree to which people felt respected by others’. Attainment of status or its loss was ‘the strongest predictor of long-term positive and negative feelings’.

Psychologists find that simply connecting with others and feeling accepted by them can be profoundly good for us. But equally revealing is how our minds and bodies react when we fail to connect. A wide range of research finds people with depression tend to belong to ‘far fewer’ groups than the rest of the population. Studies across time suggest the more a depressed person identifies with their group – the more of their own sense of self they invest in it – the more their symptoms lift.

In one study, participants were told they were taste-testing chocolate chip cookies. Before the test began, they were asked to mingle with other tasters then choose two they’d like to work with. Some were told (falsely) that nobody had picked them; others that everyone had. The first group, who’d been socially rejected, went on to eat an average of nine cookies more than the non-rejected: nearly twice the number. Most of them even rated the taste of the cookies more highly, implying their rejection actually altered their perceptions of the sugary food.

Marmot was surprised to discover precisely how high a civil servant climbed in the game of the civil service predicted their health outcomes and mortality rates. This was not, as you might reasonably assume, to do with the wealthier individuals leading healthier and more privileged lifestyles. This effect, which Marmot calls the ‘status syndrome’, was entirely independent: a wealthy smoker just one rung below the very top of the status game was more likely to fall ill, as a result of their habit, than the smoker one rung above them.

One review of the scientific literature found that ‘perceiving oneself as having low rank compared to others is consistently linked to higher depressive symptoms’. Some psychologists argue that when we become depressed we ‘mentally withdraw from the competition for higher status’. This keeps us off ‘high-status individuals’ radars’ and conserves energy, helping us cope with the ‘reduced opportunities imposed by low status’.

Much of what seems inarguably real and true, in the space around us, is not. The actual world is monochrome and silent. Sounds, colours, tastes and smells exist only in the projection in our heads. What’s actually out there are vibrating particles, floating chemical compounds, molecules and colourless light waves of varying lengths.

psychologically healthy brain excels at making its owner feel heroic. It does this by reordering our experiences, remixing our memories and rationalising our behaviour, using a battery of reality-warping weapons that make us believe we’re more virtuous, more correct in our beliefs and have more hopeful futures in store than others.

These apparently trite symbols matter. In one test, when participants were shown photos of people wearing ‘rich’ or ‘poor’ clothes, they automatically assumed those in wealthier looking outfits were significantly more competent and of higher status. This effect remained when they were warned upfront of the potential bias, when they were informed the clothing was definitely irrelevant and when they were told all the people worked in sales at a ‘mid-size firm in the Midwest’ and earned around US$80,000. It even remained when the participants were paid money to make an accurate guess.

The status detection system is highly evident in the behaviour of youngsters. Around three-quarters of arguments between children aged 18 and 30 months are over possessions, a figure that rises to 90 per cent when just two toddlers are present. For developmental psychologist Professor Bruce Hood, possession is a ‘means to establish where you are in the nursery pecking order’.

This has been found many times, with one study using data from twelve thousand British adults concluding ‘the ranked position of an individual’s income predicts general life satisfaction, whereas absolute income and reference income have no effect’.

These rules were essential because humans can often be greedy, dishonest and aggressive. One survey of sixty premodern societies uncovered seven common rules of play that are thought to be universal: help your family; help your group; return favours; be brave; defer to superiors; divide resources fairly; respect others’ property. These elemental rules dictate the ways humans keep their tribes working well.

In one study, 86 per cent of Australians rated their job performance as ‘above average’; in another, 96 per cent of Americans described themselves as ‘special’. East Asian games tend to be more collective

The brain begins learning these rules in infancy. As 2-year-olds, we have around one hundred trillion connections between our brain cells, double that of an adult. This is because, when we’re born, we don’t know where we’re going to pop out. Baby brains are specialised for many environments, many games. At this age, we’re better than adults at recognising faces of other races and can hear tones in foreign languages that grown-ups are deaf to.

Much of the rest of human life is comprised of three varieties of status-striving and three varieties of game: dominance, virtue and success. In dominance games, status is coerced by force or fear. In virtue games, status is awarded to players who are conspicuously dutiful, obedient and moralistic. In success games, status is awarded for the achievement of closely specified outcomes, beyond simply winning, that require skill, talent or knowledge.

In Tanzania, Hadza hunters who share meat widely ‘gain great social status – prestige that can be parlayed into powerful social alliances, the deference of other men, and greater mating success’, writes Buss. People engage in ‘competitive altruism’, battling to be ‘seen by others as great contributors to the group’. Of course, status is awarded to the altruistic in more modern societies too: studies show those who donate to charity, for example, experience ‘a dramatic boost in prestige in the eyes of others’.

Chimpanzee troops have been found to be ‘several hundred to a thousand times’ more aggressive than even the most violent human societies.

one survey found 53 per cent of Americans saying they’d prefer instant death than the reputation of a child molester; 70 per cent opted for the amputation of their dominant hand over a swastika tattoo on their face; 40 per cent preferred a year in jail to the reputation of a criminal.

It’s in this way that children in countries such as India overcome the pain of eating spicy foods. Mimicking the actions of high-status people is so desirable, it’s argued, their brains reinterpret the pain signals as pleasurable. Children are thought to teach themselves to enjoy spice-burning foods using automatic prestige-driven imitation. They rarely have to be forced.

Status games run on powerlines of influence and deference that crackle up and down their hierarchy. This is why, of all the countless status symbols that exist in human life, influence is probably the most reliable. We often assume money or fancy possessions are the most certain symbols of a person’s rank, but the highest-status monk in the world may have less wealth, and fewer Hermès ties, than the most junior banker on Wall Street. Influence is different.

But serious violence among women and girls is comparatively rare. For psychologist Professor Jonathan Haidt, ‘girls and boys are equally aggressive but their aggression is different. Boys’ aggression revolves around the threat of violence: “I will physically hurt you” … but girls’ aggression has always been relational: “I will destroy your reputation or your relationships”.’ Researchers argue female aggression tends to be ‘indirect’.

Humiliation has been described by researchers as ‘the nuclear bomb of the emotions’ and has been shown to cause major depressions, suicidal states, psychosis, extreme rage and severe anxiety, ‘including ones characteristic of post-traumatic stress disorder’. Criminal violence expert Professor James Gilligan describes the experience of humiliation as an ‘annihilation of the self

If humans are players, programmed to seek connection and status, humiliation insults both our deepest needs.

The only way to recover is to find a new game even if that means rebuilding an entire life and self. ‘Many humiliated individuals find it necessary to move to another community to recover their status, or more broadly, to reconstruct their lives.’

An African proverb says, ‘the child who is not embraced by the village will burn it down to feel its warmth’. If the game rejects you, you can return in dominance as a vengeful God, using deadly violence to force the game to attend to you in humility.

Researchers find happiness isn’t closely linked to our socioeconomic status, which captures our rank compared with others across the whole of society, including class. It’s actually our smaller games that matter: ‘studies show that respect and admiration within one’s local group, but not socioeconomic status, predicts subjective well-being’.

The model said a person is compelled to act when three forces collide in a moment: motivation (we must want the thing); trigger (something must happen to trigger a desire to get more of it) and ability (it must be easy).

more than any other to make them habitual. He described a way of issuing rewards such that they’d encourage compulsive behaviours. If a programmer wanted to create a certain action, in a user, they should offer a symbol of reinforcement after they’d performed the desired ‘target behaviour’. But here was the trick: the positive reinforcement would be inconsistent. You wouldn’t always know what you were going to get.

To strengthen an existing behaviour, reinforcers are most effective when they are unpredictable,’ Fogg wrote in 2003.

We await replies, likes or upvotes and, just as a gambler never knows how the slot machine will pay out, we don’t know what reward we’ll receive for our contribution. Will we go up? Will we go down? The great prize changes every time. This variation creates compulsion. We just want to keep playing, again and again, to see what we’ll get.

Canny players sense the flaw in their elites and seek to improve their own rank with flattery. And flattery works: Tourish calls it a ‘perfumed trap’. A study of 451 CEOs found leaders who were exposed to more frequent and intense flattery and agreement rated their own abilities more highly, were less able to change course when things went wrong, and led firms that were more likely to suffer persistently poor performance.

Surprisingly, what made the most difference to their behaviour wasn’t the level of inequality in their game, but whether or not the inequality was visible. When players’ wealth was hidden everyone, including the elites, became more egalitarian. But when wealth was displayed, players in every game became less friendly, cooperated ‘roughly half as much’ and the rich were significantly more likely to exploit the poor.

This is why poverty alone doesn’t tend to lead to revolutions. Revolutions – defined as mass movements to replace a ruling order in the name of social justice – have been found to occur in middle-income countries more than the poorest. Sociologist Professor Jack Goldstone writes, ‘what matters is that people feel they are losing their proper place in society for reasons that are not inevitable and not their fault’

By the time we are thirteen,’ writes psychologist Professor Mitch Prinstein, ‘it seems as if there is nothing more important to us than this type of popularity. We talk about who has it. We strategise how to get it. We are devastated when we lose it. We even do things we know are wrong, immoral, illegal, or dangerous merely to obtain status, or to fiercely defend it.’

Psychologist Dr Lilliana Mason writes, ‘more often than not, citizens do not choose which party to support based on policy opinion; they alter their policy opinion according to which party they support. Usually they do not notice that this is happening, and most, in fact, feel outraged when the possibility is mentioned.’

The moral reality we live in is a virtue game. We use our displays of morality to manufacture status. It’s good that we do this. It’s functional. It’s why billionaires fund libraries, university scholarships and scientific endeavours; it’s why a study of 11,672 organ donations in the USA found only thirty-one were made anonymously. It’s why we feel good when we commit moral acts and thoughts privately and enjoy the approval of our imaginary audience. Virtue status is the bribe that nudges us into putting the interests of other people – principally our co-players – before our own.

When neuroscientist Professor Sarah Gimbel presented forty people with evidence their strongly held political beliefs were wrong, the response she observed in their brains was ‘very similar to what would happen if, say, you were walking through the forest and came across a bear

Professor Sam Gosling finds this when his students cluster into personality groups: ‘the extroverts don’t disguise their disdain for the uncommunicative introverts, who selfishly refuse to keep the discussion alive; they cannot fathom why their mute colleagues don’t do their bit to carry some of the conversational load. At the same time, the introverts have nothing but contempt for their garrulous counterparts; why not, they wonder, wait until you’ve got something worth saying before opening your mouth?’

For political psychologist Dr Lilliana Mason, part of the reason we continually attempt at warring for victory is that ‘people are compelled to think of their groups as better than others. Without that, they themselves feel inferior.’ At a ‘very primal level’ players are motivated ‘to view the world through a competitive lens, with importance placed on their own group’s superiority’. Humans love to become superior: to win. Researchers

A game’s command over its players strengthens when it flips into a mode of war.

For the vast majority of our time on earth, then, humans haven’t been subject to the tyranny of leaders. Instead, we lived in fear of what anthropologists call the ‘tyranny of the cousins’. These ‘cousins’ weren’t necessarily actual cousins. They’d usually be clan elders that, in these shallow hierarchies, passed for the elite.

In the end, she saved herself. At the time of writing, Templer’s company still exists, as does her blog. By conforming to the tyrannical cousins, and the frenzy spreading across the gossip networks of social media, she avoided being ‘cancelled’ – which is what we call it when internet mobs, unsatisfied by mockery, denunciation and humiliation meted out online, attempt at having their target de-graded as much as possible in the physical world.

A study of seventy million messages on the Chinese platform Weibo found the emotion that ‘travelled fastest and farthest through the social network’ was anger. Meanwhile, studies of mobbing events on Twitter find shamers increase their follower counts faster than non-shamers.

One investigation found those most likely to circulate ‘hostile political rumours’ including conspiracy theories and ‘fake news’ on social media were often ‘status-obsessed, yet socially marginalised’, their behaviour fuelled by a ‘thwarted desire for high status’, their aim, to ‘mobilise the audience against disliked elites’.

We found these same currents in the collective dreams of one of history’s most lethal games. The Nazis were Elliot Rodger, Ed Kemper and Ted Kaczynski on the level of a culture. They told a self-serving story that explained their catastrophic lack of status and justified its restoration in murderous attack. But it’s not just Germany that’s been possessed in this way. Nations the world over become dangerous when humiliated. One study of ninety-four wars since 1648 found 67 per cent were motivated by matters of national standing or revenge, with the next greatest factor – security – coming in at a distant 18 per cent.

Researchers find a primary motivation for suicide bombers is ‘the shame and humiliation induced by foreign troops in their country’.

When Algerians killed 103 French people following a riot, their colonialist masters sent aeroplanes to destroy forty-four villages, a cruiser to bombard coastal towns and commandos to slaughter on land: the French admit to 1,500 deaths, the Algerians claim 50,000. It’s for reasons like these that psychologist Dr Evelin Lindner has concluded that, ‘The most potent weapon of mass destruction’ is ‘the humiliated mind’.

Toxic morality is deeply implicated in these episodes: ‘genocide is highly moralistic’. Genocides are dominance-virtue games, carried out in the name of justice and fairness and the restoration of the correct order.

A ‘work ethic’ came into being, in which toil itself became prestigious. ‘This shift can be understood as the beginning of a work-centred society’, writes historian Professor Andrea Komlosy, ‘in which the diverse activities of all of its members are increasingly obliged to take on the traits of active production and strenuous exertion.’

We were getting our status from new kinds of games. Slowly, and in fits and starts, our focus had been juddering from duty to the clan towards individual competence and success. This changed our psychology, rewriting the cultural coding of our game-playing brains, turning us into new sorts of humans.

For Protestants, life was no longer a gruelling test for heaven or hell. God already knew where you were ending up. Believers were to look for clues of ‘assurance’ to see if they were saved or damned: signs of ‘elect status’ could be found in their own personal behaviour such as virtuous and sober living, but also in the accrual of wealth and rank on earth. Believers were said to have a personal ‘calling’. God had endowed them with special talents that they should seek to maximise by choosing the right occupation or vocation, then working hard in it.

Humans had been able to conquer the planet partly because we exist in a web of stored information. Every individual born didn’t have to learn everything for themselves afresh: knowledge was communicated by elders and passed down through the generations.

By connecting our ability to accumulate knowledge to our desire for status, they’d discovered the future.

This ‘Industrial Revolution’ was a status goldrush. It came to define the country’s mood and culture. Britons ‘became innovators because they adopted an improving mentality’, writes historian Dr Anton Howes. This mentality spread like a ‘disease’ that could infect ‘anyone … rich and poor, city-dwellers and rustics, Anglicans and dissenters, Whigs and Tories, skilled engineers and complete amateurs’.

One of the most famous, Scottish economist Adam Smith, is commonly known as the ‘Father of Capitalism’. Perhaps more than anyone, the hyper-individualistic, self-interested money-obsessed world we live in today is linked to him and his theories of how free markets and competition generate prosperity. But Smith didn’t believe greed for wealth was the ultimate driver of economies. He thought something else was going on, something deeper in the human psyche. ‘Humanity does not desire to be great, but to be beloved,’ he wrote in 1759.

We win points for personal success throughout our lives, in the highly formalised and often precisely graded games of school, college and work. In the street, in the office and on social media we signal our accomplishments with appearance, possessions and lifestyles. We’re self-obsessed, because this is the game we’re raised to play.

Following the depression and world wars, the economies of the USA and Britain became more rule-bound, virtuous and group-focussed: it was an era of increasing regulation over banking and business, high taxation (topping out at 90 per cent in America in the 1940s and 1950s), broad unionisation and ‘big government’ innovations such as the New Deal, the Social Security Act, the minimum wage and the welfare state.

American and British players became concomitantly collective: the monkey-suited ‘Corporation Man’ of the 1950s suburbs gave birth to the even more collectively minded hippies, with their anti-materialistic values.

But in the 1980s, the game changed again. During the previous decade, the economies of the West had started to fail. New ways of playing were sought. The leaders of the UK and USA, Margaret Thatcher and Ronald Reagan, decided to make the game significantly more competitive. In 1981, Thatcher told journalists, ‘What’s irritated me about the whole direction of politics in the last thirty years is that it’s always been toward the collectivist society.’

We see this perfect human all around us, beaming with flawless teeth from advertising, film, television, media and the internet. Young, agreeable, visibly fit, self-starting, productive, popular, globally-minded, stylish, self-confident, extrovert, busy. Who is it, this person we feel so pressured to punch ourselves into becoming? It’s the player best equipped to win status in the game we’re in.

Led by psychologist Dr Thomas Curran, the researchers discovered all the forms of perfectionism they looked at had risen between 1989 and 2016. Social perfectionism had grown the most. The extent to which people felt they had to ‘display perfection to secure approval’ had soared by 32 per cent.

Today, sixty-nine of the hundred largest economies on earth are not nations but corporations. In the first quarter of 2021 alone, technology company Apple made more money than the annual GDP of 135 countries; its market valuation was higher than the GDP of Italy, Brazil, Canada, South Korea and Russia.

In just three years, between 2015 and 2018, support for capitalism among young Americans fell from 39 per cent to 30 per cent; a 2019 poll found 36 per cent of millennials saying they approve of Communism. Sociologist Professor Thomas Cushman writes, ‘anti-capitalism has become, in some ways, a central pillar of the secular religion of the intellectuals, the habitus of modern critical intellectuals as a status group’.

Between 1979 and 2005, the average real hourly wage for white working-class Americans without a high-school diploma declined by 18 per cent.

Education lies at the heart of this divide.’ Most of the 41 per cent of white millennials who voted for Trump in 2016 didn’t have college degrees. In all, white non-college voters comprised around three-fifths of Trump’s support in 2016; 74 per cent of people with no qualifications supported Brexit, the educational divide being greater than that of social class, income or age.

The person most credited with attempting to realize this dream is Vladimir Ilyich Ulyanov, better known as Lenin. His hatred for the bourgeoisie was blinding, violent and total; many contemporary historians see its genesis in the humiliation his own upper-middle class family suffered after his brother, Sasha, was executed for a ‘laughably amateur’ but nearly successful assassination plot.

By 1920, 5.4 million were directly employed by the government. ‘There were twice as many officials as there were workers in Soviet Russia and these officials were the main social base of the new regime,’ writes Figes. ‘This was not a Dictatorship of the Proletariat but a Dictatorship of the Bureaucracy.’

During the Great Terror, the police were issued quotas for what percentage of their district was to be shot or sent to the camps. On 2 June 1937, it was ordered that 35,000 were to be ‘repressed’ in one district, 5,000 of whom were to be shot. Between 1937 and 1938, 165,200 priests were arrested, 106,800 of whom were shot. In the same period, an average of one and a half thousand people were executed daily. One and a half million ordinary Russians were arrested by the secret police, nearly seven hundred thousand were executed for ‘counter-revolutionary activities’.

Throughout the 1930s, there came into being a complex hierarchy of status. Stalin might have admitted there were now three classes, but sociologists found at least ten: the ruling elite; superior intelligentsia; general intelligentsia; working-class aristocracy; white collar; well-to-do peasants; average workers; average peasants; disadvantaged workers; forced labour.

The new elites gained access to special apartments and had the best goods automatically reserved for them. Their children were sent to exclusive summer camps. They received holidays, chauffeur-driven cars and money. It became ‘normal’ for them to have live-in servants

As for the state itself, it argued their privilege was temporary: soon all of the USSR would live like this. They were not a privileged elite, went the thinking, they were a vanguard.

More than two thousand years before the revolution, the Ancient Greek who’d first dreamed the Communist dream had been corrected by his student, Aristotle, who’d pointed out it wasn’t actually wealth or private ownership that created the human yearning to get ahead. That yearning was a part of our nature: ‘it is not possession but the desires of mankind which require to be equalized’.

To persuade us to push a penis in and out of a vagina, it invented orgasm. To persuade us to sacrifice our wellbeing for a screaming, shit-smeared infant, it made love. To persuade us to force mashed-up foreign objects down our throats, it evolved taste and appetite. To persuade us to engage in groupish, co-operative living, it conjured the obsessive joys of connection and acclaim. Follow the rules, and follow them well, and you can expect to feel great.

As we play for ever-greater status, for ourselves and our games, we weave a self-serving and highly motivating dream that writhes with saints and demons and irrational beliefs. This dream is presented to us as reality. It’s entirely convincing, in all its colour, noise and pristine focus. We see evidence everywhere that it’s true. It has the power to seduce us into the most depraved acts of hatred and barbarity. But it can also lead us into modes of play that truly make a better world.

Psychologists studying optimal self-presentation discuss a set of closely related ideas. Professor Susan Fiske argues that, when encountering others, people ask of them two fundamental questions: ‘What are their intentions?’ and ‘What’s their capacity to pursue them?’ If we want to supply the right answers, and so be received positively, Fiske finds we should behave in ways that imply warmth and competence. More recently it’s been argued a third component should be added.

For Professor Jennifer Ray, morality is ‘not only a critical and separable dimension … it may even be the primary dimension’. Elsewhere, ‘perceived sincerity’ has been found to be essential to successful ‘impression management’.

Throughout history, leaders have succeeded by telling a story that says their group is deserving of more status, which, under their direction, they’ll win. But it remains important this evangelical passion doesn’t morph into arrogance.

Tyrannies are virtue-dominance games. Much of their daily play and conversation will focus on matters of obedience, belief and enemies. Is the game you’re playing coercing people, both inside and outside it, into conforming to its rules and symbols? Does it attempt to silence its ideological foes? Does it tell a simplistic story that explains the hierarchy, deifying their group whilst demonising a common enemy? Are those around you obsessed with their sacred beliefs?

Some forms of status are easier to win than others. For those of us who aren’t pretty, virtue is probably the easiest to find of all. It’s as simple as judging people: because status is relative, their de-grading raises us up, if only in our minds.

Morality poisons empathy.

I believe we can all take consolation in the knowledge that nobody ever gets there, not the superstars, the presidents, the geniuses or the artists we gaze up at in envy and awe. That promised land is a mirage. In our lowest moments, we should remind ourselves of the truth of the dream: that life is not a story, but a game with no end. This means it isn’t a final victory we should seek but simple, humble progress: the never-ending pleasure of moving in the right direction. Nobody wins the status game. They’re not supposed to. The meaning of life is not to win, it’s to play.

Categories
Psychology Society

Richard Thaler – Misbehaving

The core premise of economic theory is that people choose by optimizing. Of all the goods and services a family could buy, the family chooses the best one that it can afford.

Giving up the opportunity to sell something does not hurt as much as taking the money out of your wallet to pay for it. Opportunity costs are vague and abstract when compared to handing over actual cash.

I called this phenomenon the “endowment effect” because, in economists’ lingo, the stuff you own is part of your endowment, and I had stumbled upon a finding that suggested people valued things that were already part of their endowment more highly than things that could be part of their endowment, that were available but not yet owned.

Roughly speaking, losses hurt about twice as much as gains make you feel good.

The fact that a loss hurts more than an equivalent gain gives pleasure is called loss aversion. It has become the single most powerful tool in the behavioral economist’s arsenal.

“By default, the method of hypothetical choices emerges as the simplest procedure by which a large number of theoretical questions can be investigated. The use of the method relies on the assumption that people often know how they would behave in actual situations of choice, and on the further assumption that the subjects have no special reason to disguise their true preferences.”

Psychologists tell us that in order to learn from experience, two ingredients are necessary: frequent practice and immediate feedback.

Because learning takes practice, we are more likely to get things right at small stakes than at large stakes. This means critics have to decide which argument they want to apply. If learning is crucial, then as the stakes go up, decision-making quality is likely to go down.

Eventually I settled on a formulation that involves two kinds of utility: acquisition utility and transaction utility. Acquisition utility is based on standard economic theory and is equivalent to what economists call “consumer surplus.”

Humans, on the other hand, also weigh another aspect of the purchase: the perceived quality of the deal. That is what transaction utility captures. It is defined as the difference between the price actually paid for the object and the price one would normally expect to pay, the reference price

With those provisos out of the way, we can proceed to the punch line. People are willing to pay more for the beer if it was purchased from the resort than from the convenience store. The median† answers, adjusted for inflation, were $7.25 and $4.10.

Because consumers think this way, sellers have an incentive to manipulate the perceived reference price and create the illusion of a “deal.” One example that has been used for decades is announcing a largely fictional “suggested retail price,” which actually just serves as a misleading suggested reference price. In America, some products always seem to be on sale, such as rugs and mattresses, and at some retailers, men’s suits.

Once you recognize the break-even effect and the house money effect, it is easy to spot them in everyday life. It occurs whenever there are two salient reference points, for instance where you started and where you are right now. The house money effect—along with a tendency to extrapolate recent returns into the future—facilitates financial bubbles.

When most people think about Adam Smith, they think of his most famous work, The Wealth of Nations. This remarkable book—the first edition was published in 1776—created the foundation for modern economic thinking. Oddly, the most well-known phrase in the book, the vaunted “invisible hand,” mentioned earlier, appears only once, treated with a mere flick by Smith. He notes that by pursuing personal profits, the typical businessman is “led by an invisible hand to promote an end which was no part of his intention. Nor is it always the worse for the society that it was no part of it.”

The bulk of Smith’s writings on what we would now consider behavioral economics appeared in his earlier book The Theory of Moral Sentiments, published in 1759.

These worries led Strotz to engage in what has become an obligatory discussion of Homer’s tale of Odysseus and the Sirens. Almost all researchers on self-control—from philosophers to psychologists to economists—eventually get around to talking about this ancient story, and for once, I will follow the traditional path. Odysseus wanted to both hear the music and live to tell about it. He devised a two-part plan to succeed. The first part was to make sure that his crew did not hear the Sirens’ call, so he instructed them to fill their ears with wax. The second part of the plan was to have his crew bind him to the mast, allowing Odysseus to enjoy the show without risking the inevitable temptation to steer the ship toward the rocks.

At some point in pondering these questions, I came across a quote from social scientist Donald McIntosh that profoundly influenced my thinking: “The idea of self-control is paradoxical unless it is assumed that the psyche contains more than one energy system, and that these energy systems have some degree of independence from each other.”

One innovation was the rebate, introduced by Chrysler in 1975, and quickly followed by Ford and GM. The car companies would announce a temporary sale whereby each buyer of a car would receive some cash back, usually a few hundred dollars. A rebate seems to be just another name for a temporary sale, but they seemed to be more popular than an equivalent reduction in price, as one might expect based on mental accounting. Suppose the list price of the car was $14,800. Reducing the price to $14,500 did not seem like a big deal, not a just-noticeable difference. But by calling the price reduction a rebate, the consumer was encouraged to think about the $300 separately, which would intensify its importance.

In many situations, the perceived fairness of an action depends not only on who it helps or harms, but also on how it is framed. But firms don’t always get these things right. The fact that my MBA students think it is perfectly fine to raise the price of snow shovels after a blizzard should be a warning to all business executives that their intuitions about what seems fair to their customers and employees might need some fine-tuning.

Of course, if we look around, we see counterexamples to this result all the time. Some people donate to charities and clean up campgrounds, and quite miraculously, at least in America, most urban dog owners now carry a plastic bag when they take their dog for a “walk” in order to dispose of the waste. (Although there are laws in place supposedly enforcing this norm, they are rarely enforced.) In other words, some people cooperate, even when it is not in their self-interest to do so.

The discussion with Charlie and Vernon also led us to recognize that the endowment effect, if true, will reduce the volume of trade in a market. Those who start out with some object will tend to keep it, while those who don’t have such an object won’t be that keen to buy one.

We ran numerous versions of these experiments to answer the complaints of various critics and journal referees, but the results always came out the same. Buyers were willing to pay about half of what sellers would demand, even with markets and learning. Again we see that losses are roughly twice as painful as gains are pleasurable, a finding that has been replicated numerous times over the years.

And while loss aversion is certainly part of the explanation for our findings, there is a related phenomenon: inertia. In physics, an object in a state of rest stays that way, unless something happens. People act the same way: they stick with what they have unless there is some good reason to switch, or perhaps despite there being a good reason to switch. Economists William Samuelson and Richard Zeckhauser have dubbed this behavior “status quo bias.”

A paradigm shift is one of the rare cataclysmic events in science when people make a substantial break with the way the field has been progressing and pursue a new direction. The Copernican revolution, which placed the sun at the center of the solar system, is perhaps the most famous example. It replaced Ptolemaic thinking, in which all the objects in our solar system revolved around the Earth.

Economics is distinguished from other social sciences by the belief that most (all?) behavior can be explained by assuming that agents have stable, well-defined preferences and make rational choices consistent with those preferences in markets that (eventually) clear.

Categories
Society

Hans Rosling – Factfulness

Every group of people I ask thinks the world is more frightening, more violent, and more hopeless—in short, more dramatic—than it really is.

Only actively wrong “knowledge” can make us score so badly.

In short: Think about the world. War, violence, natural disasters, man-made disasters, corruption. Things are bad, and it feels like they are getting worse, right? The rich are getting richer and the poor are getting poorer; and the number of poor just keeps increasing; and we will soon run out of resources unless we do something drastic. At least that’s the picture that most Westerners see in the media and carry around in their heads. I call it the overdramatic worldview. It’s stressful and misleading. In fact, the vast majority of the world’s population lives somewhere in the middle of the income scale. Perhaps they are not what we think of as middle class, but they are not living in extreme poverty. Their girls go to school, their children get vaccinated, they live in two-child families, and they want to go abroad on holiday, not as refugees. Step-by-step, year-by-year, the world is improving. Not on every single measure every single year, but as a rule. Though the world faces huge challenges, we have made tremendous progress. This is the fact-based worldview.

The overdramatic worldview is so difficult to shift because it comes from the very way our brains work.

The human brain is a product of millions of years of evolution, and we are hard-wired with instincts that helped our ancestors to survive in small groups of hunters and gatherers. Our brains often jump to swift conclusions without much thinking, which used to help us to avoid immediate dangers. We are interested in gossip and dramatic stories, which used to be the only source of news and useful information. We crave sugar and fat, which used to be life-saving sources of energy when food was scarce. We have many instincts that used to be useful thousands of years ago, but we live in a very different world now.

Our quick-thinking brains and cravings for drama—our dramatic instincts—are causing misconceptions and an overdramatic worldview.

Uncontrolled, our appetite for the dramatic goes too far, prevents us from seeing the world as it is, and leads us terribly astray.

This chapter is about the first of our ten dramatic instincts, the gap instinct. I’m talking about that irresistible temptation we have to divide all kinds of things into two distinct and often conflicting groups, with an imagined gap—a huge chasm of injustice—in between. It is about how the gap instinct creates a picture in people’s heads of a world split into two kinds of countries or two kinds of people: rich versus poor.

Graphs showing levels of income, or tourism, or democracy, or access to education, health care, or electricity would all tell the same story: that the world used to be divided into two but isn’t any longer. Today, most people are in the middle. There is no gap between the West and the rest, between developed and developing, between rich and poor. And we should all stop using the simple pairs of categories that suggest there is. 

Of the world population, what percentage lives in low-income countries? The majority suggested the answer was 50 percent or more. The average guess was 59 percent. The real figure is 9 percent. Only 9 percent of the world lives in low-income countries.

To summarize: low-income countries are much more developed than most people think. And vastly fewer people live in them. The idea of a divided world with a majority stuck in misery and deprivation is an illusion. A complete misconception. Simply wrong.

So, to replace them, I will now suggest an equally simple but more relevant and useful way of dividing up the world. Instead of dividing the world into two groups I will divide it into four income levels, as set out in the image below.

Today the vast majority of people are spread out in the middle, across Levels 2 and 3, with the same range of standards of living as people had in Western Europe and North America in the 1950s. And this has been the case for many years.

The gap instinct makes us imagine division where there is just a smooth range, difference where there is convergence, and conflict where there is agreement. It is the first instinct on our list because it’s so common and distorts the data so fundamentally. If you look at the news or click on a lobby group’s website this evening, you will probably notice stories about conflict between two groups, or phrases like “the increasing gap.”

Of course, gap stories can reflect reality. In apartheid South Africa, black people and white people lived on different income levels and there was a true gap between them, with almost no overlap. The gap story of separate groups was absolutely relevant. But apartheid was very unusual. Much more often, gap stories are a misleading overdramatization. In most cases there is no clear separation of two groups, even if it seems like that from the averages. We almost always get a more accurate picture by digging a little deeper and looking not just at the averages but at the spread: not just the group all bundled together, but the individuals. Then we often see that apparently distinct groups are in fact very much overlapping.

We are naturally drawn to extreme examples, and they are easy to recall.

These stories of opposites are engaging and provocative and tempting—and very effective for triggering our gap instinct—but they rarely help understanding. There will always be the richest and the poorest, there will always be the worst regimes and the best. But the fact that extremes exist doesn’t tell us much. The majority is usually to be found in the middle, and it tells a very different story.

To control the gap instinct, look for the majority.

Beware comparisons of extremes. In all groups, of countries or people, there are some at the top and some at the bottom. The difference is sometimes extremely unfair. But even then the majority is usually somewhere in between, right where the gap is supposed to be.

Over the last 20 years, the proportion of people living in extreme poverty has almost halved. But in our online polls, in most countries, less than 10 percent knew this. Remember the four income levels from chapter 1? In the year 1800, roughly 85 percent of humanity lived on Level 1, in extreme poverty.

Level 1 is where all of humanity started. It’s where the majority always lived, until 1966. Until then, extreme poverty was the rule, not the exception.

Back in 1800, when Swedes starved to death and British children worked in coal mines, life expectancy was roughly 30 years everywhere in the world.

There’s a dip in the global life expectancy curve in 1960 because 15 to 40 million people—nobody knows the exact number—starved to death that year in China, in what was probably the world’s largest ever man-made famine. The Chinese harvest in 1960 was smaller than planned because of a bad season combined with poor governmental advice about how to grow crops more effectively. The local governments didn’t want to show bad results, so they took all the food and sent it to the central government. There was no food left. One year later the shocked inspectors were delivering eyewitness reports of cannibalism and dead bodies along roads. The government denied that its central planning had failed, and the catastrophe was kept secret by the Chinese government for 36 years. It wasn’t described in English to the outside world until 1996. (Think about it. Could any government keep the death of 15 million people a global secret today?)

Your own country has been improving like crazy too. I can say this with confidence even though I don’t know where you live, because every country in the world has improved its life expectancy over the last 200 years. In fact almost every country has improved by almost every measure.

In large part, it is because of our negativity instinct: our instinct to notice the bad more than the good. There are three things going on here: the misremembering of the past; selective reporting by journalists and activists; and the feeling that as long as things are bad it’s heartless to say they are getting better.

And thanks to increasing press freedom and improving technology, we hear more, about more disasters, than ever before. When Europeans slaughtered indigenous peoples across America a few centuries ago, it didn’t make the news back in the old world. When central planning resulted in mass famine in rural China, millions starved to death while the youngsters in Europe waving communist red flags knew nothing about it. When in the past whole species or ecosystems were destroyed, no one realized or even cared. Alongside all the other improvements, our surveillance of suffering has improved tremendously. This improved reporting is itself a sign of human progress, but it creates the impression of the exact opposite.

Factfulness is … recognizing when we get negative news, and remembering that information about bad events is much more likely to reach us. When things are getting better we often don’t hear about them. This gives us a systematically too-negative impression of the world around us, which is very stressful. 

To control the negativity instinct, expect bad news. 

  • Better and bad. Practice distinguishing between a level (e.g., bad) and a direction of change (e.g., better). Convince yourself that things can be both better and bad. 
  • Good news is not news. Good news is almost never reported. So news is almost always bad. When you see bad news, ask whether equally positive news would have reached you. 
  • Gradual improvement is not news. When a trend is gradually improving, with periodic dips, you are more likely to notice the dips than the overall improvement. 
  • More news does not equal more suffering. More bad news is sometimes due to better surveillance of suffering, not a worsening world. 
  • Beware of rosy pasts. People often glorify their early experiences, and nations often glorify their histories.

The world population today is 7.6 billion people, and yes, it’s growing fast. Still, the growth has already started to slow down, and the UN experts are pretty sure it will keep slowing down over the next few decades. They think the curve will flatten out at somewhere between 10 and 12 billion people by the end of the century.

The UN experts are not predicting that the number of children will stop increasing. They are reporting that it is already happening. The radical change that is needed to stop rapid population growth is that the number of children stops growing. And that is already happening. How could that be? That, everybody should know.

When I was born in 1948, women on average gave birth to five children each. After 1965 the number started dropping like it never had done before. Over the last 50 years it dropped all the way to the amazingly low world average of just below 2.5.

The large increase in population is going to happen not because there are more children. And not, in the main, because old folks are living longer. In fact the UN experts do predict that by 2100, world life expectancy will have increased by roughly 11 years, adding 1 billion old people to the total and taking it to around 11 billion. The large increase in population will happen mainly because the children who already exist today are going to grow up and “fill up” the diagram with 3 billion more adults. This “fill-up effect” takes three generations, and then it is done.

When combining all the parents living on Levels 2, 3, and 4, from every region of the world, and of every religion or no religion, together they have on average two children. No kidding! This includes the populations of Iran, Mexico, India, Tunisia, Bangladesh, Brazil, Turkey, Indonesia, and Sri Lanka, just to name a few examples.

Critical thinking is always difficult, but it’s almost impossible when we are scared. There’s no room for facts when our minds are occupied by fear.

Yet here’s the paradox: the image of a dangerous world has never been broadcast more effectively than it is now, while the world has never been less violent and more safe.

In fact, the number of deaths from acts of nature has dropped far below half. It is now just 25 percent of what it was 100 years ago.

This chapter has touched on terrifying events: natural disasters (0.1 percent of all deaths), plane crashes (0.001 percent), murders (0.7 percent), nuclear leaks (0 percent), and terrorism (0.05 percent). None of them kills more than 1 percent of the people who die each year, and still they get enormous media attention.

“In the deepest poverty you should never do anything perfectly. If you do you are stealing resources from where they can be better used.”

Today there are robust data sets for making the kinds of comparisons I made in Nacala on a global scale, and they show the same thing: It is not doctors and hospital beds that save children’s lives in countries on Levels 1 and 2. Beds and doctors are easy to count and politicians love to inaugurate buildings. But almost all the increased child survival is achieved through preventive measures outside hospitals by local nurses, midwives, and well-educated parents. Especially mothers: the data shows that half the increase in child survival in the world happens because the mothers can read and write. More children now survive because they don’t get ill in the first place.

I first discovered how useful the 80/20 rule is when I started to review aid projects for the Swedish government. In most budgets, around 20 percent of the lines sum up to more than 80 percent of the total. You can save a lot of money by making sure you understand these lines first. Doing just that is how I discovered that half the aid budget of a small health center in rural Vietnam was about to be spent on 2,000 of the wrong kind of surgical knives. It’s how I discovered that 100 times too much—4 million liters—of baby formula was about to be sent to a refugee camp in Algeria. And it is how I stopped 20,000 testicular prostheses from being sent to a small youth clinic in Nicaragua. In each case I simply looked for the biggest single items taking up 80 percent of the budget, then dug down into any that seemed unusual. In each case the problem was due to a simple confusion or tiny error such as a missing decimal point. The 80/20 rule is as easy as it seems. You just have to remember to use it.

By the end of this century, the UN expects there to have been almost no change in the Americas and Europe but 3 billion more people in Africa and 1 billion more in Asia. By 2100 the new PIN code of the world will be 1-1-4-5. More than 80 percent of the world’s population will live in Africa and Asia.

When I see a lonely number in a news report, it always triggers an alarm: What should this lonely number be compared to? What was that number a year ago? Ten years ago? What is it in a comparable country or region? And what should it be divided by? What is the total of which this is a part? What would this be per person? I compare the rates, and only then do I decide whether it really is an important number. 

Factfulness is … recognizing when a lonely number seems impressive (small or large), and remembering that you could get the opposite impression if it were compared with or divided by some other relevant number. 

To control the size instinct, get things in proportion. 

  • Compare. Big numbers always look big. Single numbers on their own are misleading and should make you suspicious. Always look for comparisons. Ideally, divide by something. 
  • 80/20. Have you been given a long list? Look for the few largest items and deal with those first. They are quite likely more important than all the others put together. 
  • Divide. Amounts and rates can tell very different stories. Rates are more meaningful, especially when comparing between different-sized groups. In particular, look for rates per person when comparing between countries or regions.

During the Second World War and the Korean War, doctors and nurses discovered that unconscious soldiers stretchered off the battlefields survived more often if they were laid on their fronts rather than on their backs. On their backs, they often suffocated on their own vomit. On their fronts, the vomit could exit and their airways remained open. This observation saved many millions of lives, not just of soldiers. The “recovery position” has since become a global best practice, taught in every first-aid course on the planet. (The rescue workers saving lives after the 2015 earthquake in Nepal had all learned it.) But a new discovery can easily be generalized too far. In the 1960s, the success of the recovery position inspired new public health advice, against most traditional practices, to put babies to sleep on their tummies. As if any helpless person on their back needed just the same help. The mental clumsiness of a generalization like this is often difficult to spot. The chain of logic seems correct. When seemingly impregnable logic is combined with good intentions, it becomes nearly impossible to spot the generalization error. Even though the data showed that sudden infant deaths went up, not down, it wasn’t until 1985 that a group of pediatricians in Hong Kong actually suggested that the prone position might be the cause. Even then, doctors in Europe didn’t pay much attention. It took Swedish authorities another seven years to accept their mistake and reverse the policy. Unconscious soldiers were dying on their backs when they vomited. Sleeping babies, unlike unconscious soldiers, have fully functioning reflexes and turn to the side if they vomit while on their backs. But on their tummies, maybe some babies are not yet strong enough to tilt their heavy heads to keep their airways open. (The reason the prone position is more dangerous is still not fully understood.)

Factfulness is … recognizing when a category is being used in an explanation, and remembering that categories can be misleading. We can’t stop generalization and we shouldn’t even try. What we should try to do is to avoid generalizing incorrectly.

To control the generalization instinct, question your categories. 

  • Look for differences within groups. Especially when the groups are large, look for ways to split them into smaller, more precise categories.
  •  Look for similarities across groups. If you find striking similarities between different groups, consider whether your categories are relevant. 
  • Look for differences across groups. Do not assume that what applies for one group (e.g., you and other people living on Level 4 or unconscious soldiers) applies for another (e.g., people not living on Level 4 or sleeping babies). •
  • Beware of “the majority.” The majority just means more than half. Ask whether it means 51 percent, 99 percent, or something in between. 
  • Beware of vivid examples. Vivid images are easier to recall but they might be the exception rather than the rule. 
  • Assume people are not idiots. When something looks strange, be curious and humble, and think, In what way is this a smart solution?

Almost every religious tradition has rules about sex, so it is easy to understand why so many people assume that women in some religions give birth to more children. But the link between religion and the number of babies per woman is often overstated. There is, though, a strong link between income and number of babies per woman.

Factfulness is … recognizing that many things (including people, countries, religions, and cultures) appear to be constant just because the change is happening slowly, and remembering that even small, slow changes gradually add up to big changes. 

To control the destiny instinct, remember slow change is still change. 

  • Keep track of gradual improvements. A small change every year can translate to a huge change over decades. 
  • Update your knowledge. Some knowledge goes out of date quickly. Technology, countries, societies, cultures, and religions are constantly changing. 
  • Talk to Grandpa. If you want to be reminded of how values have changed, think about your grandparents’ values and how they differ from yours. 
  • Collect examples of cultural change. Challenge the idea that today’s culture must also have been yesterday’s, and will also be tomorrow’s.

You probably know the saying “give a child a hammer and everything looks like a nail.” When you have valuable expertise, you like to see it put to use. Sometimes an expert will look around for ways in which their hard-won knowledge and skills can be applicable beyond where it’s actually useful. So, people with math skills can get fixated on the numbers. Climate activists argue for solar everywhere. And physicians promote medical treatment where prevention would be better.

Experts in maternal mortality who understand the point about hammers and nails can see that the most valuable intervention for saving the lives of the poorest mothers is not training more local nurses to perform C-sections, or better treatment of severe bleeding or infections, but the availability of transport to the local hospital. The hospitals were of limited use if women could not reach them: if there were no ambulances, or no roads for the ambulances to travel on. Similarly, educators know that it is often the availability of electricity rather than more textbooks or even more teachers in the classroom that has the most impact on learning, as students can do their homework after sunset.

Instead of comparing themselves with extreme socialist regimes, US citizens should be asking why they cannot achieve the same levels of health, for the same cost, as other capitalist countries that have similar resources. The answer is not difficult, by the way: it is the absence of the basic public health insurance that citizens of most other countries on Level 4 take for granted. Under the current US system, rich, insured patients visit doctors more than they need, running up costs, while poor patients cannot afford even simple, inexpensive treatments and die younger than they should. Doctors spend time that could be used to save lives or treat illness providing unnecessary, meaningless care. What a tragic waste of physician time.

People like me, who believe this, are often tempted to argue that democracy leads to, or is even a requirement for, other good things, like peace, social progress, health improvements, and economic growth. But here’s the thing, and it is hard to accept: the evidence does not support this stance. Most countries that make great economic and social progress are not democracies. South Korea moved from Level 1 to Level 3 faster than any country had ever done (without finding oil), all the time as a military dictatorship. Of the ten countries with the fastest economic growth in 2016, nine of them score low on democracy.

Our press may be free, and professional, and truth-seeking, but independent is not the same as representative: even if every report is itself completely true, we can still get a misleading picture through the sum of true stories reporters choose to tell. The media is not and cannot be neutral, and we shouldn’t expect it to be.

The urgency instinct makes us want to take immediate action in the face of a perceived imminent danger. It must have served us humans well in the distant past. If we thought there might be a lion in the grass, it wasn’t sensible to do too much analysis. Those who stopped and carefully analyzed the probabilities are not our ancestors. We are the offspring of those who decided and acted quickly with insufficient information.

The five that concern me most are the risks of global pandemic, financial collapse, world war, climate change, and extreme poverty.

The Spanish flu that spread across the world in the wake of the First World War killed 50 million people—more people than the war had, although that was partly because the populations were already weakened after four years of war.