Categories
Psychology Society

Will Storr – The Status Game

As a tribal species, our personal survival has always depended on our being accepted into a supportive community. Powerful emotions compel us to connect: the joy of belongingness and agony of rejection. But once inside a group, we’re rarely content to flop about on its lower rungs. We seek to rise. When we do, and receive acclaim from our people, we feel as if our lives have meaning and purpose and that we’re thriving.

Our need for status gives us a thirst for rank and a fear of its loss that deforms our thinking and denies us the possibility of reliable happiness. It’s why, even as we raise ourselves so high above the other animals we appear to them as gods, we still behave like them – and worse. Always on alert for slights and praise, we can be petty, hateful, aggressive, grandiose and delusional.

This is why, I’ve come to believe, we make a fundamental error when we reflexively categorise our desire for status as shameful. A greater understanding of what helps drive us on our good days and bad must surely be useful. Digging beneath the flattering stories we like to tell of ourselves can help us see more clearly how we can become better, but also how easily we become tempted into delusion and tyranny.

We’re going to define three different forms of the status game – the dominance game, the virtue game and the success game – and ask how certain kinds of play can lead us into a fairer, wealthier tomorrow.

When asked why we do the things we do, we rarely say, ‘It’s because of status. I really love it.’ It can be distasteful to think of it as any kind of motivating force, let alone a vital one. It contradicts the heroic story we like to tell of ourselves. When we pursue the great goals of our lives, we tend to focus on our happy ending. We want the qualification, the promotion, the milestone, the crown. These motivations, that tend to spring to mind immediately, are known by researchers as ‘proximate’. They’re absolutely real and valid but they have other upstream ‘ultimate’ causes. Ultimate causes are often subconscious and so hidden from us: they’re the reason we want the qualification, the promotion, the milestone, the crown, in the first place.

Wherever psychologists look, they find a remarkably powerful link between status and wellbeing. One study of more than sixty thousand people across 123 countries found people’s wellbeing ‘consistently depended on the degree to which people felt respected by others’. Attainment of status or its loss was ‘the strongest predictor of long-term positive and negative feelings’.

Psychologists find that simply connecting with others and feeling accepted by them can be profoundly good for us. But equally revealing is how our minds and bodies react when we fail to connect. A wide range of research finds people with depression tend to belong to ‘far fewer’ groups than the rest of the population. Studies across time suggest the more a depressed person identifies with their group – the more of their own sense of self they invest in it – the more their symptoms lift.

In one study, participants were told they were taste-testing chocolate chip cookies. Before the test began, they were asked to mingle with other tasters then choose two they’d like to work with. Some were told (falsely) that nobody had picked them; others that everyone had. The first group, who’d been socially rejected, went on to eat an average of nine cookies more than the non-rejected: nearly twice the number. Most of them even rated the taste of the cookies more highly, implying their rejection actually altered their perceptions of the sugary food.

Marmot was surprised to discover precisely how high a civil servant climbed in the game of the civil service predicted their health outcomes and mortality rates. This was not, as you might reasonably assume, to do with the wealthier individuals leading healthier and more privileged lifestyles. This effect, which Marmot calls the ‘status syndrome’, was entirely independent: a wealthy smoker just one rung below the very top of the status game was more likely to fall ill, as a result of their habit, than the smoker one rung above them.

One review of the scientific literature found that ‘perceiving oneself as having low rank compared to others is consistently linked to higher depressive symptoms’. Some psychologists argue that when we become depressed we ‘mentally withdraw from the competition for higher status’. This keeps us off ‘high-status individuals’ radars’ and conserves energy, helping us cope with the ‘reduced opportunities imposed by low status’.

Much of what seems inarguably real and true, in the space around us, is not. The actual world is monochrome and silent. Sounds, colours, tastes and smells exist only in the projection in our heads. What’s actually out there are vibrating particles, floating chemical compounds, molecules and colourless light waves of varying lengths.

psychologically healthy brain excels at making its owner feel heroic. It does this by reordering our experiences, remixing our memories and rationalising our behaviour, using a battery of reality-warping weapons that make us believe we’re more virtuous, more correct in our beliefs and have more hopeful futures in store than others.

These apparently trite symbols matter. In one test, when participants were shown photos of people wearing ‘rich’ or ‘poor’ clothes, they automatically assumed those in wealthier looking outfits were significantly more competent and of higher status. This effect remained when they were warned upfront of the potential bias, when they were informed the clothing was definitely irrelevant and when they were told all the people worked in sales at a ‘mid-size firm in the Midwest’ and earned around US$80,000. It even remained when the participants were paid money to make an accurate guess.

The status detection system is highly evident in the behaviour of youngsters. Around three-quarters of arguments between children aged 18 and 30 months are over possessions, a figure that rises to 90 per cent when just two toddlers are present. For developmental psychologist Professor Bruce Hood, possession is a ‘means to establish where you are in the nursery pecking order’.

This has been found many times, with one study using data from twelve thousand British adults concluding ‘the ranked position of an individual’s income predicts general life satisfaction, whereas absolute income and reference income have no effect’.

These rules were essential because humans can often be greedy, dishonest and aggressive. One survey of sixty premodern societies uncovered seven common rules of play that are thought to be universal: help your family; help your group; return favours; be brave; defer to superiors; divide resources fairly; respect others’ property. These elemental rules dictate the ways humans keep their tribes working well.

In one study, 86 per cent of Australians rated their job performance as ‘above average’; in another, 96 per cent of Americans described themselves as ‘special’. East Asian games tend to be more collective

The brain begins learning these rules in infancy. As 2-year-olds, we have around one hundred trillion connections between our brain cells, double that of an adult. This is because, when we’re born, we don’t know where we’re going to pop out. Baby brains are specialised for many environments, many games. At this age, we’re better than adults at recognising faces of other races and can hear tones in foreign languages that grown-ups are deaf to.

Much of the rest of human life is comprised of three varieties of status-striving and three varieties of game: dominance, virtue and success. In dominance games, status is coerced by force or fear. In virtue games, status is awarded to players who are conspicuously dutiful, obedient and moralistic. In success games, status is awarded for the achievement of closely specified outcomes, beyond simply winning, that require skill, talent or knowledge.

In Tanzania, Hadza hunters who share meat widely ‘gain great social status – prestige that can be parlayed into powerful social alliances, the deference of other men, and greater mating success’, writes Buss. People engage in ‘competitive altruism’, battling to be ‘seen by others as great contributors to the group’. Of course, status is awarded to the altruistic in more modern societies too: studies show those who donate to charity, for example, experience ‘a dramatic boost in prestige in the eyes of others’.

Chimpanzee troops have been found to be ‘several hundred to a thousand times’ more aggressive than even the most violent human societies.

one survey found 53 per cent of Americans saying they’d prefer instant death than the reputation of a child molester; 70 per cent opted for the amputation of their dominant hand over a swastika tattoo on their face; 40 per cent preferred a year in jail to the reputation of a criminal.

It’s in this way that children in countries such as India overcome the pain of eating spicy foods. Mimicking the actions of high-status people is so desirable, it’s argued, their brains reinterpret the pain signals as pleasurable. Children are thought to teach themselves to enjoy spice-burning foods using automatic prestige-driven imitation. They rarely have to be forced.

Status games run on powerlines of influence and deference that crackle up and down their hierarchy. This is why, of all the countless status symbols that exist in human life, influence is probably the most reliable. We often assume money or fancy possessions are the most certain symbols of a person’s rank, but the highest-status monk in the world may have less wealth, and fewer Hermès ties, than the most junior banker on Wall Street. Influence is different.

But serious violence among women and girls is comparatively rare. For psychologist Professor Jonathan Haidt, ‘girls and boys are equally aggressive but their aggression is different. Boys’ aggression revolves around the threat of violence: “I will physically hurt you” … but girls’ aggression has always been relational: “I will destroy your reputation or your relationships”.’ Researchers argue female aggression tends to be ‘indirect’.

Humiliation has been described by researchers as ‘the nuclear bomb of the emotions’ and has been shown to cause major depressions, suicidal states, psychosis, extreme rage and severe anxiety, ‘including ones characteristic of post-traumatic stress disorder’. Criminal violence expert Professor James Gilligan describes the experience of humiliation as an ‘annihilation of the self

If humans are players, programmed to seek connection and status, humiliation insults both our deepest needs.

The only way to recover is to find a new game even if that means rebuilding an entire life and self. ‘Many humiliated individuals find it necessary to move to another community to recover their status, or more broadly, to reconstruct their lives.’

An African proverb says, ‘the child who is not embraced by the village will burn it down to feel its warmth’. If the game rejects you, you can return in dominance as a vengeful God, using deadly violence to force the game to attend to you in humility.

Researchers find happiness isn’t closely linked to our socioeconomic status, which captures our rank compared with others across the whole of society, including class. It’s actually our smaller games that matter: ‘studies show that respect and admiration within one’s local group, but not socioeconomic status, predicts subjective well-being’.

The model said a person is compelled to act when three forces collide in a moment: motivation (we must want the thing); trigger (something must happen to trigger a desire to get more of it) and ability (it must be easy).

more than any other to make them habitual. He described a way of issuing rewards such that they’d encourage compulsive behaviours. If a programmer wanted to create a certain action, in a user, they should offer a symbol of reinforcement after they’d performed the desired ‘target behaviour’. But here was the trick: the positive reinforcement would be inconsistent. You wouldn’t always know what you were going to get.

To strengthen an existing behaviour, reinforcers are most effective when they are unpredictable,’ Fogg wrote in 2003.

We await replies, likes or upvotes and, just as a gambler never knows how the slot machine will pay out, we don’t know what reward we’ll receive for our contribution. Will we go up? Will we go down? The great prize changes every time. This variation creates compulsion. We just want to keep playing, again and again, to see what we’ll get.

Canny players sense the flaw in their elites and seek to improve their own rank with flattery. And flattery works: Tourish calls it a ‘perfumed trap’. A study of 451 CEOs found leaders who were exposed to more frequent and intense flattery and agreement rated their own abilities more highly, were less able to change course when things went wrong, and led firms that were more likely to suffer persistently poor performance.

Surprisingly, what made the most difference to their behaviour wasn’t the level of inequality in their game, but whether or not the inequality was visible. When players’ wealth was hidden everyone, including the elites, became more egalitarian. But when wealth was displayed, players in every game became less friendly, cooperated ‘roughly half as much’ and the rich were significantly more likely to exploit the poor.

This is why poverty alone doesn’t tend to lead to revolutions. Revolutions – defined as mass movements to replace a ruling order in the name of social justice – have been found to occur in middle-income countries more than the poorest. Sociologist Professor Jack Goldstone writes, ‘what matters is that people feel they are losing their proper place in society for reasons that are not inevitable and not their fault’

By the time we are thirteen,’ writes psychologist Professor Mitch Prinstein, ‘it seems as if there is nothing more important to us than this type of popularity. We talk about who has it. We strategise how to get it. We are devastated when we lose it. We even do things we know are wrong, immoral, illegal, or dangerous merely to obtain status, or to fiercely defend it.’

Psychologist Dr Lilliana Mason writes, ‘more often than not, citizens do not choose which party to support based on policy opinion; they alter their policy opinion according to which party they support. Usually they do not notice that this is happening, and most, in fact, feel outraged when the possibility is mentioned.’

The moral reality we live in is a virtue game. We use our displays of morality to manufacture status. It’s good that we do this. It’s functional. It’s why billionaires fund libraries, university scholarships and scientific endeavours; it’s why a study of 11,672 organ donations in the USA found only thirty-one were made anonymously. It’s why we feel good when we commit moral acts and thoughts privately and enjoy the approval of our imaginary audience. Virtue status is the bribe that nudges us into putting the interests of other people – principally our co-players – before our own.

When neuroscientist Professor Sarah Gimbel presented forty people with evidence their strongly held political beliefs were wrong, the response she observed in their brains was ‘very similar to what would happen if, say, you were walking through the forest and came across a bear

Professor Sam Gosling finds this when his students cluster into personality groups: ‘the extroverts don’t disguise their disdain for the uncommunicative introverts, who selfishly refuse to keep the discussion alive; they cannot fathom why their mute colleagues don’t do their bit to carry some of the conversational load. At the same time, the introverts have nothing but contempt for their garrulous counterparts; why not, they wonder, wait until you’ve got something worth saying before opening your mouth?’

For political psychologist Dr Lilliana Mason, part of the reason we continually attempt at warring for victory is that ‘people are compelled to think of their groups as better than others. Without that, they themselves feel inferior.’ At a ‘very primal level’ players are motivated ‘to view the world through a competitive lens, with importance placed on their own group’s superiority’. Humans love to become superior: to win. Researchers

A game’s command over its players strengthens when it flips into a mode of war.

For the vast majority of our time on earth, then, humans haven’t been subject to the tyranny of leaders. Instead, we lived in fear of what anthropologists call the ‘tyranny of the cousins’. These ‘cousins’ weren’t necessarily actual cousins. They’d usually be clan elders that, in these shallow hierarchies, passed for the elite.

In the end, she saved herself. At the time of writing, Templer’s company still exists, as does her blog. By conforming to the tyrannical cousins, and the frenzy spreading across the gossip networks of social media, she avoided being ‘cancelled’ – which is what we call it when internet mobs, unsatisfied by mockery, denunciation and humiliation meted out online, attempt at having their target de-graded as much as possible in the physical world.

A study of seventy million messages on the Chinese platform Weibo found the emotion that ‘travelled fastest and farthest through the social network’ was anger. Meanwhile, studies of mobbing events on Twitter find shamers increase their follower counts faster than non-shamers.

One investigation found those most likely to circulate ‘hostile political rumours’ including conspiracy theories and ‘fake news’ on social media were often ‘status-obsessed, yet socially marginalised’, their behaviour fuelled by a ‘thwarted desire for high status’, their aim, to ‘mobilise the audience against disliked elites’.

We found these same currents in the collective dreams of one of history’s most lethal games. The Nazis were Elliot Rodger, Ed Kemper and Ted Kaczynski on the level of a culture. They told a self-serving story that explained their catastrophic lack of status and justified its restoration in murderous attack. But it’s not just Germany that’s been possessed in this way. Nations the world over become dangerous when humiliated. One study of ninety-four wars since 1648 found 67 per cent were motivated by matters of national standing or revenge, with the next greatest factor – security – coming in at a distant 18 per cent.

Researchers find a primary motivation for suicide bombers is ‘the shame and humiliation induced by foreign troops in their country’.

When Algerians killed 103 French people following a riot, their colonialist masters sent aeroplanes to destroy forty-four villages, a cruiser to bombard coastal towns and commandos to slaughter on land: the French admit to 1,500 deaths, the Algerians claim 50,000. It’s for reasons like these that psychologist Dr Evelin Lindner has concluded that, ‘The most potent weapon of mass destruction’ is ‘the humiliated mind’.

Toxic morality is deeply implicated in these episodes: ‘genocide is highly moralistic’. Genocides are dominance-virtue games, carried out in the name of justice and fairness and the restoration of the correct order.

A ‘work ethic’ came into being, in which toil itself became prestigious. ‘This shift can be understood as the beginning of a work-centred society’, writes historian Professor Andrea Komlosy, ‘in which the diverse activities of all of its members are increasingly obliged to take on the traits of active production and strenuous exertion.’

We were getting our status from new kinds of games. Slowly, and in fits and starts, our focus had been juddering from duty to the clan towards individual competence and success. This changed our psychology, rewriting the cultural coding of our game-playing brains, turning us into new sorts of humans.

For Protestants, life was no longer a gruelling test for heaven or hell. God already knew where you were ending up. Believers were to look for clues of ‘assurance’ to see if they were saved or damned: signs of ‘elect status’ could be found in their own personal behaviour such as virtuous and sober living, but also in the accrual of wealth and rank on earth. Believers were said to have a personal ‘calling’. God had endowed them with special talents that they should seek to maximise by choosing the right occupation or vocation, then working hard in it.

Humans had been able to conquer the planet partly because we exist in a web of stored information. Every individual born didn’t have to learn everything for themselves afresh: knowledge was communicated by elders and passed down through the generations.

By connecting our ability to accumulate knowledge to our desire for status, they’d discovered the future.

This ‘Industrial Revolution’ was a status goldrush. It came to define the country’s mood and culture. Britons ‘became innovators because they adopted an improving mentality’, writes historian Dr Anton Howes. This mentality spread like a ‘disease’ that could infect ‘anyone … rich and poor, city-dwellers and rustics, Anglicans and dissenters, Whigs and Tories, skilled engineers and complete amateurs’.

One of the most famous, Scottish economist Adam Smith, is commonly known as the ‘Father of Capitalism’. Perhaps more than anyone, the hyper-individualistic, self-interested money-obsessed world we live in today is linked to him and his theories of how free markets and competition generate prosperity. But Smith didn’t believe greed for wealth was the ultimate driver of economies. He thought something else was going on, something deeper in the human psyche. ‘Humanity does not desire to be great, but to be beloved,’ he wrote in 1759.

We win points for personal success throughout our lives, in the highly formalised and often precisely graded games of school, college and work. In the street, in the office and on social media we signal our accomplishments with appearance, possessions and lifestyles. We’re self-obsessed, because this is the game we’re raised to play.

Following the depression and world wars, the economies of the USA and Britain became more rule-bound, virtuous and group-focussed: it was an era of increasing regulation over banking and business, high taxation (topping out at 90 per cent in America in the 1940s and 1950s), broad unionisation and ‘big government’ innovations such as the New Deal, the Social Security Act, the minimum wage and the welfare state.

American and British players became concomitantly collective: the monkey-suited ‘Corporation Man’ of the 1950s suburbs gave birth to the even more collectively minded hippies, with their anti-materialistic values.

But in the 1980s, the game changed again. During the previous decade, the economies of the West had started to fail. New ways of playing were sought. The leaders of the UK and USA, Margaret Thatcher and Ronald Reagan, decided to make the game significantly more competitive. In 1981, Thatcher told journalists, ‘What’s irritated me about the whole direction of politics in the last thirty years is that it’s always been toward the collectivist society.’

We see this perfect human all around us, beaming with flawless teeth from advertising, film, television, media and the internet. Young, agreeable, visibly fit, self-starting, productive, popular, globally-minded, stylish, self-confident, extrovert, busy. Who is it, this person we feel so pressured to punch ourselves into becoming? It’s the player best equipped to win status in the game we’re in.

Led by psychologist Dr Thomas Curran, the researchers discovered all the forms of perfectionism they looked at had risen between 1989 and 2016. Social perfectionism had grown the most. The extent to which people felt they had to ‘display perfection to secure approval’ had soared by 32 per cent.

Today, sixty-nine of the hundred largest economies on earth are not nations but corporations. In the first quarter of 2021 alone, technology company Apple made more money than the annual GDP of 135 countries; its market valuation was higher than the GDP of Italy, Brazil, Canada, South Korea and Russia.

In just three years, between 2015 and 2018, support for capitalism among young Americans fell from 39 per cent to 30 per cent; a 2019 poll found 36 per cent of millennials saying they approve of Communism. Sociologist Professor Thomas Cushman writes, ‘anti-capitalism has become, in some ways, a central pillar of the secular religion of the intellectuals, the habitus of modern critical intellectuals as a status group’.

Between 1979 and 2005, the average real hourly wage for white working-class Americans without a high-school diploma declined by 18 per cent.

Education lies at the heart of this divide.’ Most of the 41 per cent of white millennials who voted for Trump in 2016 didn’t have college degrees. In all, white non-college voters comprised around three-fifths of Trump’s support in 2016; 74 per cent of people with no qualifications supported Brexit, the educational divide being greater than that of social class, income or age.

The person most credited with attempting to realize this dream is Vladimir Ilyich Ulyanov, better known as Lenin. His hatred for the bourgeoisie was blinding, violent and total; many contemporary historians see its genesis in the humiliation his own upper-middle class family suffered after his brother, Sasha, was executed for a ‘laughably amateur’ but nearly successful assassination plot.

By 1920, 5.4 million were directly employed by the government. ‘There were twice as many officials as there were workers in Soviet Russia and these officials were the main social base of the new regime,’ writes Figes. ‘This was not a Dictatorship of the Proletariat but a Dictatorship of the Bureaucracy.’

During the Great Terror, the police were issued quotas for what percentage of their district was to be shot or sent to the camps. On 2 June 1937, it was ordered that 35,000 were to be ‘repressed’ in one district, 5,000 of whom were to be shot. Between 1937 and 1938, 165,200 priests were arrested, 106,800 of whom were shot. In the same period, an average of one and a half thousand people were executed daily. One and a half million ordinary Russians were arrested by the secret police, nearly seven hundred thousand were executed for ‘counter-revolutionary activities’.

Throughout the 1930s, there came into being a complex hierarchy of status. Stalin might have admitted there were now three classes, but sociologists found at least ten: the ruling elite; superior intelligentsia; general intelligentsia; working-class aristocracy; white collar; well-to-do peasants; average workers; average peasants; disadvantaged workers; forced labour.

The new elites gained access to special apartments and had the best goods automatically reserved for them. Their children were sent to exclusive summer camps. They received holidays, chauffeur-driven cars and money. It became ‘normal’ for them to have live-in servants

As for the state itself, it argued their privilege was temporary: soon all of the USSR would live like this. They were not a privileged elite, went the thinking, they were a vanguard.

More than two thousand years before the revolution, the Ancient Greek who’d first dreamed the Communist dream had been corrected by his student, Aristotle, who’d pointed out it wasn’t actually wealth or private ownership that created the human yearning to get ahead. That yearning was a part of our nature: ‘it is not possession but the desires of mankind which require to be equalized’.

To persuade us to push a penis in and out of a vagina, it invented orgasm. To persuade us to sacrifice our wellbeing for a screaming, shit-smeared infant, it made love. To persuade us to force mashed-up foreign objects down our throats, it evolved taste and appetite. To persuade us to engage in groupish, co-operative living, it conjured the obsessive joys of connection and acclaim. Follow the rules, and follow them well, and you can expect to feel great.

As we play for ever-greater status, for ourselves and our games, we weave a self-serving and highly motivating dream that writhes with saints and demons and irrational beliefs. This dream is presented to us as reality. It’s entirely convincing, in all its colour, noise and pristine focus. We see evidence everywhere that it’s true. It has the power to seduce us into the most depraved acts of hatred and barbarity. But it can also lead us into modes of play that truly make a better world.

Psychologists studying optimal self-presentation discuss a set of closely related ideas. Professor Susan Fiske argues that, when encountering others, people ask of them two fundamental questions: ‘What are their intentions?’ and ‘What’s their capacity to pursue them?’ If we want to supply the right answers, and so be received positively, Fiske finds we should behave in ways that imply warmth and competence. More recently it’s been argued a third component should be added.

For Professor Jennifer Ray, morality is ‘not only a critical and separable dimension … it may even be the primary dimension’. Elsewhere, ‘perceived sincerity’ has been found to be essential to successful ‘impression management’.

Throughout history, leaders have succeeded by telling a story that says their group is deserving of more status, which, under their direction, they’ll win. But it remains important this evangelical passion doesn’t morph into arrogance.

Tyrannies are virtue-dominance games. Much of their daily play and conversation will focus on matters of obedience, belief and enemies. Is the game you’re playing coercing people, both inside and outside it, into conforming to its rules and symbols? Does it attempt to silence its ideological foes? Does it tell a simplistic story that explains the hierarchy, deifying their group whilst demonising a common enemy? Are those around you obsessed with their sacred beliefs?

Some forms of status are easier to win than others. For those of us who aren’t pretty, virtue is probably the easiest to find of all. It’s as simple as judging people: because status is relative, their de-grading raises us up, if only in our minds.

Morality poisons empathy.

I believe we can all take consolation in the knowledge that nobody ever gets there, not the superstars, the presidents, the geniuses or the artists we gaze up at in envy and awe. That promised land is a mirage. In our lowest moments, we should remind ourselves of the truth of the dream: that life is not a story, but a game with no end. This means it isn’t a final victory we should seek but simple, humble progress: the never-ending pleasure of moving in the right direction. Nobody wins the status game. They’re not supposed to. The meaning of life is not to win, it’s to play.

Categories
Psychology Society

Richard Thaler – Misbehaving

The core premise of economic theory is that people choose by optimizing. Of all the goods and services a family could buy, the family chooses the best one that it can afford.

Giving up the opportunity to sell something does not hurt as much as taking the money out of your wallet to pay for it. Opportunity costs are vague and abstract when compared to handing over actual cash.

I called this phenomenon the “endowment effect” because, in economists’ lingo, the stuff you own is part of your endowment, and I had stumbled upon a finding that suggested people valued things that were already part of their endowment more highly than things that could be part of their endowment, that were available but not yet owned.

Roughly speaking, losses hurt about twice as much as gains make you feel good.

The fact that a loss hurts more than an equivalent gain gives pleasure is called loss aversion. It has become the single most powerful tool in the behavioral economist’s arsenal.

“By default, the method of hypothetical choices emerges as the simplest procedure by which a large number of theoretical questions can be investigated. The use of the method relies on the assumption that people often know how they would behave in actual situations of choice, and on the further assumption that the subjects have no special reason to disguise their true preferences.”

Psychologists tell us that in order to learn from experience, two ingredients are necessary: frequent practice and immediate feedback.

Because learning takes practice, we are more likely to get things right at small stakes than at large stakes. This means critics have to decide which argument they want to apply. If learning is crucial, then as the stakes go up, decision-making quality is likely to go down.

Eventually I settled on a formulation that involves two kinds of utility: acquisition utility and transaction utility. Acquisition utility is based on standard economic theory and is equivalent to what economists call “consumer surplus.”

Humans, on the other hand, also weigh another aspect of the purchase: the perceived quality of the deal. That is what transaction utility captures. It is defined as the difference between the price actually paid for the object and the price one would normally expect to pay, the reference price

With those provisos out of the way, we can proceed to the punch line. People are willing to pay more for the beer if it was purchased from the resort than from the convenience store. The median† answers, adjusted for inflation, were $7.25 and $4.10.

Because consumers think this way, sellers have an incentive to manipulate the perceived reference price and create the illusion of a “deal.” One example that has been used for decades is announcing a largely fictional “suggested retail price,” which actually just serves as a misleading suggested reference price. In America, some products always seem to be on sale, such as rugs and mattresses, and at some retailers, men’s suits.

Once you recognize the break-even effect and the house money effect, it is easy to spot them in everyday life. It occurs whenever there are two salient reference points, for instance where you started and where you are right now. The house money effect—along with a tendency to extrapolate recent returns into the future—facilitates financial bubbles.

When most people think about Adam Smith, they think of his most famous work, The Wealth of Nations. This remarkable book—the first edition was published in 1776—created the foundation for modern economic thinking. Oddly, the most well-known phrase in the book, the vaunted “invisible hand,” mentioned earlier, appears only once, treated with a mere flick by Smith. He notes that by pursuing personal profits, the typical businessman is “led by an invisible hand to promote an end which was no part of his intention. Nor is it always the worse for the society that it was no part of it.”

The bulk of Smith’s writings on what we would now consider behavioral economics appeared in his earlier book The Theory of Moral Sentiments, published in 1759.

These worries led Strotz to engage in what has become an obligatory discussion of Homer’s tale of Odysseus and the Sirens. Almost all researchers on self-control—from philosophers to psychologists to economists—eventually get around to talking about this ancient story, and for once, I will follow the traditional path. Odysseus wanted to both hear the music and live to tell about it. He devised a two-part plan to succeed. The first part was to make sure that his crew did not hear the Sirens’ call, so he instructed them to fill their ears with wax. The second part of the plan was to have his crew bind him to the mast, allowing Odysseus to enjoy the show without risking the inevitable temptation to steer the ship toward the rocks.

At some point in pondering these questions, I came across a quote from social scientist Donald McIntosh that profoundly influenced my thinking: “The idea of self-control is paradoxical unless it is assumed that the psyche contains more than one energy system, and that these energy systems have some degree of independence from each other.”

One innovation was the rebate, introduced by Chrysler in 1975, and quickly followed by Ford and GM. The car companies would announce a temporary sale whereby each buyer of a car would receive some cash back, usually a few hundred dollars. A rebate seems to be just another name for a temporary sale, but they seemed to be more popular than an equivalent reduction in price, as one might expect based on mental accounting. Suppose the list price of the car was $14,800. Reducing the price to $14,500 did not seem like a big deal, not a just-noticeable difference. But by calling the price reduction a rebate, the consumer was encouraged to think about the $300 separately, which would intensify its importance.

In many situations, the perceived fairness of an action depends not only on who it helps or harms, but also on how it is framed. But firms don’t always get these things right. The fact that my MBA students think it is perfectly fine to raise the price of snow shovels after a blizzard should be a warning to all business executives that their intuitions about what seems fair to their customers and employees might need some fine-tuning.

Of course, if we look around, we see counterexamples to this result all the time. Some people donate to charities and clean up campgrounds, and quite miraculously, at least in America, most urban dog owners now carry a plastic bag when they take their dog for a “walk” in order to dispose of the waste. (Although there are laws in place supposedly enforcing this norm, they are rarely enforced.) In other words, some people cooperate, even when it is not in their self-interest to do so.

The discussion with Charlie and Vernon also led us to recognize that the endowment effect, if true, will reduce the volume of trade in a market. Those who start out with some object will tend to keep it, while those who don’t have such an object won’t be that keen to buy one.

We ran numerous versions of these experiments to answer the complaints of various critics and journal referees, but the results always came out the same. Buyers were willing to pay about half of what sellers would demand, even with markets and learning. Again we see that losses are roughly twice as painful as gains are pleasurable, a finding that has been replicated numerous times over the years.

And while loss aversion is certainly part of the explanation for our findings, there is a related phenomenon: inertia. In physics, an object in a state of rest stays that way, unless something happens. People act the same way: they stick with what they have unless there is some good reason to switch, or perhaps despite there being a good reason to switch. Economists William Samuelson and Richard Zeckhauser have dubbed this behavior “status quo bias.”

A paradigm shift is one of the rare cataclysmic events in science when people make a substantial break with the way the field has been progressing and pursue a new direction. The Copernican revolution, which placed the sun at the center of the solar system, is perhaps the most famous example. It replaced Ptolemaic thinking, in which all the objects in our solar system revolved around the Earth.

Economics is distinguished from other social sciences by the belief that most (all?) behavior can be explained by assuming that agents have stable, well-defined preferences and make rational choices consistent with those preferences in markets that (eventually) clear.

Categories
Philosophy

A Matter of Perspective

The author and podcast host Tim Ferris often finishes his interviews with a question: “’If you could have a gigantic billboard anywhere with anything on it, what would it say and why?”. I was discussing this question with a friend the other day. This is what my answer came down to: 

Start with yourself, before criticizing the world. Remember you are holding only a perspective, never the truth. Nobody knows, we are all just trying our best. 

Think of them as three intertwined components, where each part works best in the context of the other ones. Here is what they mean.

 

Start With Yourself, Before Criticizing the World

This is not exactly contrarian. Mother Teresa once said “If each of us would only sweep our own doorstep, the whole world would be clean”, Nelson Mandela stated “One of the most difficult things is not to change society but to change yourself” and Ghandi’s philosophy is often summarized as “You must be the change you wish to see in the world”. It’s nonetheless a message that is more needed than ever. Our world grows in polarization with political and cultural ideologies becoming increasingly hostile to one another. In the midst of woke identity politics, cancel culture, climate deniers, and right-wing populism, we lost our sense of togetherness. It seems like people rarely question themselves and all too often argue with a conviction that is both to be envied and utterly unjustified given our modern world’s complexity. 

Our public discourse nowadays often takes place from a place of resentment and anger with little attempt to understand other perspectives. Truly considering other perspectives is not an easy task. It requires letting go of your own ego, something which has to be learned. Thus, before proclaiming a big change in the world, it seems wise to practice locally. Start by being a good friend, brother or sister, son or daughter, husband or wife. Then grow beyond this. Many of us are greatly privileged. It’s a modern tragedy that these privileges are so easily forgotten and how much conscious effort it takes to bring them back into perspective. Growing to become your best self and nourishing your gratitude for the person you are and the privileges you experience seems crucial, as one can not be grateful and angry, furious, or resentful at the same time. And, as the naive argument goes, if all of us were to do this, our world would be saved. But honestly. Wouldn’t it? 


Remember You Are Holding Only a Perspective, Never the Truth

To start working on yourself before critiquing others requires some degree of epistemic humility, meaning the ability to recognize and accept our limited knowledge of this world. While there is some ground truth in hard sciences such as mathematics or physics, most things in life don’t have a right or wrong answer. There is not one “correct” political opinion nor is there one set of “superior” moral values. Seldom are conflicts caused by only one person and barely ever can we know something for sure. 

Don’t mistake this as a case for moral relativism, it’s not. The author and philosopher Sam Harris describes morality as a mountainous landscape. There is more than one moral peak and more than one valley, but without any doubt, there are better and worse opinions and values to hold. Yet it’s crucial to be open-minded. Think about the fundamental shifts in morality we experienced in the last 200 years, from the abolishment of slavery to the rise of feminism. Not being open-minded might lead you to be stuck in a valley without even noticing it. To put it in the words of the Stoic philosopher Epictetus: “It is impossible for a man to learn what he thinks he already knows”

Epistemic humility is also not an argument against the necessity of forming opinions which is essential to navigating life’s complexity. Yet, it puts opinions into perspective. One way of pointing out the limitations of opinions is to think about them as stories. Stories help us to make sense of our experiences. As reality is far too complex to be fully comprehended, narratives are our way of dealing with it. We simplify the experience, categorize it, and stitch together patterns in such a way that they form a coherent story. Yet, by definition, every story is a reduction of reality and its complexity. This has two implications. First, not a single story is an accurate representation of the truth. Second, every story, depending on the person’s personality and past experiences, is unique. Thus, the same experience can and often does mean two fundamentally different things to two different people. In this sense, no two people live in the same world. 


Nobody Knows, We Are All Just Trying Our Best 

It’s not just that our personal narratives aren’t wholly accurate; we have far less control over them than we believe. In fact, one might say it’s the stories choosing us. It’s not on us to select our genes, we have to play the cards we are dealt with. We are raised in an environment that is none of our decisions either. Entering adulthood and eventually moving out is just another way of transitioning into a bubble that is the result of our genes and previous environments. For better or worse, this leaves us with less agency than we commonly assume. The limited ability to choose our stories implies that whatever conflicts and disagreements we encounter, it’s often not caused by malicious intentions that we so eagerly attribute to others. Instead, people often behave the way they do because they simply can’t help but see things the way they do.
 

Putting Together The Puzzle

The above makes a case for humility, acceptance, and tolerance in regard to opinions and people. It does not, however, suggest any kind of passivity. This is to say, let us remember the power of conversations and change. Begin with yourself. Constantly question your own beliefs and opinions. Then engage with others from a position of openness. Never impose anything onto anyone, but instead listen actively, ask openly, and suggest carefully. As every perspective is unique, everybody holds a piece of the puzzle we call truth. So maybe we can figure it out together?  

Categories
Buddhism Philosophy

Alan Watts – The Meaning of Happiness

The point on which I have insisted in many different ways is, in brief, that this special and supreme order of happiness is not a result to be attained through action, but a fact to be realized through knowledge. The sphere of action is to express it, not to gain it.

In the terms of the great Oriental philosophies, man’s unhappiness is rooted in the feeling of anxiety which attends his sense of being an isolated individual or ego, separate from “life” or “reality” as a whole. On the other hand, happiness—a sense of harmony, completion, and wholeness—comes with the realization that the feeling of isolation is an illusion.

The Meaning of Happiness explains that the psychological equivalent of this doctrine is a state of mind called “total acceptance,” a yes-saying to everything that we experience, the unreserved acceptance of what we are, of what we feel and know at this and every moment.

Wisdom therefore consists in accepting what we are, rather than in struggling fruitlessly to be something else, as if it were possible to run away from one’s own feet.

But whether it is called the giving up of self, submitting to the will of God, accepting life, releasing the tension of striving for happiness or letting oneself go with the stream of life, the essential principle is one of relaxation.

Relaxation is something just as elusive as happiness; it is something which no amount of self-assertive striving can obtain, for as it is in a certain sense the absence of effort, any effort to achieve it is self-defeating.

These arise for two principal reasons: first, that twentieth-century, civilized man is so centered in his own limited self-consciousness that he is quite unaware of its origin, of the directing forces that lie beneath it; and second, that the real problem is not to bring about a state of affairs which does not as yet exist, but to realize something which is already happening—“as it was in the beginning, is now and ever shall be.” For although civilized man appears to live only from his self-conscious center, although he appears divorced from nature, from a spiritual point of view this is a mere conceit. In other words, at this very moment we have that union and harmony in spite of ourselves; we create spiritual problems simply through not being aware of it, and that lack of understanding causes and in turn is caused by the delusion of self-sufficiency. As Christianity would say, the Grace of God is always being freely offered; the problem is to get man to accept it and give up the conceit that he can save himself by the power of his ego, which is like trying to pick himself up by his own belt.

Both Oriental and Western psychology, however, state the problem in a rather different way. They say that if the ego can be made to look into itself, it will see that its own true nature is deeper than itself, that it derives its faculties and its consciousness from a source beyond individual personality. In other words, the ego is not really a self at all; it is simply a function of that inner universe.

In much the same way, speech is a function of the human being, and it is possible that one given only the sense of hearing might think that the voice is the man.

It is unusually complicated because in fact it is unusually simple; its solution lies so close to us and is so self-evident that we have the greatest difficulty in seeing it, and we must complicate it in order to bring it into focus and be able to discuss it at all.

“If you want to see, see directly into it; but when you try to think about it, it is altogether missed.”

A man and his wife had a mysterious goose that from time to time favored them by laying a golden egg. When this had been going on for some weeks they began to think it rather tiresome of the goose to part with its gold so gradually, for they imagined that it carried a store of such eggs inside itself. Not having the sense to weigh the creature first and find out if it was much heavier than a goose should be, they decided to kill it and cut it open. As might be expected, they found only one ordinary, dead goose, void of gold eggs and unable to produce any more.

For the meaning is in the whole, and not only the meaning but the very existence of the thing. Indeed, we are only aware of life and life is only able to manifest itself because it is divided into innumerable pairs of opposites: we know motion by contrast with stillness, long by short, light by darkness, heat by cold, and joy by sorrow.

Just as too much light blinds the eyes, too much pleasure numbs the senses; to be apparent it needs contrast.

For as the snail and the tortoise withdraw into their shells, man retires into his castle of illusion.

If we liked pain as much as pleasure we might shortly become extinct, for it is only this original fear of pain which urges us to self-preservation.

But note the term original fear. Man’s difficulty is that his fear is seldom original; it is once or many times removed from originality, being not just simple fear but the fear of being afraid.

Man does not like to admit to himself that he is afraid, for this weakens his self-esteem and shakes his faith in the security of his ego.

To accept fear would be like accepting death, so he runs from it, and this is the great unhappiness. Sometimes it is expressed in sheer unbridled terror, but more often it is a half-concealed, gnawing anxiety moving in vicious circles to an ever-greater intensity. It would have been better to say in the first place, “I am afraid, but not ashamed.”

The troubles which he tries to avoid are the only things which make him aware of his blessings, and if he would love the latter he must fear the former.

The isolation of the human soul from nature is, generally speaking, a phenomenon of civilization. This isolation is more apparent than real, because the more nature is held back by brick, concrete, and machines, the more it reasserts itself in the human mind, usually as an unwanted, violent, and troublesome visitor.

All men suffer, now as well as in ancient times, but not all are unhappy, for unhappiness is a reaction to suffering, not suffering itself. Therefore, generally speaking, the primitive was unhappy from his conflict with the external forces of nature. But the unhappiness of civilized man is chiefly the result of conflict with natural forces inside himself and inside human society, forces that are all the more dangerous and violent because they come in unrecognized and unwanted at the back door.

But it is not often realized that the apparent departure from nature which we have in civilization is an absolutely essential stage in man’s development. Without it we should remain like the elder son in the parable, jealous and unappreciative. For only those who have sinned can understand and appreciate the bliss of redemption.

The Hindus represent the evolution of man as a circle. Starting at the top he falls, instinctively and unconsciously, to the bottom, at which point they say he enters the extreme of materiality and self-consciousness, the age of Kali Yuga. From thereon he must climb up the second half of the circle and so return in full consciousness to the point from which he began. But truly to be united with nature again, he must first experience that absolute division between himself and the universe (or life).

Christianity differs from many other religions in according the existence of an immortal soul only to man. The rest of creation exists principally for man’s convenience, for no other living creature is of any special significance in the divine plan.

But in early Christian thought and practice there was, with few exceptions, an utter lack of concern for anything beyond the salvation of man.

It was not surprising, therefore, that Christianity took on an increasingly human or anthropomorphic conception of God.

Therefore when it has to accept an irrational impulse it rationalizes it in the course of putting it into effect. When the unregenerate Adam desires blood just for the sake of blood, the reasoning machine has to find a reasonable purpose for shedding blood, however specious.

Nevertheless, Freudian doctrine aroused little sympathy until after the Great War when it achieved sudden success, primarily through the ability of its method of psychological healing to cure cases of shell shock. But the outburst of the unregenerate Adam in the war itself made Freud’s ideas much more acceptable, though it is surprising how many intelligent people even today will refuse to admit that they have such a thing as an unconscious mind.

For to him the unconscious mind is personal only on its surface; essentially it is collective, racial, and perhaps universal, for Jung found that in their dreams modern men and women spontaneously produced myths and symbols thousands of years old of which they had no conscious knowledge.

Jung describes the ego (which we ordinarily regard as our central self) as a complex of the unconscious. That is to say, it is a device employed by the unconscious mind to achieve certain results; in the same way the apparently self-contained human body is a device employed by nature to achieve certain results.

Thus to the Hindus man’s self was identified with his individual person only because of his limited vision; they knew that if this vision could be enlarged, he would discover that his true self was Brahman. In other words, man’s ego is a trick or device (maya)7 to

For the peculiar thing is that both what we are trying to escape and what we are trying to find are inside ourselves. This, as we have seen, is almost more true of modern man than of the primitive, for our difficulty is what to do with ourselves rather than the external world.

Thus, at the risk of repeating a truism, it is obvious that unless we can come face to face with the difficulty in ourselves, everything to which we look for salvation is nothing more than an extra curtain with which to hide that difficulty from our eyes.

We have examined something of the meaning of unhappiness, of the war between the opposites in the human soul, of the fear of fear, of man’s consequent isolation from nature, and of the way in which this isolation has been intensified in the growth of civilization. We have also shown how man is intimately and inseparably connected with the material and mental universe, and that if he tries to cut himself off from it he must perish. In fact, however, he can only cut himself off in imagination, otherwise he would cease to exist, but we have yet to decide whether this elusive thing called happiness would result from acceptance of the fact of man’s union with the rest of life.

But if this is true we have to discover how such an acceptance may be made, whether it is possible for man to turn in his flight into isolation and overcome the panic which makes him try to swim against the current instead of with it. In the psychological realm this swimming against the current is called repression, the reaction of proud, conscious reason to the fears and desires of nature in man.

To return to our analogy: life is the current into which man is thrown, and though he struggles against it, it carries him along despite all his efforts, with the result that his efforts achieve nothing but his own unhappiness.

Finding it, he will understand that in fleeing from death, fear, and sorrow he is making himself a slave, for he will realize the mysterious truth that in fact he is free both to live and to die, to love and to fear, to rejoice and to be sad, and that in none of these things is there any shame. But man rejects his freedom to do them, imagining that death, fear, and sorrow are the causes of his unhappiness. The real cause is that he does not let himself be free to accept them, for he does not understand that he who is free to love is not really free unless he is also free to fear, and this is the freedom of happiness.

Hinayana Buddhism

The gist of its teaching is that when you realize that your personal self does not exist, then you are free of suffering, for suffering can arise only when there is a person to suffer.

Strictly speaking, a composer is inspired when melody emerges from the depths of his mind, how or why we do not know. To convey that melody to others he writes it down on paper, employing a technical knowledge which enables him to name the notes which he hears in his mind. This fact is important: his technical knowledge does not create the tune in his mind; it simply provides him with a complicated alphabet, and is no more the source of music than the literary alphabet and the rules of grammar are the sources of men’s ideas.

The spiritual genius works in the same way as the musical genius. He has a wider scope because his technique of expression, his alphabet, is every possible human activity. For some reason there arises in his soul a feeling of the most profound happiness, not because of some special event, but because of the whole of life. This is not necessarily contentment or joy; it is rather that he feels himself completely united to the power that moves the universe, whatever that may be. This feeling he expresses in two ways, firstly by living a certain kind of life, and secondly by translating his feeling into the form of thoughts and words. People who have not had this feeling make observations on his actions and words, and from them formulate the “rules” of religious morality and theology. But this involves a strange distortion, for as a rule the observer goes about his work in the wrong way.

But experience as such never made anyone either free or happy, and insofar as freedom and happiness are concerned with experience the important thing is not experience itself but what is learned from it. Some people learn from experience and others do not; some learn much from a little, others learn little from much. “Without going out of my house,” said the Chinese sage Lao Tzu, “I know the whole universe.”

And this is real freedom; it includes both freedom to move and to be moved; action and passivity are merged, and in spirituality as well as in marriage this is the fulfillment of love.

Wisdom is a quality of the psychological or spiritual relationship between man and his experience. When that relationship is wise and harmonious man’s experiences set him free, but when it is unwise and discordant his experiences bind him. Religion alone can deal with that relationship, and this is its essential function.

For what do we find left in religion when its quasi-scientific aspect is removed? There is the whole, vast problem of love or spiritual union which is contained in the question, “How can I learn to love life, whose source and essence we call God? How can I learn to be united with it in all its expressions, in living and dying, in love and fear, in the outer world of circumstances, and in the inner world of thought and feeling, so that in union with it I may find freedom?”

The will of God as expressed in morality is not a ukase which we should merely obey, for the purpose of His will is not that there should be morality, but that there should be love, and morality is just the “outward and visible sign of an inward and spiritual grace.”

But this kind of religion does not encourage the type of love upon which spirituality is founded. We have seen that its technique is imitative and thus unlikely to produce genuine, firsthand religious experience; we have also seen that its contempt of this world and its concentration on the life hereafter has little to do with the essentials of religion. This is not all, for not only has it little to do with such essentials; it is also a decided hindrance to spiritual growth because it encourages a “love” of God on a false basis. God is loved not because He has given us this world, but because He is said to have promised a much better world in the life after death.

Certainly all pleasures are transient; otherwise we should cease to appreciate them, but if this be made the excuse for refusing to enjoy them, one must suspect that man’s ideas of happiness are horribly confused. The secret of the enjoyment of pleasure is to know when to stop. Man does not learn this secret easily, but to shun pleasure altogether is cowardly avoidance of a difficult task. For we have to learn the art of enjoying things because they are impermanent. We do this every time we listen to music. We do not seize hold of a particular chord or phrase and shout at the orchestra to go on playing it for the rest of the evening; on the contrary, however much we may like that particular moment of music, we know that its perpetuation would interrupt and kill the movement of melody. We understand that the beauty of a symphony is less in these musical moments than in the whole movement from beginning to end. If the symphony tries to go on too long, if at a certain point the composer exhausts his creative ability and tries to carry on just for the sake of filling in the required space of time, then we begin to fidget in our chairs, feeling that he has denied the natural rhythm, has broken the smooth curve from birth to death and that though a pretense at life is being made it is in fact a living death.

religious ideas and practices (which are no more religion itself than any other activities) exist solely to promote a positive and loving attitude toward ordinary life and what it stands for, namely, God. Unless one happens to be a religious specialist, which is not necessarily the same thing as a spiritual person, religious practices are not ends in themselves. They are means to a fuller and greater life in this world, involving a positive and constructive attitude to pleasure and pain alike, and thus an increasing ability to learn happiness and freedom from every possible kind of experience.

An earlier myth than that of St. Michael and the Dragon tells of an encounter with a monster who for every one head slashed off by the hero’s sword grew seven new heads. Indeed, the problem of evil is not quite so straightforward as the accepted technique of “morality by battle” would assume. Those desires, feelings, and impulses in the soul which are called evil seem to thrive on resistance because resistance belongs to their own nature, and, as the Buddha said, “Hatred ceases not by hatred alone; hatred ceases but by love.” This seems reasonable enough when applied to persons, but somehow we find it difficult to believe that the impulse of hate can only be overcome by loving it. But, as with fear, the hate of hatred is only adding one hate to another, and its results are as contrary as those of the war which was fought to end war.

In Christianity the idea of total acceptance is somewhat hidden; it is only spoken of directly in some of the writings of the mystics, but it is soon discovered when we begin to make a thorough search into the symbolism of Christian doctrine. In the religions of the East, however, it is given particular emphasis; in fact, it is the fundamental principle of Vedantist, Buddhist, and Taoist philosophy. The chief difference between these Eastern religions and Christianity is that, on the surface at least, Christianity is concerned with belief in doctrines whereas the Eastern religions are concerned with states of mind. That is to say, Christianity tends to be a theological and ethical religion, while Buddhism, Taoism, and Vedanta are psychological religions.

In the ordinary way the aim of Christianity is to make the person of Jesus as described in the Gospels as vivid a reality as possible so that the believer may love, follow, and serve Him as if He were a real friend standing always at his side. The psychology of Christian faith is therefore one of the personal devotion of the disciple to his Lord and Master, and this expands into mysticism when the believer feels a relation of love to the cosmic as well as to the personal Christ.

The Christian belief that only one historical religious tradition is valid for man is a clear enough sign of this confusion; so much emphasis is placed on history and doctrine as the essentials of salvation that a psychology of religion independent of the person of Christ is not understood. In the three great Eastern religions this confusion does not exist, and from them we are able to form a much clearer idea of the essentials of religion, of the state of mind called spiritual experience as distinct from the “local color” of particular historical events. For if indeed this experience is attainable outside the Christian faith, apart from devotion to a particular personality, and even without reference to theology (as in certain forms of Buddhism), then Eastern religions have two important contributions to make to Western civilization. Firstly, they show the principles of an approach to spiritual experience on a purely psychological basis to those who have lost faith in the historical and theological tenets of Christianity; secondly, Christianity itself can be enriched and expanded in this sadly underdeveloped aspect of its experience, and perhaps led to a higher understanding of spirituality than even many of its own mystics have attained.9

Anyone who has studied either Hinduism, Buddhism, or Taoism will know that the object of these religions is to attain a realization of the union between man and the Self of the universe.

As Bernard Shaw says, belief is a matter of taste and is quite unaffected by the objective truth or falsity of that belief. Our belief in a logical universe is a matter of taste, even though it may be objectively true; we say we are reasonable men, but we accept the pronouncements of our scientists with a faith quite as groveling as the faith of peasants who believe unquestioningly those who say they have seen gods and demons. How many people could prove such common beliefs as that the earth revolves round the sun or that atoms are composed of electrons and protons?

In conclusion we may say that for Western man acceptance means this: “Live and let live.” We see the root of our unhappiness in the war between ourselves and the universe, a war in which we so often feel tiny, impotent, and alone. The forces of nature, death, change, and unreasoning passion, seem to be against our most cherished longings, and by no trick or deceit can we get rid of our helpless solitude or of the battle between desire and destiny. Acceptance for us is therefore to say, “Let it live” to the whole situation, to the ego and its desires, to life and destiny, and also to the war between them.

For it seems as if the ego were the organizing faculty whose function is to “make sense” out of a collection of chaotic powers.

From one point of view it is true that almost everyone suffers from some form of neurosis, however mild, but the cure of neurosis by itself is not generally desirable unless one of two other conditions is involved: first, that the neurosis is unbearable, and second, that the psychology can supply a source of creative energy to take the place of the neurosis. In fact, we have neurosis to thank for some of the greatest human genius, for the very motive of escape from conflict has provided a driving force for artistic and scientific accomplishments very worthwhile in themselves, and possession by unconscious forces is the secret of many a creative genius. It may indeed be possible to attribute the masterpieces of Leonardo da Vinci2 to unresolved problems of infantile sexuality, and maybe the sonnets of Shakespeare were the work of a homosexual.

But the process does not consist simply in watching over one’s dreams; it is fundamentally a question of the conscious assimilation and acceptance of hitherto unconscious processes, in spite of their seeming irrationality and independence of the ego. When this has been carried out successfully for some time, a fundamental change is said to take place in the psyche. This Jung describes as a shifting of the center of personality from the ego to the self, a term which, in his system, has the special meaning of the center of the whole psyche as distinct from the center of consciousness, which is the ego. He explains the self as a “virtual point” between the conscious and the unconscious which gives equal recognition to the demands of both.

Whereas the neurotic genius finds his energy in escape and the natural genius in “possession” by unconscious forces (“which is to madness close allied”), the integrated genius would supposedly be able to draw upon the unconscious life-sources quite freely and consciously.

I think the re-creation of the personality might fairly be described as becoming conscious again of our plurality, of our many souls, and having them all contribute to our being instead of one at a time.

Jung has gone far more deeply into the nature of the unconscious than did Freud,8 and his system is bound up with aspects of the human soul which have a peculiar magic. Indeed, he goes so deeply that to follow him, not in ideas alone but in experience, is an extremely serious undertaking which involves the gravest risks for those whose feet are not planted on solid earth. And here “fools rush in where angels fear to tread.”

here again the view of such Oriental systems as Taoism and various forms of Buddhism is very suggestive. For here the object is not to reach any particular stage; it is to find the right attitude of mind in whatever stage one may happen to be. This, indeed, is a fundamental principle of those forms of Oriental psychology which we shall be considering. In the course of his evolution man will pass through an indefinite number of stages; he will climb to the crest of one hill to find his road leading on over the crest of another and another. No stage is final because the meaning of life is in its movement and not in the place to which it moves. We have a proverb that to travel well is better than to arrive, which comes close to the Oriental idea.

he is like the dunce who looked for fire with a lighted lantern. Sometimes the longest way round is the shortest way home.

For the unconscious is not, as some imagine, a mental refuse-pit; it is simply unfettered nature, demonic and divine, painful and pleasant, hideous and lovely, cruel and compassionate, destructive and creative. It is the source of heroism, love, and inspiration as well as of fear, hatred, and crime. Indeed, it is as if we carried inside of us an exact duplicate of the world we see around us, for the world is a mirror of the soul, and the soul a mirror of the world.

If anyone imagines Buddhism to be a religion of pure passivity, as we understand it, he should see some of the Chinese paintings of Achala! He might also do well to visit some of the living masters of Zen Buddhism. For the art of becoming reconciled to and at ease with those aspects of natural man which correspond to storm and thunder in the natural universe is to let them rage. Just as there is an incomparable beauty and majesty in thunder and lightning, so also there is something awe-inspiring in the abandoned and uninhibited anger of the sage, which is no mere loss of temper or petty irritability.

For acceptance is emptiness in the Buddhist sense of sunyata, which is sometimes likened to a crystal or a mirror. “The perfect man,” says Chuang Tzu, “employs his mind as a mirror. It grasps nothing; it refuses nothing; it receives, but does not keep.”

Those who have followed partial techniques know that in a life where there is nothing special to be unhappy about there is a kind of barrenness; it is like a wheel without a center, or a perfect lamp without a light. There is nothing to supply any creative fire.

Everything is going just as it should go; the daily routine may be a little dull, but it is by no means unbearable. Certainly there are troubles, but nothing overwhelming. As for one’s own character, well, that is quite normal. There are no serious neurotic troubles and no moral defects. For the most part life is quite agreeable and if death comes at the end of it, that is a matter of course for which nature will prepare us; when the time comes to die we shall be tired and ready to go. That is not a happy life, even though it may be contented; it is simple vegetation.

There is not that joyous response of the individual to the universe which is the essence of spirituality, which expresses itself in religious worship and adoration.

It is a symptom of our spiritual phlegmatism and torpidity that the dance is no longer a part of our ritual and that we worship in churches which, as often as not, resemble cattle pens where people sit in rows and pray by leaning forward in their seats and mumbling.

The Ecstasy of Creation

They, too, know the answer to that eternal question of philosophy, “Why does the universe exist?” They know that it exists for an almost childlike reason—for play, or what the Hindus called lila (which is nearly our own word “lilt”).1 Chesterton points out that when a child sees you do something wonderful, it asks you to do it again and again. So too he says that God made the earth and told it to move round the sun, and when it had moved round once He was pleased and said, “Do it again.”

Should we ask and expect the universe to conform with our standards of good behavior and doubt the existence of God in all things because He does not observe the ordinary standards of middle-class humanitarian morality?

For the truth is simply that without faith we are forever bashing our heads against an immovable wall. No self-deception, no trick of reason or science, no magic, no amount of self-reliance can make us independent of the universe and enable us to escape its destructive aspect.

Faith means that we give ourselves to it absolutely and utterly, without making conditions of any kind, that we abandon ourselves to God without asking anything in return, save that our abandonment to Him may make us feel more keenly the lilt of His playing. This abandonment is the freedom of the spirit.

That is the only promise which can be given for faith, but what a promise! It means that we share in the ecstasy of His creation and His destruction, and experience the mystery and the freedom of His power in all the aspects of life, in both the heights of pleasure and the depths of pain. It may seem illogical, but those who have once shared in this mystery have a gratitude that knows no bounds and are able to say again that God is Love, though with an altogether new meaning.

There is also the problem of the relation between nature and the ego. If we accept the universe and subordinate ourselves to it, if, instead of trying to live life, we let life live us, we are accepting one aspect of life only to deny another—the aggressive, self-asserting ego in which life has manifested itself.

It seems, therefore, that what we need is, as it were, a higher type of acceptance that includes both acceptance and escape, faith and suspicion, self-abandonment and egotism, surrender and aggressiveness, the Dragon and St. Michael.

The motivating power of the vicious circle is pride. In Christian terms we should say that man is not willing to be saved as he is; he feels that it is necessary for him to do something about it, to earn salvation by his own self-made spirituality and righteousness. The Grace of God is offered freely to all, but through pride man will not accept it. He cannot bear the thought that he is absolutely powerless to lift himself up and that the only chance of salvation is simply to accept something which is offered as freely to the saint as to the sinner.

When it is said that man will not let himself be saved as he is, this is another way of saying that he will not accept himself as he is; subtly he gets around this simple act by making a technique out of acceptance, setting it up as something which he should do in order to be a “good boy.” And as soon as acceptance is made a question of doing and technique we have the vicious circle. True acceptance is not something to be attained; it is not an ideal to be sought after—a state of soul which can be possessed and acquired, which we can add to ourselves in order to increase our spiritual stature.

In other words, as soon as we try to make the ideal state of mind called “acceptance” something different from the state of mind which we have at this moment, this is the pride which makes it so difficult to accept what we are now, the barrier that stands between man and that which we call God or Tao.

And so it happens that the very thing we are forever struggling to get away from, to outgrow, to change, and to escape, is the very thing which holds the much desired secret. That is why there is a vicious circle, why our search for happiness is this frantic running around, pursuing in ignorance that which we are trying to flee.

Bear always in mind that the doctrines of these ancient religions are the symbols of inward, personal experiences rather than attempts to describe metaphysical truth.

Hence Vedanta is also known as the system of Advaita (literally, “not two”) or nonduality, and nonduality in philosophy is the natural expression of total acceptance in psychology. Every object, being, and activity is Brahman in His (or Its) entirety, for Brahman alone is—the “One-without-a-second.”

Chinese saying that “between the All and the Void is only a difference of name.”

man can only become conscious of it, not as metaphysical truth but as spiritual freedom, by seeing his own nature as it is and relaxing that contraction (sankocha) of egoistic pride which will not let his nature be as it is, and which is forever trying to get away from it by making a virtue of acceptance.

Deliverance (kaivalya) or freedom is not the result of any course of action, whether mental or physical or moral; according to Vedanta it comes only by Knowledge in the special sense of gnana (Gk. γνωδις) as the fruit of “meditation,” which is being rather than doing.

Our ordinary, partial experience is always limited: joy is conditioned by sorrow, pleasure by pain, life by death, and knowledge by ignorance. Therefore the Hindus conceived freedom as an experience which had no conditioning opposite and called it union with Brahman, the “One-without-a-second.”

The Buddha’s teaching is unique in its utter lack of theology; it concentrates wholly on the necessity of arriving at a personal, immediate experience and dispenses with the doctrinal symbol of that experience.

According to him the cause of discord or unhappiness was tanha or selfish craving, which is perhaps best understood as refusal to accept the “three signs of being.” These are:         1.   Anicca—Change or Impermanence.         2.   Anatta—Literally, “No-self.” The unreality of the ego as a permanent, self-contained, and self-directing unit.         3.   Dukkha—In this context, suffering in its widest sense.

Total acceptance of the three signs of being culminates and fulfills itself in the experience of enlightenment or awakening (bodhi), which is the abrupt transition from the dual to the nondual view of life,

The Hinayanists looked upon Nirvana as an escape from the pains of life and death—a conception which to the Mahayanists with their Brahmanic background appeared as the old error of dualism.

But the Mahayanists gave their philosophy of nonduality practical expression in the ideal of the Bodhisattva, who attains liberation but remains in the world of birth and death to assist all other beings to enlightenment.

In a certain sense Buddhism is very much a philosophy and a psychology of the moment, for if we are asked what life is, and if our answer is to be a practical demonstration and not a theory, we can do no better than point to the moment—now! It is in the moment that we find reality and freedom, for acceptance of life is acceptance of the present moment now and at all times.

Acceptance of the moment is allowing the moment to live, which, indeed, is another way of saying that it is to allow life to live, to be what it is now (yathabhutam). Thus to allow this moment of experience and all that it contains freedom to be as it is, to come in its own time and to go in its own time, this is to allow the moment, which is what we are now, to set us free; it is to realize that life, as expressed in the moment, has always been setting us free from the very beginning, whereas we have chosen to ignore it and tried to achieve that freedom by ourselves.

Mahayana scriptures form the largest bible in the world. The whole Mahayana Canon comprises some sixteen hundred works, some of the longer ones, of which there are an appreciable number, running into as many as a hundred and twenty volumes! Even so, we are told that certain parts of it have been lost.

In form rather than content the native Chinese religion of Taoism presents a refreshing contrast. It has only four important scriptures, all of which are eminently readable, straightforward, and brief; these are the works of Lao Tzu, Chuang Tzu, Lieh Tzu, and Huai-nan Tzu.

Therefore toward the end of the eighth century AD the Chinese had evolved a form of Buddhism which combined all the virtues of Buddhism and Taoism, and, I cannot feel by mere chance, the rise of this Chinese school of Buddhism coincided with the golden age of Chinese culture in the dynasties of T’ang, Sung, and Yuan. In Chinese this school was known as Ch’an, but in the West it is more generally known by its Japanese name of Zen,

These stories are rather like jokes. The moment you try to explain a joke it falls flat, and you only laugh when you see the point directly. Thus to explain these stories is really to explain them away. Now Zen never explains; it only gives hints, for, as van der Leeuw has said, “The mystery of life is not a problem to be solved, but a reality to be experienced.”

More than the old Mahayana, more even than Taoism, Zen concentrates on the importance of seeing into one’s own nature now at this moment—not in five minutes when you have had time to “accept” yourself, nor ten years ahead when you have had time to retire to the mountains and meditate. The Zen masters resort to every possible means to direct your attention to yourself, your experience, your state of consciousness as it is now, for, as we have said before, there is no greater freedom than freedom to be what you are now.

The free man walks straight ahead; he has no hesitations and never looks behind, for he knows that there is nothing in the future and nothing in the past that can shake his freedom. Freedom does not belong to him; it is no more his property than the wind, and as he does not possess it he is not possessed by it. And because he never looks behind his actions are said to leave no trace, like the passage of a bird through the air.

Those who search for happiness do not find it because they do not understand that the object of their search is the seeker. We say that they are happy who have “found themselves” for the secret of happiness lies in the ancient saying, “Become what you are.”

This is why total acceptance, which seems to be a response to bondage, is actually a key to freedom, for when you accept what you are now you become free to be what you are now, and this is why the fool becomes a sage when he lets himself be free to be a fool.

Whereupon the ego and the unconscious, man and nature, oneself and life are seen as the two dancers who move in such close accord that it is impossible to say which moves and which responds, which is the active partner and which the passive.

Eckhart says that “the eye with which I see God is the same with which God sees me.”2 Realization is not predestined to come at a certain time because predestination is an utterly limited half-truth. It may come at any moment, for that union exists eternally.

At each moment the mystic accepts the whole of his experience, including himself as he is, his circumstances as they are, and the relationship between them as it is. Wholeness is his keyword; his acceptance is total, and he excludes no part of his experience, however unsavory it may be. And in this he discovers that wholeness is holiness, and that holiness is another name for acceptability. He is a holy man because he has accepted the whole of himself and thus made holy what he was, is, and shall be in every moment of his life.

Even in resisting her laws one obeys them; and one works with her even in desiring to work against her.…Love is her crown. Only through love does one come near her.…She has isolated all things so that she may bring all together.…All is eternally present in her, for she knows neither past nor future. For her the present is eternity.

But, as we have seen, as soon as you let life live you, you discover that you are living life with an altogether new fullness and zest. To return to the analogy of the dance, it is as if you allowed your partner, life, to swing you along until you so get the “feel” of the dance that you are doing the “swinging” just as much as your partner.

But just as music demands four voices for the full expression of melody and harmony, so the human being demands four fully grown faculties to express the complete possibilities of freedom—and even so they are still expressing only possibilities. Jung classifies the four faculties or functions of man as intuition, sensation, intellect, and feeling, and it is almost impossible that anyone should be awakened to all of them before the middle of life.

Therefore in the process of individuation the psyche may be said to grow a new “organ” which Jung calls the self as distinct from the ego on the one hand, and the unconscious on the other. This self, as the vehicle of freedom, appears as a rule only in the ripeness of years when freedom has become a habit and has shaped the human organism to suit its ends, just as perpetually running water carves out a permanent course in the rock. This is the fulfillment of personality.

In the understanding of our freedom we learn that however low we may sink, we can never separate ourselves from the power of life and the love of God.

Just as love is the meaning of man and woman and has its symbol in the child, so only love can explain all other opposites under the sun.

Without these many opposites there could no more be a universe than there could be melody without the sounding and silencing of notes, and only those who do not accept them can complain that the universe was unfortunately arranged.

Note: Painting

Love, however, is not to be confused with liking; we may love the opposites, but because of our human nature we cannot always like them. Only the pervert actually likes suffering, but the love of suffering is known in giving freedom to your dislike of it; for without dislike on our part, suffering is no longer suffering.

This gratitude therefore demands expression in “works of love,” which is to say morality. It makes possible for the first time a genuine morality, for the free man is moral because he wants to be, not because he thinks he ought to be moral. Without gratitude morality is a mere discipline which keeps human society in a relatively stable condition until such time as men learn the freedom of love.

If you try to discover the secret of beauty by taking a flower to pieces, you will arrive at the somewhat unsatisfactory conclusion of having abolished the flower.

For beauty is beauty just because it is a mystery, and when ordinary life is known as a profound mystery then we are somewhere near to wisdom.

If a doctor explains the transformations undergone by food in his stomach, he does not cease to enjoy his dinner. If a scientist tells him that thunder is not the music of the gods but mere electrical disturbances, the thunder is for him no less wonderful. And if some Philistine tells him that playing a violin is only scraping cats’ entrails with horsehair, he simply marvels that melody can emerge from things so unprepossessing in appearance.

Categories
Psychology Relationships

Esther Perel – Mating in Captivity

We all share a fundamental need for security, which propels us toward committed relationships in the first place; but we have an equally strong need for adventure and excitement. Modern romance promises that it’s possible to meet these two distinct sets of needs in one place. Still, I’m not convinced. Today, we turn to one person to provide what an entire village once did: a sense of grounding, meaning, and continuity. At the same time, we expect our committed relationships to be romantic as well as emotionally and sexually fulfilling. Is it any wonder that so many relationships crumble under the weight of it all?

Love flourishes in an atmosphere of closeness, mutuality, and equality. We seek to know our beloved, to keep him near, to contract the distance between us. We care about those we love, worry about them, and feel responsible for them. For some of us, love and desire are inseparable. But for many others, emotional intimacy inhibits erotic expression. The caring, protective elements that foster love often block the unselfconsciousness that fuels erotic pleasure.

My belief, reinforced by twenty years of practice, is that in the course of establishing security, many couples confuse love with merging. This mix-up is a bad omen for sex. To sustain an élan toward the other, there must be a synapse to cross. Eroticism requires separateness. In other words, eroticism thrives in the space between the self and the other.

Love may be universal, but its constructions in each culture are defined, both literally and figuratively, in different languages. I was particularly sensitive to the conversations about child and adolescent sexuality because it is in messages to children that societies most reveals their values, goals, incentives, prohibitions.

For those who aspire to accelerate their heartbeat periodically, I give them the score: excitement is interwoven with uncertainty, and with our willingness to embrace the unknown rather than to shield ourselves from it. But this very tension leaves us feeling vulnerable. I caution my patients that there is no such thing as “safe sex.”

Romantics value intensity over stability. Realists value security over passion. But both are often disappointed, for few people can live happily at either extreme.

In his book Can Love Last? the infinitely thoughtful psychoanalyst Stephen Mitchell offers a framework for thinking about this conundrum. As he explains it, we all need security: permanence, reliability, stability, and continuity. These rooting, nesting instincts ground us in our human experience. But we also have a need for novelty and change, generative forces that give life fullness and vibrancy. Here risk and adventure loom large. We’re walking contradictions, seeking safety and predictability on one hand and thriving on diversity on the other.

And what is true for human beings is true for every living thing: all organisms require alternating periods of growth and equilibrium. Any person or system exposed to ceaseless novelty and change risks falling into chaos; but one that is too rigid or static ceases to grow and eventually dies. This never-ending dance between change and stability is like the anchor and the waves.

Not so long ago, the desire to feel passionate about one’s husband would have been considered a contradiction in terms. Historically, these two realms of life were organized separately—marriage on one side and passion most likely somewhere else, if anywhere at all. The concept of romantic love, which came about toward the end of the nineteenth century, brought them together for the first time. The central place of sex in marriage, and the heightened expectations surrounding it, took decades more to arrive.

It is not that our human insecurity is greater today than in earlier times. In fact, quite the contrary may be true. What is different is that modern life has deprived us of our traditional resources, and has created a situation in which we turn to one person for the protection and emotional connections that a multitude of social networks used to provide. Adult intimacy has become overburdened with expectations.

There’s a powerful tendency in long-term relationships to favor the predictable over the unpredictable. Yet eroticism thrives on the unpredictable. Desire butts heads with habit and repetition. It is unruly, and it defies our attempts at control.

The motivational expert Anthony Robbins put it succinctly when he explained that passion in a relationship is commensurate with the amount of uncertainty you can tolerate.

Introducing uncertainty sometimes requires nothing more than letting go of the illusion of certitude. In this shift of perception, we recognize the inherent mystery of our partner.

In the words of Proust, “The real voyage of discovery consists not in seeking new landscapes but in having new eyes.”

In truth, we never know our partner as well as we think we do. Mitchell reminds us that even in the dullest marriages, predictability is a mirage. Our need for constancy limits how much we are willing to know the person who’s next to us. We are invested in having him or her conform to an image that is often a creation of our own imagination, based on our own set of needs.

We see what we want to see, what we can tolerate seeing, and our partner does the same. Neutralizing each other’s complexity affords us a kind of manageable otherness. We narrow down our partner, ignoring or rejecting essential parts when they threaten the established order of our coupledom. We also reduce ourselves, jettisoning large chunks of our personalities in the name of love.

In his book Open to Desire, the Buddhist psychoanalyst Mark Epstein explains that our willingness to engage that mystery keeps desire alive. Faced with the irrefutable otherness of our partner, we can respond with fear or with curiosity. We can try to reduce the other to a knowable entity, or we can embrace her persistent mystery. When we resist the urge to control, when we keep ourselves open, we preserve the possibility of discovery.

Eroticism resides in the ambiguous space between anxiety and fascination.

If love is an act of imagination, then intimacy is an act of fruition. It waits for the high to subside so it can patiently insert itself into the relationship. The seeds of intimacy are time and repetition. We choose each other again and again, and so create a community of two.

Love rests on two pillars: surrender and autonomy. Our need for togetherness exists alongside our need for separateness. One does not exist without the other. With too much distance, there can be no connection. But too much merging eradicates the separateness of two distinct individuals.

When people become fused—when two become one—connection can no longer happen. There is no one to connect with. Thus separateness is a precondition for connection: this is the essential paradox of intimacy and sex.

The dual (and often conflicting) needs for connection and independence are a central theme in our developmental histories. Throughout childhood we struggle to find a delicate balance between our profound dependence on our primary caregivers and our need to carve out a sense of independence.

Sexual desire does not obey the laws that maintain peace and contentment between partners. Reason, understanding, compassion, and camaraderie are the handmaidens of a close, harmonious relationship. But sex often evokes unreasoning obsession rather than thoughtful judgment, and selfish desire rather than altruistic consideration. Aggression, objectification, and power all exist in the shadow of desire, components of passion that do not necessarily nurture intimacy.

Love enjoys knowing everything about you; desire needs mystery. Love likes to shrink the distance that exists between me and you, while desire is energized by it. If intimacy grows through repetition and familiarity, eroticism is numbed by repetition. It thrives on the mysterious, the novel, and the unexpected. Love is about having; desire is about wanting.

Intimacy has become the sovereign antidote for lives of increasing isolation. Our determination to “reach out and touch someone” has reached a peak of religious fervor.

In our era of communication, intimacy has been redefined. No longer is it the deep knowledge and familiarity that develop over time and can be cultivated in silence. Instead, we think of intimacy primarily as a discursive process, one that involves self-disclosure, the trustful sharing of our most personal and private material—our feelings.

The hegemony of the spoken word has veered into a female bias that has, for once, put men in a position of inferiority. Men are socialized to perform, to compete, and to be fearless. The capacity to express feelings is not a prized attribute in the making of American manhood. Dare I say it’s not even considered a desirable one?—at least, not yet.

I am not convinced that unrestrained disclosure—the ability to speak the truth and not hide anything—necessarily fosters a harmonious and robust intimacy. Any practice can be taken to a ridiculous extreme.

Some couples take this one step farther, confusing intimacy with control. What passes for care is actually covert surveillance—a fact-finding approach to the details of a partner’s life. What did you eat for lunch? Who called? What did you guys talk about? This kind of interrogation feigns closeness and confuses insignificant details with a deeper sense of knowledge. I am often amazed at how couples can be up on the minute details of each other’s lives, but haven’t had a meaningful conversation in years. In fact, such transparency can often spell the end of curiosity. It’s as if this stream of questions replaces a more thoughtful and authentically interested inquiry.

When the impulse to share becomes obligatory, when personal boundaries are no longer respected, when only the shared space of togetherness is acknowledged and private space is denied, fusion replaces intimacy and possession co-opts love. It is also the kiss of death for sex. Deprived of enigma, intimacy becomes cruel when it excludes any possibility of discovery. Where there is nothing left to hide, there is nothing left to seek.

If commitment requires a trade-off of freedom for security, then eroticism is the gateway back to freedom. In the broad expansiveness of our imagination we uncover the freedom that allows us to tolerate the confines of reality.

The more we need, the angrier we are when we don’t get. Kids know this; lovers do, too. No one can bring us to the boiling point as quickly as our partner (except maybe our parents, the original locus of dependent rage). Love is always accompanied by hate.

Most fans of kinky sex, at least those I’ve encountered, are drawn by the erotics of power and not, as it may appear to an outsider, by violence or pain.

The social critic Camille Paglia sees this rise in domination and submission as a collective fantasy that tweaks the rough spots of our egalitarian culture. It seems to me that rituals of domination and submission are a subversive way to put one over on a society that glorifies control, belittles dependency, and demands equality. In cultures where these values are at a premium—America, for example—we find more and more people seeking to give up control, revel in dependency, and recognize the very inequities no one wants to talk about.

More often than not, the beauty and flow of a sexual encounter unfurl in a safe, noncompetitive, and non-result-oriented atmosphere. Sensuality simply doesn’t lend itself to the rigors of scorekeeping.

There’s an evolutionary anthropologist named Helen Fisher who explains that lust is metabolically expensive. It’s hard to sustain after the evolutionary payoff: the kids. You become so focused on the incessant demands of daily life that you short-circuit any electric charge between you.

We find the same polarities in every system: stability and change, passion and reason, personal interest and collective well-being, action and reflection (to name but a few). These tensions exist in individuals, in couples, and in large organizations. They express dynamics that are part of the very nature of reality.

The tension between security and adventure is a paradox to manage, not a problem to solve.

It’s also worth noting that in Europe, teenagers engage in sexual activity an average of two years later than their American counterparts, and the rate at which teenagers give birth is a staggering eight times less. How is it that American society, with such a clear bias against teen sex, produces such a statistical embarrassment? Taboo-ridden

No history has a more lasting effect on our adult loves than the one we write with our primary caregivers.

Our sexual preferences arise from the thrills, challenges, and conflicts of our early life. How these bear on our threshold for closeness and pleasure is the object of our excavation. What turns you on and what turns you off? What draws you in? What leaves you cold? Why? How much closeness can you stand to feel? Can you tolerate pleasure with the one you love?

Our physical and emotional dependence on our parents surpasses that of any other living species, in both magnitude and duration. It is so complete—and our need to feel safe is so profound—that we will do anything not to lose them.

It takes two people to create a pattern, but only one to change it.

Over the years I’ve met more than a few people like James and Stella, couples whose otherwise colorful relationship teeters on the brink of sensual austerity.

Safety and stability take on a whole new meaning when children enter the picture. Read any parenting book about infants and toddlers and what you’ll find over and over is an emphasis on routine, predictability, and regularity. For children to feel confident enough to go out into the world and explore on their own, they need a secure base. Parenthood demands that we become steady, dependable, and responsible.

What I see over and over is that the person who takes on the role of primary caretaker almost always undergoes changes similar to Stephanie’s: a total immersion in the lives and rhythms of the children, a loss of self, and a greater difficulty extricating himself or herself from chores (a compulsion that is simultaneously frustrating and grounding).

Our fantasies allow us to negate and undo the limits imposed on us by our conscience, by our culture, and by our self-image. If we feel insecure and unattractive, in our fantasies we are irresistible. If we anticipate a withholding woman, in fantasy she’s insatiable. If we fear our own aggression, in our internal reveries we can feel powerful without worrying that we might hurt another.

What turns us on often collides with our preferred self-image, or with our moral and ideological convictions. Ergo the feminist who longs to be dominated; the survivor of sexual abuse who infuses her personal erotics with her traumatic experiences; the husband who fantasizes about the au pair (the stripper, the masseuse, the porn star) in order to boost his enjoyment with his wife; the mother who finds the skin-to-skin contact with her baby sensuous and, yes, erotic; the wife who masturbates to images of hot sex with the psychopathic boyfriend she knew she was never going to marry; the lover who needs to think about the hunk he spotted at the gym in order to get off with his boyfriend.

The point about sexual fantasy is that it involves pretending. It’s a simulation, a performance—not the real thing, and not necessarily a desire for the real thing. Like dreams and works of art, fantasies are far more than what they appear to be on the surface. They’re complex psychic creations whose symbolic content mustn’t be translated into literal intent.

Heterosexual pornography, predominantly produced by and for men, concerns itself almost exclusively with what the sociologist Anthony Giddens calls “low emotion, high intensity sex.” In part, it meets the need of many men to compartmentalize their sexual and emotional lives, and to separate their secure relationships from their rash urges.

Our erotic imagination is an exuberant expression of our aliveness, and one of the most powerful tools we have for keeping desire alive. Giving voice to our fantasies can liberate us from the many personal and social obstacles that stand in the way of pleasure. Understanding what our fantasies do for us will help us understand what it is we’re seeking, sexually and emotionally. In our erotic daydreams, we find the energy that keeps us passionately awake to our own sexuality.

The bonds of wedlock are so heavy that it takes two to carry them, sometimes three. —Alexandre Dumas

Despite a 50 percent divorce rate for first marriages and 65 percent the second time around; despite the staggering frequency of affairs; despite the fact that monogamy is a ship sinking faster than anyone can bail it out, we continue to cling to the wreckage with absolute faith in its structural soundness.

Historically, monogamy was an externally imposed system of control over women’s reproduction. “Which child is mine? Who gets the cows when I die?” Fidelity, as a mainstay of patriarchal society, was about lineage and property; it had nothing to do with love.

The exclusiveness we seek in monogamy has roots in our earliest experience of intimacy with our primary caretakers. The feminist psychoanalyst Nancy Chodorow writes, “This primary tendency, I shall be loved always, everywhere, in every way, my whole body, my whole being—without any criticism, without the slightest effort on my part—is the final aim of all erotic striving.” In our adult love we seek to recapture the primordial oneness we felt with Mom. The baby knows no separateness.

In a culture where everything is disposable and downsizing confirms just how replaceable we really are, our need to feel secure in our primary relationship is all the greater. The smaller we feel in the world, the more we need to shine in the eyes of our partner. We want to know that we matter, and that, for at least one person, we are irreplaceable. We long to feel whole, to rise above the prison of our solitude.

I question the widespread view that infidelity is always a symptom of deeper problems in a relationship. Affairs are motivated by myriad forces; not all of them are directly related to flaws in the marriage. As it happens, plenty of adulterers are reasonably content in their relationships.

Most American couples therapists believe that affairs must be disclosed if intimacy is to be rebuilt. This idea goes hand in hand with our model of intimate love, which celebrates transparency—having no secrets, telling no lies, sharing everything. In fact, some people condemn the deception even more than the transgression: “It’s not that you cheated, it’s that you lied to me!” To the American way of thinking, respect is bound up with honesty, and honesty is essential to personal responsibility.

In other cultures, respect is more likely to be expressed with gentle untruths that aim at preserving the partner’s honor. A protective opacity is preferable to telling truths that might result in humiliation.

At the boundary of every couple lives the third. He’s the high school sweetheart whose hands you still remember, the pretty cashier, the handsome fourth-grade teacher you flirt with when you pick your son up at school. The smiling stranger on the subway is the third. So, too, are the stripper, the porn star, and the sex worker, whether touched or untouched. He is the one a woman fantasizes about when she makes love to her husband. Increasingly, she can be found on the Internet. Real or imagined, embodied or not, the third is the fulcrum on which a couple balances. The third is the manifestation of our desire for what lies outside the fence. It is the forbidden.

Laura Kipnis says, “What is more anxiogenic than a partner’s freedom, which might mean the freedom not to love you, or to stop loving you, or to love someone else, or to become a different person than the one who once pledged to love you always and now…perhaps doesn’t?”

For these couples, fidelity is defined not by sexual exclusivity but by the strength of their commitment. The boundaries aren’t physical but emotional. The primacy of the couple remains paramount. The couples stress emotional monogamy as a sine qua non, and from there they make all sorts of sexual allowances.

I’d like to suggest that we view monogamy not as a given but as a choice. As such, it becomes a negotiated decision. More to the point, if we’re planning to spend fifty years with one soul—and we want a happy jubilee—it may be wiser to review our contract at various junctures. Just how accommodating each couple may be to the third varies. But at least a nod is more apt to sustain desire with our one and only over the long haul—and perhaps even to create a new “art of loving” for the twenty-first century couple.

Even the biochemistry of passion is known to be short-lived. The evolutionary anthropologist Helen Fisher says that the hormonal cocktail of romance (dopamine, norepineprine, and PEA) is known to last no more than a few years at best.

Oscar Wilde wrote, “In this world there are only two tragedies. One is getting what one wants, and the other is not getting it.”

I believe that longing, waiting, and yearning are fundamental elements of desire that can be generated with forethought, even in long-term relationships.

Like all couples, they go through periods when desire is dormant—when they are estranged from each other, or simply immersed in their own projects and in their own lives—but they don’t panic, terrified that something is fundamentally wrong with them. They know that erotic intensity waxes and wanes, that desire suffers periodic eclipses and intermittent disappearances. But given sufficient attention, they can bring the frisson back.

For them, love is a vessel that contains both security and adventure, and commitment offers one of the great luxuries of life: time. Marriage is not the end of their romance, it’s the beginning. They know that they have years in which to deepen their connection, to experiment, to regress, and even to fail. They see their relationship as something alive and ongoing, not a fait accompli. It’s a story that they are writing together, one with many chapters, and neither partner knows how it will end. There’s always a place they haven’t gone yet, always something about the other still to be discovered.

Categories
Personal Growth Psychology

Adam Grant – Think Again

With all due respect to the lessons of experience, I prefer the rigor of evidence. When a trio of psychologists conducted a comprehensive review of thirty-three studies, they found that in every one, the majority of answer revisions were from wrong to right. This phenomenon is known as the first-instinct fallacy.

Part of the problem is cognitive laziness. Some psychologists point out that we’re mental misers: we often prefer the ease of hanging on to old views over the difficulty of grappling with new ones. Yet there are also deeper forces behind our resistance to rethinking. Questioning ourselves makes the world more unpredictable. It requires us to admit that the facts may have changed, that what was once right may now be wrong. Reconsidering something we believe deeply can threaten our identities, making it feel as if we’re losing a part of ourselves.

When it comes to our knowledge and opinions, though, we tend to stick to our guns. Psychologists call this seizing and freezing. We favor the comfort of conviction over the discomfort of doubt, and we let our beliefs get brittle long before our bones. We laugh at people who still use Windows 95, yet we still cling to opinions that we formed in 1995. We listen to views that make us feel good, instead of ideas that make us think hard.

Mike Lazaridis dreamed up the idea for the BlackBerry as a wireless communication device for sending and receiving emails. As of the summer of 2009, it accounted for nearly half of the U.S. smartphone market. By 2014, its market share had plummeted to less than 1 percent.

When a company takes a nosedive like that, we can never pinpoint a single cause of its downfall, so we tend to anthropomorphize it: BlackBerry failed to adapt. Yet adapting to a changing environment isn’t something a company does—it’s something people do in the multitude of decisions they make every day.

Most of us take pride in our knowledge and expertise, and in staying true to our beliefs and opinions. That makes sense in a stable world, where we get rewarded for having conviction in our ideas. The problem is that we live in a rapidly changing world, where we need to spend as much time rethinking as we do thinking.

With advances in access to information and technology, knowledge isn’t just increasing. It’s increasing at an increasing rate. In 2011, you consumed about five times as much information per day as you would have just a quarter century earlier. As of 1950, it took about fifty years for knowledge in medicine to double. By 1980, medical knowledge was doubling every seven years, and by 2010, it was doubling in half that time.

Researchers have recently discovered that we need to rethink widely accepted assumptions about such subjects as Cleopatra’s roots (her father was Greek, not Egyptian, and her mother’s identity is unknown); the appearance of dinosaurs (paleontologists now think some tyrannosaurs had colorful feathers on their backs); and what’s required for sight (blind people have actually trained themselves to “see”—sound waves can activate the visual cortex and create representations in the mind’s eye, much like how echolocation helps bats navigate in the dark). Vintage records, classic cars, and antique clocks might be valuable collectibles, but outdated facts are mental fossils that are best abandoned.

Two decades ago my colleague Phil Tetlock discovered something peculiar. As we think and talk, we often slip into the mindsets of three different professions:preachers, prosecutors, and politicians. In each of these modes, we take on a particular identity and use a distinct set of tools. We go into preacher mode when our sacred beliefs are in jeopardy: we deliver sermons to protect and promote our ideals. We enter prosecutor mode when we recognize flaws in other people’s reasoning: we marshal arguments to prove them wrong and win our case. We shift into politician mode when we’re seeking to win over an audience: we campaign and lobby for the approval of our constituents. The risk is that we become so wrapped up in preaching that we’re right, prosecuting others who are wrong, and politicking for support that we don’t bother to rethink our own views.

The entrepreneurs arrived in Milan for a training program in entrepreneurship. Over the course of four months, they learned to create a business strategy, interview customers, build a minimum viable product, and then refine a prototype. What they didn’t know was that they’d been randomly assigned to either a “scientific thinking” group or a control group. The training for both groups was identical, except that one was encouraged to view startups through a scientist’s goggles.

From that perspective, their strategy is a theory, customer interviews help to develop hypotheses, and their minimum viable product and prototype are experiments to test those hypotheses. Their task is to rigorously measure the results and make decisions based on whether their hypotheses are supported or refuted.

Over the following year, the startups in the control group averaged under $300 in revenue. The startups in the scientific thinking group averaged over $12,000 in revenue. They brought in revenue more than twice as fast—and attracted customers sooner, too.

Mental horsepower doesn’t guarantee mental dexterity. No matter how much brainpower you have, if you lack the motivation to change your mind, you’ll miss many occasions to think again. Research reveals that the higher you score on an IQ test, the more likely you are to fall for stereotypes, because you’re faster at recognizing patterns. And recent experiments suggest that the smarter you are, the more you might struggle to update your beliefs.

My favorite bias is the “I’m not biased” bias, in which people believe they’re more objective than others. It turns out that smart people are more likely to fall into this trap. The brighter you are, the harder it can be to see your own limitations.

In the case of the BlackBerry, Mike Lazaridis was trapped in an overconfidence cycle. Taking pride in his successful invention gave him too much conviction. Nowhere was that clearer than in his preference for the keyboard over a touchscreen. It was a BlackBerry virtue he loved to preach—and an Apple vice he was quick to prosecute.

The legend of Apple’s renaissance revolves around the lone genius of Steve Jobs. It was his conviction and clarity of vision, the story goes, that gave birth to the iPhone. The reality is that he was dead-set against the mobile phone category. His employees had the vision for it, and it was their ability to change his mind that really revived Apple. Although Jobs knew how to “think different,” it was his team that did much of the rethinking.

Research shows that when people are resistant to change, it helps to reinforce what will stay the same. Visions for change are more compelling when they include visions of continuity. Although our strategy might evolve, our identity will endure.

You’ve probably met some football fans who are convinced they know more than the coaches on the sidelines. That’s the armchair quarterback syndrome, where confidence exceeds competence.

The opposite of armchair quarterback syndrome is impostor syndrome, where competence exceeds confidence.

We’re all novices at many things, but we’re not always blind to that fact. We tend to overestimate ourselves on desirable skills, like the ability to carry on a riveting conversation. We’re also prone to overconfidence in situations where it’s easy to confuse experience for expertise, like driving, typing, trivia, and managing emotions. Yet we underestimate ourselves when we can easily recognize that we lack experience—like painting, driving a race car, and rapidly reciting the alphabet backward. Absolute beginners rarely fall into the Dunning-Kruger trap.

It’s when we progress from novice to amateur that we become overconfident. A bit of knowledge can be a dangerous thing.

“Arrogance is ignorance plus conviction,” blogger Tim Urban explains. “While humility is a permeable filter that absorbs life experience and converts it into knowledge and wisdom, arrogance is a rubber shield that life experience simply bounces off of.”

Humility is often misunderstood. It’s not a matter of having low self-confidence. One of the Latin roots of humility means “from the earth.” It’s about being grounded—recognizing that we’re flawed and fallible.

You can be confident in your ability to achieve a goal in the future while maintaining the humility to question whether you have the right tools in the present. That’s the sweet spot of confidence.

Uncertainty primes us to ask questions and absorb new ideas. It protects us against the Dunning-Kruger effect. “Impostor syndrome always keeps me on my toes and growing because I never think I know it all,” Halla reflects, sounding more like a scientist than a politician.

Arrogance leaves us blind to our weaknesses. Humility is a reflective lens: it helps us see them clearly. Confident humility is a corrective lens: it enables us to overcome those weaknesses.

I found a Nobel Prize–winning scientist and two of the world’s top election forecasters. They aren’t just comfortable being wrong; they actually seem to be thrilled by it. I think they can teach us something about how to be more graceful and accepting in moments when we discover that our beliefs might not be true. The goal is not to be wrong more often. It’s to recognize that we’re all wrong more often than we’d like to admit, and the more we deny it, the deeper the hole we dig for ourselves.

In a classic paper, sociologist Murray Davis argued that when ideas survive, it’s not because they’re true—it’s because they’re interesting. What makes an idea interesting is that it challenges our weakly held opinions.

When a core belief is questioned, though, we tend to shut down rather than open up. It’s as if there’s a miniature dictator living inside our heads, controlling the flow of facts to our minds, much like Kim Jong-un controls the press in North Korea. The technical term for this in psychology is the totalitarian ego, and its job is to keep out threatening information.

It’s easy to see how an inner dictator comes in handy when someone attacks our character or intelligence. Those kinds of personal affronts threaten to shatter aspects of our identities that are important to us and might be difficult to change. The totalitarian ego steps in like a bodyguard for our minds, protecting our self-image by feeding us comforting lies.

As physicist Richard Feynman quipped, “You must not fool yourself—and you are the easiest person to fool.”

Neuroscientists find that when our core beliefs are challenged, it can trigger the amygdala, the primitive “lizard brain” that breezes right past cool rationality and activates a hot fight-or-flight response. The anger and fear are visceral: it feels as if we’ve been punched in the mind. The totalitarian ego comes to the rescue with mental armor.

Discovering I was wrong felt joyful because it meant I’d learned something. As Daniel Kahnemann told me, “Being wrong is the only way I feel sure I’ve learned anything.”

The students who found it stressful didn’t know how to detach. Their opinions were their identities. An assault on their worldviews was a threat to their very sense of self.

A few years ago I surveyed hundreds of new teams in Silicon Valley on conflict several times during their first six months working together. Even if they argued constantly and agreed on nothing else, they agreed on what kind of conflict they were having. When their projects were finished, I asked their managers to evaluate each team’s effectiveness. The teams that performed poorly started with more relationship conflict than task conflict. They entered into personal feuds early on and were so busy disliking one another that they didn’t feel comfortable challenging one another. It took months for many of the teams to make real headway on their relationship issues, and by the time they did manage to debate key decisions, it was often too late to rethink their directions.

“The absence of conflict is not harmony, it’s apathy.”

Although productive disagreement is a critical life skill, it’s one that many of us never fully develop. The problem starts early: parents disagree behind closed doors, fearing that conflict will make children anxious or somehow damage their character. Yet research shows that how often parents argue has no bearing on their children’s academic, social, or emotional development. What matters is how respectfully parents argue, not how frequently. Kids whose parents clash constructively feel more emotionally safe in elementary school, and over the next few years they actually demonstrate more helpfulness and compassion toward their classmates.

In a classic study, highly creative architects were more likely than their technically competent but less original peers to come from homes with plenty of friction. They often grew up in households that were “tense but secure,” as psychologist Robert Albert notes: “The creative person-to-be comes from a family that is anything but harmonious, one with a ‘wobble.’”

Disagreeable people tend to be more critical, skeptical, and challenging—and they’re more likely than their peers to become engineers and lawyers. They’re not just comfortable with conflict; it energizes them. If you’re highly disagreeable, you might be happier in an argument than in a friendly conversation.

One experiment, when people were criticized rather than praised by a partner, they were over four times more likely to request a new partner. Across a range of workplaces, when employees received tough feedback from colleagues, their default response was to avoid those coworkers or drop them from their networks altogether—and their performance suffered over the following year.

Agreeableness is about seeking social harmony, not cognitive consensus. It’s possible to disagree without being disagreeable. Although I’m terrified of hurting other people’s feelings, when it comes to challenging their thoughts, I have no fear. In fact, when I argue with someone, it’s not a display of disrespect—it’s a sign of respect.

My favorite demonstration is an experiment by my colleagues Jennifer Chatman and Sigal Barsade. Agreeable people were significantly more accommodating than disagreeable ones—as long as they were in a cooperative team. When they were assigned to a competitive team, they acted just as disagreeably as their disagreeable teammates.

When they argued about the propeller, the Wright brothers were making a common mistake. Each was preaching about why he was right and why the other was wrong. When we argue about why, we run the risk of becoming emotionally attached to our positions and dismissive of the other side’s. We’re more likely to have a good fight if we argue about how.

One difference was visible before anyone even arrived at the bargaining table. Prior to the negotiations, the researchers interviewed both groups about their plans. The average negotiators went in armed for battle, hardly taking note of any anticipated areas of agreement. The experts, in contrast, mapped out a series of dance steps they might be able to take with the other side, devoting more than a third of their planning comments to finding common ground.

The more reasons we put on the table, the easier it is for people to discard the shakiest one. Once they reject one of our justifications, they can easily dismiss our entire case.

Harish started by emphasizing common ground. When he took the stage for his rebuttal, he immediately drew attention to his and Debra’s areas of agreement. “So,” he began, “I think we disagree on far less than it may seem.” He called out their alignment on the problem of poverty—and on the validity of some of the studies—before objecting to subsidies as a solution.

Most people immediately start with a straw man, poking holes in the weakest version of the other side’s case. He does the reverse: he considers the strongest version of their case, which is known as the steel man.

“If you have too many arguments, you’ll dilute the power of each and every one,” he told me. “They are going to be less well explained, and I don’t know if any of them will land enough—I don’t think the audience will believe them to be important enough. Most top debaters aren’t citing a lot of information.”

If they’re not invested in the issue or they’re receptive to our perspective, more reasons can help: people tend to see quantity as a sign of quality. The more the topic matters to them, the more the quality of reasons matters. It’s when audiences are skeptical of our view, have a stake in the issue, and tend to be stubborn that piling on justifications is most likely to backfire. If they’re resistant to rethinking, more reasons simply give them more ammunition to shoot our views down.

Psychologists have long found that the person most likely to persuade you to change your mind is you. You get to pick the reasons you find most compelling, and you come away with a real sense of ownership over them.

In a heated argument, you can always stop and ask, “What evidence would change your mind?” If the answer is “nothing,” then there’s no point in continuing the debate. You can lead a horse to water, but you can’t make it think.

A few years ago, I argued in my book Originals that if we want to fight groupthink, it helps to have “strong opinions, weakly held.” Since then I’ve changed my mind—I now believe that’s a mistake. If we hold an opinion weakly, expressing it strongly can backfire. Communicating it with some uncertainty signals confident humility, invites curiosity, and leads to a more nuanced discussion.

Research shows that in courtrooms, expert witnesses and deliberating jurors are more credible and more persuasive when they express moderate confidence, rather than high or low confidence.

In every human society, people are motivated to seek belonging and status. Identifying with a group checks both boxes at the same time: we become part of a tribe, and we take pride when our tribe wins. In classic studies on college campuses, psychologists found that after their team won a football game, students were more likely to walk around wearing school swag.

Socially, there’s another reason stereotypes are so sticky. We tend to interact with people who share them, which makes them even more extreme. This phenomenon is called group polarization, and it’s been demonstrated in hundreds of experiments.

Citizens who start out with a clear belief on affirmative action and gay marriage develop more extreme views on these issues after talking with a few others who share their stance. Their preaching and prosecuting move in the direction of their politics. Polarization is reinforced by conformity: peripheral members fit in and gain status by following the lead of the most prototypical member of the group, who often holds the most intense views.

Upon returning from space, astronauts are less focused on individual achievements and personal happiness, and more concerned about the collective good. “You develop an instant global consciousness . . . an intense dissatisfaction with the state of the world, and a compulsion to do something about it,” Apollo 14 astronaut Edgar Mitchell reflected. “From out there on the moon, international politics looks so petty. You want to grab a politician by the scruff of the neck and drag him a quarter of a million miles out and say, ‘Look at that, you son of a b*tch.’”

In one experiment, psychologists randomly assigned Manchester United soccer fans a short writing task. They then staged an emergency in which a passing runner slipped and fell, screaming in pain as he held his ankle. He was wearing the T-shirt of their biggest rival, and the question was whether they would stop to help him. If the soccer fans had just written about why they loved their team, only 30 percent helped. If they had written about what they had in common with other soccer fans, 70 percent helped.

In an ideal world, learning about individual group members will humanize the group, but often getting to know a person better just establishes her as different from the rest of her group. When we meet group members who defy a stereotype, our first instinct isn’t to see them as exemplars and rethink the stereotype. It’s to see them as exceptions and cling to our existing beliefs.

We found that it was thinking about the arbitrariness of their animosity—not the positive qualities of their rival—that mattered. Regardless of whether they generated reasons to like their rivals, fans showed less hostility when they reflected on how silly the rivalry was.

Research suggests that there are more similarities between groups than we recognize. And there’s typically more variety within groups than between them.

Since people held unfounded fears about vaccines, it was time to educate them with a dose of the truth. The results were often disappointing. In a pair of experiments in Germany, introducing people to the research on vaccine safety backfired: they ended up seeing vaccines as riskier. Similarly, when Americans read accounts of the dangers of measles, saw pictures of children suffering from it, or learned of an infant who nearly died from it, their interest in vaccination didn’t rise at all. And when they were informed that there was no evidence that the measles vaccine causes autism, those who already had concerns actually became less interested in vaccinating.

Together, they developed the core principles of a practice called motivational interviewing. The central premise is that we can rarely motivate someone else to change. We’re better off helping them find their own motivation to change.

Motivational interviewing starts with an attitude of humility and curiosity. We don’t know what might motivate someone else to change, but we’re genuinely eager to find out.

Before Marie-Hélène left the hospital, she had Tobie vaccinated. A key turning point, she recalls, was when Arnaud “told me that whether I chose to vaccinate or not, he respected my decision as someone who wanted the best for my kids. Just that sentence—to me, it was worth all the gold in the world.”

Overall, motivational interviewing has a statistically and clinically meaningful effect on behavior change in roughly three out of four studies, and psychologists and physicians using it have a success rate of four in five. There aren’t many practical theories in the behavioral sciences with a body of evidence this robust.

Motivational interviewing isn’t limited to professional settings—it’s relevant to everyday decisions and interactions. One day a friend called me for advice on whether she should get back together with her ex. I was a fan of the idea, but I didn’t think it was my place to tell her what to do. Instead of offering my opinion, I asked her to walk through the pros and cons and tell me how they stacked up against what she wanted in a partner. She ended up talking herself into rekindling the relationship. The conversation felt like magic, because I hadn’t tried to persuade her or even given any advice.

To protect their freedom, instead of giving commands or offering recommendations, a motivational interviewer might say something along the lines of “Here are a few things that have helped me—do you think any of them might work for you?”

Motivational interviewing pioneers Miller and Rollnick have long warned that the technique shouldn’t be used manipulatively. Psychologists have found that when people detect an attempt at influence, they have sophisticated defense mechanisms. The moment people feel that we’re trying to persuade them, our behavior takes on a different meaning. A straightforward question is seen as a political tactic, a reflective listening statement comes across as a prosecutor’s maneuvering, an affirmation of their ability to change sounds like a preacher’s proselytizing.

We can all get better at asking “truly curious questions that don’t have the hidden agenda of fixing, saving, advising, convincing or correcting,” journalist Kate Murphy writes, and helping to “facilitate the clear expression of another person’s thoughts.”

As Betty muses, “Even the devil appreciates being listened to.”

Inverse charisma. What a wonderful turn of phrase to capture the magnetic quality of a great listener.

Hearing an opposing opinion doesn’t necessarily motivate you to rethink your own stance; it makes it easier for you to stick to your guns (or your gun bans). Presenting two extremes isn’t the solution; it’s part of the polarization problem. Psychologists have a name for this: binary bias. It’s a basic human tendency to seek clarity and closure by simplifying a complex continuum into two categories. To paraphrase the humorist Robert Benchley, there are two kinds of people: those who divide the world into two kinds of people, and those who don’t.

An antidote to this proclivity is complexifying: showcasing the range of perspectives on a given topic. We might believe we’re making progress by discussing hot-button issues as two sides of a coin, but people are actually more inclined to think again if we present these topics through the many lenses of a prism. To borrow a phrase from Walt Whitman, it takes a multitude of views to help people realize that they too contain multitudes.

If people read the binary version of the article, they defended their own perspective more often than they showed an interest in their opponent’s. If they read the complexified version, they made about twice as many comments about common ground as about their own views. They asserted fewer opinions and asked more questions. At the end of the conversation, they generated more sophisticated, higher-quality position statements—and both parties came away more satisfied.

Yet by 2018, only 59 percent of Americans saw climate change as a major threat—and 16 percent believed it wasn’t a threat at all.

To overcome binary bias, a good starting point is to become aware of the range of perspectives across a given spectrum. Polls suggest that on climate change, there are at least six camps of thought. Believers represent more than half of Americans, but some are concerned while others are alarmed. The so-called nonbelievers actually range from cautious to disengaged to doubtful to dismissive.

Although no more than 10 percent of Americans are dismissive of climate change, it’s these rare deniers who get the most press. In an analysis of some hundred thousand media articles on climate change between 2000 and 2016, prominent climate contrarians received disproportionate coverage: they were featured 49 percent more often than expert scientists. As a result, people end up overestimating how common denial is—which in turn makes them more hesitant to advocate for policies that protect the environment.

And multiple experiments have shown that when experts express doubt, they become more persuasive. When someone knowledgeable admits uncertainty, it surprises people, and they end up paying more attention to the substance of the argument.

In a series of experiments, psychologists demonstrated that when news reports about science included caveats, they succeeded in capturing readers’ interest and keeping their minds open.

If Peterson had bothered to read the comprehensive meta-analyses of studies spanning nearly two hundred jobs, he’d have discovered that—contrary to his claims—emotional intelligence is real and it does matter. Emotional intelligence tests predict performance even after controlling for IQ and personality. If Goleman hadn’t ignored those same data, he’d have learned that if you want to predict performance across jobs, IQ is more than twice as important as emotional intelligence (which accounts for only 3 to 8 percent of performance).

In a pair of experiments, randomly assigning people to reflect on the intentions and interests of their political opposites made them less receptive to rethinking their own attitudes on health care and universal basic income. Across twenty-five experiments, imagining other people’s perspectives failed to elicit more accurate insights—and occasionally made participants more confident in their own inaccurate judgments. Perspective-taking consistently fails because we’re terrible mind readers. We’re just guessing.

What works is not perspective-taking but perspective-seeking: actually talking to people to gain insight into the nuances of their views. That’s what good scientists do: instead of drawing conclusions about people based on minimal clues, they test their hypotheses by striking up conversations.

My favorite assignment of Erin’s is her final one. As a passionate champion of inquiry-based learning, she sends her eighth graders off to do self-directed research in which they inspect, investigate, interrogate, and interpret. Their active learning culminates in a group project: they pick a chapter from their textbook, choosing a time period that interests them and a theme in history that they see as underrepresented. Then they go off to rewrite it.

Evidence shows that if false scientific beliefs aren’t addressed in elementary school, they become harder to change later. “Learning counter-intuitive scientific ideas [is] akin to becoming a fluent speaker of a second language,” psychologist Deborah Kelemen writes. It’s “a task that becomes increasingly difficult the longer it is delayed, and one that is almost never achieved with only piecemeal instruction and infrequent practice.”

In a curriculum developed at Stanford, high school students are encouraged to critically examine what really caused the Spanish-American War, whether the New Deal was a success, and why the Montgomery bus boycott was a watershed moment. Some teachers even send students out to interview people with whom they disagree. The focus is less on being right, and more on building the skills to consider different views and argue productively about them.

Rethinking needs to become a regular habit. Unfortunately, traditional methods of education don’t always allow students to form that habit.

In this experiment the topic doesn’t matter: the teaching method is what shapes your experience. I expected active learning to win the day, but the data suggest that you and your roommate will both enjoy the subject more when it’s delivered by lecture.10 You’ll also rate the instructor who lectures as more effective—and you’ll be more likely to say you wish all your physics courses were taught that way.

In the physics experiment, the students took tests to gauge how much they had learned about statics and fluids. Despite enjoying the lectures more, they actually gained more knowledge and skill from the active-learning session. It required more mental effort, which made it less fun but led to deeper understanding.

A meta-analysis compared the effects of lecturing and active learning on students’ mastery of the material, cumulating 225 studies with over 46,000 undergraduates in science, technology, engineering, and math (STEM). Active-learning methods included group problem solving, worksheets, and tutorials. On average, students scored half a letter grade worse under traditional lecturing than through active learning—and students were 1.55 times more likely to fail in classes with traditional lecturing.

In North American universities, more than half of STEM professors spend at least 80 percent of their time lecturing, just over a quarter incorporate bits of interactivity, and fewer than a fifth use truly student-centered methods that involve active learning.

It turns out that although perfectionists are more likely than their peers to ace school, they don’t perform any better than their colleagues at work. This tracks with evidence that, across a wide range of industries, grades are not a strong predictor of job performance.

Achieving excellence in school often requires mastering old ways of thinking. Building an influential career demands new ways of thinking. In a classic study of highly accomplished architects, the most creative ones graduated with a B average. Their straight-A counterparts were so determined to be right that they often failed to take the risk of rethinking the orthodoxy. A similar pattern emerged in a study of students who graduated at the top of their class. “Valedictorians aren’t likely to be the future’s visionaries,” education researcher Karen Arnold explains. “They typically settle into the system instead of shaking it up.”

The following year, the class’s favorite idea took that rethinking a step further: the students hosted a day of “passion talks” on which anyone could teach the class about something he or she loved. We learned how to beatbox and design buildings that mesh with nature and make the world more allergy safe. From that point on, sharing passions has been part of class participation. All the students give a passion talk as a way of introducing themselves to their peers. Year after year, they tell me that it injects a heightened level of curiosity into the room, leaving them eager to soak up insights from each of their classmates.

When I was involved in a study at Google to identify the factors that distinguish teams with high performance and well-being, the most important differentiator wasn’t who was on the team or even how meaningful their work was. What mattered most was psychological safety.

I knew that changing the culture of an entire organization is daunting, while changing the culture of a team is more feasible. It starts with modeling the values we want to promote, identifying and praising others who exemplify them, and building a coalition of colleagues who are committed to making the change.

Some of the power distance evaporated—they were more likely to reach out to Melinda and other senior leaders with both criticism and compliments. One employee commented: In that video Melinda did something that I’ve not yet seen happen at the foundation: she broke through the veneer. It happened for me when she said, “I go into so many meetings where there are things I don’t know.” I had to write that down because I was shocked and grateful at her honesty. Later, when she laughed, like really belly-laughed, and then answered the hard comments, the veneer came off again and I saw that she was no less of Melinda Gates, but actually, a whole lot more of Melinda Gates.

Organizational learning should be an ongoing activity, but best practices imply it has reached an endpoint. We might be better off looking for better practices.

Focusing on results might be good for short-term performance, but it can be an obstacle to long-term learning. Sure enough, social scientists find that when people are held accountable only for whether the outcome was a success or failure, they are more likely to continue with ill-fated courses of action. Exclusively praising and rewarding results is dangerous because it breeds overconfidence in poor strategies, incentivizing people to keep doing things the way they’ve always done them.

Process accountability might sound like the opposite of psychological safety, but they’re actually independent. Amy Edmondson finds that when psychological safety exists without accountability, people tend to stay within their comfort zone, and when there’s accountability but not safety, people tend to stay silent in an anxiety zone. When we combine the two, we create a learning zone.

When we dedicate ourselves to a plan and it isn’t going as we hoped, our first instinct isn’t usually to rethink it. Instead, we tend to double down and sink more resources in the plan. This pattern is called escalation of commitment. Evidence shows that entrepreneurs persist with failing strategies when they should pivot, NBA general managers and coaches keep investing in new contracts and more playing time for draft busts, and politicians continue sending soldiers to wars that didn’t need to be fought in the first place. Sunk costs are a factor, but the most important causes appear to be psychological rather than economic. Escalation of commitment happens because we’re rationalizing creatures, constantly searching for self-justifications for our prior beliefs as a way to soothe our egos, shield our images, and validate our past decisions.

In some ways, identity foreclosure is the opposite of an identity crisis: instead of accepting uncertainty about who we want to become, we develop compensatory conviction and plunge head over heels into a career path. I’ve noticed that the students who are the most certain about their career plans at twenty are often the ones with the deepest regrets by thirty. They haven’t done enough rethinking along the way.

A first step is to entertain possible selves:identify some people you admire within or outside your field, and observe what they actually do at work day by day. A second step is to develop hypotheses about how these paths might align with your own interests, skills, and values. A third step is to test out the different identities by running experiments: do informational interviews, job shadowing, and sample projects to get a taste of the work. The goal is not to confirm a particular plan but to expand your repertoire of possible selves—which keeps you open to rethinking.

A second likely culprit is that we spend too much time striving for peak happiness, overlooking the fact that happiness depends more on the frequency of positive emotions than their intensity.

A third potential factor is that when we hunt for happiness, we overemphasize pleasure at the expense of purpose. This theory is consistent with data suggesting that meaning is healthier than happiness.

Psychologists find that passions are often developed, not discovered. In a study of entrepreneurs, the more effort they put into their startups, the more their enthusiasm about their businesses climbed each week. Their passion grew as they gained momentum and mastery. Interest doesn’t always lead to effort and skill; sometimes it follows them.

“Those only are happy,” philosopher John Stuart Mill wrote, “who have their minds fixed on some object other than their own happiness; on the happiness of others, on the improvement of mankind, even on some art or pursuit, followed not as a means, but as itself an ideal end. Aiming thus at something else, they find happiness by the way.”

It takes humility to reconsider our past commitments, doubt to question our present decisions, and curiosity to reimagine our future plans. What we discover along the way can free us from the shackles of our familiar surroundings and our former selves. Rethinking liberates us to do more than update our knowledge and opinions—it’s a tool for leading a more fulfilling life.

Define your identity in terms of values, not opinions. It’s easier to avoid getting stuck to your past beliefs if you don’t become attached to them as part of your present self-concept. See yourself as someone who values curiosity, learning, mental flexibility, and searching for knowledge. As you form opinions, keep a list of factors that would change your mind.

Don’t shy away from constructive conflict. Disagreements don’t have to be disagreeable. Although relationship conflict is usually counterproductive, task conflict can help you think again. Try framing disagreement as a debate: people are more likely to approach it intellectually and less likely to take it personally.

Ask “What evidence would change your mind?” You can’t bully someone into agreeing with you. It’s often more effective to inquire about what would open their minds, and then see if you can convince them on their own terms.

Ask how people originally formed an opinion. Many of our opinions, like our stereotypes, are arbitrary; we’ve developed them without rigorous data or deep reflection. To help people reevaluate, prompt them to consider how they’d believe different things if they’d been born at a different time or in a different place.

Acknowledge common ground. A debate is like a dance, not a war. Admitting points of convergence doesn’t make you weaker—it shows that you’re willing to negotiate about what’s true, and it motivates the other side to consider your point of view.

Categories
Personal Growth Psychology

Matthew Dicks – Storyworthy

How could you develop an ego or agenda to become internet- or podcast-famous (actual things, swear to god)? It’s a little like wanting to have the biggest house on the tiny-home scene.

Elysha has this consistent, annoying confidence in my abilities. She assumes that I’m capable of almost anything, which both undermines her appreciation for my abject terror and sets expectations far too high for my liking.

Your story must reflect change over time. A story cannot simply be a series of remarkable events. You must start out as one version of yourself and end as something new. The change can be infinitesimal. It need not reflect an improvement in yourself or your character, but change must happen. Even the worst movies in the world reflect some change in a character over time. So

Don’t tell other people’s stories. Tell your own. But feel free to tell your side of other people’s stories, as long as you are the protagonist in these tales.

A story is like a diamond with many facets. Everyone has a different relationship to it. If you can find a way of making your particular facet of the story compelling, you can tell that story as your own. Otherwise, leave the telling to someone else.

Lastly, the story must pass the Dinner Test. The Dinner Test is simply this: Is the story that you craft for the stage, the boardroom, the sales conference, or the Sunday sermon similar to the story you would tell a friend at dinner? This should be the goal.

This is why tiny moments like the one at my dining-room table with my wife and children often make the best stories. These are the moments that connect with people. These are the stories that touch people’s hearts.

I decided that at the end of every day, I’d reflect upon my day and ask myself one simple question: If I had to tell a story from today — a five-minute story onstage about something that took place over the course of this day — what would it be? As benign and boring and inconsequential as it might seem, what was the most storyworthy moment from my day?

I discovered that there is beauty and import in my life that I never would have imagined before doing my homework, and that these small, unexpected moments of beauty are oftentimes some of my most compelling stories.

All of this happens because I sit down every evening and ask myself: What is my story from today? What is the thing about today that has made it different from any previous day? Then I write my answer down.

As you start to see importance and meaning in each day, you suddenly understand your importance to this world. You start to see how the meaningful moments that we experience every day contribute to the lives of others and to the world. You start to sense the critical nature of your very existence. There are no more throwaway days. Every day can change the world in some small way. In fact, every day has been changing the world for as long as you’ve been alive. You just haven’t noticed yet.

It may take you a month, six months, or even a year to refine and focus your storytelling lens. You might give up five minutes of your day for an entire year and receive nothing in return. This process requires you to believe that eventually you will begin seeing these moments in your life, just as I and so many others have.

The reason is simple: We are the sum of our experiences, the culmination of everything that has come before. The more we know about our past, the better we know ourselves. The greater our storehouse of memory, the more complete our personal narrative becomes. Our life begins to feel full and complete and important.

There are many secrets to storytelling, but there is one fundamental truth above all others that must be understood before a storyteller can ever be successful: All great stories — regardless of length or depth or tone — tell the story of a five-second moment in a person’s life.

These five-second moments are the moments in your life when something fundamentally changes forever. You fall in love. You fall out of love. You discover something new about yourself or another person. Your opinion on a subject dramatically changes. You find forgiveness. You reach acceptance. You sink into despair. You grudgingly resign. You’re drowned in regret. You make a life-altering decision. Choose a new path. Accomplish something great. Fail spectacularly.

Many times storytellers fail to understand the importance of these five-second moments. They see the big when they should be looking for the small. They come to me and say, “I went to Tanzania last summer. I want to tell that story onstage.” My answer is always the same: No. Visiting Tanzania is not a story. Your ability to travel the world does not mean that you can tell a good story or even have a good story to tell. But if something happened in Tanzania that altered you in some deep and fundamental way, then you might have a story. If you experienced a five-second moment in Tanzania, you might have something. Think of it this way: If we remove Tanzania from the story, do you still have a story worth telling?

Like Jurassic Park, the real story isn’t about the big thing. In fact, when people talk to me about the story, they rarely mention the car accident or my near-death experience. Instead, they speak about my five-second moment, when I find myself alone in the emergency room two hours after the accident, waiting for surgeons to operate on my ruined legs. Upon hearing that I was in stable condition, my parents decide to check on the car before checking on me, leaving me alone, frightened, and in terrible pain in the corner of a cold, sterile emergency room. Except it turns out that I’m not alone, because my friends from McDonald’s find out about the accident and quickly fill the waiting room, making the kind of noise that only a gang of teenagers can make.

This was my five-second moment. It was the moment when I realized that I had family after all. My friends were my family, and they remained the only family I had and the only family I needed until I met my wife fifteen years later. It might be the greatest five-second moment of my life.

So how do you choose the right place to start a story? Simple. Ask yourself where your story ends. What is the meaning of your five-second moment? Say it aloud. In “Charity Thief,” I might say it like this: “I thought I was alone in this world, facing a lifetime of loneliness. Then I met a man who taught me that I knew very little about loneliness and never wanted to know loneliness the way that man knew it on that day and probably many, many days thereafter.”

Simply put, the beginning of the story should be the opposite of the end. Find the opposite of your transformation, revelation, or realization, and this is where your story should start. This is what creates an arc in your story. This is how a story shows change over time. I was once this, but now I am this. I once thought this, but now I think this. I once felt this, but now I feel this. Stories must reflect change of some kind.

Regardless of whether your change is infinitesimal or profound, positive or negative, your story must reflect change. You must begin and end your story in entirely different states of being. Change is key. The story of how you’re an amazing person who did an amazing thing and ended up in an amazing place is not a story. It’s a recipe for a douchebag. The story of how you’re a pathetic person who did a pathetic thing and remained pathetic is also not a story. It’s a recipe for a sad sack.

Listen to people in the world tell you stories. Often they start with a sentence like, “This is hilarious,” or “You need to hear this,” or “You’re not going to believe this.” This is always a mistake, for three reasons. First, it establishes potentially unrealistic expectations.

Second, starting your story with a thesis statement reduces your chances of surprising your audience. When you tell me that the story is hilarious, I’m already primed for humor.

Pay attention to the opening scenes of movies. So many of them use this strategy as well. We open on the protagonist or someone similarly important to the story. That person will be moving. Walking. Running. Driving. Flying. Climbing. Fleeing. Falling. Swimming. Crawling. Diving. Filmmakers want to immerse you into their world as quickly as possible. They want you to forget the theater and the popcorn and the jackass who is texting beside you. They want you to be absorbed by the story. They want you to forget that you even exist for the duration of the film.

Every story must have an Elephant. The Elephant is the thing that everyone in the room can see. It is large and obvious. It is a clear statement of the need, the want, the problem, the peril, or the mystery. It signifies where the story is headed, and it makes it clear to your audience that this is in fact a story and not a simple musing on a subject.

The audience doesn’t know why they are listening to the story or what is to come, so it’s easy to stop listening. If you don’t present a reason to listen very early on, you risk losing their attention altogether.

Eventually the Elephant in my story changes color. The story isn’t really about escaping New Hampshire at all. It’s really a story about understanding the nature of loneliness. I change the color of the Elephant halfway through this story. I present the audience with one Elephant, but then I paint it another color. I trick them. This is an excellent storytelling strategy: make your audience think they are on one path, and then when they least expect it, show them that they have been on a different path all along.

A Backpack is a strategy that increases the stakes of the story by increasing the audience’s anticipation about a coming event. It’s when a storyteller loads up the audience with all the storyteller’s hopes and fears in that moment before moving the story forward. It’s an attempt to do two things: 1. Make the audience wonder what will happen next. 2. Make your audience experience the same emotion, or something like the same emotion, that the storyteller experienced in the moment about to be described.

This is why heist movies like the Ocean’s Eleven franchise explain almost every part of the robbers’ plan before they ever make a move. If you understand their plan to rob the casino, you can experience the same level of frustration, worry, fear, and suspense that the characters feel when their plans go awry. The filmmakers put the audience on Danny Ocean’s team. They know the plan, so they feel as if they are a part of the heist themselves.

Storytellers use Breadcrumbs when we hint at a future event but only reveal enough to keep the audience guessing.

In “Charity Thief,” I drop a Breadcrumb when I say: But as I climb back into the car, I see my crumpled McDonald’s uniform on the backseat, and I suddenly have an idea.

This is the moment to use an Hourglass. It’s time to slow things down. Grind them to a halt when possible. When you know the audience is hanging on your every word, let them hang. Drag out the wait as long as possible.

It’s the perfect time to use an Hourglass. Stakes. The desire of an audience to hear the next sentence, made greater by the deliberate slowing down of action and pace. Find the moment in your story that everyone has been waiting for, then flip that Hourglass and let the sand run.

The Crystal Ball is the easiest of the strategies to deploy, because you already use Crystal Balls in everyday life. A Crystal Ball is a false prediction made by a storyteller to cause the audience to wonder if the prediction will prove to be true.

Memory is a slippery thing, and as storytellers, we must remember this. Research suggests that every time you tell a story, it becomes less true.

A lie of progression is when a storyteller changes the order of events in a story to make it more emotionally satisfying or comprehensible to the listener. In my experience, this is the least common lie told, and I have never done it myself, but I’ve recommended that other storytellers use it from time to time.

Storytellers use conflation to push all the emotion of an event into a single time frame, because stories are more entertaining this way. Rather than describing change over a long period, we compress all the intellectual and emotional transformation into a smaller bit of time, because this is what audiences expect from stories.

Stories are not supposed to start with thesis statements or overwrought aphorisms. Let me say it again, because it’s that important: A great storyteller creates a movie in the mind of the audience. Listeners should be able to see the story in their mind’s eye at all times. At no point should the story become visually obscured or impossible to see.

Always provide a physical location for every moment of your story. That’s it. If the audience knows where you are at all times within your story, the movie is running in their minds.

The ideal connective tissue in any story are the words but and therefore, along with all their glorious synonyms. These buts and therefores can be either explicit or implied.

Just listen to someone tell you about their vacation to Europe or their weekend at the beach. It’s almost never a good story. It’s almost never something you want to hear. Why? “First we went here, and it was amazing, and then we went here, and it was also amazing, and then we saw this, which was so amazing.”

One other aspect to the but-and-therefore principle: the power of the negative. Oddly, the negative is almost always better than the positive when it comes to storytelling. Saying what something or someone is not is almost always better than saying what something or someone is. For example: I am dumb, ugly, and unpopular. I’m not smart, I’m not at all good-looking, and no one likes me.

The second sentence is better, isn’t it? Here’s why: it contains a hidden but. It presents both possibilities. Unlike the first sentence, which only offers single descriptors, the second sentence offers a binary.

This is the trick to telling a big story: it cannot be about anything big. Instead we must find the small, relatable, comprehensible moments in our larger stories. We must find the piece of the story that people can connect to, relate to, and understand.

Your big stories could be about a vacation to exotic locales or the birth of a child or your wedding day or the untimely death of a loved one. Any of these could be told well if you find a way to make the story smaller than it seems. This is hard to do. Rarely are stories of birth or death or weddings or vacations good. They are more often ordinary, expected, and boring. Cliché. But this need not be the case.

As Blaise Pascal first said, “If I had more time, I would have written you a shorter letter.” Brevity takes time, because brevity is always better.

The same thing happens later in that story, when I say, “Hi, I’m Matt, and I’m collecting money for Ronald McDonald Children’s Charities.” It’s the most surprising moment of the story. People either gasp or laugh when they hear me say those words. If you’ll remember, I accentuate this surprise with a Breadcrumb and an Hourglass. I give a hint about what is to come (a crumpled McDonald’s uniform), and I make the audience wait forever to hear it by slowing my speech and adding enormous amounts of unnecessary description and repetition. Can you imagine how less surprising the moment would be if I had climbed into my car, spotted the crumpled McDonald’s uniform, and said, “I know. I’m going to go door-to-door pretending to be a charity worker.” Still surprising, perhaps, but not nearly so. Yet

Avoid thesis statements in storytelling.

You must end your story on heart. Far too often I hear storytellers attempt to end their story on a laugh. A pun. A joke. A play on words. This is not why we listen to stories. We like to laugh; we want to laugh. But we listen to stories to be moved.

Humor is a combination of wit, speed, tonality, confidence, daring, nonconformity, flexibility with the language, understanding of your audience, and more. In a lot of ways, it’s all about the way you say something. Delivery is critical.

Like all other emotional responses (see the previous chapter), humor is based entirely on surprise. A combination of specific words spoken in a specific way at a specific moment initiates a surprise that sparks a smile, a giggle, or actual laughter.

Babies and Blenders is the idea that when two things that rarely or never go together are pushed together, humor often results.

In the story about the way that my grandmother pulled my loose teeth, I refer to her as a sadist. Grandmother and sadist are rarely seen together, so it’s funny.

My favorite storyteller in the world — Steve Zimmer — does this in a story entitled “Neighborhood Watch.” After Steve’s family is not invited to the neighborhood Hawaiian luau, they decide to host the Zimmer family barbecue, which features “Zimmers, pineapple-flavored ham, and despair.”

The ending of the story — your five-second moment — will tell you what the beginning of your story should be. The beginning will be the opposite of the end. If my story is about my realization that the world (and especially people) are fundamentally unsafe and willing to hurt you for the pettiest of reasons, the beginning of my story needs to present my previous belief that people are basically good and the world is generally safe.

This is the magic of the present tense. It creates a sense of immediacy.

Rather than attempting to be grandiose about yourself or your success, you must undermine both you and it. This is because of two realities: First, human beings love underdog stories. The love for the underdog is universal. Underdogs are supposed to lose, so when they manage to pull out an unexpected or unbelievable victory, our sense of joy is more intense than if that same underdog suffers a crushing defeat.

I also suggested this: Can Tim’s story be about something other than Mount Everest? Can the climb to the summit be about something more personal? More interior? Perhaps a bit of individual growth that resulted from the climb? I know it sounds crazy to turn the summiting of Mount Everest into something other than the summiting of Mount Everest, but if I can turn a story about putting my head through a windshield and dying on the side of the road into a story about my friends taking the place of my family, why not?

Avoid phrases like “You guys!” for the same reason you shouldn’t ask rhetorical questions. When a storyteller says something like “You guys, you’re not going to believe this!” the bubble is instantly broken. Time travel has abruptly ended. The audience is keenly aware that someone is standing in front of them, speaking directly to them and the people sitting around them.

Phrases like, “But that’s a story for another day,” or “Long story short” serve to remind our audience that we are telling a story. If your audience knows that you’re telling a story, then they’re not time traveling.

The lesson here: Nervousness can be your friend. Too much of it is never good, but not being nervous at all isn’t good either. I bristle at the saying, “If you’re not nervous, you don’t care enough,” because I couldn’t care more about performing well, but there is some truth in this statement. It ain’t always bad to be nervous.

It’s hard to be authentic and vulnerable when you’re reciting lines. It’s also obvious to an audience when a storyteller is simply reciting a story instead of telling a story. Instead of memorizing your story word-for-word, memorize three parts to a story: 1. The first few sentences. Always start strong. 2. The last few sentences. Always end strong. 3. The scenes of your story.

Some people remember their scenes in a list, but I actually remember these scenes as circles in my mind. The size of the circle reflects the size of the scene. The color of the circle reflects the tone and tenor of the scene.

But when you can see your audience — in a classroom, a conference room, your aunt’s kitchen, a reception hall, or a faculty meeting — eye contact is important. You can’t speak to the middle distance and expect your audience to connect.

This is what I call the Spider-Man Principle of Meetings and Presentations (though Voltaire admittedly said it first): “With great power comes great responsibility.”

A first date is an interview of sorts. If you can make the person laugh, share a little vulnerability, and tell a good story in the process, your chances for second and third dates increase exponentially.

I believe that it is the teacher’s responsibility to provide a reason to learn. A meaningful, entertaining, engaging, thrilling, fly-by-the-seat-of-their-pants reason to keep their eyes and ears and minds open. This is why every lesson requires a hook. A hook is not a statement like “This material will be on Friday’s test” or “This is something you’ll use for the rest of your life.” A hook is an attempt to be entertaining, engaging, thought-provoking, surprising, challenging, daring, and even shocking. This can be done in dozens, and perhaps hundreds, of ways.

Four times I have stepped off the stage at a storytelling show and been approached by a woman who wanted to share the story of her miscarriage with me.I was speechless the first time this happened. I called Elysha immediately after the show to tell her. Elysha’s response was surprising. “Of course she wanted to tell you,” she said. “You stood on that stage and talked about one of your most difficult moments in your life with complete honesty. Your story made you safe to talk to. And she never needs to see you again. She could unburden herself of this secret to someone she knew she could trust, and she doesn’t have to see you at work or home the next day.” It made sense.

Categories
Personal Growth Psychology

James Clear – Atomic Habits

Goals are about the results you want to achieve. Systems are about the processes that lead to those results.

Changes that seem small and unimportant at first will compound into remarkable results if you’re willing to stick with them for years.

The impact created by a change in your habits is similar to the effect of shifting the route of an airplane by just a few degrees. Imagine you are flying from Los Angeles to New York City. If a pilot leaving from LAX adjusts the heading just 3.5 degrees south, you will land in Washington, D.C., instead of New York. Such a small change is barely noticeable at takeoff—the nose of the airplane moves just a few feet—but when magnified across the entire United States, you end up hundreds of miles apart.

Your outcomes are a lagging measure of your habits. Your net worth is a lagging measure of your financial habits. Your weight is a lagging measure of your eating habits. Your knowledge is a lagging measure of your learning habits. Your clutter is a lagging measure of your cleaning habits. You get what you repeat.

The purpose of setting goals is to win the game. The purpose of building systems is to continue playing the game. True long-term thinking is goal-less thinking. It’s not about any single accomplishment. It is about the cycle of endless refinement and continuous improvement. Ultimately, it is your commitment to the process that will determine your progress

You do not rise to the level of your goals. You fall to the level of your systems.

Many people begin the process of changing their habits by focusing on what they want to achieve. This leads us to outcome-based habits. The alternative is to build identity-based habits. With this approach, we start by focusing on who we wish to become.

Behind every system of actions is a system of beliefs. The system of a democracy is founded on beliefs like freedom, majority rule, and social equality. The system of a dictatorship has a very different set of beliefs like absolute authority and strict obedience.

Behavior that is incongruent with the self will not last. You may want more money, but if your identity is someone who consumes rather than creates, then you’ll continue to be pulled toward spending rather than earning.

The ultimate form of intrinsic motivation is when a habit becomes part of your identity. It’s one thing to say I’m the type of person who wants this. It’s something very different to say I’m the type of person who is this.

True behavior change is identity change. You might start a habit because of motivation, but the only reason you’ll stick with one is that it becomes part of your identity.

The goal is not to read a book, the goal is to become a reader. The goal is not to run a marathon, the goal is to become a runner. The goal is not to learn an instrument, the goal is to become a musician.

Your behaviors are usually a reflection of your identity.

Research has shown that once a person believes in a particular aspect of their identity, they are more likely to act in alignment with that belief.

Once you have adopted an identity, it can be easy to let your allegiance to it impact your ability to change. Many people walk through life in a cognitive slumber, blindly following the norms attached to their identity. “I’m terrible with directions.” “I’m not a morning person.” “I’m bad at remembering people’s names.” “I’m always late.” “I’m not good with technology.” “I’m horrible at math.” … and a thousand other variations. When you have repeated a story to yourself for years, it is easy to slide into these mental grooves and accept them as a fact.

The biggest barrier to positive change at any level—individual, team, society—is identity conflict.

Your habits are how you embody your identity.

The more you repeat a behavior, the more you reinforce the identity associated with that behavior. In fact, the word identity was originally derived from the Latin words essentitas, which means being, and identidem, which means repeatedly. Your identity is literally your “repeated beingness.”

Each habit is like a suggestion: “Hey, maybe this is who I am.” If you finish a book, then perhaps you are the type of person who likes reading. If you go to the gym, then perhaps you are the type of person who likes exercise. If you practice playing the guitar, perhaps you are the type of person who likes music.

Every action you take is a vote for the type of person you wish to become. No single instance will transform your beliefs, but as the votes build up, so does the evidence of your new identity.

Putting this all together, you can see that habits are the path to changing your identity. The most practical way to change who you are is to change what you do. Each time you write a page, you are a writer. Each time you practice the violin, you are a musician. Each time you start a workout, you are an athlete. Each time you encourage your employees, you are a leader.

Decide the type of person you want to be. Prove it to yourself with small wins.

Habits are mental shortcuts learned from experience. In a sense, a habit is just a memory of the steps you previously followed to solve a problem in the past.

The primary reason the brain remembers the past is to better predict what will work in the future.

The process of building a habit can be divided into four simple steps: cue, craving, response, and reward.

As the psychologist Carl Jung said, “Until you make the unconscious conscious, it will direct your life and you will call it fate.”

The Japanese railway system is regarded as one of the best in the world. If you ever find yourself riding a train in Tokyo, you’ll notice that the conductors have a peculiar habit. As each operator runs the train, they proceed through a ritual of pointing at different objects and calling out commands. When the train approaches a signal, the operator will point at it and say, “Signal is green.” As the train pulls into and out of each station, the operator will point at the speedometer and call out the exact speed. When it’s time to leave, the operator will point at the timetable and state the time. Out on the platform, other employees are performing similar actions. Before each train departs, staff members will point along the edge of the platform and declare, “All clear!” Every detail is identified, pointed at, and named aloud.fn1 This process, known as Pointing-and-Calling, is a safety system designed to reduce mistakes. It seems silly, but it works incredibly well. Pointing-and-Calling reduces errors by up to 85 percent and cuts accidents by 30 percent. The MTA subway system in New York City adopted a modified version that is “point-only,” and “within two years of implementation, incidents of incorrectly berthed subways fell 57 percent.” Pointing-and-Calling is so effective because it raises the level of awareness from a nonconscious habit to a more conscious level. Because the train operators must use their eyes, hands, mouth, and ears, they are more likely to notice problems before something goes wrong.

The sentence they filled out is what researchers refer to as an implementation intention, which is a plan you make beforehand about when and where to act. That is, how you intend to implement a particular habit.

The cues that can trigger a habit come in a wide range of forms—the feel of your phone buzzing in your pocket, the smell of chocolate chip cookies, the sound of ambulance sirens—but the two most common cues are time and location.

The Diderot Effect states that obtaining a new possession often creates a spiral of consumption that leads to additional purchases. You can spot this pattern everywhere.

Environment is the invisible hand that shapes human behavior. Despite our unique personalities, certain behaviors tend to arise again and again under certain environmental conditions. In church, people tend to talk in whispers. On a dark street, people act wary and guarded. In this way, the most common form of change is not internal, but external: we are changed by the world around us. Every habit is context dependent.

In 1936, psychologist Kurt Lewin wrote a simple equation that makes a powerful statement: Behavior is a function of the Person in their Environment, or B = f (P,E).

The power of context also reveals an important strategy: habits can be easier to change in a new environment. It helps to escape the subtle triggers and cues that nudge you toward your current habits. Go to a new place—a different coffee shop, a bench in the park, a corner of your room you seldom use—and create a new routine there. It is easier to associate a new habit with a new context than to build a new habit in the face of competing cues.

A stable environment where everything has a place and a purpose is an environment where habits can easily form.

In 1971, as the Vietnam War was heading into its sixteenth year, congressmen Robert Steele from Connecticut and Morgan Murphy from Illinois made a discovery that stunned the American public. While visiting the troops, they had learned that over 15 percent of U.S. soldiers stationed there were heroin addicts. Follow-up research revealed that 35 percent of service members in Vietnam had tried heroin and as many as 20 percent were addicted—the problem was even worse than they had initially thought.In a finding that completely upended the accepted beliefs about addiction, Robins found that when soldiers who had been heroin users returned home, only 5 percent of them became re-addicted within a year, and just 12 percent relapsed within three years. In other words, approximately nine out of ten soldiers who used heroin in Vietnam eliminated their addiction nearly overnight.

This finding contradicted the prevailing view at the time, which considered heroin addiction to be a permanent and irreversible condition. Instead, Robins revealed that addictions could spontaneously dissolve if there was a radical change in the environment.

When scientists analyze people who appear to have tremendous self-control, it turns out those individuals aren’t all that different from those who are struggling. Instead, “disciplined” people are better at structuring their lives in a way that does not require heroic willpower and self-control. In other words, they spend less time in tempting situations.

The people with the best self-control are typically the ones who need to use it the least. It’s easier to practice self-restraint when you don’t have to use it very often. So, yes, perseverance, grit, and willpower are essential to success, but the way to improve these qualities is not by wishing you were a more disciplined person, but by creating a more disciplined environment.

Bad habits are autocatalytic: the process feeds itself. They foster the feelings they try to numb. You feel bad, so you eat junk food. Because you eat junk food, you feel bad. Watching television makes you feel sluggish, so you watch more television because you don’t have the energy to do anything else.

Here’s the punch line: You can break a habit, but you’re unlikely to forget it. Once the mental grooves of habit have been carved into your brain, they are nearly impossible to remove entirely—even if they go unused for quite a while.

Scientists can track the precise moment a craving occurs by measuring a neurotransmitter called dopamine.

By implanting electrodes in the brains of rats, the researchers blocked the release of dopamine. To the surprise of the scientists, the rats lost all will to live. They wouldn’t eat. They wouldn’t have sex. They didn’t crave anything. Within a few days, the animals died of thirst.

Habits are a dopamine-driven feedback loop. Every behavior that is highly habit-forming—taking drugs, eating junk food, playing video games, browsing social media—is associated with higher levels of dopamine.

When it comes to habits, the key takeaway is this: dopamine is released not only when you experience pleasure, but also when you anticipate it.

Your brain has far more neural circuitry allocated for wanting rewards than for liking them. The wanting centers in the brain are large: the brain stem, the nucleus accumbens, the ventral tegmental area, the dorsal striatum, the amygdala, and portions of the prefrontal cortex. By comparison, the liking centers of the brain are much smaller. They are often referred to as “hedonic hot spots” and are distributed like tiny islands throughout the brain.

Temptation bundling works by linking an action you want to do with an action you need to do. In Byrne’s case, he bundled watching Netflix (the thing he wanted to do) with riding his stationary bike (the thing he needed to do).

Temptation bundling is one way to apply a psychology theory known as Premack’s Principle. Named after the work of professor David Premack, the principle states that “more probable behaviors will reinforce less probable behaviors.”

We imitate the habits of three groups in particular: The close. The many. The powerful.

Similarly, one study found that the higher your best friend’s IQ at age eleven or twelve, the higher your IQ would be at age fifteen, even after controlling for natural levels of intelligence. We soak up the qualities and practices of those around us.

Whenever we are unsure how to act, we look to the group to guide our behavior. We are constantly scanning our environment and wondering, “What is everyone else doing?” We check reviews on Amazon or Yelp or TripAdvisor because we want to imitate the “best” buying, eating, and travel habits. It’s usually a smart strategy. There is evidence in numbers.

You can make hard habits more attractive if you can learn to associate them with a positive experience. Sometimes, all you need is a slight mind-set shift. For instance, we often talk about everything we have to do in a given day. You have to wake up early for work. You have to make another sales call for your business. You have to cook dinner for your family. Now, imagine changing just one word: You don’t “have” to. You “get” to. You get to wake up early for work. You get to make another sales call for your business. You get to cook dinner for your family. By simply changing one word, you shift the way you view each event.

I once heard a story about a man who uses a wheelchair. When asked if it was difficult being confined, he responded, “I’m not confined to my wheelchair—I am liberated by it. If it wasn’t for my wheelchair, I would be bed-bound and never able to leave my house.” This shift in perspective completely transformed how he lived each day.

As Voltaire once wrote, “The best is the enemy of the good.”

Repeating a habit leads to clear physical changes in the brain. In musicians, the cerebellum—critical for physical movements like plucking a guitar string or pulling a violin bow—is larger than it is in nonmusicians. Mathematicians, meanwhile, have increased gray matter in the inferior parietal lobule, which plays a key role in computation and calculation. Its size is directly correlated with the amount of time spent in the field; the older and more experienced the mathematician, the greater the increase in gray matter. When scientists analyzed the brains of taxi drivers in London, they found that the hippocampus—a region of the brain involved in spatial memory—was significantly larger in their subjects than in non–taxi drivers. Even more fascinating, the hippocampus decreased in size when a driver retired. Like the muscles of the body responding to regular weight training, particular regions of the brain adapt as they are used and atrophy as they are abandoned.

One of the most common questions I hear is, “How long does it take to build a new habit?” But what people really should be asking is, “How many does it take to form a new habit?” That is, how many repetitions are required to make a habit automatic?

The purpose of resetting each room is not simply to clean up after the last action, but to prepare for the next action. “When I walk into a room everything is in its right place,” Nuckols wrote. “Because I do this every day in every room, stuff always stays in good shape …. People think I work hard but I’m actually really lazy. I’m just proactively lazy. It gives you so much time back.”

There are many ways to prime your environment so it’s ready for immediate use. If you want to cook a healthy breakfast, place the skillet on the stove, set the cooking spray on the counter, and lay out any plates and utensils you’ll need the night before. When you wake up, making breakfast will be easy.  Want to draw more? Put your pencils, pens, notebooks, and drawing tools on top of your desk, within easy reach. Want to exercise? Set out your workout clothes, shoes, gym bag, and water bottle ahead of

Even when you know you should start small, it’s easy to start too big. When you dream about making a change, excitement inevitably takes over and you end up trying to do too much too soon. The most effective way I know to counteract this tendency is to use the Two-Minute Rule, which states, “When you start a new habit, it should take less than two minutes to do.”

Instead of trying to engineer a perfect habit from the start, do the easy thing on a more consistent basis. You have to standardize before you can optimize.

IN THE LATE 1990s, a public health worker named Stephen Luby left his hometown of Omaha, Nebraska, and bought a one-way ticket to Karachi, Pakistan.

Luby and his team realized that in an environment with poor sanitation, the simple habit of washing your hands could make a real difference in the health of the residents. But they soon discovered that many people were already aware that handwashing was important.

Everyone said handwashing was important, but few people made a habit out of it. The problem wasn’t knowledge. The problem was consistency. That was when Luby and his team partnered with Procter & Gamble to supply the neighborhood with Safeguard soap.

“In Pakistan, Safeguard was a premium soap,” Luby told me. “The study participants commonly mentioned how much they liked it.”

Within months, the researchers saw a rapid shift in the health of children in the neighborhood. The rate of diarrhea fell by 52 percent; pneumonia by 48 percent; and impetigo, a bacterial skin infection, by 35 percent. The long-term effects were even better.

Behavioral economists refer to this tendency as time inconsistency. That is, the way your brain evaluates rewards is inconsistent across time. You value the present more than the future. Usually, this tendency serves us well. A reward that is certain right now is typically worth more than one that is merely possible in the future. But occasionally, our bias toward instant gratification causes problems.

The French economist Frédéric Bastiat explained the problem clearly when he wrote, “It almost always happens that when the immediate consequence is favorable, the later consequences are disastrous, and vice versa …. Often, the sweeter the first fruit of a habit, the more bitter are its later fruits.”

Dyrsmid began each morning with two jars on his desk. One was filled with 120 paper clips. The other was empty. As soon as he settled in each day, he would make a sales call. Immediately after, he would move one paper clip from the full jar to the empty jar and the process would begin again.

I like to refer to this technique as the Paper Clip Strategy and, over the years, I’ve heard from readers who have employed it in a variety of ways. One woman shifted a hairpin from one container to another whenever she wrote a page of her book. Another man moved a marble from one bin to the next after each set of push-ups.

In summary, habit tracking (1) creates a visual cue that can remind you to act, (2) is inherently motivating because you see the progress you are making and don’t want to lose it, and (3) feels satisfying whenever you record another successful instance of your habit. Furthermore, habit tracking provides visual proof that you are casting votes for the type of person you wish to become, which is a delightful form of immediate and intrinsic gratification.

You don’t realize how valuable it is to just show up on your bad (or busy) days.

This is sometimes referred to as Goodhart’s Law. Named after the economist Charles Goodhart, the principle states, “When a measure becomes a target, it ceases to be a good measure.”

All five characteristics in the Big Five model have biological underpinnings. Extroversion, for instance, can be tracked from birth. If scientists play a loud noise in the nursing ward, some babies turn toward it while others turn away. When the researchers tracked these children through life, they found that the babies who turned toward the noise were more likely to grow up to be extroverts. Those who turned away were more likely to become introverts.

As you explore different options, there are a series of questions you can ask yourself to continually narrow in on the habits and areas that will be most satisfying to you: What feels like fun to me, but work to others? The mark of whether you are made for a task is not whether you love it but whether you can handle the pain of the task easier than most people.

What makes me lose track of time? Flow is the mental state you enter when you are so focused on the task at hand that the rest of the world fades away. This blend of happiness and peak performance is what athletes and performers experience when they are “in the zone.”

What comes naturally to me? For just a moment, ignore what you have been taught. Ignore what society has told you. Ignore what others expect of you. Look inside yourself and ask, “What feels natural to me? When have I felt alive? When have I felt like the real me?”

The Goldilocks Rule states that humans experience peak motivation when working on tasks that are right on the edge of their current abilities. Not too hard. Not too easy. Just right.

As Machiavelli noted, “Men desire novelty to such an extent that those who are doing well wish for a change as much as those who are doing badly.”

In psychology, this is known as a variable reward. Slot machines are the most common real-world example. A gambler hits the jackpot every now and then but not at any predictable interval. The pace of rewards varies. This variance leads to the greatest spike of dopamine, enhances memory recall, and accelerates habit formation.

I know of executives and investors who keep a “decision journal” in which they record the major decisions they make each week, why they made them, and what they expect the outcome to be. They review their choices at the end of each month or year to see where they were correct and where they went wrong.

Categories
Society

Hans Rosling – Factfulness

Every group of people I ask thinks the world is more frightening, more violent, and more hopeless—in short, more dramatic—than it really is.

Only actively wrong “knowledge” can make us score so badly.

In short: Think about the world. War, violence, natural disasters, man-made disasters, corruption. Things are bad, and it feels like they are getting worse, right? The rich are getting richer and the poor are getting poorer; and the number of poor just keeps increasing; and we will soon run out of resources unless we do something drastic. At least that’s the picture that most Westerners see in the media and carry around in their heads. I call it the overdramatic worldview. It’s stressful and misleading. In fact, the vast majority of the world’s population lives somewhere in the middle of the income scale. Perhaps they are not what we think of as middle class, but they are not living in extreme poverty. Their girls go to school, their children get vaccinated, they live in two-child families, and they want to go abroad on holiday, not as refugees. Step-by-step, year-by-year, the world is improving. Not on every single measure every single year, but as a rule. Though the world faces huge challenges, we have made tremendous progress. This is the fact-based worldview.

The overdramatic worldview is so difficult to shift because it comes from the very way our brains work.

The human brain is a product of millions of years of evolution, and we are hard-wired with instincts that helped our ancestors to survive in small groups of hunters and gatherers. Our brains often jump to swift conclusions without much thinking, which used to help us to avoid immediate dangers. We are interested in gossip and dramatic stories, which used to be the only source of news and useful information. We crave sugar and fat, which used to be life-saving sources of energy when food was scarce. We have many instincts that used to be useful thousands of years ago, but we live in a very different world now.

Our quick-thinking brains and cravings for drama—our dramatic instincts—are causing misconceptions and an overdramatic worldview.

Uncontrolled, our appetite for the dramatic goes too far, prevents us from seeing the world as it is, and leads us terribly astray.

This chapter is about the first of our ten dramatic instincts, the gap instinct. I’m talking about that irresistible temptation we have to divide all kinds of things into two distinct and often conflicting groups, with an imagined gap—a huge chasm of injustice—in between. It is about how the gap instinct creates a picture in people’s heads of a world split into two kinds of countries or two kinds of people: rich versus poor.

Graphs showing levels of income, or tourism, or democracy, or access to education, health care, or electricity would all tell the same story: that the world used to be divided into two but isn’t any longer. Today, most people are in the middle. There is no gap between the West and the rest, between developed and developing, between rich and poor. And we should all stop using the simple pairs of categories that suggest there is. 

Of the world population, what percentage lives in low-income countries? The majority suggested the answer was 50 percent or more. The average guess was 59 percent. The real figure is 9 percent. Only 9 percent of the world lives in low-income countries.

To summarize: low-income countries are much more developed than most people think. And vastly fewer people live in them. The idea of a divided world with a majority stuck in misery and deprivation is an illusion. A complete misconception. Simply wrong.

So, to replace them, I will now suggest an equally simple but more relevant and useful way of dividing up the world. Instead of dividing the world into two groups I will divide it into four income levels, as set out in the image below.

Today the vast majority of people are spread out in the middle, across Levels 2 and 3, with the same range of standards of living as people had in Western Europe and North America in the 1950s. And this has been the case for many years.

The gap instinct makes us imagine division where there is just a smooth range, difference where there is convergence, and conflict where there is agreement. It is the first instinct on our list because it’s so common and distorts the data so fundamentally. If you look at the news or click on a lobby group’s website this evening, you will probably notice stories about conflict between two groups, or phrases like “the increasing gap.”

Of course, gap stories can reflect reality. In apartheid South Africa, black people and white people lived on different income levels and there was a true gap between them, with almost no overlap. The gap story of separate groups was absolutely relevant. But apartheid was very unusual. Much more often, gap stories are a misleading overdramatization. In most cases there is no clear separation of two groups, even if it seems like that from the averages. We almost always get a more accurate picture by digging a little deeper and looking not just at the averages but at the spread: not just the group all bundled together, but the individuals. Then we often see that apparently distinct groups are in fact very much overlapping.

We are naturally drawn to extreme examples, and they are easy to recall.

These stories of opposites are engaging and provocative and tempting—and very effective for triggering our gap instinct—but they rarely help understanding. There will always be the richest and the poorest, there will always be the worst regimes and the best. But the fact that extremes exist doesn’t tell us much. The majority is usually to be found in the middle, and it tells a very different story.

To control the gap instinct, look for the majority.

Beware comparisons of extremes. In all groups, of countries or people, there are some at the top and some at the bottom. The difference is sometimes extremely unfair. But even then the majority is usually somewhere in between, right where the gap is supposed to be.

Over the last 20 years, the proportion of people living in extreme poverty has almost halved. But in our online polls, in most countries, less than 10 percent knew this. Remember the four income levels from chapter 1? In the year 1800, roughly 85 percent of humanity lived on Level 1, in extreme poverty.

Level 1 is where all of humanity started. It’s where the majority always lived, until 1966. Until then, extreme poverty was the rule, not the exception.

Back in 1800, when Swedes starved to death and British children worked in coal mines, life expectancy was roughly 30 years everywhere in the world.

There’s a dip in the global life expectancy curve in 1960 because 15 to 40 million people—nobody knows the exact number—starved to death that year in China, in what was probably the world’s largest ever man-made famine. The Chinese harvest in 1960 was smaller than planned because of a bad season combined with poor governmental advice about how to grow crops more effectively. The local governments didn’t want to show bad results, so they took all the food and sent it to the central government. There was no food left. One year later the shocked inspectors were delivering eyewitness reports of cannibalism and dead bodies along roads. The government denied that its central planning had failed, and the catastrophe was kept secret by the Chinese government for 36 years. It wasn’t described in English to the outside world until 1996. (Think about it. Could any government keep the death of 15 million people a global secret today?)

Your own country has been improving like crazy too. I can say this with confidence even though I don’t know where you live, because every country in the world has improved its life expectancy over the last 200 years. In fact almost every country has improved by almost every measure.

In large part, it is because of our negativity instinct: our instinct to notice the bad more than the good. There are three things going on here: the misremembering of the past; selective reporting by journalists and activists; and the feeling that as long as things are bad it’s heartless to say they are getting better.

And thanks to increasing press freedom and improving technology, we hear more, about more disasters, than ever before. When Europeans slaughtered indigenous peoples across America a few centuries ago, it didn’t make the news back in the old world. When central planning resulted in mass famine in rural China, millions starved to death while the youngsters in Europe waving communist red flags knew nothing about it. When in the past whole species or ecosystems were destroyed, no one realized or even cared. Alongside all the other improvements, our surveillance of suffering has improved tremendously. This improved reporting is itself a sign of human progress, but it creates the impression of the exact opposite.

Factfulness is … recognizing when we get negative news, and remembering that information about bad events is much more likely to reach us. When things are getting better we often don’t hear about them. This gives us a systematically too-negative impression of the world around us, which is very stressful. 

To control the negativity instinct, expect bad news. 

  • Better and bad. Practice distinguishing between a level (e.g., bad) and a direction of change (e.g., better). Convince yourself that things can be both better and bad. 
  • Good news is not news. Good news is almost never reported. So news is almost always bad. When you see bad news, ask whether equally positive news would have reached you. 
  • Gradual improvement is not news. When a trend is gradually improving, with periodic dips, you are more likely to notice the dips than the overall improvement. 
  • More news does not equal more suffering. More bad news is sometimes due to better surveillance of suffering, not a worsening world. 
  • Beware of rosy pasts. People often glorify their early experiences, and nations often glorify their histories.

The world population today is 7.6 billion people, and yes, it’s growing fast. Still, the growth has already started to slow down, and the UN experts are pretty sure it will keep slowing down over the next few decades. They think the curve will flatten out at somewhere between 10 and 12 billion people by the end of the century.

The UN experts are not predicting that the number of children will stop increasing. They are reporting that it is already happening. The radical change that is needed to stop rapid population growth is that the number of children stops growing. And that is already happening. How could that be? That, everybody should know.

When I was born in 1948, women on average gave birth to five children each. After 1965 the number started dropping like it never had done before. Over the last 50 years it dropped all the way to the amazingly low world average of just below 2.5.

The large increase in population is going to happen not because there are more children. And not, in the main, because old folks are living longer. In fact the UN experts do predict that by 2100, world life expectancy will have increased by roughly 11 years, adding 1 billion old people to the total and taking it to around 11 billion. The large increase in population will happen mainly because the children who already exist today are going to grow up and “fill up” the diagram with 3 billion more adults. This “fill-up effect” takes three generations, and then it is done.

When combining all the parents living on Levels 2, 3, and 4, from every region of the world, and of every religion or no religion, together they have on average two children. No kidding! This includes the populations of Iran, Mexico, India, Tunisia, Bangladesh, Brazil, Turkey, Indonesia, and Sri Lanka, just to name a few examples.

Critical thinking is always difficult, but it’s almost impossible when we are scared. There’s no room for facts when our minds are occupied by fear.

Yet here’s the paradox: the image of a dangerous world has never been broadcast more effectively than it is now, while the world has never been less violent and more safe.

In fact, the number of deaths from acts of nature has dropped far below half. It is now just 25 percent of what it was 100 years ago.

This chapter has touched on terrifying events: natural disasters (0.1 percent of all deaths), plane crashes (0.001 percent), murders (0.7 percent), nuclear leaks (0 percent), and terrorism (0.05 percent). None of them kills more than 1 percent of the people who die each year, and still they get enormous media attention.

“In the deepest poverty you should never do anything perfectly. If you do you are stealing resources from where they can be better used.”

Today there are robust data sets for making the kinds of comparisons I made in Nacala on a global scale, and they show the same thing: It is not doctors and hospital beds that save children’s lives in countries on Levels 1 and 2. Beds and doctors are easy to count and politicians love to inaugurate buildings. But almost all the increased child survival is achieved through preventive measures outside hospitals by local nurses, midwives, and well-educated parents. Especially mothers: the data shows that half the increase in child survival in the world happens because the mothers can read and write. More children now survive because they don’t get ill in the first place.

I first discovered how useful the 80/20 rule is when I started to review aid projects for the Swedish government. In most budgets, around 20 percent of the lines sum up to more than 80 percent of the total. You can save a lot of money by making sure you understand these lines first. Doing just that is how I discovered that half the aid budget of a small health center in rural Vietnam was about to be spent on 2,000 of the wrong kind of surgical knives. It’s how I discovered that 100 times too much—4 million liters—of baby formula was about to be sent to a refugee camp in Algeria. And it is how I stopped 20,000 testicular prostheses from being sent to a small youth clinic in Nicaragua. In each case I simply looked for the biggest single items taking up 80 percent of the budget, then dug down into any that seemed unusual. In each case the problem was due to a simple confusion or tiny error such as a missing decimal point. The 80/20 rule is as easy as it seems. You just have to remember to use it.

By the end of this century, the UN expects there to have been almost no change in the Americas and Europe but 3 billion more people in Africa and 1 billion more in Asia. By 2100 the new PIN code of the world will be 1-1-4-5. More than 80 percent of the world’s population will live in Africa and Asia.

When I see a lonely number in a news report, it always triggers an alarm: What should this lonely number be compared to? What was that number a year ago? Ten years ago? What is it in a comparable country or region? And what should it be divided by? What is the total of which this is a part? What would this be per person? I compare the rates, and only then do I decide whether it really is an important number. 

Factfulness is … recognizing when a lonely number seems impressive (small or large), and remembering that you could get the opposite impression if it were compared with or divided by some other relevant number. 

To control the size instinct, get things in proportion. 

  • Compare. Big numbers always look big. Single numbers on their own are misleading and should make you suspicious. Always look for comparisons. Ideally, divide by something. 
  • 80/20. Have you been given a long list? Look for the few largest items and deal with those first. They are quite likely more important than all the others put together. 
  • Divide. Amounts and rates can tell very different stories. Rates are more meaningful, especially when comparing between different-sized groups. In particular, look for rates per person when comparing between countries or regions.

During the Second World War and the Korean War, doctors and nurses discovered that unconscious soldiers stretchered off the battlefields survived more often if they were laid on their fronts rather than on their backs. On their backs, they often suffocated on their own vomit. On their fronts, the vomit could exit and their airways remained open. This observation saved many millions of lives, not just of soldiers. The “recovery position” has since become a global best practice, taught in every first-aid course on the planet. (The rescue workers saving lives after the 2015 earthquake in Nepal had all learned it.) But a new discovery can easily be generalized too far. In the 1960s, the success of the recovery position inspired new public health advice, against most traditional practices, to put babies to sleep on their tummies. As if any helpless person on their back needed just the same help. The mental clumsiness of a generalization like this is often difficult to spot. The chain of logic seems correct. When seemingly impregnable logic is combined with good intentions, it becomes nearly impossible to spot the generalization error. Even though the data showed that sudden infant deaths went up, not down, it wasn’t until 1985 that a group of pediatricians in Hong Kong actually suggested that the prone position might be the cause. Even then, doctors in Europe didn’t pay much attention. It took Swedish authorities another seven years to accept their mistake and reverse the policy. Unconscious soldiers were dying on their backs when they vomited. Sleeping babies, unlike unconscious soldiers, have fully functioning reflexes and turn to the side if they vomit while on their backs. But on their tummies, maybe some babies are not yet strong enough to tilt their heavy heads to keep their airways open. (The reason the prone position is more dangerous is still not fully understood.)

Factfulness is … recognizing when a category is being used in an explanation, and remembering that categories can be misleading. We can’t stop generalization and we shouldn’t even try. What we should try to do is to avoid generalizing incorrectly.

To control the generalization instinct, question your categories. 

  • Look for differences within groups. Especially when the groups are large, look for ways to split them into smaller, more precise categories.
  •  Look for similarities across groups. If you find striking similarities between different groups, consider whether your categories are relevant. 
  • Look for differences across groups. Do not assume that what applies for one group (e.g., you and other people living on Level 4 or unconscious soldiers) applies for another (e.g., people not living on Level 4 or sleeping babies). •
  • Beware of “the majority.” The majority just means more than half. Ask whether it means 51 percent, 99 percent, or something in between. 
  • Beware of vivid examples. Vivid images are easier to recall but they might be the exception rather than the rule. 
  • Assume people are not idiots. When something looks strange, be curious and humble, and think, In what way is this a smart solution?

Almost every religious tradition has rules about sex, so it is easy to understand why so many people assume that women in some religions give birth to more children. But the link between religion and the number of babies per woman is often overstated. There is, though, a strong link between income and number of babies per woman.

Factfulness is … recognizing that many things (including people, countries, religions, and cultures) appear to be constant just because the change is happening slowly, and remembering that even small, slow changes gradually add up to big changes. 

To control the destiny instinct, remember slow change is still change. 

  • Keep track of gradual improvements. A small change every year can translate to a huge change over decades. 
  • Update your knowledge. Some knowledge goes out of date quickly. Technology, countries, societies, cultures, and religions are constantly changing. 
  • Talk to Grandpa. If you want to be reminded of how values have changed, think about your grandparents’ values and how they differ from yours. 
  • Collect examples of cultural change. Challenge the idea that today’s culture must also have been yesterday’s, and will also be tomorrow’s.

You probably know the saying “give a child a hammer and everything looks like a nail.” When you have valuable expertise, you like to see it put to use. Sometimes an expert will look around for ways in which their hard-won knowledge and skills can be applicable beyond where it’s actually useful. So, people with math skills can get fixated on the numbers. Climate activists argue for solar everywhere. And physicians promote medical treatment where prevention would be better.

Experts in maternal mortality who understand the point about hammers and nails can see that the most valuable intervention for saving the lives of the poorest mothers is not training more local nurses to perform C-sections, or better treatment of severe bleeding or infections, but the availability of transport to the local hospital. The hospitals were of limited use if women could not reach them: if there were no ambulances, or no roads for the ambulances to travel on. Similarly, educators know that it is often the availability of electricity rather than more textbooks or even more teachers in the classroom that has the most impact on learning, as students can do their homework after sunset.

Instead of comparing themselves with extreme socialist regimes, US citizens should be asking why they cannot achieve the same levels of health, for the same cost, as other capitalist countries that have similar resources. The answer is not difficult, by the way: it is the absence of the basic public health insurance that citizens of most other countries on Level 4 take for granted. Under the current US system, rich, insured patients visit doctors more than they need, running up costs, while poor patients cannot afford even simple, inexpensive treatments and die younger than they should. Doctors spend time that could be used to save lives or treat illness providing unnecessary, meaningless care. What a tragic waste of physician time.

People like me, who believe this, are often tempted to argue that democracy leads to, or is even a requirement for, other good things, like peace, social progress, health improvements, and economic growth. But here’s the thing, and it is hard to accept: the evidence does not support this stance. Most countries that make great economic and social progress are not democracies. South Korea moved from Level 1 to Level 3 faster than any country had ever done (without finding oil), all the time as a military dictatorship. Of the ten countries with the fastest economic growth in 2016, nine of them score low on democracy.

Our press may be free, and professional, and truth-seeking, but independent is not the same as representative: even if every report is itself completely true, we can still get a misleading picture through the sum of true stories reporters choose to tell. The media is not and cannot be neutral, and we shouldn’t expect it to be.

The urgency instinct makes us want to take immediate action in the face of a perceived imminent danger. It must have served us humans well in the distant past. If we thought there might be a lion in the grass, it wasn’t sensible to do too much analysis. Those who stopped and carefully analyzed the probabilities are not our ancestors. We are the offspring of those who decided and acted quickly with insufficient information.

The five that concern me most are the risks of global pandemic, financial collapse, world war, climate change, and extreme poverty.

The Spanish flu that spread across the world in the wake of the First World War killed 50 million people—more people than the war had, although that was partly because the populations were already weakened after four years of war.

Categories
Morality Philosophy Psychology

Sam Harris – Free Will

Free will is an illusion. Our wills are simply not of our own making. Thoughts and intentions emerge from background causes of which we are unaware and over which we exert no conscious control. We do not have the freedom we think we have. Free will is actually more than an illusion (or less), in that it cannot be made conceptually coherent. Either our wills are determined by prior causes and we are not responsible for them, or they are the product of chance and we are not responsible for them.

If a man’s choice to shoot the president is determined by a certain pattern of neural activity, which is in turn the product of prior causes—perhaps an unfortunate coincidence of bad genes, an unhappy childhood, lost sleep, and cosmic-ray bombardment—what can it possibly mean to say that his will is “free”?

The popular conception of free will seems to rest on two assumptions: (1) that each of us could have behaved differently than we did in the past, and (2) that we are the conscious source of most of our thoughts and actions in the present. As we are about to see, however, both of these assumptions are false. 

The intention to do one thing and not another does not originate in consciousness—rather, it appears in consciousness, as does any thought or impulse that might oppose it.

One fact now seems indisputable: Some moments before you are aware of what you will do next—a time in which you subjectively appear to have complete freedom to behave however you please—your brain has already determined what you will do. You then become conscious of this “decision” and believe that you are in the process of making it.

The physiologist Benjamin Libet famously used EEG to show that activity in the brain’s motor cortex can be detected some 300 milliseconds before a person feels that he has decided to move.

Another lab extended this work using functional magnetic resonance imaging (fMRI): Subjects were asked to press one of two buttons while watching a “clock” composed of a random sequence of letters appearing on a screen. They reported which letter was visible at the moment they decided to press one button or the other. The experimenters found two brain regions that contained information about which button subjects would press a full 7 to 10 seconds before the decision was consciously made.

There is a distinction between voluntary and involuntary actions, of course, but it does nothing to support the common idea of free will (nor does it depend upon it). A voluntary action is accompanied by the felt intention to carry it out, whereas an involuntary action isn’t. Needless to say, this difference is reflected at the level of the brain. And what a person consciously intends to do says a lot about him. It makes sense to treat a man who enjoys murdering children differently from one who accidentally hit and killed a child with his car—because the conscious intentions of the former give us a lot of information about how he is likely to behave in the future.

Of course, this insight does not make social and political freedom any less important. The freedom to do what one intends, and not to do otherwise, is no less valuable than it ever was.

You are not controlling the storm, and you are not lost in it. You are the storm.

In the philosophical literature, one finds three main approaches to the problem: determinism, libertarianism, and compatibilism.

Today, the only philosophically respectable way to endorse free will is to be a compatibilist—because we know that determinism, in every sense relevant to human behavior, is true. Unconscious neural events determine our thoughts and actions—and are themselves determined by prior causes of which we are subjectively unaware.

However, the “free will” that compatibilists defend is not the free will that most people feel they have.

Compatibilists generally claim that a person is free as long as he is free from any outer or inner compulsions that would prevent him from acting on his actual desires and intentions.

If you want a second scoop of ice cream and no one is forcing you to eat it, then eating a second scoop is fully demonstrative of your freedom of will. The truth, however, is that people claim greater autonomy than this. Our moral intuitions and sense of personal agency are anchored to a felt sense that we are the conscious source of our thoughts and actions.

My mental life is simply given to me by the cosmos. Why didn’t I decide to drink a glass of juice? The thought never occurred to me. Am I free to do that which does not occur to me to do? Of course not.

And there is no way I can influence my desires—for what tools of influence would I use? Other desires? To say that I would have done otherwise had I wanted to is simply to say that I would have lived in a different universe had I been in a different universe. Compatibilism amounts to nothing more than an assertion of the following creed: A puppet is free as long as he loves his strings.

At this moment, you are making countless unconscious “decisions” with organs other than your brain—but these are not events for which you feel responsible. Are you producing red blood cells and digestive enzymes at this moment?

Your body is doing these things, of course, but if it “decided” to do otherwise, you would be the victim of these changes, rather than their cause. To say that you are responsible for everything that goes on inside your skin because it’s all “you” is to make a claim that bears absolutely no relationship to the feelings of agency and moral responsibility that have made the idea of free will an enduring problem for philosophy.

We know, in fact, that we sometimes feel responsible for events over which we have no causal influence. Given the right experimental manipulations, people can be led to believe that they consciously intended an action when they neither chose it nor had control over their movements. In one experiment, subjects were asked to select pictures on a screen using a computer’s cursor. They tended to believe that they had intentionally guided the cursor to a specific image even when it was under the full control of another person, as long as they heard the name of the image just before the cursor stopped.12 People who are susceptible to hypnosis can be given elaborate suggestions to perform odd tasks, and when asked why they have done these things, many will confabulate—giving reasons for their behavior that have nothing to do with its actual cause.

How can we be “free” as conscious agents if everything that we consciously intend is caused by events in our brain that we do not intend and of which we are entirely unaware? We can’t. To say that “my brain” decided to think or act in a particular way, whether consciously or not, and that this is the basis for my freedom, is to ignore the very source of our belief in free will: the feeling of conscious agency. People feel that they are the authors of their thoughts and actions, and this is the only reason why there seems to be a problem of free will worth talking about.

Consequently, some scientists and philosophers hope that chance or quantum uncertainty can make room for free will.

The sound of the leaf blower intrudes, but I can seize the spotlight of my attention in the next moment and aim it elsewhere. This difference between nonvolitional and volitional states of mind is reflected at the level of the brain—for they are governed by different systems. And the difference between them must, in part, produce the felt sense that there is a conscious self endowed with freedom of will.

The phrase “free will” describes what it feels like to identify with certain mental states as they arise in consciousness. Thoughts like “What should I get my daughter for her birthday? I know—I’ll take her to a pet store and have her pick out some tropical fish” convey the apparent reality of choices, freely made. But from a deeper perspective (speaking both objectively and subjectively), thoughts simply arise unauthored and yet author our actions.

And we know that the brain systems that allow us to reflect upon our experience are different from those involved when we automatically react to stimuli. So consciousness, in this sense, is not inconsequential

As Dan Dennett and many others have pointed out, people generally confuse determinism with fatalism. This gives rise to questions like “If everything is determined, why should I do anything? Why not just sit back and see what happens?” This is pure confusion. To sit back and see what happens is itself a choice that will produce its own consequences. It is also extremely difficult to do: Just try staying in bed all day waiting for something to happen;

Decisions, intentions, efforts, goals, willpower, etc., are causal states of the brain, leading to specific behaviors, and behaviors lead to outcomes in the world. Human choice, therefore, is as important as fanciers of free will believe. But the next choice you make will come out of the darkness of prior causes that you, the conscious witness of your experience, did not bring into being.

You are not in control of your mind—because you, as a conscious agent, are only part of your mind, living at the mercy of other parts. You can do what you decide to do—but you cannot decide what you will decide to do.

Many people worry that free will is a necessary illusion—and that without it we will fail to live creative and fulfilling lives. This concern isn’t entirely unjustified. One study found that having subjects read an argument against the existence of free will made them more likely to cheat on a subsequent exam.Another found such subjects to be less helpful and more aggressive.

Speaking from personal experience, I think that losing the sense of free will has only improved my ethics—by increasing my feelings of compassion and forgiveness, and diminishing my sense of entitlement to the fruits of my own good luck.

Our interests in life are not always served by viewing people and things as collections of atoms—but this doesn’t negate the truth or utility of physics.

Becoming sensitive to the background causes of one’s thoughts and feelings can—paradoxically—allow for greater creative control over one’s life. It is one thing to bicker with your wife because you are in a bad mood; it is another to realize that your mood and behavior have been caused by low blood sugar. This understanding reveals you to be a biochemical puppet, of course, but it also allows you to grab hold of one of your strings: A bite of food may be all that your personality requires. Getting behind our conscious thoughts and feelings can allow us to steer a more intelligent course through our lives (while knowing, of course, that we are ultimately being steered).

The great worry, of course, is that an honest discussion of the underlying causes of human behavior appears to leave no room for moral responsibility. If we view people as neuronal weather patterns, how can we coherently speak about right and wrong or good and evil?

To say that I was responsible for my behavior is simply to say that what I did was sufficiently in keeping with my thoughts, intentions, beliefs, and desires to be considered an extension of them. If I had found myself standing in the market naked, intent upon stealing as many tins of anchovies as I could carry, my behavior would be totally out of character; I would feel that I was not in my right mind, or that I was otherwise not responsible for my actions.

And it works this miracle even if the man’s subjective experience was identical to that of the psychopath in case 4—for the moment we understand that his feelings had a physical cause, a brain tumor, we cannot help seeing him as a victim of his own biology.

What we condemn most in another person is the conscious intention to do harm. Degrees of guilt can still be judged by reference to the facts of a case: the personality of the accused, his prior offenses, his patterns of association with others, his use of intoxicants, his confessed motives with regard to the victim, etc. If a person’s actions seem to have been entirely out of character, this might influence our view of the risk he now poses to others. If the accused appears unrepentant and eager to kill again, we need entertain no notions of free will to consider him a danger to society.

Once we recognize that even the most terrifying predators are, in a very real sense, unlucky to be who they are, the logic of hating (as opposed to fearing) them begins to unravel. Once again, even if you believe that every human being harbors an immortal soul, the picture does not change: Anyone born with the soul of a psychopath has been profoundly unlucky. 

Our system of justice should reflect an understanding that any of us could have been dealt a very different hand in life. In fact, it seems immoral not to recognize just how much luck is involved in morality itself.

The urge for retribution depends upon our not seeing the underlying causes of human behavior.

Viewing human beings as natural phenomena need not damage our system of criminal justice. If we could incarcerate earthquakes and hurricanes for their crimes, we would build prisons for them as well. We fight emerging epidemics—and even the occasional wild animal—without attributing free will to them.

Clearly, vengeance answers to a powerful psychological need in many of us. We are deeply disposed to perceive people as the authors of their actions, to hold them responsible for the wrongs they do us, and to feel that these transgressions must be punished.

However, it may be that a sham form of retribution would still be moral—even necessary—if it led people to behave better than they otherwise would.

Even if you have struggled to make the most of what nature gave you, you must still admit that your ability and inclination to struggle is part of your inheritance. How much credit does a person deserve for not being lazy? None at all. Laziness, like diligence, is a neurological condition. Of course, conservatives are right to think that we must encourage people to work to the best of their abilities and discourage free riders wherever we can. And it is wise to hold people responsible for their actions when doing so influences their behavior and brings benefit to society. But this does not mean that we must be taken in by the illusion of free will.

We need only acknowledge that efforts matter and that people can change. We do not change ourselves, precisely—because we have only ourselves with which to do the changing—but we continually influence, and are influenced by, the world around us and the world within us. It may seem paradoxical to hold people responsible for what happens in their corner of the universe, but once we break the spell of free will, we can do this precisely to the degree that it is useful.

Not only are we not as free as we think we are—we do not feel as free as we think we do. Our sense of our own freedom results from our not paying close attention to what it is like to be us. The moment we pay attention, it is possible to see that free will is nowhere to be found, and our experience is perfectly compatible with this truth.