Categories
Personal Growth Psychology

Adam Grant – Think Again

With all due respect to the lessons of experience, I prefer the rigor of evidence. When a trio of psychologists conducted a comprehensive review of thirty-three studies, they found that in every one, the majority of answer revisions were from wrong to right. This phenomenon is known as the first-instinct fallacy.

Part of the problem is cognitive laziness. Some psychologists point out that we’re mental misers: we often prefer the ease of hanging on to old views over the difficulty of grappling with new ones. Yet there are also deeper forces behind our resistance to rethinking. Questioning ourselves makes the world more unpredictable. It requires us to admit that the facts may have changed, that what was once right may now be wrong. Reconsidering something we believe deeply can threaten our identities, making it feel as if we’re losing a part of ourselves.

When it comes to our knowledge and opinions, though, we tend to stick to our guns. Psychologists call this seizing and freezing. We favor the comfort of conviction over the discomfort of doubt, and we let our beliefs get brittle long before our bones. We laugh at people who still use Windows 95, yet we still cling to opinions that we formed in 1995. We listen to views that make us feel good, instead of ideas that make us think hard.

Mike Lazaridis dreamed up the idea for the BlackBerry as a wireless communication device for sending and receiving emails. As of the summer of 2009, it accounted for nearly half of the U.S. smartphone market. By 2014, its market share had plummeted to less than 1 percent.

When a company takes a nosedive like that, we can never pinpoint a single cause of its downfall, so we tend to anthropomorphize it: BlackBerry failed to adapt. Yet adapting to a changing environment isn’t something a company does—it’s something people do in the multitude of decisions they make every day.

Most of us take pride in our knowledge and expertise, and in staying true to our beliefs and opinions. That makes sense in a stable world, where we get rewarded for having conviction in our ideas. The problem is that we live in a rapidly changing world, where we need to spend as much time rethinking as we do thinking.

With advances in access to information and technology, knowledge isn’t just increasing. It’s increasing at an increasing rate. In 2011, you consumed about five times as much information per day as you would have just a quarter century earlier. As of 1950, it took about fifty years for knowledge in medicine to double. By 1980, medical knowledge was doubling every seven years, and by 2010, it was doubling in half that time.

Researchers have recently discovered that we need to rethink widely accepted assumptions about such subjects as Cleopatra’s roots (her father was Greek, not Egyptian, and her mother’s identity is unknown); the appearance of dinosaurs (paleontologists now think some tyrannosaurs had colorful feathers on their backs); and what’s required for sight (blind people have actually trained themselves to “see”—sound waves can activate the visual cortex and create representations in the mind’s eye, much like how echolocation helps bats navigate in the dark). Vintage records, classic cars, and antique clocks might be valuable collectibles, but outdated facts are mental fossils that are best abandoned.

Two decades ago my colleague Phil Tetlock discovered something peculiar. As we think and talk, we often slip into the mindsets of three different professions:preachers, prosecutors, and politicians. In each of these modes, we take on a particular identity and use a distinct set of tools. We go into preacher mode when our sacred beliefs are in jeopardy: we deliver sermons to protect and promote our ideals. We enter prosecutor mode when we recognize flaws in other people’s reasoning: we marshal arguments to prove them wrong and win our case. We shift into politician mode when we’re seeking to win over an audience: we campaign and lobby for the approval of our constituents. The risk is that we become so wrapped up in preaching that we’re right, prosecuting others who are wrong, and politicking for support that we don’t bother to rethink our own views.

The entrepreneurs arrived in Milan for a training program in entrepreneurship. Over the course of four months, they learned to create a business strategy, interview customers, build a minimum viable product, and then refine a prototype. What they didn’t know was that they’d been randomly assigned to either a “scientific thinking” group or a control group. The training for both groups was identical, except that one was encouraged to view startups through a scientist’s goggles.

From that perspective, their strategy is a theory, customer interviews help to develop hypotheses, and their minimum viable product and prototype are experiments to test those hypotheses. Their task is to rigorously measure the results and make decisions based on whether their hypotheses are supported or refuted.

Over the following year, the startups in the control group averaged under $300 in revenue. The startups in the scientific thinking group averaged over $12,000 in revenue. They brought in revenue more than twice as fast—and attracted customers sooner, too.

Mental horsepower doesn’t guarantee mental dexterity. No matter how much brainpower you have, if you lack the motivation to change your mind, you’ll miss many occasions to think again. Research reveals that the higher you score on an IQ test, the more likely you are to fall for stereotypes, because you’re faster at recognizing patterns. And recent experiments suggest that the smarter you are, the more you might struggle to update your beliefs.

My favorite bias is the “I’m not biased” bias, in which people believe they’re more objective than others. It turns out that smart people are more likely to fall into this trap. The brighter you are, the harder it can be to see your own limitations.

In the case of the BlackBerry, Mike Lazaridis was trapped in an overconfidence cycle. Taking pride in his successful invention gave him too much conviction. Nowhere was that clearer than in his preference for the keyboard over a touchscreen. It was a BlackBerry virtue he loved to preach—and an Apple vice he was quick to prosecute.

The legend of Apple’s renaissance revolves around the lone genius of Steve Jobs. It was his conviction and clarity of vision, the story goes, that gave birth to the iPhone. The reality is that he was dead-set against the mobile phone category. His employees had the vision for it, and it was their ability to change his mind that really revived Apple. Although Jobs knew how to “think different,” it was his team that did much of the rethinking.

Research shows that when people are resistant to change, it helps to reinforce what will stay the same. Visions for change are more compelling when they include visions of continuity. Although our strategy might evolve, our identity will endure.

You’ve probably met some football fans who are convinced they know more than the coaches on the sidelines. That’s the armchair quarterback syndrome, where confidence exceeds competence.

The opposite of armchair quarterback syndrome is impostor syndrome, where competence exceeds confidence.

We’re all novices at many things, but we’re not always blind to that fact. We tend to overestimate ourselves on desirable skills, like the ability to carry on a riveting conversation. We’re also prone to overconfidence in situations where it’s easy to confuse experience for expertise, like driving, typing, trivia, and managing emotions. Yet we underestimate ourselves when we can easily recognize that we lack experience—like painting, driving a race car, and rapidly reciting the alphabet backward. Absolute beginners rarely fall into the Dunning-Kruger trap.

It’s when we progress from novice to amateur that we become overconfident. A bit of knowledge can be a dangerous thing.

“Arrogance is ignorance plus conviction,” blogger Tim Urban explains. “While humility is a permeable filter that absorbs life experience and converts it into knowledge and wisdom, arrogance is a rubber shield that life experience simply bounces off of.”

Humility is often misunderstood. It’s not a matter of having low self-confidence. One of the Latin roots of humility means “from the earth.” It’s about being grounded—recognizing that we’re flawed and fallible.

You can be confident in your ability to achieve a goal in the future while maintaining the humility to question whether you have the right tools in the present. That’s the sweet spot of confidence.

Uncertainty primes us to ask questions and absorb new ideas. It protects us against the Dunning-Kruger effect. “Impostor syndrome always keeps me on my toes and growing because I never think I know it all,” Halla reflects, sounding more like a scientist than a politician.

Arrogance leaves us blind to our weaknesses. Humility is a reflective lens: it helps us see them clearly. Confident humility is a corrective lens: it enables us to overcome those weaknesses.

I found a Nobel Prize–winning scientist and two of the world’s top election forecasters. They aren’t just comfortable being wrong; they actually seem to be thrilled by it. I think they can teach us something about how to be more graceful and accepting in moments when we discover that our beliefs might not be true. The goal is not to be wrong more often. It’s to recognize that we’re all wrong more often than we’d like to admit, and the more we deny it, the deeper the hole we dig for ourselves.

In a classic paper, sociologist Murray Davis argued that when ideas survive, it’s not because they’re true—it’s because they’re interesting. What makes an idea interesting is that it challenges our weakly held opinions.

When a core belief is questioned, though, we tend to shut down rather than open up. It’s as if there’s a miniature dictator living inside our heads, controlling the flow of facts to our minds, much like Kim Jong-un controls the press in North Korea. The technical term for this in psychology is the totalitarian ego, and its job is to keep out threatening information.

It’s easy to see how an inner dictator comes in handy when someone attacks our character or intelligence. Those kinds of personal affronts threaten to shatter aspects of our identities that are important to us and might be difficult to change. The totalitarian ego steps in like a bodyguard for our minds, protecting our self-image by feeding us comforting lies.

As physicist Richard Feynman quipped, “You must not fool yourself—and you are the easiest person to fool.”

Neuroscientists find that when our core beliefs are challenged, it can trigger the amygdala, the primitive “lizard brain” that breezes right past cool rationality and activates a hot fight-or-flight response. The anger and fear are visceral: it feels as if we’ve been punched in the mind. The totalitarian ego comes to the rescue with mental armor.

Discovering I was wrong felt joyful because it meant I’d learned something. As Daniel Kahnemann told me, “Being wrong is the only way I feel sure I’ve learned anything.”

The students who found it stressful didn’t know how to detach. Their opinions were their identities. An assault on their worldviews was a threat to their very sense of self.

A few years ago I surveyed hundreds of new teams in Silicon Valley on conflict several times during their first six months working together. Even if they argued constantly and agreed on nothing else, they agreed on what kind of conflict they were having. When their projects were finished, I asked their managers to evaluate each team’s effectiveness. The teams that performed poorly started with more relationship conflict than task conflict. They entered into personal feuds early on and were so busy disliking one another that they didn’t feel comfortable challenging one another. It took months for many of the teams to make real headway on their relationship issues, and by the time they did manage to debate key decisions, it was often too late to rethink their directions.

“The absence of conflict is not harmony, it’s apathy.”

Although productive disagreement is a critical life skill, it’s one that many of us never fully develop. The problem starts early: parents disagree behind closed doors, fearing that conflict will make children anxious or somehow damage their character. Yet research shows that how often parents argue has no bearing on their children’s academic, social, or emotional development. What matters is how respectfully parents argue, not how frequently. Kids whose parents clash constructively feel more emotionally safe in elementary school, and over the next few years they actually demonstrate more helpfulness and compassion toward their classmates.

In a classic study, highly creative architects were more likely than their technically competent but less original peers to come from homes with plenty of friction. They often grew up in households that were “tense but secure,” as psychologist Robert Albert notes: “The creative person-to-be comes from a family that is anything but harmonious, one with a ‘wobble.’”

Disagreeable people tend to be more critical, skeptical, and challenging—and they’re more likely than their peers to become engineers and lawyers. They’re not just comfortable with conflict; it energizes them. If you’re highly disagreeable, you might be happier in an argument than in a friendly conversation.

One experiment, when people were criticized rather than praised by a partner, they were over four times more likely to request a new partner. Across a range of workplaces, when employees received tough feedback from colleagues, their default response was to avoid those coworkers or drop them from their networks altogether—and their performance suffered over the following year.

Agreeableness is about seeking social harmony, not cognitive consensus. It’s possible to disagree without being disagreeable. Although I’m terrified of hurting other people’s feelings, when it comes to challenging their thoughts, I have no fear. In fact, when I argue with someone, it’s not a display of disrespect—it’s a sign of respect.

My favorite demonstration is an experiment by my colleagues Jennifer Chatman and Sigal Barsade. Agreeable people were significantly more accommodating than disagreeable ones—as long as they were in a cooperative team. When they were assigned to a competitive team, they acted just as disagreeably as their disagreeable teammates.

When they argued about the propeller, the Wright brothers were making a common mistake. Each was preaching about why he was right and why the other was wrong. When we argue about why, we run the risk of becoming emotionally attached to our positions and dismissive of the other side’s. We’re more likely to have a good fight if we argue about how.

One difference was visible before anyone even arrived at the bargaining table. Prior to the negotiations, the researchers interviewed both groups about their plans. The average negotiators went in armed for battle, hardly taking note of any anticipated areas of agreement. The experts, in contrast, mapped out a series of dance steps they might be able to take with the other side, devoting more than a third of their planning comments to finding common ground.

The more reasons we put on the table, the easier it is for people to discard the shakiest one. Once they reject one of our justifications, they can easily dismiss our entire case.

Harish started by emphasizing common ground. When he took the stage for his rebuttal, he immediately drew attention to his and Debra’s areas of agreement. “So,” he began, “I think we disagree on far less than it may seem.” He called out their alignment on the problem of poverty—and on the validity of some of the studies—before objecting to subsidies as a solution.

Most people immediately start with a straw man, poking holes in the weakest version of the other side’s case. He does the reverse: he considers the strongest version of their case, which is known as the steel man.

“If you have too many arguments, you’ll dilute the power of each and every one,” he told me. “They are going to be less well explained, and I don’t know if any of them will land enough—I don’t think the audience will believe them to be important enough. Most top debaters aren’t citing a lot of information.”

If they’re not invested in the issue or they’re receptive to our perspective, more reasons can help: people tend to see quantity as a sign of quality. The more the topic matters to them, the more the quality of reasons matters. It’s when audiences are skeptical of our view, have a stake in the issue, and tend to be stubborn that piling on justifications is most likely to backfire. If they’re resistant to rethinking, more reasons simply give them more ammunition to shoot our views down.

Psychologists have long found that the person most likely to persuade you to change your mind is you. You get to pick the reasons you find most compelling, and you come away with a real sense of ownership over them.

In a heated argument, you can always stop and ask, “What evidence would change your mind?” If the answer is “nothing,” then there’s no point in continuing the debate. You can lead a horse to water, but you can’t make it think.

A few years ago, I argued in my book Originals that if we want to fight groupthink, it helps to have “strong opinions, weakly held.” Since then I’ve changed my mind—I now believe that’s a mistake. If we hold an opinion weakly, expressing it strongly can backfire. Communicating it with some uncertainty signals confident humility, invites curiosity, and leads to a more nuanced discussion.

Research shows that in courtrooms, expert witnesses and deliberating jurors are more credible and more persuasive when they express moderate confidence, rather than high or low confidence.

In every human society, people are motivated to seek belonging and status. Identifying with a group checks both boxes at the same time: we become part of a tribe, and we take pride when our tribe wins. In classic studies on college campuses, psychologists found that after their team won a football game, students were more likely to walk around wearing school swag.

Socially, there’s another reason stereotypes are so sticky. We tend to interact with people who share them, which makes them even more extreme. This phenomenon is called group polarization, and it’s been demonstrated in hundreds of experiments.

Citizens who start out with a clear belief on affirmative action and gay marriage develop more extreme views on these issues after talking with a few others who share their stance. Their preaching and prosecuting move in the direction of their politics. Polarization is reinforced by conformity: peripheral members fit in and gain status by following the lead of the most prototypical member of the group, who often holds the most intense views.

Upon returning from space, astronauts are less focused on individual achievements and personal happiness, and more concerned about the collective good. “You develop an instant global consciousness . . . an intense dissatisfaction with the state of the world, and a compulsion to do something about it,” Apollo 14 astronaut Edgar Mitchell reflected. “From out there on the moon, international politics looks so petty. You want to grab a politician by the scruff of the neck and drag him a quarter of a million miles out and say, ‘Look at that, you son of a b*tch.’”

In one experiment, psychologists randomly assigned Manchester United soccer fans a short writing task. They then staged an emergency in which a passing runner slipped and fell, screaming in pain as he held his ankle. He was wearing the T-shirt of their biggest rival, and the question was whether they would stop to help him. If the soccer fans had just written about why they loved their team, only 30 percent helped. If they had written about what they had in common with other soccer fans, 70 percent helped.

In an ideal world, learning about individual group members will humanize the group, but often getting to know a person better just establishes her as different from the rest of her group. When we meet group members who defy a stereotype, our first instinct isn’t to see them as exemplars and rethink the stereotype. It’s to see them as exceptions and cling to our existing beliefs.

We found that it was thinking about the arbitrariness of their animosity—not the positive qualities of their rival—that mattered. Regardless of whether they generated reasons to like their rivals, fans showed less hostility when they reflected on how silly the rivalry was.

Research suggests that there are more similarities between groups than we recognize. And there’s typically more variety within groups than between them.

Since people held unfounded fears about vaccines, it was time to educate them with a dose of the truth. The results were often disappointing. In a pair of experiments in Germany, introducing people to the research on vaccine safety backfired: they ended up seeing vaccines as riskier. Similarly, when Americans read accounts of the dangers of measles, saw pictures of children suffering from it, or learned of an infant who nearly died from it, their interest in vaccination didn’t rise at all. And when they were informed that there was no evidence that the measles vaccine causes autism, those who already had concerns actually became less interested in vaccinating.

Together, they developed the core principles of a practice called motivational interviewing. The central premise is that we can rarely motivate someone else to change. We’re better off helping them find their own motivation to change.

Motivational interviewing starts with an attitude of humility and curiosity. We don’t know what might motivate someone else to change, but we’re genuinely eager to find out.

Before Marie-Hélène left the hospital, she had Tobie vaccinated. A key turning point, she recalls, was when Arnaud “told me that whether I chose to vaccinate or not, he respected my decision as someone who wanted the best for my kids. Just that sentence—to me, it was worth all the gold in the world.”

Overall, motivational interviewing has a statistically and clinically meaningful effect on behavior change in roughly three out of four studies, and psychologists and physicians using it have a success rate of four in five. There aren’t many practical theories in the behavioral sciences with a body of evidence this robust.

Motivational interviewing isn’t limited to professional settings—it’s relevant to everyday decisions and interactions. One day a friend called me for advice on whether she should get back together with her ex. I was a fan of the idea, but I didn’t think it was my place to tell her what to do. Instead of offering my opinion, I asked her to walk through the pros and cons and tell me how they stacked up against what she wanted in a partner. She ended up talking herself into rekindling the relationship. The conversation felt like magic, because I hadn’t tried to persuade her or even given any advice.

To protect their freedom, instead of giving commands or offering recommendations, a motivational interviewer might say something along the lines of “Here are a few things that have helped me—do you think any of them might work for you?”

Motivational interviewing pioneers Miller and Rollnick have long warned that the technique shouldn’t be used manipulatively. Psychologists have found that when people detect an attempt at influence, they have sophisticated defense mechanisms. The moment people feel that we’re trying to persuade them, our behavior takes on a different meaning. A straightforward question is seen as a political tactic, a reflective listening statement comes across as a prosecutor’s maneuvering, an affirmation of their ability to change sounds like a preacher’s proselytizing.

We can all get better at asking “truly curious questions that don’t have the hidden agenda of fixing, saving, advising, convincing or correcting,” journalist Kate Murphy writes, and helping to “facilitate the clear expression of another person’s thoughts.”

As Betty muses, “Even the devil appreciates being listened to.”

Inverse charisma. What a wonderful turn of phrase to capture the magnetic quality of a great listener.

Hearing an opposing opinion doesn’t necessarily motivate you to rethink your own stance; it makes it easier for you to stick to your guns (or your gun bans). Presenting two extremes isn’t the solution; it’s part of the polarization problem. Psychologists have a name for this: binary bias. It’s a basic human tendency to seek clarity and closure by simplifying a complex continuum into two categories. To paraphrase the humorist Robert Benchley, there are two kinds of people: those who divide the world into two kinds of people, and those who don’t.

An antidote to this proclivity is complexifying: showcasing the range of perspectives on a given topic. We might believe we’re making progress by discussing hot-button issues as two sides of a coin, but people are actually more inclined to think again if we present these topics through the many lenses of a prism. To borrow a phrase from Walt Whitman, it takes a multitude of views to help people realize that they too contain multitudes.

If people read the binary version of the article, they defended their own perspective more often than they showed an interest in their opponent’s. If they read the complexified version, they made about twice as many comments about common ground as about their own views. They asserted fewer opinions and asked more questions. At the end of the conversation, they generated more sophisticated, higher-quality position statements—and both parties came away more satisfied.

Yet by 2018, only 59 percent of Americans saw climate change as a major threat—and 16 percent believed it wasn’t a threat at all.

To overcome binary bias, a good starting point is to become aware of the range of perspectives across a given spectrum. Polls suggest that on climate change, there are at least six camps of thought. Believers represent more than half of Americans, but some are concerned while others are alarmed. The so-called nonbelievers actually range from cautious to disengaged to doubtful to dismissive.

Although no more than 10 percent of Americans are dismissive of climate change, it’s these rare deniers who get the most press. In an analysis of some hundred thousand media articles on climate change between 2000 and 2016, prominent climate contrarians received disproportionate coverage: they were featured 49 percent more often than expert scientists. As a result, people end up overestimating how common denial is—which in turn makes them more hesitant to advocate for policies that protect the environment.

And multiple experiments have shown that when experts express doubt, they become more persuasive. When someone knowledgeable admits uncertainty, it surprises people, and they end up paying more attention to the substance of the argument.

In a series of experiments, psychologists demonstrated that when news reports about science included caveats, they succeeded in capturing readers’ interest and keeping their minds open.

If Peterson had bothered to read the comprehensive meta-analyses of studies spanning nearly two hundred jobs, he’d have discovered that—contrary to his claims—emotional intelligence is real and it does matter. Emotional intelligence tests predict performance even after controlling for IQ and personality. If Goleman hadn’t ignored those same data, he’d have learned that if you want to predict performance across jobs, IQ is more than twice as important as emotional intelligence (which accounts for only 3 to 8 percent of performance).

In a pair of experiments, randomly assigning people to reflect on the intentions and interests of their political opposites made them less receptive to rethinking their own attitudes on health care and universal basic income. Across twenty-five experiments, imagining other people’s perspectives failed to elicit more accurate insights—and occasionally made participants more confident in their own inaccurate judgments. Perspective-taking consistently fails because we’re terrible mind readers. We’re just guessing.

What works is not perspective-taking but perspective-seeking: actually talking to people to gain insight into the nuances of their views. That’s what good scientists do: instead of drawing conclusions about people based on minimal clues, they test their hypotheses by striking up conversations.

My favorite assignment of Erin’s is her final one. As a passionate champion of inquiry-based learning, she sends her eighth graders off to do self-directed research in which they inspect, investigate, interrogate, and interpret. Their active learning culminates in a group project: they pick a chapter from their textbook, choosing a time period that interests them and a theme in history that they see as underrepresented. Then they go off to rewrite it.

Evidence shows that if false scientific beliefs aren’t addressed in elementary school, they become harder to change later. “Learning counter-intuitive scientific ideas [is] akin to becoming a fluent speaker of a second language,” psychologist Deborah Kelemen writes. It’s “a task that becomes increasingly difficult the longer it is delayed, and one that is almost never achieved with only piecemeal instruction and infrequent practice.”

In a curriculum developed at Stanford, high school students are encouraged to critically examine what really caused the Spanish-American War, whether the New Deal was a success, and why the Montgomery bus boycott was a watershed moment. Some teachers even send students out to interview people with whom they disagree. The focus is less on being right, and more on building the skills to consider different views and argue productively about them.

Rethinking needs to become a regular habit. Unfortunately, traditional methods of education don’t always allow students to form that habit.

In this experiment the topic doesn’t matter: the teaching method is what shapes your experience. I expected active learning to win the day, but the data suggest that you and your roommate will both enjoy the subject more when it’s delivered by lecture.10 You’ll also rate the instructor who lectures as more effective—and you’ll be more likely to say you wish all your physics courses were taught that way.

In the physics experiment, the students took tests to gauge how much they had learned about statics and fluids. Despite enjoying the lectures more, they actually gained more knowledge and skill from the active-learning session. It required more mental effort, which made it less fun but led to deeper understanding.

A meta-analysis compared the effects of lecturing and active learning on students’ mastery of the material, cumulating 225 studies with over 46,000 undergraduates in science, technology, engineering, and math (STEM). Active-learning methods included group problem solving, worksheets, and tutorials. On average, students scored half a letter grade worse under traditional lecturing than through active learning—and students were 1.55 times more likely to fail in classes with traditional lecturing.

In North American universities, more than half of STEM professors spend at least 80 percent of their time lecturing, just over a quarter incorporate bits of interactivity, and fewer than a fifth use truly student-centered methods that involve active learning.

It turns out that although perfectionists are more likely than their peers to ace school, they don’t perform any better than their colleagues at work. This tracks with evidence that, across a wide range of industries, grades are not a strong predictor of job performance.

Achieving excellence in school often requires mastering old ways of thinking. Building an influential career demands new ways of thinking. In a classic study of highly accomplished architects, the most creative ones graduated with a B average. Their straight-A counterparts were so determined to be right that they often failed to take the risk of rethinking the orthodoxy. A similar pattern emerged in a study of students who graduated at the top of their class. “Valedictorians aren’t likely to be the future’s visionaries,” education researcher Karen Arnold explains. “They typically settle into the system instead of shaking it up.”

The following year, the class’s favorite idea took that rethinking a step further: the students hosted a day of “passion talks” on which anyone could teach the class about something he or she loved. We learned how to beatbox and design buildings that mesh with nature and make the world more allergy safe. From that point on, sharing passions has been part of class participation. All the students give a passion talk as a way of introducing themselves to their peers. Year after year, they tell me that it injects a heightened level of curiosity into the room, leaving them eager to soak up insights from each of their classmates.

When I was involved in a study at Google to identify the factors that distinguish teams with high performance and well-being, the most important differentiator wasn’t who was on the team or even how meaningful their work was. What mattered most was psychological safety.

I knew that changing the culture of an entire organization is daunting, while changing the culture of a team is more feasible. It starts with modeling the values we want to promote, identifying and praising others who exemplify them, and building a coalition of colleagues who are committed to making the change.

Some of the power distance evaporated—they were more likely to reach out to Melinda and other senior leaders with both criticism and compliments. One employee commented: In that video Melinda did something that I’ve not yet seen happen at the foundation: she broke through the veneer. It happened for me when she said, “I go into so many meetings where there are things I don’t know.” I had to write that down because I was shocked and grateful at her honesty. Later, when she laughed, like really belly-laughed, and then answered the hard comments, the veneer came off again and I saw that she was no less of Melinda Gates, but actually, a whole lot more of Melinda Gates.

Organizational learning should be an ongoing activity, but best practices imply it has reached an endpoint. We might be better off looking for better practices.

Focusing on results might be good for short-term performance, but it can be an obstacle to long-term learning. Sure enough, social scientists find that when people are held accountable only for whether the outcome was a success or failure, they are more likely to continue with ill-fated courses of action. Exclusively praising and rewarding results is dangerous because it breeds overconfidence in poor strategies, incentivizing people to keep doing things the way they’ve always done them.

Process accountability might sound like the opposite of psychological safety, but they’re actually independent. Amy Edmondson finds that when psychological safety exists without accountability, people tend to stay within their comfort zone, and when there’s accountability but not safety, people tend to stay silent in an anxiety zone. When we combine the two, we create a learning zone.

When we dedicate ourselves to a plan and it isn’t going as we hoped, our first instinct isn’t usually to rethink it. Instead, we tend to double down and sink more resources in the plan. This pattern is called escalation of commitment. Evidence shows that entrepreneurs persist with failing strategies when they should pivot, NBA general managers and coaches keep investing in new contracts and more playing time for draft busts, and politicians continue sending soldiers to wars that didn’t need to be fought in the first place. Sunk costs are a factor, but the most important causes appear to be psychological rather than economic. Escalation of commitment happens because we’re rationalizing creatures, constantly searching for self-justifications for our prior beliefs as a way to soothe our egos, shield our images, and validate our past decisions.

In some ways, identity foreclosure is the opposite of an identity crisis: instead of accepting uncertainty about who we want to become, we develop compensatory conviction and plunge head over heels into a career path. I’ve noticed that the students who are the most certain about their career plans at twenty are often the ones with the deepest regrets by thirty. They haven’t done enough rethinking along the way.

A first step is to entertain possible selves:identify some people you admire within or outside your field, and observe what they actually do at work day by day. A second step is to develop hypotheses about how these paths might align with your own interests, skills, and values. A third step is to test out the different identities by running experiments: do informational interviews, job shadowing, and sample projects to get a taste of the work. The goal is not to confirm a particular plan but to expand your repertoire of possible selves—which keeps you open to rethinking.

A second likely culprit is that we spend too much time striving for peak happiness, overlooking the fact that happiness depends more on the frequency of positive emotions than their intensity.

A third potential factor is that when we hunt for happiness, we overemphasize pleasure at the expense of purpose. This theory is consistent with data suggesting that meaning is healthier than happiness.

Psychologists find that passions are often developed, not discovered. In a study of entrepreneurs, the more effort they put into their startups, the more their enthusiasm about their businesses climbed each week. Their passion grew as they gained momentum and mastery. Interest doesn’t always lead to effort and skill; sometimes it follows them.

“Those only are happy,” philosopher John Stuart Mill wrote, “who have their minds fixed on some object other than their own happiness; on the happiness of others, on the improvement of mankind, even on some art or pursuit, followed not as a means, but as itself an ideal end. Aiming thus at something else, they find happiness by the way.”

It takes humility to reconsider our past commitments, doubt to question our present decisions, and curiosity to reimagine our future plans. What we discover along the way can free us from the shackles of our familiar surroundings and our former selves. Rethinking liberates us to do more than update our knowledge and opinions—it’s a tool for leading a more fulfilling life.

Define your identity in terms of values, not opinions. It’s easier to avoid getting stuck to your past beliefs if you don’t become attached to them as part of your present self-concept. See yourself as someone who values curiosity, learning, mental flexibility, and searching for knowledge. As you form opinions, keep a list of factors that would change your mind.

Don’t shy away from constructive conflict. Disagreements don’t have to be disagreeable. Although relationship conflict is usually counterproductive, task conflict can help you think again. Try framing disagreement as a debate: people are more likely to approach it intellectually and less likely to take it personally.

Ask “What evidence would change your mind?” You can’t bully someone into agreeing with you. It’s often more effective to inquire about what would open their minds, and then see if you can convince them on their own terms.

Ask how people originally formed an opinion. Many of our opinions, like our stereotypes, are arbitrary; we’ve developed them without rigorous data or deep reflection. To help people reevaluate, prompt them to consider how they’d believe different things if they’d been born at a different time or in a different place.

Acknowledge common ground. A debate is like a dance, not a war. Admitting points of convergence doesn’t make you weaker—it shows that you’re willing to negotiate about what’s true, and it motivates the other side to consider your point of view.