#8 Julia Galef: The Art of Changing Minds
Shane Parrish interviews Julia Galef, co-founder of the Center for Applied Rationality, about rationality, changing one's own mind and others', and filtering information. They discuss strategies for improving reasoning and decision-making.
Deep Dive Analysis
11 Topic Outline
Origins of Interest in Rationality: Parental Influence
Mission and Work of the Center for Applied Rationality (CFAR)
Defining Rationality: Epistemic vs. Instrumental
The Role of Intuition and Emotion in Rationality
Connection Between Art Appreciation and Rationality
Strategies for Changing Your Own Mind
Processing Information Rationally in a High-Information World
The Importance of Emotional Self-Awareness in Rational Thinking
Ethical and Unethical Strategies for Changing Other People's Minds
Combating Misinformation and Creating Uncertainty
Recommended Books for Profound Impact
6 Key Concepts
Epistemic Rationality
This type of rationality is about using reasoning processes that systematically lead to a more accurate model of how the world works. It acknowledges that perfect certainty is impossible due to limited information and time, but some processes are more reliable than others for building accurate beliefs.
Instrumental Rationality
This refers to making choices that are most likely to achieve your goals, given the best information available. Goals in this context are broad and can include anything one values, such as happiness, friendships, or making a positive impact on the world.
System One Thinking
This is the intuitive, fast, and automatic part of the mind, which is excellent at processing large amounts of information quickly, often unconsciously. While indispensable for daily functioning, it can be fallible and prone to biases.
System Two Thinking
This is the more recently evolved part of the mind responsible for slow, laborious, and effortful processes like logical reasoning, weighing abstract trade-offs, performing calculations, and long-term planning. It can do things System One cannot, but it requires more cognitive effort.
Straw Vulcan
This is an archetype or trope representing a weak caricature of rationality, often seen in fictional characters like Spock. Such characters are portrayed as logical but frequently make miscalculations because they irrationally expect others to behave rationally, despite evidence to the contrary.
Trigger-Action Plan
This is a concrete, habit-based approach designed to translate abstract advice into actionable steps for improving reasoning. It involves identifying a specific cue or trigger and then installing a predefined action to manifest a principle like open-mindedness, even if it initially requires conscious effort.
6 Questions Answered
CFAR is a nonprofit organization co-founded by Julia Galef, dedicated to developing and training people in strategies for reasoning and decision-making. Its mission is to improve default human processes for reasoning and decision-making with an eye towards having a positive impact on the world.
Julia Galef's interest in rationality stemmed from her intellectually curious parents, who were unusually good at changing their minds when presented with good arguments or new facts. Her father also encouraged skeptical thinking by sometimes making up answers to her questions, prompting her to figure out discrepancies herself.
The first step is genuinely believing that changing your mind is desirable, acknowledging that everyone is often wrong, and actively seeking out new evidence. It also involves translating abstract advice like 'be open-minded' into concrete, habit-based 'trigger-action plans'.
Emotions often operate in the background, leading to defensiveness, aggression, or subtle anxiety that can distort how information is processed and create blind spots. Developing self-awareness of these emotional reactions can provide important clues about where one is inadvertently creating biases or avoiding uncomfortable truths.
Strategies can range from intellectually honest approaches using good arguments and evidence to more insidious methods that exploit psychological biases, as described in Cialdini's 'Influence.' Julia Galef personally feels morally uncomfortable using methods that don't ground out in good argument or logic, even if they are effective.
Misinformation can be created by selectively presenting a few studies or expert quotes that support a desired claim, even if the broader scientific consensus or evidence points elsewhere. Combating this is difficult, often requiring not just facts but also charismatic spokespeople, extensive advertising, and leveraging principles like familiarity to get the correct message out.
20 Actionable Insights
1. Embrace Mind-Changing as a Strength
Actively believe that changing your mind is a desirable trait, not a sign of weakness or stupidity, and recognize that you likely hold incorrect beliefs that, if updated, would lead to better decisions.
2. Actively Seek Disconfirming Evidence
Don’t just passively accept new evidence; actively seek out information that might challenge your existing beliefs, rather than only looking for evidence that supports what you already believe.
3. Cultivate Emotional Self-Awareness
Become more attuned to your emotional reactions (e.g., defensiveness, anxiety, subtle concerns) as they occur, as these can serve as important clues to your blind spots and areas where you might be flinching away from uncomfortable truths.
4. Cash Out Beliefs into Concrete Predictions
To clarify your thinking and test your beliefs, ask yourself: ‘What do I expect to see differently in the world if this claim is true?’ This forces you to make your beliefs concrete and potentially disprovable by evidence.
5. Implement Trigger-Action Plans
Translate abstract advice like ‘be open-minded’ into concrete, habit-based ’trigger-action plans.’ For example, if you read an article you disagree with (trigger), consciously look for reasons to agree with it or not reject it (action).
6. Integrate Intuition and Logic
Understand the respective strengths and weaknesses of intuitive (System 1) and logical (System 2) thinking, and learn to get them to communicate, rather than ignoring or suppressing intuition.
7. Seek Reliable Information Sources
To build an accurate model of the world, prioritize reliable reasoning processes such as synthesizing opinions of top experts or looking at randomized controlled trials, rather than making things up or believing random people.
8. Eliminate Obviously Bad Choices
When unsure of the optimal path to achieve a goal, focus on ruling out clearly suboptimal or ‘stupid’ choices as a first step, as this is often low-hanging fruit and a practical way to improve decisions.
9. Avoid the “Straw Vulcan” Error
Do not expect others to behave rationally or change their minds solely based on facts and evidence, as this expectation itself is often irrational given human psychology.
10. Neutralize Personal Bias in Evaluation
When evaluating an argument from someone you dislike, mentally reframe it by imagining someone you like said the exact same thing to check if your reaction is unfairly biased by your feelings towards the person.
11. Observe Physical Manifestations of Emotion
Develop awareness of physical signs of emotional reactions (e.g., body tensing up, leaning aggressively or anxiously) as these can indicate how emotions are influencing your information processing.
12. Utilize Practices for Self-Awareness
Consider practices like meditation or martial arts to develop greater self-awareness, particularly in detecting physical and emotional states that influence reasoning.
13. Encourage Socratic Reasoning
When teaching or guiding others, or even for self-learning, lead through a reasoning process with questions rather than simply providing answers, to make the learning more satisfying and memorable.
14. Develop Skepticism
Cultivate an ability to notice when something doesn’t make sense, even when presented with a serious tone, to foster critical thinking and avoid being easily misled.
15. Practice Overcoming Biases
To change ingrained habits of thought and behavior, engage in longer-term practice on real-life issues, rather than just learning about biases or solving toy problems.
16. Define Broad Goals
When thinking about goals for instrumental rationality, include less obvious but important values like feeling connected to others, finding meaning, or experiencing pleasure (e.g., through art), as these are often neglected but crucial for well-being.
17. Recognize Selective Evidence Presentation
Be aware that it’s easy to create uncertainty or a false sense of certainty by selectively presenting studies or expert quotes that support a desired claim, even if the overall body of evidence suggests otherwise.
18. Be Cautious with “Scientific Evidence” from Quick Searches
When seeking evidence for a belief (e.g., a diet), be aware of the tendency to find only studies that support your pre-existing idea, and acknowledge that a broader search might reveal contradictory evidence.
19. Apply Rationality to Altruism
When trying to help the world, step back from automatic emotional reactions and use evidence to identify the most effective and efficient ways to make an impact, recognizing that some methods or charities are orders of magnitude more effective than others.
20. Be Aware of Persuasion Tactics
Understand that techniques like scarcity, social proof, and reciprocity (from Cialdini’s “Influence”) can effectively change minds without relying on facts or logic, and consider the ethical implications before using them or to protect yourself from them.
7 Key Quotes
If I was right, they wanted to realize that I was right.
Julia Galef
The best choice is not doing, you know, a half-assed job at your current job.
Julia Galef
Rationality is, it's less about ignoring or suppressing system one intuition, and more about understanding the sort of respective strengths and weaknesses of the two systems, and learning how to sort of get them to communicate with each other.
Julia Galef
If you look at the, the quintessential straw Vulcan with Spock himself, he is supposed to be the logical rational one, but he keeps making miscalculations because he expects other people to behave rationally and they don't.
Julia Galef
I don't expect people to change their minds based on facts and evidence. Cause I, I know that's not a thing.
Julia Galef
I'm not calling your paper, your claim wrong. It's not even wrong. It's like, doesn't even make enough sense to be wrong.
Julia Galef
There are literally like orders of magnitude difference in the effectiveness of different charities trying to do the same thing.
Julia Galef
1 Protocols
Trigger-Action Plan for Open-Mindedness
Julia Galef- Identify a trigger: You read an article that you disagree with.
- Take action: Instead of defaulting to finding reasons to reject the article, actively look for evidence that actually agrees with the article.