The Science of Learning & Speaking Languages | Dr. Eddie Chang

Episode 95 Oct 24, 2022 Episode Page ↗
Overview

Dr. Eddie Chang, UCSF neurosurgeon and professor, discusses brain mechanisms of speech, language learning, and disorders like stuttering. He shares his work in decoding speech via BCI to restore communication for paralyzed patients and touches on epilepsy treatment and the future of human-computer communication.

At a Glance
18 Insights
2h 31m Duration
23 Topics
11 Concepts

Deep Dive Analysis

Neuroplasticity and Early Sound Exposure

White Noise Machines and Infant Brain Development

Awake Brain Surgery for Speech and Language Mapping

Brain Stimulation and Emotional Responses

Epilepsy: Causes, Medications, and Neurosurgery

Ketogenic Diet for Epilepsy Treatment

Different Types of Seizures: Absence and Nocturnal

Revisiting Broca's and Wernicke's Areas: New Discoveries

Language Lateralization, Handedness, and Brain Plasticity

Bilingualism and Shared Brain Language Circuits

Distinguishing Speech from Language

Anatomy and Mechanics of Voice Production

Neural Encoding of Speech Sounds: Consonants and Vowels

Plosive and Fricative Consonants in Speech

Reading, Writing, and Dyslexia: Brain Mechanisms

Evolution of Language and Speech Changes

Foreign Accent Syndrome After Stroke

Auditory and Motor Memory for Speech

Brain-Computer Interface for Restoring Speech in Paralysis

Augmenting Human Brain Function with Neurotechnology

Non-Verbal Communication and Avatars in BCI

Stuttering: Causes, Anxiety, and Treatment

Neurosurgeon's Practices for Maintaining Calm and Focus

Critical Period (Sensitive Period)

An early period in brain development where the brain is highly susceptible to patterns it hears or sees, like speech sounds. During this time, the brain is open to plasticity and reorganization, but its maturation can be slowed if deprived of structured environmental sounds.

Brain Mapping

A technique used during awake brain surgery where small electrical currents stimulate different brain areas to identify regions critical for functions like language or movement. This helps surgeons protect essential areas while removing tumors or treating epilepsy.

Speech Arrest

A phenomenon observed during brain mapping where electrical stimulation of a specific brain area temporarily stops a person's ability to speak, even if they know what they want to say. This highlights the brain's physical role in speech production.

Broca's Area (Revised Understanding)

Traditionally thought to be the 'seat of articulation' for speech production. Modern research suggests the precentral gyrus (motor cortex) is more critical for formulating and expressing words, challenging the textbook view of Broca's area's primary role.

Wernicke's Area

Located in the posterior temporal lobe, this brain area is crucial for understanding spoken language and also plays a role in sending commands for speech production. Damage here can lead to aphasia, affecting comprehension and word recall.

Phonemes

The individual speech segments, such as consonants or vowels, that make up a language. Different languages have varying inventories of phonemes, which are combined to form words and meanings.

Articulatory Features

The fundamental, distinct movements of the vocal tract (tongue, jaw, lips, larynx) that generate speech sounds. These approximately 12 features, by themselves meaningless, combine in sequences to generate all words and possible meanings.

Visual Word Form Area

A specialized part of the brain in the back of the temporal lobe that becomes highly sensitive to seeing words (typed or handwritten) as a result of learning to read. It interfaces with the visual cortex to process written language.

Foreign Accent Syndrome

A rare condition where individuals, after a brain injury like a stroke, develop speech patterns that sound like a foreign accent, even though they are not actually speaking another language. This is due to altered vocal tract control.

Locked-In Syndrome

A severe form of paralysis where an individual has completely intact cognition and awareness but no voluntary movement or ability to speak. This condition leads to profound psychological and social isolation.

Augmentation (Neurotechnology)

The use of neurotechnology, such as brain-computer interfaces, to enhance human abilities beyond normal levels, like super memory or communication speeds. This field raises significant ethical and societal questions.

?
How do early environmental sounds affect brain development and plasticity?

The patterns of sound we are exposed to from the earliest times, even in utero, influence how neural networks organize and forever structure how we hear sounds, with a critical period for language specialization.

?
Is using white noise machines for infant sleep beneficial or harmful?

While white noise can soothe babies, continuous exposure might deprive the brain of salient, structured sounds necessary for normal development, potentially slowing the maturation of the auditory cortex. More definitive studies are needed.

?
Can anxiety or other emotional states be caused by underlying brain activity or seizures?

Yes, in rare cases, specific brain areas like the amygdala or insula can evoke acute anxiety or disgust when stimulated or hyperactive due to seizures, which might be misdiagnosed as psychiatric conditions.

?
Are the traditional textbook understandings of Broca's and Wernicke's areas for speech and language still accurate?

While Wernicke's area's role in comprehension largely holds true, the traditional understanding of Broca's area as the sole 'seat of articulation' is fundamentally wrong; the precentral gyrus is more critical for speech production.

?
Is language heavily lateralized to one side of the brain, and what does the other side do?

For right-handed people, language is on the left side 99% of the time. For left-handers, it's about 70% on the left, with more cases of bilateral or right-sided language. The 'equivalent' areas on the opposite side likely contain the machinery for language but are not typically specialized for it in everyday use.

?
Do bilingual individuals use different brain areas for each language?

Bilinguals use shared machinery in the brain to process both languages, with significant overlap in brain activity patterns. However, the way the brain processes the sequences of sounds that give rise to words and meaning can be very different between languages.

?
What is the difference between 'speech' and 'language'?

Speech refers to the physical production of auditory signals through vocal tract movements. Language is a broader concept encompassing the extraction of meaning, semantics, syntax, and pragmatics from words, including forms like sign language or reading.

?
How does the brain encode speech sounds like consonants and vowels?

The ear decomposes sounds into frequencies, which are then processed by the cortex. Specific, focal sites in the speech cortex are tuned to particular consonants, vowels, or articulatory features (like plosives or fricatives), forming a 'salt and pepper' map of speech elements.

?
How do reading and writing relate to the brain's speech and language circuits?

Reading and writing are human inventions that map onto existing brain architecture. The visual word form area processes written words, which then map to the auditory speech cortex, the fundamental area for processing word sounds (phonology).

?
Can a stroke cause someone to spontaneously speak a new language they never learned?

No, there's no evidence of a stroke causing a gain of function like spontaneously speaking a new language. However, a condition called Foreign Accent Syndrome can occur, where a stroke alters speech control, making a person sound like they have a foreign accent without actually speaking that language.

?
What is the current state of Brain-Computer Interfaces (BCI) for paralyzed individuals?

BCI technology can decode attempted speech signals from the brain's motor cortex, translating them into text or synthesized speech for paralyzed individuals. Clinical trials are underway, showing promise in restoring communication for those with locked-in syndrome.

?
Is stuttering caused by anxiety, and what are the treatments?

Stuttering is a speech condition affecting articulation and fluency, not necessarily caused by anxiety, though anxiety can provoke and worsen it. Treatment typically involves therapy to help individuals work through initiation problems and improve coordination of vocal tract movements, sometimes by altering auditory feedback.

1. Exercise for Mental State

Prioritize regular exercise, such as running or swimming, for its mental benefits, as it helps regulate mental state, provides a disconnect from worries, and positively impacts mood, focus, and interactions with others.

2. Create Focus Sanctuary

To achieve intense focus for important tasks, create a “sanctuary” environment by disconnecting from external distractions like cell phones, allowing for complete immersion and mental clarity.

3. Eliminate Distractions for Focus

For tasks requiring intense focus and precision, eliminate all distractions, including background music, to maintain complete mental clarity and concentration on the singular objective.

4. Maintain Hydration & Electrolytes

Ensure adequate hydration and electrolyte intake (sodium, magnesium, potassium in correct ratios, without sugar) for optimal brain and body function, as even slight dehydration diminishes cognitive and physical performance and electrolytes are vital for cell function.

5. Huberman’s Electrolyte Protocol

Dissolve one packet of Element in 16 to 32 ounces of water first thing in the morning and during any physical exercise to ensure proper hydration and electrolyte intake.

6. Utilize NSDR for Energy

Practice yoga nidra or non-sleep deep rest (NSDR) sessions, even for just 10 minutes, to greatly restore levels of cognitive and physical energy.

7. Optimize Language Learning

To learn multiple languages effectively, prioritize early, intense, immersive, and prolonged exposure, ideally before age 12, and ensure learning involves real human interactions, as passive listening alone is insufficient.

8. Read Books to Improve Speech

Prioritize reading physical books (or on Kindle) to improve articulation, sentence structure, and the ability to construct coherent thoughts, as this can positively influence spoken communication and counteract the erosion of formal language in digital communication.

9. Utilize Facial Expressions

When communicating, be mindful of facial expressions and mouth movements, as they are essential nonverbal cues that convey emotion, improve speech intelligibility, and make interactions more natural and memorable.

10. Avoid Continuous White Noise

Avoid exposing infants to continuous white noise, as it can mask natural, structured sounds necessary for normal auditory cortex development and may slow brain maturation.

11. Use Structured Soothing Sounds

When soothing babies, opt for natural, structured sounds instead of continuous white noise, as white noise lacks structure and can mask essential environmental sounds for healthy brain development.

12. Discuss Neurotech Ethics

When considering emerging neurotechnologies for human augmentation, engage in thorough discussions about the ethical implications, whether such enhancements are truly desired for society, and how to ensure equitable access to these technologies.

13. Explore Ketogenic Diet for Epilepsy

For some individuals, particularly children with epilepsy, consider exploring the ketogenic diet as a potential treatment, as it can have beneficial effects for some, though its effectiveness varies.

14. Consider Epilepsy Surgery Options

If epilepsy seizures are not controlled after trying two or three medications, consult with a specialist about alternative treatments like surgery or brain stimulators, as further medication trials may be ineffective.

15. Address Phonological Awareness

If dealing with dyslexia, focus on therapies that improve phonological awareness, which involves mapping visual word forms to their corresponding speech sounds, as this is a foundational step for developing skilled reading.

16. Stuttering Therapy Strategies

If you experience stuttering, seek therapy to learn strategies and “tricks” for fluent word production, particularly focusing on overcoming initiation problems and managing anxiety that can exacerbate stuttering.

17. Develop Holistic Communication Avatars

For individuals with communication disabilities, explore and advocate for neuroprosthetic technologies that develop holistic avatars capable of decoding and displaying speech movements and facial expressions to enrich communication in digital and virtual spaces.

18. Avatar Feedback for Neuroprosthetics

When learning to use speech neuroprosthetics, prioritize systems that offer real-time visual feedback through an avatar, as this embodiment can accelerate learning and enhance the feeling of direct control over communication.

The thing that to me that has been the most striking is that, you know, some of these areas you stimulate and all together you can shut down someone's talking.

Dr. Eddie Chang

I know what I want to say but I couldn't get the words out.

Dr. Eddie Chang

The brain is a physical organ, it's part of the body, outside of the veins on top of it, doesn't look like a machine. But when you do something like that and you focally change the way it works and you see that because a person can't talk anymore... you're confronted with this idea that that organ is the basis of speech and language.

Dr. Eddie Chang

I would say that with regard to the brain in particular, I would say about 50% gets it right and accurate and is helpful, but another 50% is just the approximation and oversimplification of what's going on.

Dr. Eddie Chang

It's not just about the genetic programming that specifies some of this sensitive period, but it's also a little bit about the nature of the sounds that we hear that help keep that window for the critical period open and closed.

Dr. Eddie Chang

Speech and language is part of who we are as humans, that's part of how we evolved and it's hardwired and you know molded by experience. Reading and writing are human invention.

Dr. Eddie Chang

You take these 12 movements and you put them in combinations and you start putting them in sequence, we as humans use those 12 set of features to generate all words and because we can generate nearly an infinite number of words with that code of just 12 features, we have something that generates essentially all possible meaning.

Dr. Eddie Chang

The operating room for me is another space kind of like running or swimming where I'm disconnected from the rest of the world... that intense focus allows me to disconnect from all the other things that I'm worrying about.

Dr. Eddie Chang
~100 hertz
Voice frequency (male) Average frequency of vocal fold vibration during speech for men.
~200 hertz
Voice frequency (female) Average frequency of vocal fold vibration during speech for women.
12 to 14
Phonemes in Hawaiian Inventory of different consonants and vowels in the Hawaiian language.
~40
Phonemes in English Inventory of different consonants and vowels in the English language.
~12
Articulatory features Number of fundamental movements of the tongue, jaw, lips, and larynx that generate all speech sounds.
99%
Language lateralization (right-handed) Percentage of right-handed people who have language processing primarily on the left side of the brain.
~70%
Language lateralization (left-handed) Percentage of left-handed people who have language processing primarily on the left side of the brain.
~1/3
Epilepsy patients uncontrolled by medication Proportion of people with epilepsy whose seizures are not controlled by modern medications.
15 years
BRAVO trial participant paralysis duration The first participant in the BRAVO trial had been paralyzed for 15 years before receiving the BCI implant.
50 words
Initial BCI vocabulary The initial vocabulary set used to train the brain-computer interface algorithm for the first participant.