11 Aug 2022
Research
4 minute read

The pandemic changed the way we understand speech

A new study examines how certain now-common words influence what we expect to hear.

smaller.gif

A new study examines how certain now-common words influence what we expect to hear.

Our brains are great at filling in the blanks.

During the COVID-19 pandemic, we’ve been inundated with words and situations that were uncommon to many people before then. We’ve been in lockdowns, maintained social distance, worn masks, and taken vaccines and boosters, and have been talking about these topics seemingly nonstop. Life has looked very different for most people since the start of the pandemic, and new research suggests it has even altered the way we understand certain words.

Our study,1 recently published in PLOS ONE, shows how likely we are to perceive these newly common words as a result of the pandemic — to the point that we expect to hear words like “mask” and “isolation,” even when a different but similar-sounding word is actually spoken. What word do you hear in these clips?

Now that we’ve lived through multiple years of the pandemic, you probably thought that the speaker is saying “lockdown,” “infection,” and “testing.” In reality, each recording is only a partial word: “--ockdown,” “in--ection” and “te--ing,” with a cough replacing the missing sound in each word.

The pandemic presented a once-in-a-generation opportunity to study rapid changes in the way we process language, as those changes were in the process of occurring. The abrupt change to everyone’s lives, and to the words that were on everyone’s lips, gave us a naturalistic way to study how the human brain understands speech and engages in statistical language learning. It also allowed us to study how the brain perceives words in noisy situations — like in a bar or on a train — where it’s not always clear exactly what word someone is saying. This research both helps us understand how our brains perform the highly complex task of understanding language, and may also help to better train AI models tasked with understanding human speech.

From April 2020 through February 2021, a total of 899 subjects participated in four experiments, conducted on Amazon Mechanical Turk, testing how they understood words like “mask” and “isolation” — words that did not feature prominently in our speech before COVID, but have now become extremely common. We found drastic, long-lasting cognitive effects in the way our brains understand these words.

What was that you said? What our 10-minute experiments taught us over 10 months

As cognitive psychologists, we love thinking about language and human interaction, and what happens in the brain when we talk to one another. As it became obvious that the sudden, massive societal shift caused by COVID was also changing the frequency with which we heard certain words, we wondered if it would cause any lasting changes to how our brains process language — a critical component of what makes us human.

At the time, we had no idea how the pandemic would unfold or that it would still be with us two and a half years later. This made the fact that we ran our first experiment just weeks after the start of the soon-to-be-commonly referenced “lockdowns” all the more prescient.

First, we decided on a set of 28 words which had become much more frequent after the onset of COVID – words such as “mask” and “lockdown.” To determine both the pre-pandemic frequency of these words (how often they were spoken between January and December 2019), and the post-pandemic-onset frequency of those same words (how often they were used between January and December 2020), we used the News on the Web (NOW) corpus — a dataset of thousands of newspaper and magazine articles containing billions of words, which, critically, includes when the articles were published and thus the date that each word was used. It was striking to see how much the frequency of individual words changed in such a short period of time: COVID-related words like “mask” were used three times as frequently during 2020 as they had been during 2019, even though similar-sounding words, like “map”, didn’t change at all.

Our experiments used the phonemic restoration task to test what words listeners understand when they hear something ambiguous. This works by recording a full word — for example, “knockdown” — and then removing one sound from the recording (here, the initial “kn” sound). Then, we replaced the deleted “kn” sound with a noise, as you can hear in the sound clip at the top of the post. We asked participants what word they heard when they listened to this now-incomplete and ambiguous recording. All the words we recorded were one sound away from a COVID-related word, such as "knockdown" instead of “lockdown,” and “task” instead of “mask.” And all of the recorded words were equally common in English as their COVID-related counterparts in 2019, but were much less commonly spoken in 2020.

The roughly 10-minute-long experiments presented each qualified participant with ambiguous auditory inputs. For example, a participant would hear a spoken word accompanied by an overlapping cough, much in the same way we might hear a word spoken in a crowd.

The pandemic changed the ranking of certain words we perceive

We ran a set of four experiments over the course of 10 months, and found that people now understand a slew of spoken words differently. For example, now that “mask” is more common, an ambiguous recording of a similar-sounding word “task” is misunderstood as “mask” three times as often as an ambiguous recording of the word “tap” is misunderstood as “map.” Our study is the first to demonstrate the presence of long-lasting changes in lexical accessibility induced by rapid changes in real-world linguistic input.

More research will be needed over time to confirm whether these pandemic-related words will recede to their pre-pandemic frequencies in our mental lexicons. But the implications are clear: Our brains rapidly adapt to the changing linguistic statistics of the world around us, and we predict and expect more common words compared to less common ones.

This research helps us to better understand how the brain processes language input, and adds to a growing body of research – including from our IBM Research colleagues studying other forms of sensory input – which may eventually inform the building of new AI models structured like our own brains. For example, this understanding of the brain's ability to rapidly adapt to changing word frequencies in real-world input could be applied to help digital assistants adapt to individual users' speech more effectively as well.

Date

11 Aug 2022

Share

References

  1. Kleinman, D., Morgan, A.M., Ostrand, R., Wittenberg, E. Lasting effects of the COVID-19 pandemic on language processing. PLOS ONE. June 15, 2022.