Home / Blog / Researchers Develop New Technique to Determine Speech Comprehension.

Blog

  • Researchers Develop New Technique to Determine Speech Comprehension.
    14 Mar , 2018

     

     

     

     

     

     

     

     

     

     

     

     

     

     

     

    Researchers Develop New Technique to Determine Speech Comprehension.(Scientists set out to find an EEG-based method that can measure hearing as well as speech understanding, completely automatically)

     

    Neuroscientists from The Katholieke Universiteit Leuven (KU Leuven)—a research university in Flanders, Belgium—(in collaboration with the University of Maryland) measured brainwaves to determine whether people understand what they hear, the university announced.

     

    The new technique was developed by Professor Tom Francart and his colleagues from the Department of Neurosciences at KU Leuven in collaboration with the University of Maryland. According to the announcement, the technique will allow for a more accurate diagnosis of patients who cannot actively participate in a speech understanding test because they’re too young, for instance, or because they’re in a coma. In the longer term, the method also holds potential for the development of smart hearing devices.

     

    A common complaint from people with a hearing aid is that they can hear speech but they can’t make out its meaning. Indeed, being able to hear speech and actually understanding what’s being said are two different things.

     

    The tests to determine whether you can hear soft sounds are well established. Just think of the test used by audiologists whereby you have to indicate whether you hear “beep” sounds. An alternative option makes use of EEG, which is often used to test newborns and whereby click sounds are presented through small caps over the ears. Electrodes on the head then measure whether any brainwaves develop in response to these sounds.

     

    The great advantage of EEG is that it is objective and that the person undergoing the test doesn’t have to do anything. “This means that the test works regardless of the listener’s state of mind,” says co-author Jonathan Simon from the University of Maryland. “We don’t want a test that would fail just because someone stopped paying attention.

     

    But to test speech understanding, the options are much more limited, explains lead author Tom Francart from KU Leuven: “Today, there’s only one way to test speech understanding. First, you hear a word or sentence. You then have to repeat it so that the audiologist can check whether you have understood it. This test obviously requires the patient’s active cooperation.”

     

    Therefore, scientists set out to find an EEG-based method that can measure hearing as well as speech understanding completely automatically.

     

    “And we’ve succeeded,” said Tom Francart. “Our technique uses 64 electrodes to measure someone’s brainwaves while they listen to a sentence. We combine all these measurements and filter out the irrelevant information. If you move your arm, for instance, that creates brainwaves as well. So we filter out the brainwaves that aren’t linked to the speech sound as much as possible. We compare the remaining signal with the original sentence. That doesn’t just tell us whether you’ve heard something but also whether you have understood it.”

     

    The way this happens is quite similar to comparing two sound files on your computer: when you open the sound files, you sometimes see two figures with sound waves. Tom Francart: “Now, imagine comparing the original sound file of the sentence you’ve just heard and a different sound file derived from your brainwaves. If there is sufficient similarity between these two files, it means that you have properly understood the message.”

     

    This new technique makes it possible to objectively and automatically determine whether someone understands what’s being said. This is particularly useful in the case of patients who cannot respond, including patients in a coma.

     

    The findings can also help to develop ‘smart’ hearing aids and cochlear implants, Francart said: “Existing devices only ensure that you can hear sounds. But with built-in recording electrodes, the device would be able to measure how well you have understood the message and whether it needs to adjust its settings—depending on the amount of background noise, for instance.”

     

    This research was funded by the European Research Council (GA 637424), the Research Foundation–Flanders (FWO), and KU Leuven.

    Source: KU Leuven

    Image credit: KU Leuven, University of Maryland