• A
  • A
  • A
  • ABC
  • ABC
  • ABC
  • А
  • А
  • А
  • А
  • А
Regular version of the site

Unconscious Perception of Sounds: We Hear Differences Even without Listening

Unconscious Perception of Sounds: We Hear Differences Even without Listening

© iStock

Neurobiologists from HSE University and the RAS Institute of Higher Nervous Activity and Neurophysiology proved that the human brain unconsciously distinguishes between even very similar sound signals during passive listening. The study was published in Neuropsychologia.

Our auditory system is able to detect sounds at an implicit level. The brain can distinguish between even very similar sounds, but we do not always recognise these differences. The researchers demonstrated this in their study dedicated to sound perception during passive listening (when the subject is not trying to explicitly hear the differences).

To investigate this, the researchers carried out an experiment with 20 healthy volunteers. The participants listened to sounds while the researchers used electroencephalography (EEG) to measure their brain responses to the stimuli. The sounds were so similar that the participants could only explicitly distinguish them with 40% accuracy.

First, the volunteers listened to sequences of three sounds in which one sound was repeated often, while the two others appeared rarely. The participants were asked to press a key if they heard a difference in the sounds. Then, in passive listening mode, the same sounds appeared in more elaborate sequences: groups of five similar sounds and groups in which the fifth sound was different.

Two types of sound sequences were used in the experiment: those with local irregularities and those with global irregularities. In the first type, groups of similar sounds were often repeated, while a group with a different sound at the end appeared randomly and rarely. In the second type, groups with a different sound at the end appeared often and groups of similar sounds appeared rarely.

Detecting these two types of sound sequences requires attention at different levels. The brain reacts differently to them, and EEG registers different types of potentials. Local irregularity can be detected without explicit attention and elicits mismatch negativity (MMN) and P3a potentials. Global irregularity demands concentration and elicits P3b potential, which reflects a higher level of consciousness. The same potentials were registered in earlier experiments with the same methodology. The difference with the current study by researchers from the HSE Institute for Cognitive Neuroscience and the RAS Institute of Higher Nervous Activity and Neurophysiology is that they used sounds that are barely distinguishable. In earlier studies, stimuli (sounds or images) could be recognized with 100% accuracy.

Olga Martynova, principal researcher, Senior Research Fellow at the Institute for Cognitive Neuroscience and Academic Supervisor of the programme in Cognitive Neurobiology

‘We made the sound sequence more complicated, assuming this would facilitate sound recognition. We would see this in an increased amplitude of potential. But the result was unexpected. Instead of P3b potential in global irregularities, we saw emerging N400 potential, which is related to explicit information processing but can also appear in implicit attention. The appearance of this potential is a sign of a hidden, implicit form of learning that is constantly happening in our lives.'

The appearance of N400 potential confirms an existing theory that explains how consciousness works. According to the theory of predictive coding, the brain creates a model of the environment based on its experience and uses predictions to optimize its operations. When faced with experiences that contradict these predictions, its world outlook is updated. This process forms the basis of implicit (unconscious) learning and is related to the aim of minimizing prediction errors to enable better adaptation and faster reaction to changes in the environment.

The results of the study are important for fundamental science (since they prove the predicting coding model) and have possible applications in clinical studies. For example, P3b and N400 potentials can be used to evaluate the consciousness level of patients who are unable to explicitly react to stimuli (such as in cases of Alzheimer’s disease, Parkinson’s disease, comas etc).

If you are interested in taking part in neurobiological studies, please contact the researchers via email at omartynova@hse.ru.

See also:

'Neurotechnologies Are Already Helping Individuals with Language Disorders'

On November 4-6, as part of Inventing the Future International Symposium hosted by the National Centre RUSSIA, the HSE Centre for Language and Brain facilitated a discussion titled 'Evolution of the Brain: How Does the World Change Us?' Researchers from the country's leading universities, along with health professionals and neuroscience popularisers, discussed specific aspects of human brain function.

‘Scientists Work to Make This World a Better Place’

Federico Gallo is a Research Fellow at the Centre for Cognition and Decision Making of the HSE Institute for Cognitive Research. In 2023, he won the Award for Special Achievements in Career and Public Life Among Foreign Alumni of HSE University. In this interview, Federico discusses how he entered science and why he chose to stay, and shares a secret to effective protection against cognitive decline in old age.

'Science Is Akin to Creativity, as It Requires Constantly Generating Ideas'

Olga Buivolova investigates post-stroke language impairments and aims to ensure that scientific breakthroughs reach those who need them. In this interview with the HSE Young Scientists project, she spoke about the unique Russian Aphasia Test and helping people with aphasia, and about her place of power in Skhodnensky district.

Neuroscientists from HSE University Learn to Predict Human Behaviour by Their Facial Expressions

Researchers at the Institute for Cognitive Neuroscience at HSE University are using automatic emotion recognition technologies to study charitable behaviour. In an experiment, scientists presented 45 participants with photographs of dogs in need and invited them to make donations to support these animals. Emotional reactions to the images were determined through facial activity using the FaceReader program. It turned out that the stronger the participants felt sadness and anger, the more money they were willing to donate to charity funds, regardless of their personal financial well-being. The study was published in the journal Heliyon.

Spelling Sensitivity in Russian Speakers Develops by Early Adolescence

Scientists at the RAS Institute of Higher Nervous Activity and Neurophysiology and HSE University have uncovered how the foundations of literacy develop in the brain. To achieve this, they compared error recognition processes across three age groups: children aged 8 to 10, early adolescents aged 11 to 14, and adults. The experiment revealed that a child's sensitivity to spelling errors first emerges in primary school and continues to develop well into the teenage years, at least until age 14. Before that age, children are less adept at recognising misspelled words compared to older teenagers and adults. The study findings have beenpublished in Scientific Reports .

Meditation Can Cause Increased Tension in the Body

Researchers at the HSE Centre for Bioelectric Interfaces have studied how physiological parameters change in individuals who start practicing meditation. It turns out that when novices learn meditation, they do not experience relaxation but tend towards increased physical tension instead. This may be the reason why many beginners give up on practicing meditation. The study findings have been published in Scientific Reports.

Processing Temporal Information Requires Brain Activation

HSE scientists used magnetoencephalography and magnetic resonance imaging to study how people store and process temporal and spatial information in their working memory. The experiment has demonstrated that dealing with temporal information is more challenging for the brain than handling spatial information. The brain expends more resources when processing temporal data and needs to employ additional coding using 'spatial' cues. The paper has been published in the Journal of Cognitive Neuroscience.

Neuroscientists Inflict 'Damage' on Computational Model of Human Brain

An international team of researchers, including neuroscientists at HSE University, has developed a computational model for simulating semantic dementia, a severe neurodegenerative condition that progressively deprives patients of their ability to comprehend the meaning of words. The neural network model represents processes occurring in the brain regions critical for language function. The results indicate that initially, the patient's brain forgets the meanings of object-related words, followed by action-related words. Additionally, the degradation of white matter tends to produce more severe language impairments than the decay of grey matter. The study findings have been published in Scientific Reports.

New Method Enables Dyslexia Detection within Minutes

HSE scientists have developed a novel method for detecting dyslexia in primary school students. It relies on a combination of machine learning algorithms, technology for recording eye movements during reading, and demographic data. The new method enables more accurate and faster detection of reading disorders, even at early stages, compared to traditional diagnostic assessments. The results have been published in PLOS ONE.

HSE University and Adyghe State University Launch Digital Ethnolook International Contest

The HSE Centre for Language and Brain and the Laboratory of Experimental Linguistics at Adyghe State University (ASU) have launched the first Digital Ethnolook International Contest in the Brain Art / ScienceArt / EtnoArt format. Submissions are accepted until May 25, 2024.