This brain-imaging headband can reveal how boring you are

Functional near-infra-red spectroscopy systems (fNIRS) were built into a wearable to monitor brain activity in people listening to stories being recited
A test subject is fitted with a fNIRS (functional near infrared spectroscopy)Pat Greenhouse/The Boston Globe via Getty Images

Awkward miscommunications occur almost daily and, until now, you could never really know if the glazed look in someone's eye was due to your poor storytelling skills or the fact they simply had no idea what you were talking about.

To solve this social dilemma, researchers from Drexel University in conjunction with Princeton University designed a brain-scanning headband capable of monitoring our brains as we interact with others. The band, technically referred to as a “functional near-infra-red spectroscopy” system (fNIRS), uses light to measures the oxygenation and deoxygenation of blood cells in the test subjects brains, as well as that of the listener's brain.

Subscribe to WIRED

The area scientists were particularly interested in was the prefrontal cortex, the area responsible for high-level functions such as memory, attention and problem-solving. When a person is paying attention or learning a new skill, neural activity is greater in this area, evidenced by a greater percentage of oxygenation in the blood. Read more: This wearable will tell you how other people are feeling from the tone of their voice

Hasan Ayaz, PhD, an associate research professor at the Drexel school of biomedical engineering led the research team. He describes the project as a means of measuring how we communicate with each other in everyday situations.

“We live in a social world where everybody is interacting. And now we have a tool that can give us richer information about the brain during everyday tasks,” he says.

The research builds on that of Uri Hasson, an associate professor at Princeton, who has used magnetic resonance imaging to study the brain mechanisms involved in the comprehension of language. Hassan's original work utilised a brain resonance imaging scanner to determine that a listener’s brain activity mirrors that of the speaker’s brain when he or she is telling a story about a real-life experience. From the initial results, he noted that a higher coupling in terms of brain activity is often indicative of better understanding.

However, the restrictions of an MRI proved limiting to this earlier research. Since an MRI is an artificial environment, enclosed and full of mechanised noises, the study was not fully applicable to natural interactions. The use of fNIRS, in place of fMRI, was the vital leap needed between theoretical and real world application.

During tests, a native English speaker and two native Turkish speakers told an unrehearsed, real-life story in their native language, during which time their brains were scanned using the fNIRS. A group of 15 English speakers then listened to the recording, as well as a recording taken at a live storytelling event.

From these results, the interdisciplinary team found that a listener’s brain activity would only correlate when listening to a story they understood - synching to match the activity of the person speaking. This technology could allow for a wide range of opportunities in education and general daily interaction, to measure our communication and to perhaps aid in improving our understanding of each other.

The research is published in the journal Scientific Reports.

This article was originally published by WIRED UK