In 2019, we will leverage technology to create new senses

Data streams will soon be fed directly into our brains, enabling us to experience the world anew

You are about to develop some new senses. This idea requires a bit of unpacking. The first thing to appreciate is that the brain is locked in silence and darkness inside the vault of the skull. All it ever has are electrical and chemical signals racing around among its specialised cells – it doesn’t directly see or hear or touch anything. Whether the information coming in represents air-compression waves from a symphony, patterns of light from a snow-covered statue, molecules floating off a fresh apple pie or the pain of a wasp sting, it’s all represented by voltage spikes in brain cells. And to a first approximation, it all looks the same.

But this prompts an as yet unanswered question in neuroscience: why does vision feel so different than smell or taste? Why is it that you would never confuse the beauty of a waving pine tree with the taste of feta cheese? Or the feeling of sandpaper on your fingertips with the smell of fresh espresso?

One might imagine this has something to do with the structure of the brain: the parts involved in hearing are different from the parts involved in touch. But upon closer examination, this hypothesis weakens. If you go blind, the part of the brain that we used to call the “visual cortex” gets taken over by touch and hearing. When looking at a rewired brain, it's difficult to insist that there’s anything fundamentally visual about the “visual” cortex after all.

So a different hypothesis has emerged: that the internal subjective experience of a sense – also known as its “qualia” – is determined by the structure of the data itself. In other words, information coming from the two-dimensional sheet of the retina has a different structure than data coming from the one-dimensional signal on the eardrum or from the multidimensional receptor data from the fingertips. As a result, they all feel different.

This suggests that, if we could feed a new data stream directly into the brain – such as data from a mobile robot or the state of your spouse’s microbiome or long-range infrared temperature data – it will give rise to a new qualia. It won’t feel like vision or hearing or taste or touch or smell, but something entirely novel.

It it difficult to imagine what such a new sense would be. In fact, it is impossible to imagine it. By analogy, try imagining a new colour. It seems like it should be a simple task, but it’s impossible.

But next year, we will be able to experience new senses at first hand by feeding new data streams into the brain. This could be a real-time feed of data from a drone, such as its pitch, yaw, roll, heading and orientation. It could be the activity on a factory floor or a Twitter feed or the stock market. And the result would be that the brain would have a direct perceptual experience of the drone, of manufacturing, of hashtags or of the real-time economic movements of the planet.

This feels like pure fantasy but we are now finally at the point, technologically, where we can put it to the test.

There are two ways we can do this. The first is by implanting electrodes directly into the brain (or, soon enough, by stimulating the brain by small actuators in the blood stream or through nanorobots in cells). The second is to get signals to the brain non-invasively. My neuroscience laboratory and my company NeoSensory have together built wearable devices that deliver spatial patterns of vibration on the skin. Imagine wearing a wristband with multiple vibratory motors that stimulate different locations around your wrist to represent a data stream. When we establish a clear mapping between the information and the touch, people can come to easily understand how to act on the new data – and this will eventually lead to entirely new qualia.

Qualia develop over time. They are the brain’s way of summarising large amounts of data. Consider how babies “learn” how to use their ears by clapping their hands together or by babbling something out – and by catching the feedback in their ears. At first, the air compressions are just electrical activity in the brain; eventually they become experienced as sound. Such learning can also be seen with people who are born deaf and are fitted with cochlear implants as adults. At first, the experience of the cochlear implant is not like sound at all. A friend of mine described it as painless electrical shocks inside her head – she had no sense that it had anything to do with sound. But, after about a month, it became “sound”, albeit lousy sound, like a tinny and distorted radio. This is presumably the same process that happened to each of us when we were learning to use our ears. We simply don’t remember it.

If the ability to create new senses proves possible, a striking consequence is that we won't be able to explain the new sense to someone else. For example, you have to have experienced purple to know what purple is; no amount of academic description will enable a colour-blind person to understand purpleness. It is similarly futile to try to explain vision to someone born blind. To understand vision requires experiencing vision.

So it will go with the development of new senses. We will have to experience them to understand what they are like; and the only way to do this will be to experience the effect of data streams on our brains. Fortunately, in 2019, we’ll be able to plug in to find out.

Updated December 27, 2018: The headline of this article has been updated

David Eagleman is an adjunct professor in the department of Psychiatry & Behavioral Sciences at Stanford University and author of The Brain: The Story of You

This article was originally published by WIRED UK