If surfing the web became as quick and effortless as thinking a thought, would the web not seem to be another aspect of the mind?
The more exposure we have to certain types of information or experience, the more able we become at comprehending it, to the point we needn’t think about it. That is, we learn. This is not only true of skills and basic knowledge, but also perception.
We are born barely able to distinguish between two different objects in our field of vision. It is not our eyes that recognize that this chair is blue and remains partially behind another, they simply relay the signal received on their retinas to our brain, where it is then decoded.
Likewise, the ears, while having the physical tools to distinguish direction and separate frequencies, do not tell you that what you hear is a violin. You learn the specific frequencies and unique characteristics that define a violin and a trumpet, just as you learn how colors and shapes represent certain objects and their locations.
Your brain, through increased exposure to these types of information, learns how to interpret the data. Eventually we learn to recognize people, their voices, and the words they say; we understand they’re still the same person after a haircut, or when we haven’t seen them in several years. All this occurs through the brain’s ability to process the masses of information protruding through each of our sense organs, finding patterns and using them to interpret and predict new information.
We come preinstalled with eyes and ears, and they come prewired to certain areas of our brain. For instance, the eyes send a signal that passes through your head to the back of your brain, to the visual cortex. While we know where visual information is likely to end up, we don’t know how it will be processed and how that experience will rewire the brain—an essential process in learning.
New Avenues of Understanding
What if we wanted to start incorporating new types of data?
While we require eyes or ears to learn math, once we have it within the grips of our mind, it would hardly be considered visual knowledge.
We are capable of learning this new subject, and the ideas and concepts it contains, thanks to an ability to work with different types of information. Eyes and ears are simply mediums through which we obtain data, the brain is where we figure out what that data represents, what it really means.
Consider neuroscientist David Eagleman, who is helping pioneer a field known as Sensory Substitution. In his experiments he has attached a vibratory vest to participants which would trace certain patterns onto the torso to represent some piece of information.
While difficult to begin with, participants were soon able to interpret the patterns with greater ease. The result is a new way to transfer external data into the brain, where it can become knowledge. With a device capable of “sensing” signals that we cannot, it can convert it in real time into a signal we can then understand, while asking nothing of our ability to read, listen, or look at anything.
Some have installed magnetic implants into their fingertips, allowing them to sense magnetic fields as their fingers are slightly tugged and jostled about.
These are signals that use other senses—touch and proprioception—to portray new types of information. Using electrodes plugged into the brain, perhaps we can even skip the medium, to bypass the senses and send information right to the source.
Cochlear implants do something similar already, as they directly stimulate the cochlear nerve. There have also been retinal implants that allow the blind to see—although it is crude and low resolution. While built to replace familiar senses, this is evidence we can build devices that directly interact with our nervous system.
Plugging directly into the brain requires a good deal of knowledge about how the brain works, something we are only beginning to grasp. Conveying information through electrical stimulation of the brain is a complex task, but one the brain might be up to. The structure of the cortex—the outer layer of the brain, where most complex thought and higher functions occur—is largely the same. It is there to decode whatever signal happens to come it’s way, so why not send it something new?
“Our neocortex is virgin territory when our brain is created. It has the capability of learning and therefore of creating connections between its pattern recognizers, but it gains those connections from experience.”
“We start learning immediately, and as soon as we’ve learned a pattern, we immediately start recognizing it.”
—Ray Kurzweil, How to Create a Mind
Any skill is challenging when we’ve had no experience with it. And any skill can become intuitive and automatic with increased experience. We ride a bike without thinking much about balance once we’ve fallen off enough times. So automatic can behaviors become that we might not even know they’re happening or have any recollection of doing them—forget what you had for breakfast? Can’t remember the drive home?
The brain becomes so familiar with some variety of information it hardly works to interpret it at all. What’s more, data that becomes familiar and allows for such automaticity need not come from one of our familiar senses, as Eagleman and others are showing. As long as the information safely enters our nervous system, the brain will handle the rest.
Why Do We Fall?
How does the brain figure out how to interpret new data? Trial and error, most likely.
We fall off our bikes, cook bad meals, say the wrong thing, or take a wrong turn. Children poke things, put them in their mouths, break them apart or throw them across the table. When a dog is confused about what it sees, it tilts its head to change its perspective. We use one method until it stops working, until there is an inconsistency, then try to fix it.
When we are proven wrong, we learn. We update our mental models. We see the way we see and hear the way we hear because we’ve learned that it represents reality to a high degree. We reach out and grab things because we can see them in front of us, and understand that it is an object sitting on top of the table, it is movable, there is a liquid inside that I am able to drink by moving the object to my face.
But our interpretation of reality is just that. It is not necessarily accurate, but it gets the job done. Because we are seldom wrong about what we perceive, our mental models, by the time we’re an adult, aren’t required to change much. But any exposure to the many illusions and visual tricks popularized by magicians will make evident the flaws in our perceptual interpretations.
We don’t know that what we experience as reality is anything like the experience of our neighbor. Perhaps their mind designed something markedly different yet equally satisfactory. Yet they are both interpretations built around the data that has for years been fed into it.
“Instead of reality being passively recorded by the brain, it is actively constructed by it.”
Why should we not think that the brain could handle a new type of data and subsequently alter how we perceive the world? At first it may simply confuse us, but as we misinterpret it and make mistakes, we’ll come to a point when we start to understand it, what it is and what it represents.
Going Beyond Human
So the big, futuristic, mind-boggling question becomes, when do we start extending upon our current senses and creating entirely new ones?
We’re already doing it to an extent. Much of it being for restorative purposes, such as hearing aids and contact lenses. But these are nothing compared to what could happen.
Imagine upgrading your eyes to see more of the light spectrum? We could see colors never even imagined. How about being able to zoom in like a telescope upon the planets in the night sky? Or making your ears sensitive to a greater range of frequencies, making a new realm of music possible? We might install a new nose that rivals a dog’s sense of smell, or intensify our sense of taste. What if we could sense direction or radio waves?
Better yet, if we somehow could connect to the internet, could we “sense” the weather report in Tokyo tomorrow? Could we “feel” a dialogue taking place on social media? This is pure speculation of course, but it’s not far from becoming a real possibility.
In the Brain Electric, Malcolm Gay talks to Kevin Warwick, a cybernetics researcher:
“If you link your brain to a computer brain with different sensory inputs and different mathematical abilities, you’re into this sort of thing where a computer can deal in multidimensional processing […] Instead of thinking as your human brain does in three dimensions, you can start thinking, potentially, in twenty or thirty dimensions. What does that mean? No idea! You’re into a whole different world really.”
From Sensing to Controlling
While our senses allow us to interpret data, it is the case that we do this most often in order to change our behavior in accordance. We learn so that we can do.
And, if the brain can learn to interpret data from new sources, it stands to reason it should also be able to send data to new sources. This would allow us to control any item tasked with reacting to those signals, such as a robot arm or computer cursor.
So what happens when we plug electrodes in a monkey’s brain and link them to a robotic arm? The monkey learns how to use it. This arm need not be attached like our arms are today, in fact, it could be on the other side of the world.
An arm is familiar, and while prosthetics are going to be a great solution to many of our ills, this is again an endeavor in restoration. We might also want to extend ourselves into new realms.
While the learning curve may be steeper, consider controlling the car as you sit in it; a drone as it circles the sky; or a full body separate from your own—all with thoughts alone.
As long as there is feedback and a way to route the signals between your brain and the object, your brain is likely to figure out how to do it. Processing power may become an issue at some point, of course, but who’s to say we won’t largely improve upon that?
What if we could scour the web without moving a finger? Will it become possible to think a search instead of type it? Further still, could we Google search an image from our imagination? Exchange thoughts with other connected individuals? Would we even need to use words? Would it be possible to think in concepts and exchange ideas irrespective of language barriers?
“…how about abstract thoughts? Given ample neural access, could we bypass spoken language altogether, doing away with its ambiguities and miscommunications in favor of direct neural exchange?”
If your dog can learn to sit, roll over and give you a handshake upon certain commands, might you be able to achieve the same thing with thoughts alone provided you are wired up to each other?
Your dog also expresses himself to you when he barks, licks your face, and gives you the puppy eyes. Might your dog learn how to communicate to you using his own thoughts? Could we end up interpreting the thoughts of other species?
What Becomes of Me?
There’s a chance your self is made up of multiple interacting parts. If we split the brain down the middle, so that the two hemispheres are unable to communicate with each other, the result seems as if two conscious entities exist in one body, controlling opposite sides. We know this from experiments in which the brain needs to be split, but also from instances in which one half of the brain is anesthetized.
While nobody has definitively located consciousness in the brain, it is clear that the many facets of mind we associate with it are spread out. Attention, memory, abstract thought, daydreaming, speaking, they all require different areas of the brain to work in synchronicity.
What happens when we add additional layers to this complexity through the internet? When I do math in my head, I know that “I” am doing the thinking. That is, some part of my brain is performing mathematical procedures and relaying the results to other areas. If there is a calculator app in the cloud, and I can use it with my thoughts alone, quickly and, as I have gained experience, intuitively, will it not be the case that my thoughts have in a real sense escaped my brain? Would it become indistinguishable from what I consider thinking today? Wouldn’t this thought be taking place in some borderland between neurons and computer servers?
Perhaps this would just be the beginning of moving our entire selves across this digital divide. The start of our journey towards digital immortality. Who knows. But wherever we end up, it’s sure to be one hell of a ride.
Check out the rest of the Digital Brain Series here