“Our brains renew themselves throughout life to an extent previously thought not possible.”
― Michael Gazzaniga, neuroscientist
― Michael Gazzaniga, neuroscientist
I was just asked by a writer at Forbes about inventions that have “rewired the brain,” especially with reference to our generalized use of Google. I scrolled around under this term and discovered quite a few usages—too many, I think, because this term has a specific engineering reference, one that overestimates the direct effects of technology on human thinking and behavior.
Steven Johnson’s bright book on the implications of technological innovation, How We Got to Now: Six Innovations that made the modern world,” (2014) outlines six major themes, from glass, cold, sound, cleanliness, and time, ending up with light, but not one of these critical chapters from material history claims to have rewired anyone’s brain. Instead, each domain is the story of inventions that altered human expectations and behavior worldwide, influencing the state of the art of civilization but also revealing the mechanisms of the exchange of ideas, creative teams, and the timelines of invention, as well as applied use in society.
What has been studied with respect to thinking is the use of the internet search engine as a learning channel —but also as a storage device for memory. The operating assumption is that once something has been discovered through Google, the user doesn’t devote any effort to memorizing the material, because we are aware we can always revisit the source to refresh that memory. This was the same fear that developed around the printed word starting with Guttenberg’s press in 1453; that print would destroy memory--as the written word was predicted to do some 4500 years before. Of course what happened was the proliferation of ideas fueling the Enlightenment and the freeing of thought from the confines of church doctrine and access to the riches of global knowledge.
In the same vein, Artificial Intelligence doesn’t mean we will stop using our own brains or the discipline of thinking—AI just empowers our thought by amassing millions or billions of bytes into new patterns to inform in great depth the way we are able to see the world. Digital forms of information processing doesn’t make our brains digital, just extends our reach and grasp of data far too oversized to be absorbed through the normal senses. The neuroplasticity of our brains, which is essentially what separates and elevates us from our primate cousins, is custom-made to benefit from the depth and breadth of big data.
In the same way, the invention of lenses for reading in monasteries 800 years ago didn’t rewire our abilities to see and read ancient Latin manuscripts. It simply revealed the nearsightedness that could then be corrected by a sweeping market for spectacles, then the microscope, telescope, camera, fiberglass, TV and film. The Roman invention of clear glass cleared the way for the scientific revolution. And glasses became a human technological wearable, the first since the invention of clothing.
“Rewiring” is used loosely to refer to the impact of technology on human behavior and culture. The brain is constantly reorganizing through neuroplasticity, meaning new networks of connections between neurons, which the brain does all the time with new learning. This is a functional change, like those that occur under the influence of alcohol or depression, changing the volume of white matter and grey matter. Gaming releases dopamine, which enhances attention and visuospatial skills, and is addictive, requiring greater and greater activity to produce the same level of reward—the same effect produced by long-term use of pornography. Online and digital gaming by a hard-core percentage of daily users (like day-traders) get regular infusions of dopamine that promote addiction. Meanwhile, the efficiency of attention, focus, and visuospatial skills actually bestow serious skill sets that find all kinds of uses in the world of work.
However, this is not really rewiring, but adaptations of the brain to new stimuli or new situations that demand better efficiencies in one part of the brain versus another, which may lose potency as other areas take over. The ratio of white to grey matter in the brain’s makeup is affected by habit and experience. Our brain seeks out rewards from the world around us—from TV, socialization, chocolate, smoking, sports betting, travel, or playing Tetris—there are as many forms of addiction as an outcome of these unending explorations.
By contrast, rewiring would be a change in structure—in the way the system works, not just adapts to new content. This is a more fundamental level of change. But the key trait of the human brain has always been its adaptability to new circumstances, a wide network of social demands, and the acquisition and integration of new knowledge and the creation of new ways of thinking about both new and old datasets—the adaptability implied by neuroplasticity.
If there is a single technology that could be said to have effected such a change, it would be the invention of fire 200,000 to 400,000 years ago, the game-changing master invention of humankind that eventually led to culture itself through a biological shift. The theory goes that the new ability to cook food over heat under control made early humans far more efficient because they could devote less time hunting and gathering raw foods and chewing and digesting them. The high proteins of meat in greater quantities, digesting quickly after searing with fire, could be ingested and absorbed. Meat-eating actually grew brain size to the highest ratio to body size in the animal kingdom, allowing the thinking revolution to begin that is the basis for human civilization.