When I began writing this column back in 1985, my page could hold up to 1,000 words. Over the years that number has shrunk, first to 800, then 700. As of this month, with Sojourners’ new design, I get a measly 600. But I’m not disgruntled. As I’ve had fewer words to work with, the ones I use are often better chosen.
Besides, I know very well that in the “new media environment” I’m lucky to still be printed and read at all. Six hundred words is practically War and Peace compared to the 140-character limit of a Tweet.
When this column began, I knew that our media world was changing. CNN, MTV, and home video were already here. So was USA Today with its ultra-short stories and dumbed-down tone. Also, a few years earlier Sojourners had made the transition from typewriters to computers. When I saw my thoughts transmuted into pure light on the screen, I felt in my bones that something really new had arrived. But in 1985 I didn’t know what it was.
Now we know—it was the wired world of instant information. Soon the U.S. military’s digital network was opened to the public, and the inexorable process began that has led us to Facebook and Twitter. It’s an evolutionary (or devolutionary) process that is changing the way we shop and form relationships, and even the way we think.
Since the 1994 launch of the Web, I’ve spent lots of time with roomfuls of 18- to 22-year-olds, so I’ve watched this evolutionary process up close. And I’ve become convinced that the scientists are right who say that it is not just a cultural evolution (like the one from oral tradition to print) but a biological one. People’s brains are changing.
UCLA neuroscientist Gary Small, in his book iBRAIN: Surviving the Technological Alteration of the Modern Mind, describes experiments in which researchers observed the brain activity of frequent and infrequent technology users while they were, alternately, surfing the Web and reading a printed text. The researchers observed that Internet use stimulated activity in more areas of the brain, and frequent Internet users had more brain areas active, even when reading text. Over time, Small says, these changes will become hardwired in our physical brains. Just as, according to a Newsweek article about Small’s research, musicians have an enlarged brain in the area responsible for finger movements. Small sees a big upside to this in our improved ability to process information rapidly and make snap decisions.
I’m sure all this is true, especially the part about frequent technology users having more areas of their brain at work while reading. But in my own unscientific observation, all that diffuse brain activity leads to boredom with, and quick abandonment of, a dreary printed text for something flashing and shiny. The tech-savvy new brains are also remarkably quick to tune out an actual human delivering a complicated explanation of an issue and look down, absorbed in the ever-flowing stream of text-message trivialities on their tiny screens.
I suppose the brain scientists are right that this new cognitive style is not better or worse, just different. But I also know that some things—for instance, why many Iranians don’t like us—won’t fit on the screen of a handheld digital device, much less into 140 characters. Understanding those kinds of complex realities requires sustained concentration and focused mental effort. When even our best-educated citizens can’t do that any more, we will all be in big trouble.
Danny Duncan Collum, a Sojourners contributing writer, teaches writing at Kentucky State University in Frankfort, Kentucky.