The article was published in The Atlantic Monthly, which might not be the most mainstream publication, but compared with the Harvard Business Review, where Carr got his start, it’s practically People magazine. It’s a fitting stage for his thesis, which is that extensive use of online media and search engines is slowly changing our ability to concentrate and think deeply. You know the feeling: instead of immersing yourself in a good book you’re itching to check e-mail. To Carr, it’s an early warning sign of what may come.
“Thanks to the ubiquity of text on the Internet, not to mention the popularity of text-messaging on cell phones, we may well be reading more today than we did in the 1970s or 1980s, when television was our medium of choice. But it’s a different kind of reading, and behind it lies a different kind of thinking—perhaps even a new sense of the self,” he writes. “Our ability to interpret text, to make the rich mental connections that form when we read deeply and without distraction, remains largely disengaged.”
The long-term cognitive effects of extensive Internet use are impossible to predict, and apart from a vision of artificially intelligent search engines hard-wired to our brains, Carr barely tries. He merely makes a point – and perhaps his most important one – that in our haste to move to the next Web page we may be losing our ability to contemplate. Without that kind of sustained mental activity, we will have a more difficult time dealing with the ambiguities that make life interesting. And although he doesn’t spell it out, it could make idea generation that much more difficult. No wonder Intel, Apple and other vendors are now asking employees to devote some time to thinking away from their PC screens.
Contrast Carr’s piece with a recent New Yorker article by Malcolm Gladwell called In the Air, which tries to help us understand why the big innovations often happen simultaneously among different people. He follows the efforts of a former Microsoft exec who sets up a group brainstorming network, and the value that comes from dedicating resources to such activities. His conclusion: that “insight could be orchestrated,” but primarily among scientists, not artists.
“You can’t pool the talents of a dozen Salieris and get Mozart’s Requiem. You can’t put together a committee of really talented art students and get Matisse’s ‘La Danse,’” he writes. “Our persistent inability to come to terms with the existence of multiples are the result of our misplaced desire to impose the paradigm of artistic invention on a world where it doesn’t belong. Shakespeare owned Hamlet because he created him, as none other before or since could. Alexander Graham Bell owned the telephone only because his patent application landed on the examiner’s desk a few hours before Gray’s. The first kind of creation was sui generis; the second could be re-created in a warehouse outside Seattle.”
If the Internet erodes our attention spans, however, it may not matter whether we are scientists or artists. It will become harder to have those eureka moments that are critical to major discoveries and accomplishments. Of course, the rub here is that both Carr’s and Gladwell’s articles are obviously the result of deep contemplation, and run thousands of words longer than most of what’s read online. So far, we are winning the war on distraction.
Carr has said the role of the IT manager will eventually fade away, but for the moment, at least, they are the ones setting up the systems we use for managing information and, indirectly, the collected knowledge of employees. Just like users, they are bombarded with data and decisions to make, and one of those decisions may be whether to provide technology that makes workers not only more productive but develops their capacity to ruminate and reason. “The Net’s intellectual ethic remains obscure,” Carr says. That of the IT manager cannot afford to be.