It stands to reason that if you're spending time in front of the computer you're not watching TV. At least not with your full attention. Gussy that idea up and you have Cognitive Surplus by the appropriately-named Clay Shirky:
He argues that the television sitcom—those comic soap operas that saturated the airwaves for decades—was the alcohol of post-war societies, "absorbing the lion's share of the free time available to the developed world." (The numbers are depressing: even today, Americans sit through a hundred million hours of TV commercials every weekend.) Instead of fretting about the dislocations of the Information Age, we sat on the couch and watched Gilligan's Island.Doing some of my brain fragmenting surfing the other day I found a very recent Center for Disease Control report entitled Youth Risk Behavior Surveillance --- United States, 2009 in which you will learn that
But now, Shirky says, the reign of television is coming to an end. For the first time in decades, a few select cohorts of those under the age of 30 seem to be watching less TV than their parents. (Shirky doesn't mention that overall television consumption is still rising. According to Nielsen's media tracking survey, the amount of time the average American spent in front of the tube reached 153 hours per month in 2009, the highest level ever recorded.) But if young people aren't watching quite as many mindless sitcoms, and they're not drunk in the streets, then what the hell are they doing?
They're online, prowling the world wide web. Shirky describes this shift in media consumption as a net "cognitive surplus," since our brain is no longer mesmerized by the boob tube. Needless to say, he describes this surplus as a wonderful opportunity, a chance to get back some of the productive social interactions that were lost when we all decided to watch TV alone.
During 2003--2009, a significant linear increase occurred in the percentage of students who used computers 3 or more hours per day (22.1%--24.9%). During 1999--2009, a significant linear decrease occurred in the percentage of students who watched 3 or more hours per day of television (42.8%--32.8%).And while we're on the subject of what modern information technology does to us, Tyler Cowen took on the New York Times article that is receiving lots of attention:
Heavy multitaskers actually have more trouble focusing and shutting out irrelevant information, scientists say, and they experience more stress.Cowen
And scientists are discovering that even after the multitasking ends, fractured thinking and lack of focus persist. In other words, this is also your brain off computers.
“The technology is rewiring our brains,” said Nora Volkow, director of the National Institute of Drug Abuse and one of the world’s leading brain scientists. She and other researchers compare the lure of digital stimulation less to that of drugs and alcohol than to food and sex, which are essential but counterproductive in excess.
I've read the piece and I don't yet see the evidence. There are plenty of studies where the experimenter imposes his or her own version of multitasking on the participants and then sees their performance fall.
I'm simply not convinced or even moved in my priors by these studies. ... To sound intentionally petulant, the only multitasking that works for me is mine, mine, mine! Until I see a study showing that self-chosen multi-tasking programs lower performance, I don't see that the needle has budged.
I do see stronger evidence (as cited) that video games make people more aggressive. I also see overwhelming evidence that the internet gets people to read and write more. The latter is probably a good thing. I also believe the internet leads to less interest in long novels and more interest in non-fiction. I won't judge that one, but it's misleading to cite only the decline of interest in long novels and by the way don't forget Harry Potter, the form is hardly dead.