What’s So Good About the 21st Century? #Change11

I’m tired of the 21st Century, and I’m ready to skip ahead to the 22nd. I’ll think we’ll have things figured out by then.

The 21st Century is clearly a Dark Age; human intelligence and culture is in retreat. We’ve traded the life of the mind for the buzz of the hive.

It is also the era of misplaced concreteness. In the philosophy of Alfred North Whitehead, the Fallacy of Misplaced Concreteness occurs whenever someone mistakes an abstract concept for a concrete phenomena. We seem to think that the internet itself is a living, thinking connective tissue. Kevin Kelly is perhaps the biggest offender with his notion of “The Technium” becoming the 7th Kingdom of Life.

I’m confident that in another 100 years we’ll have put artificial intelligence in its rightful place, as a kind of hollow butler stuffed with our to-do lists, a slightly more realistic-looking Jeeves.

Artificial Intelligence is an oxymoron, since intelligence requires embodiment, demands biology. Computation is not intelligence. The only way to “artificially” replicate human intelligence is cloning. The most effective method of cloning is sexual reproduction. All of this progress and we’re right back to where we’ve started from: perfecting our pick-up lines.

It’s time to skip ahead to the 22nd Century when we’ll (hopefully) figure out the right balance between technology and humanity and between civilization and the environment.

Learning actually hasn’t changed that much. Its delivery method has. Learning is still the same. It requires a whole mind, not a fractured one. It requires a real mind not a hallucinated one.

Google Stoopid #change11

In his Week 15 overview, Howard Rheingold warns that we should not believe internet skeptics, such as Nicholas Carr, “before empirical evidence corroborates them.” Having just finished Carr’s book The Shallows: What the Internet is Doing to Our Brains, I would invite everyone to check out “Chapter 7: The Juggler’s Brain,” which is filled with solid, empirical evidence (though no scientific consensus yet exists) of the negative effects the internet has on cognition.

For instance, Carr reports on UCLA developmental psychologist Patricia Greenfield, who reviewed fifty studies on media’s effects on intelligence and learning, publishing her results in Science. Greenfield claims we have traded improvements in visual-spatial skills for weaker capacities in “the kind of deep processing that underpins mindful knowledge acquisition, inductive analysis, critical thinking, imagination, and reflection.”

One of Carr’s major arguments is that internet usage is re-wiring our brains in ways that make us better at using the internet, but worse at deeper, reflective thinking. He cites a 2009 Stanford study that compares heavy media multitaskers with relatively light multitaskers. The results found that heavy multitaskers were ” much more easily distracted,” “had less control over the contents of their working memory,” and were “less able to maintain their concentration.” While infrequent multitaskers “exhibited relatively strong top-down attentional control,” heavy users were, according to Stanford professor Clifford Nass, “suckers for irrelevancy.”

Carr also reports on multiple studies showing how reading comprehension suffers when readers must navigate through hyperlinks instead of reading in traditional, linear fashion. The neuroscience suggests that brain functions in support of deep reading must “shut off” when a reader considers whether or not to click on a link. As John Medina writes in Brain Rules, our brains functions best while performing one type of task at a time. The simple act of switching from deep reading to “link inspection mode” throws us off our game and wastes valuable brain energy. Multitaskers are always less efficient that monotaskers.

Nicholas Carr’s The Shallows is frightening stuff. It suggests that we have made massive changes to how we read, learn, live, and think. And that we have done so in a short period of time without considering the consequences.