If You’re Human, You’re a Slow Learner #change11

Sometimes the Web can make a beautiful, serendipitous nexus. Whilst pursuing two seemingly separate lines of thought in two seemingly separate universes (integral philosophy on Beams and Struts and education theory on the Change MOOC) I discovered a connection that makes me a little less schizophrenic and a little more dialectic.

Here’s my little self-absorbed tale of discovery: Jeremy Johnson commented on my Beams and Struts article (“The Singularity is Near-Sighted”) and recommended William Irwin Thompson’s wonderfully-titled  “The Borg or Borges?” Here Thompson revisits one of his key concepts from Coming Into Being, that consciousness is a delay-space where different inputs from the senses are cross-referenced and their interactions stabilized, giving rise to a unique emergent self-awareness. Time is sort of slowed-down so that some of its components can get to know each other, exchange echoes, and establish a perspective.

In other words, human consciousness is the result of slowing down.

As Thompson so eloquently puts it:

Fast is fine for the programmed crystalline world of no surprises and no discoveries, but slow is better for the creative world of erotic and intellectual play.

This fits nicely with Clark Quinn’s Week 13 presentation on Slow Learning. Quinn writes in his opening blog post:

Really, I’m looking to start matching our technology more closely to our brains. Taking a page from the slow movement (e.g. slow X, where X = food, sex, travel, …), I’m talking about slow learning, where we start distributing our learning in ways that match the ways in which our brains work: meaningfulness, activation and reactivation, not separate but wrapped around our lives, etc.

Slow is the way to go. We’ve gotten so used to outsourcing our cognition to machines, to opening multiple tabs, and craving faster connection speeds that we’ve overlooked the exquisite work of evolution. Some see the brain as a vehicle for rapid computation. Perhaps that steam pouring out of our ears isn’t mere by-product. Maybe we’ll slow down and see it’s really the driving spirit, and we’ve been blowing it off and letting it dissipate as waste. Not the ghost in the machine, but the ghostly machine.

Forget machine. Forget ghost. We could call it, to paraphrase Yeats, a sustained glimpse out of Spiritus Mundi. Or it could simply be the dance of complexity teaching its steps to the dancer, inviting improvisation for the first time.

Thompson says it best, in conjunction with John Keats:

The field of consciousness has more to do with slowness and a higher dimensionality, even beyond the three of the physical volume of the brain, in which hyperspheres— or some other higher dimensional topology — involve simultaneity in a neuronal synchrony — in a pattern. A mind, in the opening words of Keats’s ‘Ode on a Grecian Urn’, is a ‘still unravished bride of quietness’, a ‘foster-child of silence and slow time’.

And now for the ironic part: I have made this connection between the cultural historian and mystically-mind Thompson and learning technology strategist Clark Quinn because of the internet, because I was taking on more than one field of study at once, and because of Twitter, blogs, and .pdf files.

In other words, I’m writing about slowing down because I’ve been living fast.

If there is a lesson here, it’s that we need a new term and a new understanding for how a person can live and think and create in relation to technology without having to adopt one of the two polarities of Luddite or Techie. If you’ve read my Beams and Struts article you know I’m skeptical of The Singularity. Still, our lives are interconnected with technology, and likely made better because of it. It’s a matter of how one stands in relation to technology. Is it a tool, or are you?

The writings of both Thompson and Quinn suggest giving precedence (and prescience) to human consciousness over its hyper technological extensions.

How Do We Prepare Students for a Real World that Doesn’t Yet Exist? #change11

I’ve previously written about my annoyance with the term “real world.” It really irks me like few things do. Perhaps I’m sensitive to being accused of not living there, despite the fact that each year I try to teach over 200 students how to write more effectively. Given that most work environments value good communication skills, and that most workers with college degrees spend 20% of their time writing, I think my job qualifies as “real world.”

Then why am I resistant to Jan Herrington’s notion of assigning “realistic tasks in academic settings?” (illustrated here on an excellent YouTube video of her matrix.) After all,  I’m currently finishing up a class, Writing for Business, where I’ve assigned a number of tasks (memos, info-graphics, instructional manuals, etc) that fit her Lower Left quadrant.

Here’s my problem: in this rapidly changing world, “realistic tasks” quickly become fantasical (or, even worse, nostaligic.)

A 2009 Fastweb article called “Ten Majors that Didn’t Exist Ten Years Ago” makes a good case that if we focus on the present, we’ll lose the future. Given that most biology textbooks are outdated in two years, and that the hot jobs five years from now might not exist today, does today’s “real world” have value for tomorrow’s life-long learner?

Here are a few things that are more crucial than real-world application (which might be a mere transitory illusion anyway) and should be emphasized as the foundation of any education (not an exhaustive list, just a doodle of sorts):

Critical-thinking

Creativity

Big-Picture thinking

Synthesis

Innovation

Research

Reflection

Adult psychological development

Emotional Intelligence

Discipline

Organization

Philosophy

Sure, any one of these could be utilized in real-world tasks, applied to a job, and be made to fit a specific major or profession. That’s fine. But, since most people change majors, jobs, careers, and callings, and since we don’t know what the job market will be looking for in the future, I think it makes more sense to focus on these foundational, underlying qualities that can be applied to anything, and that, in my opinion, signify what it means to be educated.

In Praise of Inauthentic Learning #change11

There is nothing authentic about me. Not that I can tell.

I climbed into a Bavarian stove once for an afternoon, and when I came out, wrote down the following:

“I stink, therefore I’ll bathe.”

It was my only insight, though one I share with James Baldwin’s Sonny, who smelled his own humanity in prison. He, at least, had the jazz piano to help orchestrate his riffs and runs into something unified.

I never found the author of my self. I never found my self. I’m glad I didn’t because the most inauthentic people I know are authentic.

After all my calculation, there was no sum, ergo, only a loose bundle of cogito held together by the thread of necessity. And since I didn’t have anywhere else to go that day, I untied the knot.

This is an inauthentic story. It is also the story of my inauthenticity.

Any so-called authentic learning must trade  the reality of inauthenticty for the illusion of authenticity.

But what a useful illusion it is! I can’t help but agree with Jan Herrington on the appeal of this model. She writes:

Authentic learning is appealing as a pedagogical approach on at least four counts:

1. It situates knowledge in realistic contexts, thereby contextualising knowledge, and making it less likely to remain ‘inert’ when needed to solve problems;
2. Realistic tasks cognitively challenge learners to solve problems and think in the same ways as professionals working in real world contexts;
3. Technology-based cognitive tools can be used both in the processes and products of learning;
4. Complex tasks require the creation of real products and artefacts, and are more worthy of the investment of time and effort in higher education than decontextualised exercises and tasks.

The creation of genuine sharable products ensures that authentic learning is in a position to capitalize on the participatory culture afforded by social media.

In this age of complexity, where there seems to be a deficit of innovative problem-solving, why not encourage our students to get to work applying the latest tools on the most pressing problems. Besides, it’s fun to solve problems, and we learn by doing.

And yet, in this call for more realistic contexts, I sense a major problem: almost none of our problems are realistic. How do I know this? Because we can barely imagine the solutions. In fact, many existing solutions were unimaginable just a decade ago. Or, at least, they first came into focus through imagination. They had to, since they didn’t exist in the material world at the time. Imagination, not reality, is the birthplace of the new.

There’s a reason that IBM, that great 20th Century institution of the Knowledge Worker, has concluded that the 21st Century will be won by the Creative Class. In fact, in their 2010 study, 1500 CEO’s confirmed this, calling creativity “the most crucial factor for future success.” Creativity can certainly be realistic (or else it would have no value to companies like IBM). However, the root practice of creativity is about as inauthentic as you can get. It’s Steve Jobs dropping acid and taking a calligraphy course (Hey, Steve, why don’t you do something more realistic with your life!). It’s Leonardo DaVinci wasting years designing a helicopter. Charles Darwin sailing the world and gazing into coral reefs. (What pressing real world problem could have justified Darwin’s adventure?) It’s Tim Berners-Lee doodling with HTML on the side. It’s Homer (or whoever) setting Greek history to hexameters. And so on.

All of these creations have authentic (and monetary) real world effects, but the creative processes that inspired them were of a different sort.

They were, to quote Herrington, “decontextualized exercises,” and I think we need more, not less, of them.

The Singularity is Near-Sighted: Joseph Campbell’s Vision for the Internet Age

Here’s my article  on Beams and Struts. Please read and enjoy.

If Joseph Campbell made a MOOC #change11

I’ve written previously about MOOC (Massive Open Online Course) design and how it shouldn’t follow the traditional academic structure, not given the potential for a more chaotic, connectivist, or rhizomatic learning style.

However, we can’t let our shoots and tubers grow wild forever. Spring is only one-quarter of the year.

Let’s keep the chaos, sure, but we can’t ignore form. Look at the world around us. Order arose from chaos, and certainly back into chaos it will go.

I think Joseph Campbell’s Hero Journey is a better template for course design. It still allows us to get lost, to struggle, to stumble in the darkness, but, we can still test our progress against a very real and powerful psychological superstructure.

Narratives like Star Wars, Avatar, The Matrix, Harry Potter, and Lord of the Rings (all written with Campbell’s Monomyth in mind) have been so successful in our culture because they resonate with our eternal struggle for personal meaning and transformation, ideals emphasized too little in education.

In my Beams and Struts article “The Singularity is Near-Sighted: Joseph Campbell’s Vision for the Internet Age,” I argue that we’re getting lost in the stream of tweets, blog posts, and status updates. We’re too interested in making data-based connections at the expense of meaningful connections. We’re making a mess without making anything stick. We’re also clamoring as a nation to get back on track, and we know the status quo won’t do.

If education is all about growth, about becoming more critical, creative, flexible, and expansive thinkers, there’s hardly a better model than Campbell’s Hero’s Journey. You forge your own path, make your own map, and look for guidance from the wisest and most helpful people you can find, all in pursuit of personal transformation and with the assumption that once you find it, you’ll come back home and make things right in your village.

It doesn’t need to be implemented directly in the design of a MOOC. Any traveler can approach the course this way, regardless of its design. Some courses punish this kind of personal questing, especially traditional models that have strict expectations reflecting only the instructor’s pedagogy or some abstract set of university objectives.

This isn’t really how things work. We make our meaning, forge our own connections, and get out what we put in.

 

Hey, Brain, Let’s Slow Down and Get to Know Each Other #change11

Clark Quinn’s overview of Slow Learning is a lot more subversive than it appears, but I don’t mean subversive to traditional pedagogy. We tend to associate the word “subversive” with 1960’s radicalism, or as some kind of challenge to the puritan ethics of work and prudence. I sense, however, in Slow Learning a different kind of subversion, one that is far more traditional than most learning theories that get trumpeted on MOOCs. The Slow Food movement is a good example of how tradition has been used to combat excesses of modernity, in particular the unhealthy, pre-packaged fast foods we consume. Slow Food is subversive by being retrograde. It is a conservative movement (though not necessarily “conservative” in the sense of American politics).

Any Slow Learning movement must inevitably take on the excesses of the Internet Age. It’s a movement that would inevitably look to figures like Jaron Lanier and Nicholas Carr who have dared to point out certain areas of exposed skin on the emperor. This article by Meris Stansbury addresses some of Carr’s complaints, mainly that we’re ignoring the negative effects of the internet on our brains and our quality of thinking:

As we use technologies like smart phones and the internet, our brains are changing as well, and Carr argued that although we acquire skills, such as increased visual-spatial intelligence (being aware of many moving parts at once), we also weaken our “mindful knowledge acquisition,” inductive analysis, critical thinking, imagination, and reflection.

“Our brain is becoming saturated with information, and it’s becoming harder for us to hold onto meaningful information, if we can even pick out what’s meaningful anymore,” he explained.

The need to acquire many bits of information is nothing new, Carr said. Supposedly, early man needed to gather as much information as quickly as possible just to survive.

The brain releases dopamine, a chemical associated with pleasure, every time we receive new information, said Carr. The printed page and reading eventually changed that, but now we’re reverting to old times all over again.

For example, Carr quoted a recent study showing that for an average adult, time devoted to looking at screens per day averaged 8.5 hours, whereas time devoted to reading from pages per day averaged 20 minutes.

“This is a problem, because our brain has a ‘bottleneck’ when we go back to these old habits, meaning that our working memory has a ‘cognitive overload’—which can negatively affect our long-term memory and our ability to evaluate information and distinguish what’s useful from what’s just trivia,” he said.

For awhile now, I’ve thought that schools who ban cell phones and restrict computer use were living in the past and providing a disservice to their students who need to be trained on the forward path and fast. Now I’m wondering what all of this fast-forwarding is doing to our students.  Right now, it seems subversive to use Twitter in the classroom and to click your way through pages of research. Soon, this will become the status quo, and we’ll be complaining of the negative effects of too much internet use on education. We’ll start to sound like those “backwards cranks” who are still teaching in Industrial Age.

I agree, the Industrial Age model is over. But I think certain elements of the past will come back. I envision a new model that, from the outside, looks a lot like the one-room school house in the country.  It will become subversive to sit in silence and read for two hours straight from a real book, or maybe an e-book with no internet access. It will be subversive to sit outside under a tree with your devices out of reach, to go for a walk and reflect on what you’re learning. It will be subversive to think deeply and make a connection with only yourself.

Our course, students will use computers and the internet, just a lot less often, with less frivolity, and after contemplation. They’ll use them to make their lives simple and efficient, not cluttered and wired. “Wired” is a terrible word to describe the ideal consciousness. It suggests a hyper, frenetic brain geared up and ready to do too much at once. In time, we’ll develop better metaphors. I don’t think “slow” is ideal, but “balance” is a good place to start.

Ditch the GPS; Get Lost Instead #change11

In Cognitive Surplus, Clay Shirky points out that major changes to society often happen so quickly they don’t leave time for anyone to adjust. This results in chaos that traditional solutions can’t fix. In fact, tradition has been displaced. There is no plan for going forward, but no way to go back. Shirky’s main example is the beginning of the Industrial Era in London when a rapid influx of people into the city created social chaos, new opportunities for leisure, and mass drunkenness. Eventually, people began to use their surplus time to organize, become educated, and develop civic infrastructure. Ergo, we have modern democracy. (Sort of)

We are entering an era of chaos. The traditional approaches to education are being challenged by rapid changes in technology and economic pressures. And if the old model falls, we’re simply not ready to replace it. We don’t have any good ideas. Or…we have a lot of good ideas but no big picture.

In his opening post for Week 13 of #change11 MOOC, Clark Quinn addresses this problem:

I’m really arguing for the need to come up with a broader perspective on learning.  I’ve been calling it learning experience design, but really it’s more.  It’s a combination of performance support and learning (and it’s badly in need of some branding help). The notion is a sort-of personal GPS for your knowledge work. It’s knows where you want to go (since you told it), and it knows where you are geographically and semantically (via GPS and your calendar), and as it recognizes the context it can provide not only support in the moment, but layers on learning along the way.  And I think that we don’t know really how to look at things this way yet; we don’t have design models (to think about the experience conceptually), we don’t have design processes (to go from goal to solution), and we don’t have tools (to deliver this integrated experience).  Yet the limits are not technological; we have the ability to build the systems if we can conceptualize the needed framework.

[….] There’s lots more: addressing the epistemology of learners, mobile technologies, meta-learning & 21st C skills, and deep analytics and semantic systems, to name a few, but I think we need to start with the right conceptions.

Quinn suggests we think about “slow learning” as a way to make the reality of how our brains work match the pace and functional aspects of education design.

This sounds great. What I don’t like is GPS as a metaphor. The problem is that the GPS simultaneously knows too much and too little. Have you ever watched someone follow their GPS around and around the block, expecting the little robot to do all the work? Chances are, if the driver would just look up and read a few street signs or use common sense, he could save 15 minutes of wandering.

We need to get lost. We don’t need our locations constantly re-calibrated. Learning often means getting lost in the woods and finding your way out. It doesn’t mean having a controlling voice talking you through everything, measure, assessing, re-assessing. That’s one of the big problems with education now. Let’s not replace a human program with a digital one. You can’t hear yourself think with such a dominant narrator.

A better metaphor, I think, is the Hero’s Journey of Joseph Campbell. Most of those myths would have been utterly destroyed with the careful directions and constant updates of a GPS. Heck, even Luke Skywalker, in his world of advanced technical wizardry, needed to close the blast shield and listen to his own voice .

It might be trite, but it might be true: do we need to focus more on the journey, less on the destination? (Destinations are good, too). The GPS won’t shut up until the correct result is reached. We become dependent. We need her every time we go somewhere. We never learn to shut her off and read the landscape directly.

Classroom as Shire; MOOC as Middle Earth #change11

In his final Week 11 post (sorry, I’m navigating the Change MOOC a tad asynchronously) Jon Dron reflects on the challenges of the MOOC (Massive Open Online Course) structure:

I’m not the first to observe that a big problem with connectivist-influenced MOOCs like this is that they are, well, chaotic and lacking in centre. People are contributing all over the place in a hundred different ways and certainly not in an orderly fashion. This is not your grandmother’s kind of course and that would be fine, apart from the fact that such a small percentage of people wind up getting fully engaged and so many drop out, one of the main reasons being the complexity of following and keeping up with the course. If we had drop-out rates of this magnitude in our universities there would be some very serious questions asked.

He then goes on to say we can’t apply the same standards since MOOCs are not meant to be formal and, in fact, need to be chaotic and freewheeling in order to make possible the kind of learning often lacking in the traditional models.

I like to use a metaphor from The Lord of the Rings: the traditional classroom is the shire. The MOOC is Middle Earth at large. It’s important to have a home base, a tradition, and a safe place to return to. But sometimes in order to grow we must leave the familiar confines and learn to navigate through unfamiliar land. This poses all sorts of dangers. However, if the goal is learn more about ourselves and our world, we can’t stay stuck in one spot on the map.

Here’s the key: Frodo doesn’t go alone. He travels with a small band of helpers, friends, and skilled warriors. If we can take a small, structured classroom into a MOOC (instead of tossing people into Mordor on their own) we can have the best of both worlds.

A traditional course of 15-30 people should stick together while exploring a massive gathering of people and information. Someone will always have your back, and, of course, you’ll have your very own Wise Old Man or Wise Old Woman swooping in with white light.

 

Children Raised by Wolves are the Only True Self-Taught Learners

In Stumbling On Happiness, Daniel Gilbert says that all psychologists are required, at some point in their careers, to write a sentence that begins with “The human being is the only animal that…” According to Gilbert, the answer is “imagines the future.” My personal choice would be “wears socks with sandals,” but I haven’t done the field research to back that up.

A recent Discover Magazine article take a crack at this and comes up with the following answer: Humans are the only animals who teach.

At first, this seems like an overreach. Anyone who has watched chimpanzees interact has seen acts of imitation. It seems clear that other primates teach, and, just like humans, often do it for peanuts.

Discover Magazine is ready for this objection:

I know this may come as a surprise, but it does so because we tend to mix up teaching and learning. A young chimpanzee can learn how to smash nuts on a rock by watching an older chimpanzee in action. And when she grows up, her own children can learn by watching her. But in these situations, the students are on their own. They have to watch an action and try to tease apart the underlying rules.

I think we’re about to head down a rabbit hole having to do with “intention” and the depth of conscious awareness in primates. Ultimately, this would be a pointless debate. The question is, “Is our children learning?” No, sorry, old joke.

The question is, “Has Learning Occurred?” We aren’t necessarily going to know how it happened, who was responsible for it, or what the exact ingredients of the educational cocktail were.  Therefore, it doesn’t matter whether a child learns to tie his shoes because he was taught or because he imitated an adult. In fact, these distinctions are merely linguistic.

This discussion makes me think of the so-called “self-taught” learner. That term is a misnomer. The only self-taught learners we have on record are children raised by wolves or neglected in massive orphanages. Those children have little or no human contact. They must teach themselves. They don’t do so well.

Just like you can’t avoid learning (it’s in our DNA), you can’t avoid teachers. The world is a teacher. Maybe not on purpose, but it is.

Using Twitter on the Half-Dipper Bridge

Based on I.S.T (Internet Standard Time) , this 2010 David Carr article about Twitter is ancient. However, I’ve just come around to David Carr after watching the brilliant Page One, so forgive me for coming late to the party.

Carr, who was initially a Twitter skeptic, has come to find great value in the micro-blogging software:

At first, Twitter can be overwhelming, but think of it as a river of data rushing past that I dip a cup into every once in a while. Much of what I need to know is in that cup: if it looks like Apple is going to demo its new tablet, or Amazon sold more Kindles than actual books at Christmas, or the final vote in the Senate gets locked in on health care, I almost always learn about it first on Twitter.

I find this to be true when preparing for class or exploring ideas for research. If you’re following the right people on Twitter, it can be an endless source for material. Carr’s quote also made me think of a passage from Shunryu Suzuki’s Zen Mind, Beginner’s Mind, where he explains the proper way to draw water from a stream:

If you go to Japan and visit Eiheiji monastery, just before you enter you will see a small bridge called Hanshaku-kyo, which means ‘half-dipper bridge’. Whenever Dogen-zenji dipped water from the river, he used only half a dipper, returning the rest to the river again, without throwing it away. That is why we call the bridge Hanshaku-kyo, ‘half-dipper bridge’. It may be difficult to understand why Dogen returned half of the water he dipped to the river. When we feel the beauty of the river, we intuitively do it in Dogen’s way. It is in our nature to do so.

I guess it’s best to avoid drinking too deeply from the stream of information, to let some of water pass back into motion. Carr warns that Twitter’s power can wash you away:

All those riches do not come at zero cost: If you think e-mail and surfing can make time disappear, wait until you get ahold of Twitter, or more likely, it gets ahold of you. There is always something more interesting on Twitter than whatever you happen to be working on.

All that gurgling can also be misleading. Carr quotes Here Comes Everybody author Clay Shirky, who has long praised the wisdom of crowd-sourcing your problems and allowing the hive-mind to go to work (how’s that for larding up my prose with buzz words?)

Twitter helps define what is important by what Mr. Shirky has called “algorithmic authority,” meaning that if all kinds of people are pointing at the same thing at the same instant, it must be a pretty big deal.

Maybe. You’ll see “Kim Kardashian” trending on Twitter more frequently than “Eurozone.” Collective intelligence is powerful, but so is collective ignorance. Sometimes the stream of consciousness is just water under the bridge.