Here at the frontier, the leaves fall like rain. Although my neighbors are all barbarians, and you, you are a thousand miles away, there are still two cups at my table.


Ten thousand flowers in spring, the moon in autumn, a cool breeze in summer, snow in winter. If your mind isn't clouded by unnecessary things, this is the best season of your life.

~ Wu-men ~


Tuesday, May 31, 2011

What We Lose

If you take hold of this, you must let go of that.

Below is an excerpt from an article which appeared in the NY Times on the effect of social media on our lives. The full article may be read here.

The Twitter Trap


Last week my wife and I told our 13-year-old daughter she could join Facebook. Within a few hours she had accumulated 171 friends, and I felt a little as if I had passed my child a pipe of crystal meth.

I don’t mean to be a spoilsport, and I don’t think I’m a Luddite. I edit a newspaper that has embraced new media with creative, prizewinning gusto. I get that the Web reaches and engages a vast, global audience, that it invites participation and facilitates — up to a point — newsgathering. But before we succumb to digital idolatry, we should consider that innovation often comes at a price. And sometimes I wonder if the price is a piece of ourselves.

Joshua Foer’s engrossing best seller “Moonwalking With Einstein” recalls one colossal example of what we trade for progress. Until the 15th century, people were taught to remember vast quantities of information. Feats of memory that would today qualify you as a freak — the ability to recite entire books — were not unheard of.

Then along came the Mark Zuckerberg of his day, Johannes Gutenberg. As we became accustomed to relying on the printed page, the work of remembering gradually fell into disuse. The capacity to remember prodigiously still exists (as Foer proved by training himself to become a national memory champion), but for most of us it stays parked in the garage.

Sometimes the bargain is worthwhile; I would certainly not give up the pleasures of my library for the ability to recite “Middlemarch.” But Foer’s book reminds us that the cognitive advance of our species is not inexorable.

My father, who was trained in engineering at M.I.T. in the slide-rule era, often lamented the way the pocket calculator, for all its convenience, diminished my generation’s math skills. Many of us have discovered that navigating by G.P.S. has undermined our mastery of city streets and perhaps even impaired our innate sense of direction. Typing pretty much killed penmanship. Twitter and YouTube are nibbling away at our attention spans. And what little memory we had not already surrendered to Gutenberg we have relinquished to Google. Why remember what you can look up in seconds?

Robert Bjork, who studies memory and learning at U.C.L.A., has noticed that even very smart students, conversant in the Excel spreadsheet, don’t pick up patterns in data that would be evident if they had not let the program do so much of the work.

“Unless there is some actual problem solving and decision making, very little learning happens,”

Bjork e-mailed me. “We are not recording devices.”

Foer read that Apple had hired a leading expert in heads-up display — the transparent dashboards used by pilots. He wonders whether this means that Apple is developing an iPhone that would not require the use of fingers on keyboards. Ultimately, Foer imagines, the commands would come straight from your cerebral cortex. (Apple refused to comment.)

“This is the story of the next half-century,” Foer told me, “as we become effectively cyborgs.”

Basically, we are outsourcing our brains to the cloud. The upside is that this frees a lot of gray matter for important pursuits like FarmVille and “Real Housewives.” But my inner worrywart wonders whether the new technologies overtaking us may be eroding characteristics that are essentially human: our ability to reflect, our pursuit of meaning, genuine empathy, a sense of community connected by something deeper than snark or political affinity.

The most obvious drawback of social media is that they are aggressive distractions. Unlike the virtual fireplace or that nesting pair of red-tailed hawks we have been live-streaming on nytimes.com, Twitter is not just an ambient presence. It demands attention and response. It is the enemy of contemplation.

Every time my TweetDeck shoots a new tweet to my desktop, I experience a little dopamine spritz that takes me away from . . . from . . . wait, what was I saying?

4 comments:

walt said...

Nice photo. We used to have balloons like that land next to our business down in NorCal.

"But my inner worrywort wonders whether the new technologies overtaking us may be eroding characteristics that are essentially human..."

Ya think?
I rather turn your title around, and ask, "What do I want to keep?" Part of our e-r-r-o-r is to take things that come our way all on the same level, i.e., "just the next thing" ... so, what's the harm? But not everything is resilient enough to survive the encounters.

Take 'attention,' for instance, or what we generally call 'awareness.' To the degree that I indulge "myself" in technology -- in spite of its wonderful utility! -- I generally find myself more scattered and dispersed, not less.
For the things that interest me, that's not a *plus*.

Others have noticed this, as well. In the 4th century, Ge Hong wrote about "guarding the One":
Preserve the One, guard Truth
And you communicate with the whole Universe.
The One is not hard to know,
The difficulty is persistence.
Guard the One with no distraction
And you are eternally vital.

-- from the Book of the Master Who Embraces Simplicity

Rick Matz said...

I bet there's a twelve step program online. I'll google it ...

Georg said...

Did you find it? Please share. (Love your blog. I'm always reading all your articles without distraction.)

best
Georg

Rick Matz said...

Dag nab it. I lost track of what I was doing when I saw some interesting looking links ...