Archive for the ‘computers’ Category


October 31, 2012

More good report presentations yesterday in CoPhi, including one from Michael and another from Jon that independently observed something important about how we live now: many of us are so busy crafting and  projecting Platonically-ideal social media versions of ourselves that we’re actually lost in cyberspace. Danger, Will Robinson: those “friends” are not reliable, those experiences are not real.

Michael said we’re like Plato’s cave-dwellers, mistaking our own projected “forms” for reality. Jon said real Forms are all around us. Both were really saying, I think, that reality is immediate, embodied, personal, and subjectively experienced. I concur. So would William James, who said “the only form of thing we directly encounter, the only experience we concretely have, is our own personal life.”

“Impersonal experience” is an oxymoron. Virtual experience is better, but still not as direct or immediate or concrete as a walk in the woods or a face-to-face in exterior space. Or a hurricane, lest we forget that reality is not always more pleasant. But it is always more honest. More real. As a very old philosophy primer puts it:

If we ask the plain man, What is the real external world? the first answer that seems to present itself to his mind is this: Whatever we can see, hear, touch, taste , or smell…

So, I vote for the “plain” empiricists, as opposed to the flighty and speculative rationalists… for Aristotle over Plato, Locke and Hume (but not Berkeley) over Descartes and Leibniz. (But I like Spinoza, determinism aside.) I will continue to tweet and blog, but will also continue to resist full immersion in the second-hand, mediated world of clicks and strokes. Step away from the keyboard.

And now I really must turn to an immediate and concrete encounter with that pile of student essays. I’m sure it’ll be real.

What do you want to do today?

October 7, 2011

We had a good discussion in our James tutorial yesterday about The Sentiment of Rationality, and about the ambivalent “craving for further explanation” that makes philosophers perpetually discontented with every formulation.

As Schopenhauer says, “The uneasiness which keeps the never-resting clock of metaphysics in motion, is the consciousness that the non-existence of this world is just as possible as its existence.”

And that’s unnerving, especially when we draw out the personal implication: my non-existence is just as possible too.

We get that reminder every time a great person dies. Rest peacefully, Steve Jobs. The rest of us are that much closer to gone.

But he was no pessimist. Do you want to do what you’re about to do today? If not, he might just tell you to commence doing something better. The permanent  possibility of change is hopeful, and we’re still here.

Internet curfew is a GOOD thing

August 16, 2011

I hate it when illness robs me of the dawn and of pre-dawn slumber, as it did yesterday. Making it up for it today, though, with school back in session and alarms about to go off all over the house. The peace and quiet of 5 a.m. is at a premium once again. It’s 64 with no sun in sight.

The enforced discipline of the “mechanical servitor” (Thoreau’s version of an alarm clock, or a rooster) and the school-bell is a good thing, despite the hard transition from summer it imposes. You can’t scoff at the clock at night without paying in the morn.

I’d already decided to commence the new academic year with a self-imposed 9 pm Internet curfew. Then I learned of GOOD’s latest challenge, to unplug at 8 pm. I’m up for it, and not just ’til September either. It feels like a sane response to the information inundation (and idea deficit) people like Neal Gabler have been protesting lately, and it’ll get me to the dawn post on time. Maybe even stimulate some ideas.


January 6, 2011

If I were the sort of person who looks for signs and portents, I’d have been spooked by yesterday’s convergence of two events:

(1) the arrival at the public library of four titles I’d requested & held, and

(2) the malfunction of my shiny new Kindle, to which I’d only just begun to form an obsessive attachment.

The four titles:

  • Cognitive Surplus: Creativity and Generosity in a Connected Age
  • iBrain: surviving the technological alteration of the modern mind
  • The Dumbest Generation: How the Digital Age Stupefies Young Americans and Jeopardizes Our Future
  • Distracted: the erosion of attention and the coming dark age

But I’m not that sort of person, as I told Younger Daughter who last night was researching a social studies assignment on her family’s religion. “My Dad’s a naturalist,” she wrote, “a person who thinks that what we call magic is just facts we don’t understand.”

Facts we don’t understand yet… I begin to understand, though, that the magic so many of us have been seeking in our various handheld gizmos is unreliable.

Also a fact: printed & bound books are still the most reliable handheld vehicle for text, and probably for connectivity too.

Excuse me, I have a stack of reading to attend to… and a new Kindle book on the iPod. It’s All Things Shining.


September 25, 2010

Where do good ideas– lower case– come from? “Connectivity,” says Steven Johnson. Not an eternal Platonic Idea of Connectivity, but actual episodes of connecting within and between individuals and communities. That’s what Chris Anderson was talking about too. And E.M. Forster. “Only connect.” Build bridges between our passions and our prose.

Johnson’s TED Talk:

Connectivity is also liquidity. Pour that latte, get those ideas, those neuronal networks, together and in sync. Deep thinking really isn’t The Thinker in his solitary slump, it’s the chaotic cacophony of the coffeehouse (or tavern). “Chance favors the connected mind.” That’s when things really begin to flow.


September 22, 2010

First, to follow up Monday’s impromptu discussion:  I was wondering if books face a future of figurative immolation, not the  literal burning of the Alexandrian library (or the crazy Gainesville pastor) but every bit as terminal. Our large-scale cultural turn to e-reading, away from traditional book authorship and publication, raises questions about the long-term durability of the printed word and, hence, of our ability to transmit any legacy at all to future generations.

John Updike had important thoughts about the future of books, late in his life. He disputed Kevin Kelly’s rosy vision of a future of literary mash-ups and “snippets” unmoored from their thus-marginalized and fungible authors.

Books traditionally have edges: some are rough-cut, some are smooth-cut, and a few, at least at my extravagant publishing house, are even top-stained. In the electronic anthill, where are the edges? The book revolution, which, from the Renaissance on, taught men and women to cherish and cultivate their individuality, threatens to end in a sparkling cloud of snippets.

Updike elaborated his concerns in this speech, released as a podcast

Kevin Kelly, you may then think, is some kind of radical firebrand. But he doesn’t come across that way in our Clock of the Long Now reading today. The most sensible statement in today’s text, though, is Hillis’s response to Kelly’s report of the “complexity scientists” and their mocking of Long Now’s ambitions:

Believing in the future is not the same as believing you can predict or determine it. The Long Now Foundation is not about determining the destiny of our descendants, it is about leaving them with a chance to determine a destiny of their own.

(That’s exactly the point Harrison was making on Monday,  right?)

Also in Sunday’s Times Magazine special issue on the future of technology in education, Kelly’s conservative framing of computing as a tool we may pick up and put down at will is measured and reassuring. He quotes his previously home-schooled son, about to enter High School:

“I’m learning how to learn, but I can’t wait till next year when I have some real good teachers — better than me.”

He had learned the most critical thing: how to keep learning. A month ago he entered high school eager to be taught — not facts, or even skills, but a lifelong process that would keep pace with technology’s rapid, ceaseless teaching.

If we listen to technology, and learn to be proficient in its ways, then we’ll be able to harness this most powerful force in the world.

And if we don’t? Not so reassuring. But this seems right enough:

• The proper response to a stupid technology is to make a better one, just as the proper response to a stupid idea is not to outlaw it but to replace it with a better idea.

Jaron Lanier, who– we will read soon– insists that he’s not a gadget (and neither are you), also points out that education does what genes cannot, viz., transfer nongenetic information (“memes”) between generations:

To the degree that education is about the transfer of the known between generations, it can be digitized, analyzed, optimized and bottled or posted on Twitter. To the degree that education is about the self-invention of the human race, the gargantuan process of steering billions of brains into unforeseeable states and configurations in the future, it can continue only if each brain learns to invent itself. And that is beyond computation because it is beyond our comprehension. Learning at its truest is a leap into the unknown.

Leaping can be a good thing, it’s how we get somewhere. But, as Lanier cautions: “Trusting teachers too much also has its perils.” Danger, Will Robinson.

Don’t say I didn’t warn you.

But on the other hand, Will did always trust his Robot. It’s the duplicitous Dr. Smiths you really have to watch out for.

Apple addiction

June 8, 2010

Swung by the Apple store yesterday afternoon to fulfill Younger Daughter’s urgent order for an iPod Touch. She had enough birthday money.

I’m sure she’ll not be in any haste to follow my App suggestions and load up on educational tools, audio and e-books, and other cool mind-expanders along with the standard iTunes candy. Since we got home with it she’s been vastly amused merely by her new ability to manufacture virtual cupcakes on the thing, and to wield a sound-enhanced Star Wars light saber. Oh well.

It was personally very risky, going in there past all those seductive devices on the day of Steve Jobs’ official unveiling of the latest and greatest model iPhone.

I admit, I took sophomoric satisfaction in leaving a scary Times story behind on the screen of the iPad I played with. In it, a woman complains of her husband’s various computer addictions:

“It seems like he can no longer be fully in the moment.”

Scientists say juggling e-mail, phone calls and other incoming information can change how people think and behave. They say our ability to focus is being undermined by bursts of information.

These play to a primitive impulse to respond to immediate opportunities and threats. The stimulation provokes excitement — a dopamine squirt — that researchers say can be addictive. In its absence, people feel bored.

The resulting distractions can have deadly consequences, as when cellphone-wielding drivers and train engineers cause wrecks. And for millions of people these urges can inflict nicks and cuts on creativity and deep thought, interrupting work and family life…

Moderation sounds good, but real addicts can’t find the middle way.

Yet, there’s no going back. I don’t want to go back. I also don’t want to lose the moments of my life.

Is there an App for that?