Posts Tagged ‘Douglas Adams’

Day 1!

August 27, 2012

Opening Day is here: Happy New Year!

First day of class means a fresh start, a blank slate, a chance to sew “fresh seed” into our discussions. We’re like birds fluttering into a lighted hall to roost briefly before flying back out into the darkness.

We’re all whales wondering what’s happening as we whoosh towards that large unnamed expanse below.

But this is crucial: we’re birds of a feather, a plurality of plummeting whales, a surfeit of seed-sewers. We don’t have to wonder wordily in solitude, we can talk about our thoughts and experiences and the transient objects of our world.

We won’t always see eye-to-eye in philosophy class, but our arguments won’t just be exercises in mutual contradiction either. Though of course they can be.

In any event, it should all be eye-opening. “We are going to die, and that makes us the lucky ones…”

Ready, set…

Oh, wait. Those remarks are tailored to the Intro/CoPhilosophy course. I think they can readily be adapted to Environmental Ethics and Activism too, since collaboration very often does lead to ethically-rooted action in pursuit of shared goals like, say, sustainable ecosystems. I’ll talk about that a bit on the radio this afternoon.

Here’s where we begin in EEA:

Go!

The POV gun glimmers & twinkles too.

April 16, 2011

“It would be an awful universe if everything could be converted into words, words, words.” Those were William James’s own words.

Philosophy lives in words, but truth and fact well up into our lives in ways that exceed verbal formulation. There is in the living act of perception always something that glimmers and twinkles and will not be caught, and for which reflection comes too late. No one knows this as well as the philosopher. He must fire his volley of new vocables out of his conceptual shotgun, for his profession condemns him to this industry; but he secretly knows the hollowness and irrelevancy.

A  ”dumb region of the heart” may well be, as James said, our deepest organ of communication with the nature of things.”

Ludwig Wittgenstein agreed: there’s much we ought to shut up about. Or at least restrict ourselves to pointing at. Show, don’t say. Stop wasting time trying to eff the ineffable.

Russell Goodman has written at length about the James-Wittgenstein connection, and shown that the younger philosopher held his elder in much greater regard than is commonly assumed. Wittgenstein liked James’s “nuanced and broad-minded” vision, and confessed to Bertrand Russell:

Whenever I have time now I read James’s Varieties of Religious Experience [and Principles of Psychology], it does me a lot of good.

I’ll bet Witty (as one of my clever former students dubbed him) would have benefited as well from occasionally swapping James’s conceptual shotgun for Douglas Adams’ Point of View gun. Most guys would. “Give me that thing.”

self-termination

February 7, 2011

We’ve created a culture of self-termination, says Daniel Wildcat in Red Alert! at the beginning of chapter three, because we don’t value bio-diversity. We don’t understand the extent of our dependence on non-human species. “Who the hell cares about the snail darter?” Or even, come to that, about polar bears, orangutans, sea turtles, woodpeckers? Maybe we can keep a few in zoos, for our amusement, but we don’t need them. We don’t need to be overly concerned with endangered hotspots? Do we?


There’s lots more on E.O. Wilson and biodiversity at TED, including a wonderful talk from the late Douglas Adams. Thanks for all the fish, indeed.

We think we need our gadgets and the networks they navigate, but Oscar Kawagley (who prayed to “the spiritual person of the universe” at book’s beginning) says he’s a “technological dunce and proud of it. ” Should we emulate him? Surely not.

But are our computer models doing anything to help us roll back the ominous projections of melting ice caps and rising seas? Seems like maybe they are, actually.

Red Alert! is a consistent repudiation of attempts to dominate, control, subdue, or otherwise manage nature. I understand the sentiment, just as I understand the impulse to regard it all as sacred. But I’m still not convinced that this entails so stern a “hands off.” What’s the point of evolving a capacity for intelligence if you’re unwilling to use it?

Thinking “like a mountain,” slowly and with an eye on what Stewart Brand and his friends call the Long Now: will that help us think concretely and constructively about the future of life?

“Changes in our everyday mundane life activities”– with light bulbs, modes of transport, patterns of consumption generally– sound to some like too little, too late. But what else can it mean to “become the change” to which you aspire?

Making technology more attentive “to the life around us” sounds so smart and obvious, until you try to give the idea specific content. Are we really talking about technology at all? Isn’t it people who must be more attentive? The technology’s fine, in fact. We just have to learn how and when to lay it down and go out for a walk.

Aldo Leopold‘s “biotic community” included land, air, water, and all the forms of life attached thereto.  His “land ethic” is clear and hard to improve on:

A thing is right when it tends to preserve the integrity, stability, and beauty of the biotic community…

The land ethic simply enlarges the boundaries of the community to include soils, waters, plants, and animals, or collectively: the land.

This sounds simple: do we not already sing our love for and obligation to the land of the free and the home of the brave? Yes, but just what and whom do we love? Certainly not the soil, which we are sending helter-skelter downriver. Certainly not the waters, which we assume have no function except to turn turbines, float barges, and carry off sewage. Certainly not the plants, of which we exterminate whole communities without batting an eye. Certainly not the animals, of which we have already extirpated many of the largest and most beautiful species. A land ethic of course cannot prevent the alteration, management, and use of these ‘resources,’ but it does affirm their right to continued existence, and, at least in spots, their continued existence in a natural state. Sand County Almanac [more from Leopold]


Our form of life now includes Internet cafes, shopping malls, and gambling casinos. Is there not a way for them all to co-exist?

“Ecological pluralism”acknowledges the reality of religious diversity. Does it do more? Does it block religious disagreement? All are related, mitakua oyasin, but some relations are easier to be with than others.

Another of Wildcat’s impressive but vague phrases: “life-enhancing nature-culture nexus.” I’m all for it, I’m sure. But what is it?

Once again: native wisdom on paper looks pragmatic to the core, all experimental and open-ended and respectful of nature. But does this tradition really challenge its own ancestral legacy? That’s what a self-critical, self-correcting, fallibilistic method of inquiry is supposed to do.

play ball

January 13, 2011

It’s another Opening Day for Intro to Philosophy-let the journey begin…

The syllabus has been posted in the Pipeline, for those who’ve matriculated at our Enormous State University. Your first assignment, STUDENTS, is to find it there and read it. (If you’ve already printed the version posted previously you’ll need to update the sequence of assignments, which has been revised.)

Thinking is serious business but it’s also meant to be fun and enlarging, so we’ll be playing with a graphic as well as textual approach to big questions about life, the universe, and everything. (That’s a lot to cover, admittedly; the universe is really big.)

What is philosophy? I still like William James’s answer: an unusually stubborn striving for clarity. Stubborn, but not inflexible or intransigent. Argumentative but not disagreeable. Philosophy in the classroom is conducted with words pondered, spoken, and heard. That’s why I also like Sally Brown’s experimental approach to this question.

On Day#1 we need to set the right tone by emphasizing the importance of listening not just to our own words but to others’ as well, respectfully and with an appropriate humility.

We all have much to be humble about, not only the limits of our own personal perspicacity but those of the very enterprise we are engaged in: using words to express thoughts and feelings that begin in inarticulate wonder.

Maybe that’s enough words to get us going.

avoid boring people

November 10, 2010

That was the ambiguous title of Double Helix co-discoverer Jim Watson’s book, and it’s also Jaron Lanier’s caution in today’s section of reading in FoL. If we allow ourselves to be assimilated by our software designs and the computing culture they’re locking in, that’s exactly what we’ll be. But the good news is, we’ll also be too flattened and objectified to notice.

As he promised early on, his manifesto is getting a bit cheerier near the end. He now admits that cybernetic totalism is useful for some purposes of understanding. He still wants to keep it out of our actual engineering designs.

He’d rather think of us as meaning-makers, and of our gadgets as mere tools; but he also sees the utility of computationalism, not as a culturally-pervasive  ideology but as a realistic model of the brain (more precisely, of brain-based personhood). It, and we, have been product-tested and honed by “a very large, very long, and very deep encounter with physical reality.” That’s Lanier’s creation story, maybe the one naturalistic account of birdsong and Shakespeare worth considering.

But first he has to get in another swipe at narrow computationalism, akin to the  “Logical Positivism” we thought was moribund. Apparently it’s hot again, in  Silicon Valley (and at MIT and Tufts). With tons of data hovering in the Cloud, just waiting to verify our sentences, the neo-Positivists say, human subjectivity is unnecessary.

Lanier, we already saw, wants to pin a big scarlet  “Z” (for Zombie) on Dan Dennett and his old collaborator Douglas “Strange Loop” Hofstadter for such thinking. Thing is, if they’re right we’re all zombies. It’d be the end of consciousness as we thought we knew it, and we’d feel fine. We’d be as thoughtful and creative and un-blood-lustful as we ever were.

And as loopy, musically and otherwise: look at Andrew Bird‘s amazing 1-man band at TED.

But Lanier is sure they can’t be right. He rejects the Turing test criterion of personhood. When we start finding ourselves indistinguishable from our gadgets “we make ourselves dull.” But on the other hand, a master storyteller like Richard Powers can make a Turing scenario very lively indeed. Read Galatea 2.2, a hugely clever updating of the Pygmalion (“My Fair Lady”) story, if you doubt it. (Dennett is a fan.) Modern sculptors beware: even if it walks and talks like a lady it may still be hardware.

It’s a very big deal to Lanier that our scanners can read faces now. Privacy may be out the window for good. Will anyone look up from their screens long enough to notice?

Finally, a couple of positively-tinged  speculations from Lanier:

Swearing is rooted in sniffing, the “old factory” olfactory system. Who knew? Probably not Artoo, “it would take a lot of wires to address all those entries in the mental smell dictionary.” Or the metal one?

And, automatic language translation may get good enough to begin breaking down ancient nationalistic hostilities. The universal translator is not just a pipe dream, we’re getting closer. But if the machines hiccup they could start a major conflagration, too. Remember Douglas Adams’ inter-galactic misunderstanding between the Vl’hurgs and the G’gugvunts triggered by a malfunctioning Babel Fish (but resolved by a miscalculation of scale and swallowed by a canine)? So, maybe you don’t want to stick it in your ear.

Finally, Lanier the humanist computer geek is worried about the future of language and literature as the Cloud expands. But if he was right about the “sexual display” component of good words, we shouldn’t have to worry. Persons seeking mates will never be entirely boring. And Wikipedia is still growing, but it’s nowhere near Borges’ Library of  Babel. Is it?

mostly harmless?

November 8, 2010

Jaron Lanier, the old fogey, wants to be put out to pasture. He doesn’t get why kids these days are so smitten with Linux and Wikipedia.

In general he thinks Web 2.0’s culture of  “free & open”  collaboration is “choking off” originality and “eating its own seed stock” by endlessly mashing and recycling old-media content.  What ever happened to real novelty and first-order creativity? As Mr. Emerson challenged, why should we not enjoy our own original relation to the universe? His timid young men hunched over ancient, dusty tomes in libraries have become wired young men and women hunched over keyboards, but the challenge stands. (Wonder what he’d think of this collaborative mashup of his own words, unsourced and de-contextualized?)

Is it really all about the Internet-as-Frankenstein fantasy?.  But the monster was always a freak, albeit a compelling and finally pitiable one. He was not simply the Next Big Thing. He was not the future of life that would render the rest of us obsolete and irrelevant.

Lanier’s problem with the Open Source movement is not that its free and available to all, but that it threatens to turn all into cyphers and drones. He thinks it kills creativity and stifles innovation. I don’t know about that. I’m sure grateful for Open Office  and have been as creative with it as I know how.

I don’t know, either, about the claim that making mashups of old media content “accomplishes nothing.” Those clever Symphony of Science productions brought Carl Sagan’s inspiring cosmic consciousness to a new generation. That’s not nothing.

Speaking as an old fogey, I do agree with his verdict on  the last decade or so of pop music. It does seem pretty derivative and uninspired. Unreal. All too easily represented. Lifeless, disjoint, out of context, not conducive to human connection. But since when were older people expected to like younger people’s music, or even encouraged to listen? Lanier’s charge, though, is that young people don’t really have music of their own anymore. Is he wrong? I can tell you, there’s plenty on my kids’ iPods you won’t catch me listening to.

Then again, there’s Wikipedia and its legion of anonymous mostly-youthful creators. Everyone’s apparently gotten over– or just forgotten?– those early days of reckless, unaccountable slander. I wonder how many of us even remember the Seigenthaler fiasco?

On the other hand, there haven’t been any comparable high-profile hoaxes lately. The anonymous crowd-cloud seems to hold more water than it used to. The New York Times gushed about it yesterday: “Wikipedia is vitally important to the culture…”  Is it really making us smarter, Lanier asks? Or is it just exploiting our laziness?

In any event, Wikipedia’s not nearly as much fun as The Book. Those hitchhikers, whatever else you want to say about them, were distinctive individuals. (Zaphod Beeblebrox, doubly so.) Even Marvin the paranoid android.

“Fintlewoodlewix”

May 20, 2010

Our home-sweet-home? Continuing the Douglas Adams theme…

Bill McKibben says we’re not on Earth anymore. We need a new name for the warming new planet we’ve been carbonizing. He proposes Eaarth, with a suitably-alien pronunciation (in the fashion of Governor Terminator).

But that’s not weird enough for some. “Why not change it more?” asks a Guardian blogger.

It seems rather indulgent to write a whole book about this idea and only add one vowel. McKibben, who admits he liked the sci-fi look of the word, says it reflects the fact that the planet in question is “a lot like our own … but different enough”.

The word Earth apparently originates from the Anglo-Saxon word for ground or soil, erda. There are, of course, already hundreds of alternatives from different cultures and languages. Famous alternatives include Terra or Tierra, Gaia and Fintlewoodlewix, the name given by the original Golgafrincham inhabitants in Douglas Adams’ The Hitchhiker’s Guide to the Galaxy.

Okay, now might be a perfect time to panic after all.

But Bill is not panicking. His book is not ultimately as  bleak as you might imagine. We may fail to stem the tide of a planet no longer hospitable to our form of life, but we may succeed. He’s for trying. Me too.

rational optimism

May 19, 2010

People will ask what Matt Ridley’s been smoking.

Prosperity spreads, technology progresses, poverty declines, disease retreats, fecundity falls, happiness increases, violence atrophies, freedom grows, knowledge flourishes, the environment improves and wilderness expands.

That’s his line and his vision for the century ahead in The Rational Optimist, whose thesis is supposed to be “in your face”– or in the faces of those fashionable pessimists who insist that the end is nigh. (NYT reviewamazon)

The catch, for some of us, will be the book’s advocacy of unrestricted global trade and its implicit faith in perpetual growth and economic expansion. But the allure is the upbeat recognition that, for solid evolutionary reasons, we’re becoming better co-operators (or mutual enablers) and are living better (at least materially and medically), longer lives. “Everybody is working for everybody else.” (There’s a Hitchhiker’s Guide-style video blurb under that title on YouTube and at Ridley’s site.) Unlike Arthur Dent, Ridley’s not “gone off the idea of progress.”

We can quibble about particulars, and will in the “Future of Life” course this Fall. But on balance he’s right, by most tangible measures of species well-being we’re better off than our ancestors, and the past is no paradise. All who want to transport back to the 13th century and stay there, raise your hand.

Thought so. Of course, the future’s still in the balance. No guarantees. But don’t panic.

talked out

April 10, 2010

Thank goodness it’s Saturday, and I can shut up for a couple of days! A week of croaking and gasping through stressed vocal cords has me really feeling the truth of James’s complaint that it would be an awful universe if everything could be converted into words, words, words.”

Philosophy lives in words, but truth and fact well up into our lives in ways that exceed verbal formulation. There is in the living act of perception always something that glimmers and twinkles and will not be caught, and for which reflection comes too late. No one knows this as well as the philosopher. He must fire his volley of new vocables out of his conceptual shotgun, for his profession condemns him to this industry; but he secretly knows the hollowness and irrelevancy.

A  “dumb region of the heart” may well be, as James said, our deepest organ of communication with the nature of things.”

But for those of us who don’t sign or read minds– (I loved the Tim McGraw line to Sandra Bullock in “Blind Side,” our family Friday flick last night: “Tell me what’s on your mind, so I’ll know what I’m supposed to think.”)–  communicating with people, in person, still requires vocalizing.

So if I’m going to vow silence for the weekend, it’d sure be nice to swap the conceptual shotgun for the POV gun. Give me that thing.

all fools

April 1, 2010

Would you believe (as Maxwell Smart would’ve put it) I designed our A&S syllabus to bring us to Dawkins’ discussion of “All Fools Day” precisely today?

If so, I’d be flattered. And you’d be gullible, in just the sense he’s about to explain.

But here we are in chapter six (“Hoodwink’d With Faery Fancy”) of Unweaving the Rainbow, back on one of his and one of my favorite themes, childhood indoctrination. I’ll bring baseball into it, if I get half a chance. [A prayer from Dawkins (!) for his daughterGod Delusion on ch’hd indoctrination]

On All Fools‘ Day one year, when my sister and I were children, our parents and our uncle and aunt played a simple trick on us…

The short version of this delightful recollection is that young Richard and his sister went for a blindfolded “aeroplane” flight, much as Red and Rover regularly do with eyes wide open. (American kids are more credulous, naturally.) Their father & uncle provided the sound-and-motion simulation to create a virtual experience they wouldn’t question, at that age. “We had simply been sitting on a garden seat… the tree branches brushing against us had been wielded by our mother and aunt… It had been fun while it lasted.”

Childhood is of course a time of natural credulity, hence vulnerability to nonsense. That’s good, because lots of childish nonsense is great fun. And it’s bad, because lots of childish nonsense paves the way for intransigent adult nonsense. “It never occurred to us to wonder why we must be blindfolded. Wouldn’t it have been natural to ask what was the point of going for a joyride if you couldn’t see anything?” No, not really. “We just didn’t have the sceptic’s turn of mind… such was our faith in our parents.”

That flight was on all fours with Santa, the tooth-fairy, angels, heaven, and so much more nonsense that adults in America don’t know how to question.

But there was a time in our species history when “an experimental and sceptical turn of mind” was more likely to get you get you dead. (Remember Douglas Adams’ whale?) Maybe that’s why so many of us continue to shun it, at our peril. But let’s admit: shunning skepticism is still more likely to get you invited to church and other modern forms of safe-haven inclusion. There’s a risk factor grown-ups (another name for skeptics) must swallow, to affirm their incredulity. Growing up is no bowl of petunias, as not only Dawkins’ pal Adams but also the author of Childhood’s End tried to tell us, but it’s crucial.  Clarke’s Third Law (“Any sufficiently advanced technology is indistinguishable from magic”) won’t make the crate fly.

But there comes a time when we ought to notice, here on our pale blue dot (threw that in for you, James), that “the universe is much bigger than our prophets said, grander, more subtle, more elegant,” less magical and far more wondrous. A spiritually-mature worldview (let’s say) “that stressed the magnificence of the Universe as revealed by modern sciednce might be able to draw forth reserves of reverence and awe hardly tapped by the conventional faiths.”

Meanwhile, “human children have wide open ears and eyes, and gaping, trusting minds for sucking up language” and folk wisdom. “It must be so because Mummy and Daddy said it was.” How sobering is that, parents! “Trusting credulity may be normal and healthy in a child but it can become an unhealthy and reprehensible gullibility in an adult. Growing up… should include the cultivation of a healthy scepticism.”

Also worth noting in today’s reading: all that talk of barcodes, by which Dawkins means to symbolize “precise analysis” rooted in a pervasively-digitized information environment, brings us closer to what Michael Shermer has called the “soul of science” and an echo of the claim Sam Harris has been trumpeting lately that scientific precision should also help clarify our values. Shermer:

Morality and purpose are inextricably interdigitated — you cannot have one without the other. Fortunately, nature grants us the capacity for both morality and purpose, culture affords us the liberty to reach for higher moral purposes, and history brings us to a place where we can employ both for the enrichment of all.Through natural evolution and man-made culture, we have inherited the mantle of life’s caretaker on earth. Rather than crushing our spirits, the realization that we exist together for a narrow slice of time and space elevates us to a higher plane of humanity and humility: a proud, albeit passing, act in the drama of the cosmos.

This suggests the next step on our evolutionary walk (or our next flight-destination), doesn’t it? Humility should make us more skeptical, less obstinately gullible, and a lot less stubbornly persistent in the delusions of childhood. But we’re going to have to stop giving our children away those first seven years.


Follow

Get every new post delivered to your Inbox.

Join 57 other followers