On the Internet and becoming disconnected

images

A really interesting piece on what it’s like to take a year off the Internet, and then come back.

I’d read enough blog posts and magazine articles and books about how the internet makes us lonely, or stupid, or lonely and stupid, that I’d begun to believe them. I wanted to figure out what the internet was “doing to me,” so I could fight back. But the internet isn’t an individual pursuit, it’s something we do with each other. The internet is where people are.

When I return to the internet, I might not use it well. I might waste time, or get distracted, or click on all the wrong links. I won’t have as much time to read or introspect or write the great American sci-fi novel.

But at least I’ll be connected.

It’s for your own good

imagesA review essay by Cass Sunstein on a new book which sounds like it poses serious and interesting challenges to the priority many of us (myself included) place on our ability, and freedom, to choose for ourselves:

Until now, we have lacked a serious philosophical discussion of whether and how recent behavioral findings undermine Mill’s harm principle and thus open the way toward paternalism. Sarah Conly’s illuminating book Against Autonomy provides such a discussion. Her starting point is that in light of the recent findings, we should be able to agree that Mill was quite wrong about the competence of human beings as choosers. “We are too fat, we are too much in debt, and we save too little for the future.” With that claim in mind, Conly insists that coercion should not be ruled out of bounds. She wants to go far beyond nudges. In her view, the appropriate government response to human errors depends not on high-level abstractions about the value of choice, but on pragmatic judgments about the costs and benefits of paternalistic interventions. Even when there is only harm to self, she thinks that government may and indeed must act paternalistically so long as the benefits justify the costs.

Conly is quite aware that her view runs up against widespread intuitions and commitments. For many people, a benefit may consist precisely in their ability to choose freely even if the outcome is disappointing. She responds that autonomy is “not valuable enough to offset what we lose by leaving people to their own autonomous choices.” Conly is aware that people often prefer to choose freely and may be exceedingly frustrated if government overrides their choices. If a paternalistic intervention would cause frustration, it is imposing a cost, and that cost must count in the overall calculus. But Conly insists that people’s frustration is merely one consideration among many. If a paternalistic intervention can prevent long-term harm—for example, by eliminating risks of premature death—it might well be justified even if people are keenly frustrated by it.

Net Wisdom

Lovely piece by Robert Cotrell in the FT on what a good time it is to be a reader:

‘It is a privilege to earn one’s living by writing but, as I discovered, it is also a privilege, and a less stressful one, to earn one’s living by reading.

My first contention: this is a great time to be a reader. The amount of good writing freely available online far exceeds what even the most dedicated consumer might have hoped to encounter a generation ago within the limits of printed media.

I don’t pretend that everything online is great writing. Let me go further: only 1 per cent is of value to the intelligent general reader, by which I mean the demographic that, in the mainstream media world, might look to the Economist, the Financial Times, Foreign Affairs or the Atlantic for information. Another 4 per cent of the internet counts as entertaining rubbish. The remaining 95 per cent has no redeeming features. But even the 1 per cent of writing by and for the elite is an embarrassment of riches, a horn of plenty, a garden of delights.’

 

Interview with Theodore Dalrymple

Eloquent as always. From The Coffee House Wall Interview.

‘Have we seen a different type of person arise in the West … ? How else would you explain that the virtues of respect, duty, deference and self-sacrifice seem to have been universally derided if not abandoned?

Certainly I am worried about a shallowness in the human personality that, if I may so put it, appears to be deepening. Even such things as the electronic media of communication, for those unfortunate enough to have been brought up with them, seem to hollow out human relations, making them extensive rather than intensive. As to derided ideas such as humility, proper deference and so forth, I think we live in an age of inflamed egotism, and of individualism without individuality. Never has it been more necessary, and at the same time more difficult, to mark yourself out as an individual. The slightest subordination in any circumstances is therefore felt as a wound, because the ego is so fragile, and relies on such props as the brand of trainers you are wearing.’

On the rise of literary self-publishing

Interesting questions by Rick Archbold in the Literary Review of Canada:

‘[W]ill any of these future literary creations be works that last? The digital world has two cankers that constantly gnaw away at all notions of permanence: fragmentation and endless revisability. The former of these is our daily lament about our wired world: too much information, too many content providers, not enough time to begin to absorb any of it. The latter is less discussed. Yet the instant and infinite revisability of virtual text means that authors can continuously “improve” their work, perhaps in response to criticism, perhaps simply because writers are never truly ready to part with their creations. The notion of a definitive edition of an enduring work may soon disappear.

Is the rise of literary self-publishing the beginning of the death of literature, of works that become part of a culture’s DNA and pass from generation to generation? When the nextStone Angel or Fifth Business is published, how many of us will even know it exists? Will any of the fine novels now being brought into the world be read a hundred years from now?’

On food porn and pseudoscience

Here’s Steven Poole’s response to the inevitable “food porn” talk that Nigella’s caramel-covered face generated in December last year:

‘The fact that food-talk slips so easily these days into sex-talk might be interpreted as part of the more generalised pornification of everything; but I think it represents a different trend: the foodification of everything. Food is the vehicle through which we are now invited to take not only our erotic thrills but also our spiritual nourishment (count the number of cookbook “bibles” and purple paeans to the personal-growth aspects of stuffing yourself in memoirs such as Eat, Pray, Love), and even our education in history (the fad for food “archaeology”, cooking peculiar dishes from centuries-old recipes) or science (which Jamie Oliver says pupils can learn about through enforced cooking lessons). Food is now the grease-smeared lens through which we want to view the world. It’s an infantile ambition. A baby learns about the environment by putting things in its mouth. Are we all babies now?’

He concludes by asking ‘What if we began to care a little more about what we put into our minds than what we put into our mouths?’

Good question. And speaking of which, here is Poole more recently in the New Statesman withYour brain on pseudoscience: the rise of popular neurobollocks’:

‘An intellectual pestilence is upon us. Shop shelves groan with books purporting to explain, through snazzy brain-imaging studies, not only how thoughts and emotions function, but how politics and religion work, and what the correct answers are to age-old philosophical controversies. The dazzling real achievements of brain research are routinely pressed into service for questions they were never designed to answer. This is the plague of neuroscientism – aka neurobabble, neurobollocks, or neurotrash – and it’s everywhere.

… Happily, a new branch of the neuroscienceexplains everything genre may be created at any time by the simple expedient of adding the prefix “neuro” to whatever you are talking about. Thus, “neuroeconomics” is the latest in a long line of rhetorical attempts to sell the dismal science as a hard one; “molecular gastronomy” has now been trumped in the scientised gluttony stakes by “neurogastronomy”; students of Republican and Democratic brains are doing “neuropolitics”; literature academics practise “neurocriticism”. There is “neurotheology”, “neuromagic” (according to Sleights of Mind, an amusing book about how conjurors exploit perceptual bias) and even “neuromarketing”. Hoping it’s not too late to jump on the bandwagon, I have decided to announce that I, too, am skilled in the newly minted fields of neuroprocrastination and neuroflâneurship.’

It’s a good rant. I’ll look forward to reading his forthcoming You Aren’t What You Eat.

 

Apocalyptic Daze (on a world of moral panics)

By Pascal Bruckner in the City Journal:

‘Around the turn of the twenty-first century, a paradigm shift in our thinking took place: we decided that the era of revolutions was over and that the era of catastrophes had begun. The former had involved expectation, the hope that the human race would proceed toward some goal. But once the end of history was announced, the Communist enemy vanquished, and, more recently, the War on Terror all but won, the idea of progress lay moribund. What replaced the world’s human future was the future of the world as a material entity. The long list of emblematic victims—Jews, blacks, slaves, proletarians, colonized peoples—was likewise replaced, little by little, with the Planet, the new paragon of all misery. No longer were we summoned to participate in a particular community; rather, we were invited to identify ourselves with the spatial vessel that carried us, groaning.

The fear that these intellectuals spread is like a gluttonous enzyme that swallows up an anxiety, feeds on it, and then leaves it behind for new ones. When the Fukushima nuclear plant melted down after the enormous earthquake in Japan in March 2011, it only confirmed a feeling of anxiety that was already there, looking for some content. In six months, some new concern will grip us: a pandemic, bird flu, the food supply, melting ice caps, cell-phone radiation.

The language of fear does not include the word “maybe.” It tells us, rather, that the horror is inevitable. Resistant to all doubt, it is satisfied to mark the stages of degradation. This is another paradox of fear: it is ultimately reassuring. At least we know where we are heading—toward the worst.’

All of which is a prescient reminder to re-read Michael Crichton’s excellent “Environmentalism as Religion“.

 

The Great American Novel: will there ever be another?

By Roger Kimball in the Weekly Standard:

‘We live in an age when there is tremendous competition for—I was going to say “the reader’s attention,” but reading is part, a large part, of what has suddenly become negotiable. The Yale literary critic Geoffrey Hartman once wrote a book called The Fate of Reading: It is not, in my judgment, a very good book, but it would have been had Professor Hartman got around to addressing the subject announced in his provocative title. It is of course a subject that goes far beyond the issue of the American or any other sort of novel: The advent of television, the ubiquity of mass media, the eruption of the Internet and ebooks with their glorification of instantaneity—all this has done an extraordinary amount to alter the relationship between life and literature. Television lulled us into acquiescence, the Internet with its vaunted search engines and promise of the world at your fingertips made further inroads in seducing us to reduce wisdom to information: to believe that ready access to information was somehow tantamount to knowledge. I pause here to quote David Guaspari’s wise and amusing observation on this subject: “Comparing information and knowledge,” he writes, “is like asking whether the fatness of a pig is more or less green than the designated hitter rule.”

I am not, to be candid, quite sure what the “designated hitter rule” portends, but I am confident that it has nothing to do with being green or porcine plumpness. When I was in graduate school, I knew some students who believed that by making a Xerox copy of an article, they had somehow absorbed, or at least partly absorbed, its content. I suppose the contemporary version of that déformation professionelle is the person who wanders around with a computer perpetually linked to Google and who therefore believes he knows everything. It reminds one of the old complaint about students at the elite French universities: They know everything, it was said; unfortunately that is all they know.’

Everyone’s a critic now

An older piece by Neal Gabler in The Observer, where he observes that the Internet has just actualised a very old idea:

‘ It is certainly no secret that the internet has eroded the authority of traditional critics and substituted Everyman opinion on blogs, websites, even on Facebook and Twitter where one’s friends and neighbours get to sound off. What is less widely acknowledged is just how deeply this populist blowback is embedded in America and how much of American culture has been predicated on a conscious resistance to cultural elites. It is virtually impossible to understand America without understanding the long ongoing battle between cultural commissars who have always attempted to define artistic standards and ordinary Americans who take umbrage at those commissars and their standards.

…  We live, then, in a new age of cultural populism – an age in which everyone is not only entitled to his opinion but is encouraged to share it. Nothing could be more American.’

Interesting take. Read it here.

On living in a post-idea world

By Neal Gabler, in The New York Times:

‘We prefer knowing to thinking because knowing has more immediate value. It keeps us in the loop, keeps us connected to our friends and our cohort. Ideas are too airy, too impractical, too much work for too little reward. Few talk ideas. Everyone talks information, usually personal information. Where are you going? What are you doing? Whom are you seeing? These are today’s big questions.

It is certainly no accident that the post-idea world has sprung up alongside the social networking world. Even though there are sites and blogs dedicated to ideas, Twitter, Facebook, Myspace, Flickr, etc., the most popular sites on the Web, are basically information exchanges, designed to feed the insatiable information hunger, though this is hardly the kind of information that generates ideas. It is largely useless except insofar as it makes the possessor of the information feel, well, informed. Of course, one could argue that these sites are no different than conversation was for previous generations, and that conversation seldom generated big ideas either, and one would be right.’

Read the full (sad) piece here.