I’d read enough blog posts and magazine articles and books about how the internet makes us lonely, or stupid, or lonely and stupid, that I’d begun to believe them. I wanted to figure out what the internet was “doing to me,” so I could fight back. But the internet isn’t an individual pursuit, it’s something we doÂ withÂ each other. The internet is where people are.
When I return to the internet, I might not use it well. I might waste time, or get distracted, or click on all the wrong links. I won’t have as much time to read or introspect or write the great American sci-fi novel.
A review essay by Cass Sunstein on a new book which sounds like it poses serious and interesting challenges to the priority many of us (myself included) place on our ability, and freedom, to choose for ourselves:
Until now, we have lacked a serious philosophical discussion of whether and how recent behavioral findings undermine Millâ€™s harm principle and thus open the way toward paternalism. Sarah Conlyâ€™s illuminating book Against Autonomy provides such a discussion. Her starting point is that in light of the recent findings, we should be able to agree that Mill was quite wrong about the competence of human beings as choosers. â€œWe are too fat, we are too much in debt, and we save too little for the future.â€ With that claim in mind, Conly insists that coercion should not be ruled out of bounds. She wants to go far beyond nudges. In her view, the appropriate government response to human errors depends not on high-level abstractions about the value of choice, but on pragmatic judgments about the costs and benefits of paternalistic interventions. Even when there is only harm to self, she thinks that government may and indeed must act paternalistically so long as the benefits justify the costs.
Conly is quite aware that her view runs up against widespread intuitions and commitments. For many people, a benefit may consist precisely in their ability to choose freely even if the outcome is disappointing. She responds that autonomy is â€œnot valuable enough to offset what we lose by leaving people to their own autonomous choices.â€ Conly is aware that people often prefer to choose freely and may be exceedingly frustrated if government overrides their choices. If a paternalistic intervention would cause frustration, it is imposing a cost, and that cost must count in the overall calculus. But Conly insists that peopleâ€™s frustration is merely one consideration among many. If a paternalistic intervention can prevent long-term harmâ€”for example, by eliminating risks of premature deathâ€”it might well be justified even if people are keenly frustrated by it.
‘It is a privilege to earn oneâ€™s living by writing but, as I discovered, it is also a privilege, and a less stressful one, to earn oneâ€™s living by reading.
My first contention: this is a great time to be a reader. The amount of good writing freely available online far exceeds what even the most dedicated consumer might have hoped to encounter a generation ago within the limits of printed media.
I donâ€™t pretend that everything online is great writing. Let me go further: only 1 per cent is of value to the intelligent general reader, by which I mean the demographic that, in the mainstream media world, might look toÂ the Economist,Â the Financial Times,Â Foreign AffairsÂ orÂ the AtlanticÂ for information. Another 4 per cent of the internet counts as entertaining rubbish. The remaining 95 per cent has no redeeming features. But even the 1 per cent of writing by and for the elite is an embarrassment of riches, a horn of plenty, a garden of delights.’
‘Have we seen a different type of person arise in the West … ? How else would you explain that the virtues of respect, duty, deference and self-sacrifice seem to have been universally derided if not abandoned?
Certainly I am worried about a shallowness in the human personality that, if I may so put it, appears to be deepening. Even such things as the electronic media of communication, for those unfortunate enough to have been brought up with them, seem to hollow out human relations, making them extensive rather than intensive. As to derided ideas such as humility, proper deference and so forth, I think we live in an age of inflamed egotism, and of individualism without individuality. Never has it been more necessary, and at the same time more difficult, to mark yourself out as an individual. The slightest subordination in any circumstances is therefore felt as a wound, because the ego is so fragile, and relies on such props as the brand of trainers you are wearing.’
‘[W]ill any of these future literary creations be works that last? The digital world has two cankers that constantly gnaw away at all notions of permanence: fragmentation and endless revisability. The former of these is our daily lament about our wired world: too much information, too many content providers, not enough time to begin to absorb any of it. The latter is less discussed. Yet the instant and infinite revisability of virtual text means that authors can continuously â€œimproveâ€ their work, perhaps in response to criticism, perhaps simply because writers are never truly ready to part with their creations. The notion of a definitive edition of an enduring work may soon disappear.
Is the rise of literary self-publishing the beginning of the death of literature, of works that become part of a cultureâ€™s DNA and pass from generation to generation? When the nextStone AngelÂ orÂ Fifth BusinessÂ is published, how many of us will even know it exists? Will any of the fine novels now being brought into the world be read a hundred years from now?’
‘The fact that food-talk slips so easily these days into sex-talk might be interpreted as part of the more generalised pornification of everything; but I think it represents a different trend: the foodification of everything. Food is the vehicle through which we are now invited to take not only our erotic thrills but also our spiritual nourishment (count the number of cookbook “bibles” and purple paeans to the personal-growth aspects of stuffing yourself in memoirs such asÂ Eat, Pray, Love), and even our education in history (the fad for food “archaeology”, cooking peculiar dishes from centuries-old recipes) or science (which Jamie Oliver says pupils can learn about through enforced cooking lessons). Food is now the grease-smeared lens through which we want to view the world. It’s an infantile ambition. A baby learns about the environment by putting things in its mouth. Are we all babies now?’
He concludes by asking ‘What if we began to care a little more about what we put into our minds than what we put into our mouths?’
‘An intellectual pestilence is upon us. Shop shelves groan with books purporting to explain, through snazzy brain-imaging studies, not only how thoughts and emotions function, but how politics and religion work, and what the correct answers are to age-old philosophical controversies. The dazzling real achievements of brain research are routinely pressed into service for questions they were never designed to answer. This is the plague of neuroscientism â€“ aka neurobabble, neurobollocks, or neurotrash â€“ and itâ€™s everywhere.
… Happily, a new branch of the neuroscienceexplains everything genre may be created at any time by the simple expedient of adding the prefix â€œneuroâ€ to whatever you are talking about. Thus, â€œneuroeconomicsâ€ is the latest in a long line of rhetorical attempts to sell the dismal science as a hard one; â€œmolecular gastronomyâ€ has now been trumped in the scientised gluttony stakes by â€œneurogastronomyâ€; students of Republican and Democratic brains are doing â€œneuropoliticsâ€; literature academics practise â€œneurocriticismâ€. There is â€œneurotheologyâ€, â€œneuromagicâ€ (according toÂ Sleights of Mind, an amusing book about how conjurors exploit perceptual bias) and even â€œneuromarketingâ€. Hoping itâ€™s not too late to jump on the bandwagon, I have decided to announce that I, too, am skilled in the newly minted fields of neuroprocrastination and neuroflÃ¢neurship.’
‘Around the turn of the twenty-first century, a paradigm shift in our thinking took place: we decided that the era of revolutions was over and that the era of catastrophes had begun. The former had involved expectation, the hope that the human race would proceed toward some goal. But once the end of history was announced, the Communist enemy vanquished, and, more recently, the War on Terror all but won, the idea of progress lay moribund. What replaced the worldâ€™s human future was the future of the world as a material entity. The long list of emblematic victimsâ€”Jews, blacks, slaves, proletarians, colonized peoplesâ€”was likewise replaced, little by little, with the Planet, the new paragon of all misery. No longer were we summoned to participate in a particular community; rather, we were invited to identify ourselves with the spatial vessel that carried us, groaning.
The fear that these intellectuals spread is like a gluttonous enzyme that swallows up an anxiety, feeds on it, and then leaves it behind for new ones. When the Fukushima nuclear plant melted down after the enormous earthquake in Japan in March 2011, it only confirmed a feeling of anxiety that was already there, looking for some content. In six months, some new concern will grip us: a pandemic, bird flu, the food supply, melting ice caps, cell-phone radiation.
The language of fear does not include the word â€œmaybe.â€ It tells us, rather, that the horror is inevitable. Resistant to all doubt, it is satisfied to mark the stages of degradation. This is another paradox of fear: it is ultimately reassuring. At least we know where we are headingâ€”toward the worst.’
‘We live in an age when there is tremendous competition forâ€”I was going to say â€œthe readerâ€™s attention,â€ but reading is part, a large part, of what has suddenly become negotiable. The Yale literary critic Geoffrey Hartman once wrote a book calledÂ The Fate of Reading: It is not, in my judgment, a very good book, but it would have been had Professor Hartman got around to addressing the subject announced in his provocative title. It is of course a subject that goes far beyond the issue of the American or any other sort of novel: The advent of television, the ubiquity of mass media, the eruption of the Internet and ebooks with their glorification of instantaneityâ€”all this has done an extraordinary amount to alter the relationship between life and literature. Television lulled us into acquiescence, the Internet with its vaunted search engines and promise of the world at your fingertips made further inroads in seducing us to reduce wisdom to information: to believe that ready access to information was somehow tantamount to knowledge. I pause here to quote David Guaspariâ€™s wise and amusing observation on this subject: â€œComparing information and knowledge,â€ he writes, â€œis like asking whether the fatness of a pig is more or less green than the designated hitter rule.â€
An older piece by Neal Gabler in The Observer, where he observes that the Internet has just actualised a very old idea:
‘Â It is certainly no secret that the internet has eroded the authority of traditional critics and substituted Everyman opinion on blogs, websites, even on Facebook and Twitter where one’s friends and neighbours get to sound off. What is less widely acknowledged is just how deeply this populist blowback is embedded in America and how much of American culture has been predicated on a conscious resistance to cultural elites. It is virtually impossible to understand America without understanding the long ongoing battle between cultural commissars who have always attempted to define artistic standards and ordinary Americans who take umbrage at those commissars and their standards.
… Â We live, then, in a new age of cultural populism â€“ an age in which everyone is not only entitled to his opinion but is encouraged to share it. Nothing could be more American.’
‘We prefer knowing to thinking because knowing has more immediate value. It keeps us in the loop, keeps us connected to our friends and our cohort. Ideas are too airy, too impractical, too much work for too little reward. Few talk ideas. Everyone talks information, usually personal information. Where are you going? What are you doing? Whom are you seeing? These are todayâ€™s big questions.
It is certainly no accident that the post-idea world has sprung up alongside the social networking world. Even though there are sites and blogs dedicated to ideas, Twitter, Facebook, Myspace, Flickr, etc., the most popular sites on the Web, are basically information exchanges, designed to feed the insatiable information hunger, though this is hardly the kind of information that generates ideas. It is largely useless except insofar as it makes the possessor of the information feel, well, informed. Of course, one could argue that these sites are no different than conversation was for previous generations, and that conversation seldom generated big ideas either, and one would be right.’