Dealing with teenage disaffection used to be fairly simple. Get drunk, and/or channel Philip Larkin (preferably out loud). If you paid careful attention to the poet’s last verse, everything could even be alright in the end, and nothing would ever be your fault.
There was a short “film” (or ‘single channel video art piece’, as defined by Wikipedia) produced in 1973 by Richard Serra and Carlota Fay Schoolman calledTelevision Delivers Peoplewhich made the then-provocative argument that commercial television essentially functioned to deliver people to advertisers, meaning the product of TV is you, rather than whatever slapstick show you happened to be watching:
So there’s a “study” making the rounds claiming to have found evidence of a link between consuming “ultra-processed foods” and developing cancer. It’s not the first time we’ve heard that processed stuff like bacon and pastrami leads to cancer, but this one expands the range of “processed” to the more scary “ultra-processed” to include the following (handily summarised by the BBC):
Leaving aside what exactly even are “foods made mostly or entirely from sugar, oils and fats”, it’s an excellent example of the kind of rubbish headlines that lead to the worst outcomes of social media, and of the resulting issue of people being rightly confused about what, or what not, to eat, because it’s so beautifully tweetable, but mostly bullshit (scientifically speaking):
A review essay by Cass Sunstein on a new book which sounds like it poses serious and interesting challenges to the priority many of us (myself included) place on our ability, and freedom, to choose for ourselves:
Until now, we have lacked a serious philosophical discussion of whether and how recent behavioral findings undermine Mill’s harm principle and thus open the way toward paternalism. Sarah Conly’s illuminating book Against Autonomy provides such a discussion. Her starting point is that in light of the recent findings, we should be able to agree that Mill was quite wrong about the competence of human beings as choosers. “We are too fat, we are too much in debt, and we save too little for the future.” With that claim in mind, Conly insists that coercion should not be ruled out of bounds. She wants to go far beyond nudges. In her view, the appropriate government response to human errors depends not on high-level abstractions about the value of choice, but on pragmatic judgments about the costs and benefits of paternalistic interventions. Even when there is only harm to self, she thinks that government may and indeed must act paternalistically so long as the benefits justify the costs.
Conly is quite aware that her view runs up against widespread intuitions and commitments. For many people, a benefit may consist precisely in their ability to choose freely even if the outcome is disappointing. She responds that autonomy is “not valuable enough to offset what we lose by leaving people to their own autonomous choices.” Conly is aware that people often prefer to choose freely and may be exceedingly frustrated if government overrides their choices. If a paternalistic intervention would cause frustration, it is imposing a cost, and that cost must count in the overall calculus. But Conly insists that people’s frustration is merely one consideration among many. If a paternalistic intervention can prevent long-term harm—for example, by eliminating risks of premature death—it might well be justified even if people are keenly frustrated by it.
‘[W]ill any of these future literary creations be works that last? The digital world has two cankers that constantly gnaw away at all notions of permanence: fragmentation and endless revisability. The former of these is our daily lament about our wired world: too much information, too many content providers, not enough time to begin to absorb any of it. The latter is less discussed. Yet the instant and infinite revisability of virtual text means that authors can continuously “improve” their work, perhaps in response to criticism, perhaps simply because writers are never truly ready to part with their creations. The notion of a definitive edition of an enduring work may soon disappear.
Is the rise of literary self-publishing the beginning of the death of literature, of works that become part of a culture’s DNA and pass from generation to generation? When the nextStone Angel or Fifth Business is published, how many of us will even know it exists? Will any of the fine novels now being brought into the world be read a hundred years from now?’
‘We live in an age when there is tremendous competition for—I was going to say “the reader’s attention,” but reading is part, a large part, of what has suddenly become negotiable. The Yale literary critic Geoffrey Hartman once wrote a book called The Fate of Reading: It is not, in my judgment, a very good book, but it would have been had Professor Hartman got around to addressing the subject announced in his provocative title. It is of course a subject that goes far beyond the issue of the American or any other sort of novel: The advent of television, the ubiquity of mass media, the eruption of the Internet and ebooks with their glorification of instantaneity—all this has done an extraordinary amount to alter the relationship between life and literature. Television lulled us into acquiescence, the Internet with its vaunted search engines and promise of the world at your fingertips made further inroads in seducing us to reduce wisdom to information: to believe that ready access to information was somehow tantamount to knowledge. I pause here to quote David Guaspari’s wise and amusing observation on this subject: “Comparing information and knowledge,” he writes, “is like asking whether the fatness of a pig is more or less green than the designated hitter rule.”
I am not, to be candid, quite sure what the “designated hitter rule” portends, but I am confident that it has nothing to do with being green or porcine plumpness. When I was in graduate school, I knew some students who believed that by making a Xerox copy of an article, they had somehow absorbed, or at least partly absorbed, its content. I suppose the contemporary version of that déformation professionelle is the person who wanders around with a computer perpetually linked to Google and who therefore believes he knows everything. It reminds one of the old complaint about students at the elite French universities: They know everything, it was said; unfortunately that is all they know.’