,

“Jumping into the boundless streams of Twitter is not very different from compulsively buying books”

Jumping into the boundless streams of Twitter is not very different from compulsively buying books in the false hope that, one day, you might read them. Of course, you won’t, but this doesn’t matter: it’s the very brief encounter with that possibility that counts. The fire hose of social media tricks us into thinking that, for a fleeting moment, we can play God and conquer every link that is dumped upon us; it gives us that mad utopian hope that, with proper training, we can emerge victorious in the war on information overload.

Evgeny Morozov, “Only Disconnect”, The New Yorker (28 October 2013), 37.

Every magazine is addressed to a readership for whom what the magazine presents as attained is in truth aspirational

Every magazine is addressed to a readership for whom what the magazine presents as attained is in truth aspirational: Seventeen is read by twelve-year-olds, and no playboy has ever read Playboy. The explicit goal of the National Geographic Society, and of its house journal, was to show the world to the worldly, to enlarge the map, to support exploration with grants and medals. But the real task of National Geographic was to show white people who rarely got far from Cincinnati or San Francisco what lay beyond their ken.

Adam Gopnik, “Yellow Fever”, The New Yorker (22 April 2013), 104.

As a form of disposable entertainment, the apocalypse market is booming

As a form of disposable entertainment, the apocalypse market is booming. The question is why. The obvious answer is that these narratives tap into anxieties, conscious and otherwise, about the damage we’re doing to our species and to the planet. They allow us to safely fantasize about what might be required of us to survive.

Of course, people have been running around screaming about the end of the world for as long as we’ve been around to take notes. But in the past, the purpose of these stories was essentially prophetic. They were intended to bring man into accord with the will of God, or at least his own conscience.

The newest wave of apocalyptic visions, whether they’re intended to make us laugh or shriek, are nearly all driven by acts of sadistic violence. Rather than inspiring audiences to reckon with the sources of our potential planetary ruin, they proceed from the notion that the apocalypse will usher in an era of sanctified Darwinism: survival of the most weaponized.

There’s a deep cynicism at work here, one that stands in stark contrast to the voices of even a generation ago. And this cynicism has, I fear, become the default setting of a culture that lurches about within the shadow of its own extinction yet lacks the moral imagination to change its destiny.

Steve Almond, “‘A Culture That Lurches About Within the Shadow of Its Own Extinction'” The New York Times Magazine (29 September 2013), 48.

,

The More Successful Movie Studios Become, the Less Risk-Averse They Are

The instinct to retrench and overemphasize strategies that have worked in the past is a common problem in companies as they get bigger and have more to lose, particularly as technologies change. Polaroid and BlackBerry doubled down on their time-tested formulas despite market changes, suggesting that this behavior can undermine even the most successful companies. “The more successful and larger they become, the more antibodies they develop to doing anything new,” said Alan MacCormack, a Harvard Business School professor. And this may explain why summer 2015 will see sequels in the franchises for “Batman,” “Superman,” “Avengers,” “Terminator,” “Independence Day,” “Pirates of the Caribbean” and “Smurfs.”

Catherine Rampell, “Revenge of the Nerds”, The New York Times Magazine (8 September 2013), 16.

,

Historically, there’s something suspect about a story told as a cliffhanger, but there’s something to celebrate about it

Cliffhangers are the point when the audience decides to keep buying—when, as the cinema-studies scholar Scott Higgins puts it, “curiosity is converted into a commercial transaction.” They are sensational, in every sense of the word. Historically, there’s something suspect about a story told in this manner, the way it tugs the customer to the next ledge. Nobody likes needy.

But there is also something to celebrate about the cliffhanger, which makes visible the storyteller’s connection to his audience—like a bridge made out of lightning. Primal and unashamedly manipulative, cliffhangers are the signature gambit of serial storytelling. They expose the intimacy between writer’s room and fan base, auteur and recapper—a relationship that can take seasons to develop, years marked by incidents of betrayal, contentment, and, occasionally, by a kind of ecstasy.

That’s not despite but because cliffhangers are fake-outs. They reveal that a story is artificial, then dare you to keep believing. If you trust the creator, you take that dare, and keep going.

Emily Nussbaum, “Tune in Next Week”, The New Yorker (30 July 2012), 70.

,

In the late nineties, television took a great leap forward

In the late nineties, television took a great leap forward. This story could be told in many ways: by focussing on the quality cable dramas, starting with “The Sopranos”; by emphasizing luminous genre myths like “Buffy the Vampire Slayer”; or by highlighting experimental sitcoms, such as the British version of “The Office,” themselves a reaction to the advent of reality television. Pugnacious auteurs emerged, resistant to TV formulas. The result was one innovation after another: juggled chronologies, the rise of antiheroes, and a new breed of challenging, tangled, ambitious serial narrative. Dramas often combined a plot of the week with longer arcs, a technique pioneered by “The X-Files,” allowing for subtler levels of irresolution. Some ambitious comedies incorporated serial elements, while others, like “Arrested Development,” satirized cliffhangers in much the way that “Soap” had.

Emily Nussbaum, “Tune in Next Week”, The New Yorker (30 July 2012), 74.

,

Everything that is said about the Internet’s destruction of “interiority” was said for decades about television

It is the wraparound presence, not the specific evils, of the machine that oppresses us. Simply reducing the machine’s presence will go a long way toward alleviating the disorder. Which points, in turn, to a dog-not-barking-in-the-nighttime detail that may be significant. In the Better-Never books, television isn’t scanted or ignored; it’s celebrated. When William Powers, in “Hamlet’s BlackBerry,” describes the deal his family makes to have an Unplugged Sunday, he tells us that the No Screens agreement doesn’t include television: “For us, television had always been a mostly communal experience, a way of coming together rather than pulling apart.” (“Can you please turn off your damn computer and come watch television with the rest of the family,” the dad now cries to the teenager.)

Yet everything that is said about the Internet’s destruction of “interiority” was said for decades about television, and just as loudly. Jerry Mander’s “Four Arguments for the Elimination of Television,” in the nineteen-seventies, turned on television’s addictive nature and its destruction of viewers’ inner lives; a little later, George Trow proposed that television produced the absence of context, the disintegration of the frame—the very things, in short, that the Internet is doing now. And Bill McKibben ended his book on television by comparing watching TV to watching ducks on a pond (advantage: ducks), in the same spirit in which Nicholas Carr leaves his computer screen to read “Walden.”

Now television is the harmless little fireplace over in the corner, where the family gathers to watch “Entourage.” TV isn’t just docile; it’s positively benevolent. This makes you think that what made television so evil back when it was evil was not its essence but its omnipresence. Once it is not everything, it can be merely something. The real demon in the machine is the tirelessness of the user. A meatless Monday has advantages over enforced vegetarianism, because it helps release the pressure on the food system without making undue demands on the eaters. In the same way, an unplugged Sunday is a better idea than turning off the Internet completely, since it demonstrates that we can get along just fine without the screens, if only for a day.

Adam Gopnik, “The Information”, The New Yorker (14 & 21 February 2011), 130.

, ,

Being up on the latest movies is not as important when one becomes a parent…

When I used to hear parents complain that they didn’t have even a few minutes to scan the newspaper, let alone finish a novel or catch all that year’s Oscar nominees, I inwardly scoffed (sometimes outwardly). Clearly, they just didn’t care that much about Syria or Hilary Mantel or Quentin Tarantino. I mean, really, an average newborn sleeps like 16 hours a day; you can’t squeeze in a Lydia Davis short story or a half-hour of “Girls”? Trying to talk to new moms and dads about culture felt to me like trying to talk to prison inmates about their favorite brunch spots.

Forgive me, fellow parents. I am now in that prison.

I now understand that, yes, a baby seems to sleep a lot, but that you’re watching them for many of those hours to make sure they’re still alive. (Newborn breathing, in my experience, can sound uncomfortably like a death rattle.) I’ve learned that, especially in those early months, time means nothing. Days dwindle to fleeting dots, but individual moments — like getting a sobbing kid into a car seat — can feel endless. And I’ve learned that, in the foggy, confusing, often frightening slipstream of a baby’s arrival, the things that vanish, no matter how much you love and think you need them, are the things that are not all that useful or necessary — even if they once seemed essential to your sense of self. Inessential things like antiquing or golf or, yes, going to the movies. I instantly became one of those fathers I mocked.

Jason McBride, “Two (Sucked) Thumbs Up”, The New York Times Magazine (1 September 2013), 44.

,

Enjoying cliffhangers done well versus those not done well

When done poorly, the cliffhanger is all about shoddy craftsmanship, the creepy manipulation by a storyteller who has run out of tricks. When done well, however, it can be about much more: surprise, shock, outrage, and pleasure—the sort of thing that might send you dancing off the sofa. The cliffhanger is part of some of the silliest shows on TV; it’s also key to understanding many of the greatest ones. It’s the visceral jolt that’s not so easily detached from television’s most erudite achievements. But, then, that’s the mind-body problem of TV, a conversation that has only just begun.

Emily Nussbaum, “Tune in Next Week”, The New Yorker (30 July 2012), 74.

,

The crucial revolution was not of print but of paper, with an easier production of paper, lists were generated

Among Ever-Wasers, the Harvard historian Ann Blair may be the most ambitious. In her book “Too Much to Know: Managing Scholarly Information Before the Modern Age,” she makes the case that what we’re going through is like what others went through a very long while ago. Against the cartoon history of Shirky or Tooby, Blair argues that the sense of “information overload” was not the consequence of Gutenberg but already in place before printing began. She wants us to resist “trying to reduce the complex causal nexus behind the transition from Renaissance to Enlightenment to the impact of a technology or any particular set of ideas.” Anyway, the crucial revolution was not of print but of paper: “During the later Middle Ages a staggering growth in the production of manuscripts, facilitated by the use of paper, accompanied a great expansion of readers outside the monastic and scholastic contexts.” For that matter, our minds were altered less by books than by index slips. Activities that seem quite twenty-first century, she shows, began when people cut and pasted from one manuscript to another; made aggregated news in compendiums; passed around précis. “Early modern finding devices” were forced into existence: lists of authorities, lists of headings.

Everyone complained about what the new information technologies were doing to our minds. Everyone said that the flood of books produced a restless, fractured attention. Everyone complained that pamphlets and poems were breaking kids’ ability to concentrate, that big good handmade books were ignored, swept aside by printed works that, as Erasmus said, “are foolish, ignorant, malignant, libelous, mad.” The reader consulting a card catalogue in a library was living a revolution as momentous, and as disorienting, as our own. The book index was the search engine of its era, and needed to be explained at length to puzzled researchers—as, for that matter, did the Hermione-like idea of “looking things up.” That uniquely evil and necessary thing the comprehensive review of many different books on a related subject, with the necessary oversimplification of their ideas that it demanded, was already around in 1500, and already being accused of missing all the points. In the period when many of the big, classic books that we no longer have time to read were being written, the general complaint was that there wasn’t enough time to read big, classic books.

Adam Gopnik, “The Information”, The New Yorker (14 & 21 February 2011), 128-129