Death of news, “We All Want to Save the World” edition


The following correction recently appeared in the website, a property of the Boston Globe newspaper:

Earlier today, published a piece suggesting Harvard Business School Professor Ben Edelman sent an email with racist overtones to Sichuan Garden.  We cannot verify that Edelman, in fact, sent the email.  We have taken this story down.

Consider that fatal phrase, “in fact.”  The defense of the news media rests entirely on the claim that editors vet the facts communicated by a news report.  Unlike, say, social media, journalism is supposed to be factual, empirical, objective.  On that single claim an ambitious ideology of news has been erected:  what’s good for the news business is said to be good for democracy.

So what happened at  It had access to the Globe’s densely-packed phalanx of editorial personnel.  How could it go public with a story that, in fact, it couldn’t verify?

I don’t know the details.  I never read the offending piece.  But, having studied the news business for some time, I can easily come up with a theory of the case.

You have an accusation of racism.  You have a business school professor.  Capitalist pig stereotype, meet racist monster stereotype:  the two belong happily together.  If you are a young reporter of the usual social background and political predilections, eager to expose injustice, this marriage of stereotypes assumes the authority of Platonic truth.  It is, in fact, cosmically verified.


The same devotion to ideal forms explains the fact-blindness displayed by Rolling Stone Magazine in its story about a rape in the University of Virginia.  As with the pre-emptively racist professor, that blindness was self-induced.  Sabrina Rubin Erdely, the author, hitched her wagon to a higher truth – exposing the “rape culture” in American campuses – but paid no attention to the potholes in that lowly road to uncertainty most of us call “reality.”

Rubin Erdely, 42, has penned articles on transgender longings, Catholic “secret sex files,” and the abuse of gays by Republican evangelists.  She’s a woman on a mission.  The details of the rape she reported were horrific, but they were also a grotesque stereotype of frat boy behavior.  She had spent six weeks hunting just such a stereotype, and found it at UVA, with its “aura of preppy success” and “throngs of toned and overwhelmingly blond students.”  Plus, it was Southern.  You have stereotype heaven.  What more likely setting for a sexual horror show?

Rubin Erdely was reporting on the pictures of the world inside her head, as confirmed by her solitary source – the supposed victim, Jackie.  It is now clear that she lied about cross-checking Jackie’s account, and that Rolling Stone, for all its democracy-preserving editors, bought the lie at face value.  Bill Dana, the magazine’s managing editor, said in 2006:  “we’ll write what we believe.”  The rape story fit the pictures inside his head.

That the story has come undone – Jackie, it turns out, was lying too – is less surprising than the reaction of many right-thinking observers.  They seem upset that anyone would question even a single instance of sexual assault.  They believe in “rape denialists”:  people who deny rape exists.  In rape cases, they maintain, we “should believe, as a matter of default, what an accuser says.”  Facts matter less than theology.

The existence of this cult places Rubin Erdely’s high-risk deceptions in perspective.  She imagined that the only permissible response to a charge of rape was dogmatic applause.

It could be argued that Rolling Stone is a pop culture rag, and Sabrina Rubin Erdely is less a journalist than a manufacturer of moralistic fables.  But real people were accused.  Real damage was done.  All this was invisible to Rubin Erderly.  She failed to interview the alleged rapists because she already knew who they were:  a combination of the “overwhelmingly blond” bad frat guys from Animal House and the vicious serial torturer in Silence of the Lambs.

animal house blond frat

The aftermath demonstrated another peculiarity of journalists:  although they claim factuality and objectivity, they never reveal much about their choices, and they rarely accept responsibility for their errors.  Rubin Erdely, that fierce exposer of injustice, retreated behind Rolling Stone’s public relations staff.  Rolling Stone, in turn, initially passed the blame on to Jackie, complaining that its trust in her had been “misplaced.”  This caused such an uproar that the magazine followed up with a “Note to Our Readers” blandly acknowledging bad “judgments” and “mistakes” with a palpable lack of regret.


Somehow, the voices from heaven never command that the news business expose itself to scrutiny.  Reporters cover everyone except other reporters.  When it comes to the anthropology of journalism, we don’t know what we don’t know.  Occasionally, however, someone breaks omertà, and what emerges is never pretty.

Matti Friedman, a former AP correspondent, has written a depressing piece for The Atlantic about news coverage of Israel.  Here, far afield, we find the same cast of characters that we have encountered before in Boston and UVA:  men and women from a certain class, on a mission to “help.”  They work for NGOs like Amnesty Watch, international agencies like the UN – and, of course, for news purveyors like AP.  They drink together and court and no doubt bed each other.  They share information.  They come to think alike.

In these circles, in my experience [writes Friedman] a distaste for Israel has come to be something between an acceptable prejudice and a prerequisite for entry.  I don’t mean a critical approach to Israeli policies. . . but a belief that to some extent the Jews of Israel are a symbol of the world’s ills, particularly those connected to nationalism, militarism, colonialism, and racism – an idea quickly becoming one of the central elements of the Western “progressive” zeitgeist, spreading from the European left to American college campuses and intellectuals, including journalists.

The Jews are the white frat boys of the Middle East.  They are invisible as a people, and can be perceived by Western reporters solely in the guise of a crude stereotype:  the human monsters of their imagination.  Israel is the only democracy in a desert of despotism, but that truth is selected out of their field of vision.  Friedman cites Orwell:  “The argument that to tell the truth would be ‘inopportune’ or would ‘play into the hands’ of someone or other is felt to be unanswerable.”

“Inopportune” to those who “have largely assumed a role of advocacy of the Palestinians and against Israel.”  “Play into the hands” of the forces of evil – the devilish Jews.  Journalists covering the conflict have made a canonical choice.  Readers back home, they have decided, are best served when exposed to just one side of the story.  Friedman was forbidden from interviewing the head of a rare pro-Israel NGO:  “In my time as an AP writer moving through the local conflict, with its myriad lunatics, bigots, and killers, the only person I ever saw subjected to an interview ban was this professor.”

I have argued argued elsewhere that there’s no special category of information we can call “the news.”  This applies with particular strength to the Israeli-Palestinian struggle.  What we get isn’t a slice of reality, “new” information, or facts about important events, but rather “a kind of modern morality play in which the Jews of Israel are displayed more than any other people on earth as an example of moral failure.”  This attitude, Friedman reminds us, has “deep roots in Western civilization.”


The final act in the recent self-detonation of the news media may seem unconnected to the rest:  the fatal assault and battery on The New Republic by its owner, Chris Hughes.  Some, indeed, contend that no significant lessons about journalism can be derived from this episode.  It’s just another case of digital technology disrupting the print media business, or maybe “the story of one incompetent media mogul.”

Both explanations are valid – but let me suggest another.  The relationship of this specific disruptor-mogul to the news business can only be described as pathologically subjective, and can only be explained in the context of my other stories.

The facts of the matter are well known.  Hughes purchased TNR, the faded beauty queen of the intellectual left, at “fire sale” prices in 2012.  He has since managed to antagonize most of the staff, who felt that he lied to them and betrayed the historic mission of the magazine.  When Hughes fired Frank Foer, the editor, after publicly endorsing him, all senior and contributing editors walked out the door – horrified, as one put it, that the dowdy old institution was being “vandalized” by ownership.

Think of Chris Hughes as Sabrina Rubin Erdely plus $700 million.  His wealth added a certain density to the fantasies inside his head.  His goals for TNR were famously captured by the CEO he brought in to run the magazine:  “We need to just break shit.”  Likely translation:  “Go find stereotypes of human devils we can smack around like pi­­ñatas.”

Hughes, 31, made his money by being Mark Zuckerman’s roommate at Harvard, and has spent much of his short life trying to find a suitably idealistic way to justify this accident of fate.  Purchasing TNR was part of his personal quest.  He wanted to break shit, center stage.  He longed to hear the dogmatic applause of sectarians.  He could trample on the work and reputation of others because he was rich, but also because they were invisible and he was headed to a far better place.

The significance of the news business for Hughes – as for Rubin Erderly – has nothing to do with facts, objectivity, reality, democracy, or even the reader.  The news, for both, is a field of dreams for the game of social identity and personal justification.

erderly and hughes

Traditional news reports, Walter Lippmann warned long ago, should not be confused with truth.  But journalism today seems to have strayed into a labyrinth of blind subjectivity, where wish-fulfillment controls information and the sole permissible activity is the ritual slaughter of imaginary beings.

The sociopolitical implications aren’t trivial.  An institution once devoted to the manufacture of mass opinion has turned divisive and inquisitorial.  It plays favorites and lacks even the pretense of integrity, thus contributing to the contemporary delusion that public debate must mean making a vast noise of negation.

The difference between social media and the news, in fact, is that the rant appears honestly and openly in the one but comes sneakily, with a bad conscience, in the other.

Posted in death of news | Leave a comment

Trolls and the trolled


You enter a dark forest and encounter a stranger.  You both carry something of value.  Neither of you can appeal to the police for protection.  What happens in the dark forest stays in the dark forest, forever.

How should you behave in that situation?  More fundamentally:  what’s the right principle of action?  Sympathy for a fellow traveler?  Hostility toward a potential criminal?  Aggression and theft, since these will go unpunished?

We spend a large chunk of our lives in that dark forest.  It’s the web.  It exists in a state of nature, as I noted as long ago as 2005.  The object of value is our new-found capacity to convey information to the world, no matter how far we stand from the centers of power and mass communications.  The dilemma of right action centers entirely on the character of that nameless stranger.


On the internet, there are worse things than to flirt, unawares, with a dog:  worst, by far, is being stuck alone in that dark forest with a troll.

The troll is an animal without discernible features, except for a protective shroud of obscenities.  It joins virtual conversations only to destroy them, and jumps onto online controversies only to threaten and vilify.  Trolls can gather in lynch mobs or act as solitary vandals, but they are always bizarrely persistent:  they will continue to shout until they outlast you, though they have little to gain by doing so.

Academics have studied the condition without throwing much light on it.  One study finds that trolls display “sadism, psychopathy, and Machiavellianism.”  Another claims that they suffer from “antisocial personality disorder.”  These are words that describe what they seek to explain.  Yet I can produce a simple explanation without trying too hard.  Trolls may be who we are:  and by “we” I mean the Big We, our species, the human race.

Our ancestors evolved biologically in hunter gatherer bands, and culturally in villages and small towns, where everybody knew everybody and everybody watched everybody else.  Even among the crowds of contemporary Manhattan, signs of public deviance are quickly detected and bring down the police.  What happens when these constraints are removed?  Nobody could know.  We were never free to be our secret selves – to say exactly what we wanted to say – until our late migration into the dark forest of the web.

Now, the returns are in.  Some people, a meaningful number, love to say vicious and destructive things for the sheer hell of it.  They do it because they can:  for the fun, “for the lulz,” for the second-hand thrill of smashing things while nobody’s looking.

Here’s the important point:  trolls and trolling aren’t an incidental side-effect of an otherwise benevolent web, like a rash from penicillin.  They are intrinsic and foundational to it, emanating out of the heart of the “hacker ethic” that proclaims all information wants to be free, and they have drawn the endless conversation that is the digital universe in the direction of reflexive negation and rhetorical violence.


Two powerful but opposite forces twist and warp the global information sphere:  one pulling toward community, the other toward nihilism.

The spontaneous clustering of communities around some shared topic of interest – video games, say, or a political cause – was an immediate and much commented-upon effect of the internet.  These communities represented the authentic tastes of the public, very consciously in opposition to the authoritative dictates of the elites.  The early blogosphere was thus fiercely anti-“MSM,” for example.  Wikipedia became the un-Britannica.

Communities of interest were a revolt against the top of the information order, but increasingly need protection from the depths.  Trolls surface, disrupt, and often drive away those who truly care about the topic.  A humble example:  when the Nationals baseball team came to Washington in 2005, a vibrant community sprang up around a number of new websites.  Ten years later, most Nationals sites had been killed off.  The aggravation of dealing with trolls outbalanced the joy of talking baseball.

To stay on-topic, digital platforms must wage a Darwinian struggle with trolls.  Sites are “curated”:  that is, gated to a greater or lesser extent.  Comment sections get switched off.  Participants and interactions now depend on the will of someone, so that an assault on elite-based filtering becomes a filtering exercise.  The new elites, like the old ones, are trapped in a version of the dictator’s dilemma:  the more they filter and control, the smaller their reach, and the less authentic the community.  In the age of reddit and Twitter, the trolls will flame you regardless.

Trolls exert constant adaptive pressure on virtual behavior.  Authenticity is now identified with louder, nastier, angrier.  The more enduring and influential sites aren’t just against, but virulently so.  Tea Partiers warn, “Don’t tread on me.”  Feminists turn into “social justice warriors.”  Even Washington Nationals fans have come to be dominated online by angry ranters, nicknamed “LOD”:  the legion of doom.  The survival strategy seems to be to out-troll the trolls before they arrive.

There are exceptions.  You can form an idealistic community dedicated to Deirdre McCloskey’s theories on rhetoric and economics:  obscurity will be your friend.  Or you can participate in the vast positive enterprise of Wikipedia, where volunteer watchmen keep out the trolls.  That is still possible.  But obscurity is an uncertain gamble, Wikipedia has ever fewer watchmen along the city walls:  and the digital barbarians never rest.


So what is the right way to act, if the stranger in the dark forest is revealed as a troll?  Options are limited.  You can unplug:  wear a hairshirt, hide in a wilderness, feed on locusts and wild honey.  Others have tried this.  Almost invariably, such information exiles return home.

You can follow the conventional wisdom that says, “Don’t feed the trolls.”  Meaning:  if you ignore them, they will go away.  Andrey Miroshnichenko argues, with sound logic, that the purpose of every interaction on the web is to “elicit a response”:  so you have theoretical justification for this approach.  Empirically, things look shakier.  To ignore the troll means to look away from the vandal heaving a rock through your window.  He now has an incentive to come back and try again.

You can anoint curators and erect thick walls around your community, but this, we saw, leads straight to the dictator’s dilemma and a new class of information gatekeepers.

Let me suggest another way to look at the vexing question of trolls and the trolled.

The existential threat posed by trolling is that of the entropy of systems.  Every system accumulates noise, and will become increasingly disorganized unless sanitary measures are taken.  Mass media applied those measures brutally, upfront.  Publishers and editors chopped information down to tiny, discrete, largely unconnected gobs.  It became a kind of code only elites could understand – but it cleaned out the noise.  Trolls don’t haunt the newsprint edition of the New York Times.

The digital information system works on a different principle.  It produces an astronomic volume of stuff every instant.  Publisher and audience, author and editor, are all assumed to be the same collective entity, the public, and to prefer signal to noise.  The latter is true even of the troll, who sends strong signals about his favorite hipster sunglasses and mobile devices.  He’s a part-time troll but a full-time member of the public, and thus a signal amplifier.

It is at the level of the public – not of the author, editor, or publisher – that digital information sorts out relevance from noise.  The public, in turn, can be many things:  a person, an extended family, a virtual community, a national or global cluster.  Each layer of the honeycomb extracts meaning out of the Amazonian flood of content sweeping the landscape.  When the system works to specs, the result is far easier access to and a deeper engagement with information than the products of mass media allow.

From this perspective, the encounter in the dark forest resolves into a personal choice.  You can deal with the troll as a special case, an irreconcilable enemy, a show-stopper – or as just another feature of a complex landscape, part of the cost of extracting signal from the system.  The first tactic is self-fulfilling:  the troll becomes a destroyer of digital worlds.  The second shrinks the troll to the dimensions of a pop-up ad:  he can’t be ignored, really, but in the long journey toward meaning he can be circumvented.

It’s even possible to squeeze signal out of trolling.   Under authoritarian rule, for example, online stooges often reveal the fears and pain-points of the regime.

The arrow of entropy moves in only one direction.  From the laws of thermodynamics, we learn that disorder must eventually overwhelm all systems, including the digital and physical universes.  In the cosmic long run, the trolls will triumph.  For the next few billion years, however, we retain the choice to push and shove in the opposite direction.

Posted in new media, the public, web | Leave a comment

The Gruber revelations


The digital ink was scarcely dry on my last post when a controversy erupted to illustrate its main point.  I wrote that a growing distance between rulers and ruled lay at the heart of the crisis of liberal democracy.  Moments later, in a series of web videos, Jonathan Gruber delivered a virtuoso performance in the pathology of political distance.

Gruber is an economist at MIT beclouded with honors and titles:  a sort of pan-dimensional, hyper-intelligent being whose Big Thought was intellectual authorship of portions of President Obama’s 2010 health care law.  To anyone willing to listen, Professor Gruber ever since has insisted that passage of the law was possible only by means of the noble lie.

“The bill was written in a tortured way to make sure CBO did not score the mandate as taxes.  If CBO scored the mandate as taxes, the bill dies. . .  In terms of risk rated subsidies, if you had a bill which said that healthy people are going to pay in – you made explicit, healthy people pay in and sick people get money, it would not have passed.”

The professor went on:

“Lack of transparency is a huge political advantage.  And basically, call it the stupidity of the American voter or whatever, but basically that was really critical to getting the thing to pass.  Look, I wish…that we could make it all transparent, but I’d rather have this law than not.”

This wasn’t said in a shuttered smoke-filled room, to a cabal of shadowy political manipulators.  Nor was it a verbal outburst fueled by emotion or inebriation:  a “gaffe.”   Here and elsewhere, Gruber offered his considered reflections at open academic conferences, where they were received as such, without fuss.

The tenor and substance of Gruber’s argument is always the same.  Democracy, he regrets, isn’t up to the job.  The gap between the brilliant shepherds who rule the nation and the simple-minded sheep they must lead to pasture can be bridged only with reassuring lies.  To enact a necessary law, the public had to be deceived.

His audiences responded to such utterances in the same manner, too:  with nods of understanding and the occasional chuckle of mild amusement.  In an age of infinite offense-taking, it seems nobody felt offended by the trashing of American democracy.

That changed when videos of his talks went viral.  In the great angry noise that ensued, however, I have yet to hear anyone ask the most pertinent question:  how is this possible?  Jonathan Gruber isn’t a revolutionary or a neo-Nazi.  He’s a mainstream academic who means well for his country.  His audiences probably share the same description.  How can it be that they assume, collectively, without debate, the stupidity of the public, the failure of the democratic process, and the need for those in power to rule by trickery?

Conservatives like Charles Krauthammer find the explanation in “the arrogance of an academic liberalism that rules in the name of a citizenry it mocks, disdains and deliberately, contemptuously deceives.”  The charge has some merit.  Those who aspire to the title should ask how an attitude of scorn for public opinion and representative democracy and honest dealing can be construed as “progressive.”

Barack Obama’s blithe dismissal of the legitimacy of the midterm elections fits snuggly into this pattern of illiberal liberalism.

But it was President Bush, compassionate conservative, who erected a police apparatus that treats ordinary Americans like enemies of the state in the public places of their own land.  There can hardly be greater disdain than that.

For Bush, it was security, for Obama, welfare:  the practical effect, surely intended, was a desperate clinging by both to the protective distance between power and the public.

Distance explains Professor Gruber and his untoward proclamations.  Gruber resides at the top of the pyramid of the expert class, where policy wonks schmooze with political players.  The worldview from the heights is more 1930s than new millennium.  The illusion of invisibility from the public persists:  distance makes blind.  The professor thought he was speaking strictly entre nous, among members of his club.

But why did he dwell on the stupidity of the public?  Well, the public doesn’t have all of his diplomas.  It doesn’t speak to presidents.  It doesn’t participate in legislation.  The public lacks the tribal tokens and credentials that buttress Gruber’s identity as an expert, and make him feel interesting and important.  That one could be smart and lack those never enters his mind for an instant.

Gruber – like Obama and Bush, and all government elites – peers at the public from the wrong end of a telescope, until it resembles a tiny but multitudinous insect, a swarm of fire ants on the move, driven by primitive urges, but deprived of reasoning capacity.  The public is where terrorists and white supremacists and obesity come from. The job of government is to tame this beast.  The awkwardness of the democratic process, from the elites’ perspective, is that it demands that noble lie.

The tragedy of democracy, from a historical perspective, is that the public is a very different creature than the elites imagine.  The public is far from blind:  it body-scans the elites right back on digital, searchable formats.  With a click of my laptop, for example, I can see Jonathan Gruber boast about lying to the voters, again and again.  He is deluded in his invisibility but exposed in his fraud.

The trouble with the noble lie is that it’s tough to make a case for nobility once the lie has been found out.

Trust between rulers and ruled functions much like Humpty Dumpty:  it can’t be put together again.  That is where we are today.  The lies are all found out.  The public mistrusts every word and act that comes from government – and the hyper-intelligent Professor Gruber’s revelations confirm that it has good reason to do so.

Posted in the public | 1 Comment

Distance and the sickness of democracy

obama limo

Let’s peer through the fog of events at the place where we now find ourselves.

Americans are unhappy with their politics, unhappy with their politicians, distrustful of their government to an almost pathological extent.  To people of the left, our system of government is a puppet show performed for the enrichment of Big Business oligarchs.  To people of the right, it is an instrument of tyranny wielded by Big Government elites.  The impact of the public’s distrust is such that these venerable categories, left and right, explode into bits.

What does a Tea Party activist have in common with Mitt Romney?  What ideals or habits of life does a “social justice warrior” share with Hillary Clinton?  Romney and Clinton ride the top of the pyramid:  Walter Lippmann, that sincere elitist, would call them “insiders.”  The others are on the outside, looking in.  Their wish to obliterate the hierarchy trumps their considerable differences in ideology.

Political unhappiness isn’t a uniquely American condition.  It’s universal among the old democracies.  The president of France, François Hollande, elected just a year and a half ago, has broken records in unpopularity.  The government of David Cameron barely avoided presiding over the dissolution of the United Kingdom.  Italy has been incapable of holding meaningful elections:  the current prime minister, Matteo Renzi, though popular for the moment, is essentially unelected.

Government as such is in crisis.  Democracy has lost the authorizing magic of legitimacy.  So:  those of us, here in the US or anywhere on planet Earth, who find far more good than evil in the democratic system of government, must account for its present sickness and specify the path back to good health.

Forget the Usual Suspects

The crisis of government isn’t about economic mismanagement.  Such mismanagement has been habitual and appalling:   no sane observer will dispute that.  Government blessed the drunken-sailor bets placed on “subprime” securities before 2008.  Government then enacted a trillion-dollar stimulus in 2009 that, by its own measures, achieved nothing.  This is an old story.

Franklin Roosevelt assumed the presidency under much grimmer economic conditions than we now experience, and ten years later, on the eve of World War II, little had improved under his management.  But the public kept faith.  The many new institutions FDR erected were mostly stage business:  but they possessed legitimacy.  Given trust and legitimacy, democratic government will survive mismanagement.

Nor is the crisis about our geopolitical difficulties in a world that “is exploding all over.”  John Kennedy was humiliated face to face by Nikita Khrushchev and more indirectly defeated at the Bay of Pigs, yet he kept his standing with the public.  Americans rallied to a president facing trouble in the world, and JFK could tap into this reservoir of trust and good will.

Certainly, the crisis of government isn’t about conspiracies spun by evil geniuses who manage to be invisible and irresistible all at once.  Self-interested conspiracies have always circled political power:  but consider the empirical claims being made.  Barack Obama isn’t George III.  The accusation is laughable at very many levels.  The Koch brothers haven’t purchased any portion of the US government.  Their total worth couldn’t buy one day of Medicare disbursements.

It is a judgment on the intellectual capacity of our age of distrust that a weak, floundering president and a pair of rich political dilettantes can be portrayed as American tyrants.

These conditions – a bad economy, a disordered world, the paranoid style – infect our democracy only because our democracy was already sickened, already stripped of immunity to the predictable troubles of political life.  The supposedly life-and-death issues that divide the electorate are in truth surface symptoms of an underlying malady.

Worry About Relations Between Rulers and Ruled

The supreme question that history has placed before us concerns a fundamental choice in politics:  the distance between rulers and ruled.

Great democratic institutions – presidents, parliaments, executive agencies and regulatory bodies – have come to perch atop structures that magnify immensely their remoteness from ordinary people.  Distance is manifested literally, physically.  President Obama moves about by helicopter and motorcade, protected by hundreds of armed Praetorians.  When he attends a baseball stadium, fans are told to show up hours early, so they can be duly frisked before the game.

These procedures are a symbol of the profoundly undemocratic spirit of our democratic institutions.

Rulers and elites hide behind massive bureaucracies ringed by nervous security.  Nobody knows what they think, or why they think it.  On occasion, they allow themselves to be questioned by fellow elites from the media.  The questions are insider questions.  The answers are insider answers.  The public’s concerns, being a mystery to questioners and answerers alike, are never broached.

Congress enacts bills that are hundreds of pages long and written in incomprehensible language.  Bureaucrats interpret, and though obscure and unelected, become true legislators.  Regulators, also often unelected and unaccountable, add another degree of separation.  The Federal government plays favorites among the thicket of laws, which to implement and which to waiver.

Elected officials of all stripes and denominations form a party of sorts:  the party of distance.  Government is lost in the Olympian heights, and the public encounters the democratic system at the level of the metal detection machine and the TSA body scan.

In the digital age, this will not stand.

Move Politics to the Deep End of the Pool

It will not stand because the public, too, forms a party of sorts:  the party of rejection.  The weakness of Hollande in France, Cameron in Britain, and Barack Obama in the US can be reckoned in direct proportion to the force of this rejection.

Once a polite audience, the public has leaped onto the stage of history and wishes to play the hero.  Its loud, persistent voice can be heard on digital platforms everywhere and at all times, reflexively cynical, offering the most sinister interpretations of government failure: attacking, condemning, rejecting.  Traditional issues of political life, like unemployment, like war and peace, have become proxies to the relentless struggle of the public against the established order.

Distance induces blindness in rulers, blind distrust in the ruled.  That is the crisis of government and the sickness of democracy, in brief.

This is not the place to describe the cure in detail.  The general approach, however, shouldn’t be much of a mystery.

Obviously, the steep Federal hierarchies will have to be flattened, to bring them into greater sympathy with our networked age.  Obviously, aristocratic perks and protections, the motorcades and the bodyguards, will have to be reduced to a minimum or eliminated entirely.  Obviously, laws will have to be of a length that can be encompassed by an ordinary intelligent person, and written in American English.

But such ideas amount to navel-gazing of the most useless kind, unless American politicians dare to move out from the shallow end of the pool.  They will have no reason to do so until the public forces them.

Ideology is no barrier to action.  Those who believe in political power as a force for good need a trusted and legitimate government for their instrument.  Those who believe political power must be limited and bound to the people will find in the reduction of distance a congenial project.

Only trolls and nihilists will object to turning away from a self-destructive path:  but let’s remember that it is precisely the troll and the nihilist, the party of suicide, who often set the tone for American politics today.

Posted in cataclysm | 1 Comment

Revolt of the public – Midterm elections report


The thesis of my new book can be gleaned from its title:  The Revolt of the Public and the Crisis of Authority in the New Millennium.  A hyperactive public, I maintain, has emerged from the dormant masses of the industrial era.  Gathered in networked communities, riding on digital platforms, this public has taken command of the information sphere and battered established institutions everywhere.

The consequences are all around us.  Proud political structures have been humiliated and stripped of legitimacy.  The pillars of an established order that has endured a century and a half visibly totter, and sometimes fall.

Despite all the evidence, I confess that when the book came out I felt a bit like someone afflicted with Tourette’s syndrome:  I was shouting incomprehensible profanities in a room full of people talking about something else.  The big conversation followed traditional topics.  Experts droned on about liberals and conservatives, Democrats and Republicans – about the president, the government, the cozy world of power wielders.  Nobody mentioned the public or its relentless assault on the institutions.  Few remarked on the crisis of legitimacy, or noticed the faint odor of political decay.

This has changed with startling rapidity.  The news media has discovered the chasm of distrust between ordinary Americans and their government.  Commentators have assumed an almost mandatory tone of despair about our present moment:  an era, they moan, of “great disruption” and “disgust.”  Even the New York Times has gotten into the doomsday business.

In bits and pieces, the revolt of the public is going mainstream – and now, perversely, I worry about what this portends for the health of liberal democracy.

“How Did We Lose Our Democracy?”

The ruling institutions are always being surprised.  From the bureaucratic heights, the public appears very far away, and trouble will be detected in advance only when internal alarms ring.  Under authoritarian regimes, this means mostly never.  Hosni Mubarak was ushered to prison in a state of utter befuddlement.  Videos of Muammar Qaddafi show him being led to his death in the same condition.  Neither had a clue about what had happened to them.

In representative democracies, however, certain elections attract the nervous attention of the elites.

The unruly temper of the public, expressed at the voting booth, has meant a wild gyration of parties and ideologies in power.  Whoever is in, the public wants out.  Since 2010, Britain, Spain, and France have each reversed the previous electoral mandate.  In the US, the last three national elections have alternated contradictory directions.

Now midterm elections are at hand, and conventional wisdom is taking for granted another mood swing, this time in favor of the Republicans.  Big media players like the NYT – surprised, as always, to find events rather than elites in the saddle – have gone out among the populace, searching for explanations.  What they have learned will sound familiar to my readers:  the present crisis of political authority, we are told, is the natural outcome of government failure.

Here is Peter Baker, in what was billed as “news analysis” for the gray old Times:

With every passing week or month it seems some agency or another has had a misstep or has been caught up in scandals that have deeply eroded public confidence.  The Internal Revenue Service targets political groups, the Border Patrol is overwhelmed by children illegally crossing the Rio Grande, the Department of Veterans Affairs covers up poor service, and the Secret Service fails to guard the president and his White House.

Now public esteem for the long-respected Centers for Disease Control and Prevention has plummeted with the arrival of Ebola on American shores…

[…] Polling by Gallup shows that since June 2009, in the heyday of the new Obama presidency, public confidence in virtually every major institution of American life has fallen, including organized religion, the military, the Supreme Court, public schools, newspapers, Congress, television news, the police, the presidency, the medical system, the criminal justice system and small business.

President Obama’s collapsing political fortunes are touched on, as well as Congress’s abysmal standing in public opinion.  Baker notes, correctly, that “the broader trend precedes Mr. Obama and extends beyond politics” – yet he offers no reason for this remarkable trajectory of institutional failure, beyond an obligatory mention of the “toxic environment” in Washington.  But the poison in Washington must have a source.  Polarization is an effect:  elites fighting ever more furiously over less and less.  The cause is uncertain, and the public is beyond caring.  American politics today resemble an exceptionally rancorous, penalty-riddled NFL game, played in an empty stadium.

For all its mock horror of partisanship, the news media is a ruthless participant in the game.  “What’s not to hate?  Start with the politicians on the ballot:  a surfeit of dim-bulb partisans pledged to further gridlock,” writes Timothy Egan in the NYT’s “Opinion Pages.”  Egan doesn’t hate all politicians equally.  He favors the president and the Democrats, and he maintains that only a conspiracy by moneyed interests can bring about the results predicted for the midterms:  “Oligarchs hiding behind front groups – Citizens for Fluffy Pillows – are pulling the levers of the 2014 campaign, and overwhelmingly aiding Republicans.”

If Egan’s conspiratorial cesspool faithfully represents the world, then American democracy, as currently practiced, is an illegitimate form of government.

How did we lose our democracy?  Slowly at first, then all at once.  This fall, voters are more disgusted, more bored and more cynical about the midterm elections than at any time in at least two decades.

So what is the alternative?  Egan doesn’t say.  He strikes out blindly, without much thought to the consequences.  That is typical of our moment.  He hates a possible electoral outcome, so he rejects the system that might bring it about.  That also is typical, and frequently met with.  Egan’s rant rests on an idea that has taken hold among the articulate elites.  Since the public is truly that sheep-like and easy to manipulate, the democratic process, under any circumstances, poses a fatal threat to good government.

Distrust of elections is by no means restricted to liberals or Democrats.  It burns just as brightly on the other side of the partisan divide.  Glenn Reynolds, libertarian and Obama-basher, constantly reminds readers of his blog that victory by Republicans must exceed the “margin of fraud” – perpetrated in this case by Democrats rather than oligarchs.

“Change Is Coming. Big Change”

The Revolt of the Public tells the story of an age that seems stuck in its own ending.  The public craves an escape from the status quo, but won’t spell out a new direction.  The institutions have been blinded and crippled by the speed of life in the 21st century.  Rulers, politicians, and bureaucrats can hear the crowds roaring outside the palace windows, but all they can do is repeat the same old formulas in the hope that things turn out better next time.  Change, radical change, seems inescapable, but isn’t:  instead a sort of zombie democracy, a political body without a soul, staggers from failure to failure.

Ron Fournier of National Journal agrees with digital politics guru Joe Trippi, whom he quotes as saying, “Look beneath the surface, and you’ll see this is more of an anti-incumbent, anti-establishment year than people realize.”  Concludes Trippi:  “Change is coming.  Big change.”  Not today or tomorrow, Fournier acknowledges, but maybe “a presidential cycle or two away.”

Like all who contemplate our peculiar moment, Fournier is struck by the power of technology to alter the social and commercial landscape – and he imagines, as many have, that politics must be next.

I ask you, how long before Americans recognized they’re no less equipped to disrupt politics and government?  How long before we stop settling for an inferior product in Washington and at statehouses?  When do we demand more and better from the Democratic and Republican parties – or create new political organizations to usurp the old?

[…] Huge majorities of Americans say the country is on the wrong track.  They see a grim future for themselves, their children, and their country.  They believe political leaders are selfish, greedy, and short-sighted – unable and/or unwilling to shield most people from wrenching economic and social change.

The heart of the matter, Fournier understands, is alienation, not polarization.  American voters are defecting from the mainstream parties.  The public is leaping out of the past into the dark, into nowhere and nothing:  that is my interpretation.  Fournier is a bit more sanguine.  Disruption of the system is “a matter of when, not if,” he proclaims with apparent cheerfulness.  The “Old Guard” he dismisses as “a ship of fools, living on borrowed time.”  That’s probably true.  But there’s an assumption, never stated, that the public is headed to a positive somewhere, that disruption isn’t just another word for nihilism:  and of that, I’m not so sure.

Fournier commits the cardinal sin of the middle-aged:  he drops the burden of his hopes on the shoulders of the unsuspecting young.  Specifically, he links political change to the millennial generation, which he imagines to be “relatively civic-minded, pragmatic, tolerant, diverse, and less interested in ideology than results.”  I have no idea whether anything intelligible can be said about a generation:  but these are empty virtues.  We learn exactly nothing from them about what millennials would seek to accomplish if, against all odds, they were allowed to influence national politics before the last Baby Boomer gets lowered into his grave.

To Fournier, change has become a good in itself.  No doubt he associates it with progress – but that is a terrible failure of the political imagination.  Change can take an infinite number of forms, many of them destructive of liberal democracy, tolerance, diversity, and all those millennial values he admires.  After all, the Weimar Republic changed.  German opinion determined that Weimar had failed beyond repair, and that anything other was an improvement.  That brought on Hitler.  Barbarians typically enter the city through open gates, left unguarded by citizens who despise their own institutions and are unable to imagine worse.  The Dark Ages were the most radical kind of change.

Change must be specified.  Democracy should be defended.  The impulse to nihilism we must combat in ourselves and confront in others, as a mark of decadence.  Candidates for the midterm elections, so far as I can tell, haven’t broached any of these large subjects.  Egan calls it the “disgust election.”  He’s projecting, and he’s almost certainly wrong.  The midterms of 2014, like its immediate precursors, will be a small election, fought with little feeling over small things – another lurch of the zombie, this time, if the prophets are to be believed, to the right.

Posted in the public | Leave a comment

Unbundling the nation-state

Chirico Painting

The most potent organizational form known to history, the nation-state, is fraying at the edges:  unbundling.  The recent “No” vote against independence in Scotland, analyzed correctly, showed every symptom of this unraveling.

Britain’s ruling elites panicked, and tried to bribe the Scots with offers of increased self-rule.  They betrayed a complete lack of confidence in their own legitimacy.  The Scottish public, consumed with grievance, seduced by negation, felt free to batter a political order that was defended by nobody.  The United Kingdom was preserved for now, but as a vexing question mark, a source of dissatisfaction and uncertainty, rather than as a settled political arrangement.  Separatists in Catalonia wishing to break away from Spain were encouraged, not disheartened, by the episode.

The Geopolitics of Dis-integration

The curious notion that all the land in the world, and every human being treading on it, must be assigned to some responsible national government, probably goes back to the Peace of Westphalia in 1648.  But the modern nation-state is an artifact of the industrial age:  it grinds on like an enormous factory floor, top-down, centralizing, standardizing, bureaucratic in form yet utopian in ambition.  Leviathan – government as horror movie monster – was born of structural and geopolitical pressures.  Administered by progressive elites, the monster became the nation.

The tide of history now runs in the opposite direction.

Geopolitical imperatives turned at the end of the Cold War.  West Germany absorbed the German Democratic Republic in 1990.  The following year, polarities reversed.  The Soviet Union fragmented into 15 separate “states,” most of them dysfunctional from the start.  Yugoslavia began a violent disintegration into a patchwork of statelets.  In 1993, Czechoslovakia split into its component parts.  Scotland and Catalonia, Flanders and Northern Italy, have since seen the rise of powerful political parties that clamor for secession from the center.  Belgium, a fictional country, could at any moment follow the Czechoslovak example.

The leap of faith that was the “Arab spring” shattered Syria, Iraq, Libya, and Yemen along ethnic or sectarian lines.  Jordan and Bahrain teeter on the edge.  In Lebanon – like Belgium, a country of convenience – every valley flies its own flag.  Sub-Saharan “failed states” have been carved up among rival warlords and ethnic groups.  Even China, the very model of a nation-state growing in power, appears more concerned with questions of legitimacy and internal cohesion than with throwing its weight around in the world.  The current trouble in Honk Kong is small potatoes to China’s rulers:  the specter that haunts them is dismemberment, the memory of the “warring states.”

For at least a generation, geopolitical stresses have been pulling nations apart.  The reasons are obscure.  I would guess that, from above, imperial concepts like “Europe” and “the caliphate” bled the nation-state of legitimacy without acquiring any political punch of their own.  From below, regions like Scotland undermined the ideal of the nation without providing an alternative:  the universal demand seems to be “we want out.”  Other geopolitical factors, I don’t doubt, are also in play.

The Structural Destiny of Unbundling

The historian understands that every era is bounded by certain structural limitations.  Leonardo designed flying machines but could never be an aviator, any more than Picasso could paint the Sistine ceiling or Genghis Khan could conquer America.  It is in this historical sense that one can speak of structure as destiny.

Industrial organization shaped the structural destiny of the last century.  Institutions became steeply hierarchical.  Elites strove to control the masses in the hope of leading them to some promised land.  Nation-states either organized on a near-military basis or were bullied by those who had done so.  Strong central governments thus seemed like a patriotic necessity.  The top-down model, so successful for industry, mesmerized politicians and bureaucrats.  It was believed to be guided by data and “science.”

If Bismarck was the godfather of the modern nation-state, its patron saint might have been Frederick Winslow Taylor, author of The Principles of Scientific Management.  Taylor advised managers to choreograph every movement of each factory worker.  Henry Ford was an admirer.  So was Lenin.

That historical moment ended with the new millennium.  The development of digital platforms made possible a new mode of organization:  the network.  It was immediately embraced by the public.  While the elites remained ensconced in bureaucratic institutions, a networked public tramped into these precincts of authority and felt free to muddy the carpets, break the crockery, mock mistakes of fact, rant at failures of policy, and condemn the entire institutional framework as a conspiracy of the few against the many.  The structural destiny of the digital age – bottom-up, egalitarian, anarchistic – resembled a bipolar leap away from the past.

From the commanding heights of the information sphere, the public can perceive the repeated failure of national governments, as well as the confusion and panic of the elites.  The effect has been revolutionary.  In Cairo and Madrid and Caracas and Kiev, the public has threatened or toppled the institutions of the old regime.

This global uprising of the networked public – analyzed, I’m obliged to note, in my new book – has begun to carve up the limbs and sinews of the modern nation-state.  The forces at play are structural, world-historical, and scarcely touched by patriotic or ideological concerns.  In a very real sense, Europe’s regional independence movements stand in the same relation to their central governments as did the indignados of Spain or the Tea Party in the US.  They detest and distrust national elites and ruling institutions, more than they fear the nation-state as such, or love their native soil.  Their energies are mobilized against the political status quo:  they want out.  But that is true of the larger public in their countries.

Labels of long pedigree have lost meaning here.  Two of the secessionist parties (in Scotland and Catalonia) are center-left, as we used to measure such things, while the other two (in Flanders and Northern Italy) are center-right.  All have in common a politics of pure negation.

An Arab public on the move has repeatedly collided with brittle top-down regimes, bringing about what one writer called “the chaos of an entire civilization that has broken down.”  In Syria and Libya and Iraq, where predatory governments had swallowed the nation-state whole, the mortal weakening of the one has induced the collapse of the other.  Governments in the region lack legitimacy, or even a notion of what legitimacy means.  A frustrated public can erupt in protest at any moment, mobilized by hostility toward the structures of power.

The unraveling of the Arab nation-state, I want to suggest, has only just begun.

Yet the pattern is global.  Everywhere, the bloated modern state has ingested national sovereignty.  The world of 2014 consists of a mosaic – not of nations, in truth, but of governments that claim to embody nations.  When a government fails and unravels, and is seen to fail and unravel, on center stage, by a public in command of the information sphere, old assumptions about nationhood are placed in doubt.  If government unbundles, how can the nation stay whole?  The fate of Syria and Libya and Iraq argues that it can’t.

That government is unbundling should be beyond dispute:  even our trillion-dollar Federal government has been compelled to play in this disappearing act.  Parcel post service has unbundled to Fed-Ex, for example.  Security for State and Defense Departments has unbundled to private contractors like Blackwater.  Other examples probably fall under the rubric of “civilizational chaos.”  National borders keep nobody out or in.  Between legal immigrants and illegal trespassers, distinctions are problematic.  Citizenship itself, the badge of belonging, is now a commodity negotiated with a brain-dead bureaucracy rather than a semi-religious commitment.

A fourth of all Americans, and a third of the millennial generation, support the proposition that their state should secede from the US.  This speaks less to nostalgia for the Old Confederacy than to a new contempt for the failures of national government.

The Dialectics of Chaos

Marxist tradition maintained that, in the bourgeois era, “all that is solid melts into air,” including eventually the nation-state.  The phrase “withering away of the state” was Engel’s.  The concept received Lenin’s attention in The State and Revolution, but always hovered on the margins of Marxist political theory:  revolution came first, dictatorship second, while utopia was maybe for the great-grandchildren.  Still, the fact remains that somewhere within this ideological Pandora’s box – an industrial-era vision if ever there was one – we come across a prediction of the nation’s demise.

History, according to Marx, rode on the class struggle, and the great engine of the class struggle was contradiction or dialectics.  New forms of production engendered new social groups that collided with, and in time displaced, the old forms and their keepers.  The process was binary.  One class rose against another until, at the end of history, after revolution and dictatorship, only a single universal class remained.  Then utopia would arrive.  Absent the class struggle, human relations would be free of contradiction.  Without contradiction, there would be no need for the repressive machinery of the state:  like a dead tree, the nation-state would topple and disintegrate of its own accord.

As an intellectual exercise, it is possible to squeeze the political landscape of 2014 into this scheme.  Two groups, arrayed in radically different forms of organization, today collide everywhere:  the networked public and elites in top-down institutions.  They are inescapably hostile to each other.  The perturbing agent has been a change in technology and in control over information and communication, if not of the means of production.   If the triumph of new forms has been dialectically predetermined, then the public owns the future.  Hierarchical government must evolve into networked government – and networked government could unbundle itself out of existence.

Reality, however, has a way of breaking out of the creaky prison of Marxist teleology, and that is plainly the case here.  The public may be in revolt, but it isn’t a class.  It has no consciousness, no ideology, “no intention of governing” and “no capacity for exercising power.”  The public, in Walter Lippmann’s words, “is not a fixed body of individuals” but “merely those persons who are interested in an affair” to the extent of engaging in political action.  They can spring from the left or from the right.  They can be Occupiers or Tea Partiers.  Their abiding principle is a ferocious egalitarianism, and their driving impulse is a repugnance to the established order so profound that it borders on nihilism.

The elites running the old institutions are also not a class in the Marxian sense – and they aren’t much of a ruling class in any sense.  They feel legitimacy and authority bleeding out of a thousand failures, and they are rightly afraid.  Like the public, the elites lack a defining ideology or character.  They can be constitutionalists or despots, technocrats or murderous thugs.  All share a touching faith in miracles worked by top-down applications of political power, and a desperate hope that the world’s information, now spiraling out of control, will slow down enough to allow the lumbering hierarchies to catch up.

The struggle between the public and the elites isn’t binary but complex.  Hierarchy was never about the means of production:  it’s in our DNA.  Network can be fast and furious in orchestrating protests, but it can dissipate in an instant.  The public can overthrow, but never rule.  The institutions continue visibly to fail, but seem unable to change.  The structural destiny of our age is anti-authority, but that might be reversed in the next turn of the historical screw.  Nothing is certain or fated.

So the state isn’t predestined to wither away.  It may not even unbundle in the manner I have described.  If a deeper predictive principle can be extracted from the muddle of events, it is this:  so long as our moment lasts, the modern state, gluttonous Leviathan, will either disgorge ever more bits of the nation, or continue to sicken and fail.  It may be forced to do both.  Chaos could trump power:  it’s happened before.  We who inhabit quiet lands already can hear, not too far from home, the crack and rumble of that storm in which civilizations are broken.

Posted in cataclysm, the public | 1 Comment

A social media beheading

foley beheading

Sometime last week, James Foley, an American free-lance journalist, was beheaded by a masked individual who claimed to belong to the Islamic State of Iraq and the Levant (ISIL).  This was a moral and political atrocity, requiring frank talk about the appropriate response.  But it was also an attempt at visual persuasion:  ISIL communicated the killing in a carefully produced YouTube video that condemned President Obama’s decision to bomb their advancing forces.  The group had a message.  Foley was murdered to ensure that it was heard.

I watched the video on my iPhone – which is to say, I saw little and heard nothing at all.  (I pieced together my description from a transcript as much as from what I saw.)  President Obama was shown saying that US power would be used to contain ISIL in Iraq.  Video followed of a smart bomb blowing up what was presumably an ISIL vehicle.  Foley, in a prison-orange gown, on his knees, was then forced to make an anti-US statement.  “I guess all in all I wish I wasn’t American,” he concluded, tragically, yet from what I could see without much emotion.  His killer stood behind him, dressed in black and waving a knife.  “Any attempt by you, Obama, to deny Muslims their right of living in safety under the Islamic caliphate will result in the bloodshed of your people,” he blustered in a British accent.  Then he put his knife to Foley’s throat.

Beheading videos are a grisly tradition among insurgents in that part of the world.  Al Qaeda in Iraq, a precursor organization to ISIL, delighted in such productions under Abu Musa al Zarqawi.  Because of my work, I viewed some of them at the time.  I wanted to make sense of the message:  what possible political advantage could AQI expect from their displays of savagery?  They instilled fear, certainly.  It gave the group a terrifying, gangster-like reputation.  Mostly, however, the videos seemed to reflect nothing more substantial or strategic than Zarqawi’s bloodlust.  I was not alone in this assessment:  Ayman al Zawahiri, then Al Qaeda’s second in command, admonished Zarqawi in a captured letter that “scenes of slaughtering the hostages” would never be “palatable” to Muslims.

The last AQI video I could stomach showed a massacre of Nepalese workers, men totally innocent of the conflict in Iraq.  One by one, they were in a very literal sense slaughtered by Zarqawi using a small knife.  I was told by a Muslim colleague that this resembled the ritual sacrifice of animals during holy days.  Not only had the Nepalese workers been terrorized and killed, in the process they were denied their humanity.

James Foley died in the same manner.  In this case, the motive for murder was transparent.  The killers, who wished to be considered “an Islamic army, and a state” rather than a gang of violent men, sought to frighten the US public and warn off further US military attacks by snuffing out an American life.  In the dark mind of the ISIL zealot – personified by the ninja-clad, British-sounding assassin – the killing was tit for tat.

Predictably, the video of Foley’s beheading went viral on the information sphere.  Just as predictably, a strong backlash resulted against showing such a repulsive act.  By the time I got to my laptop, YouTube had taken down the video.  The company had every right to do so.  Foley’s family then requested that the public not watch, share, or use any images produced by ISIL.  That was understandable.   With regard to the beheading video, and excluding the usual mindless trolls, the public largely obliged.

But soon a digital frenzy ensued against showing any images of the video, anywhere.  A campaign to “blackout” ISIL media by Twitter user “Hend” garnered immediate support.  Feeling the pressure, Twitter agreed to cancel the accounts of users who posted images from the beheading video.  A great deal of online rage was aimed at those, like the New York Post, that displayed Foley in his last minutes of life.  With typical herd instinct, most of the news media engaged in self-censorship on the matter.

The reasons given were not devoid of merit, but sounded strangely out of tune with our informational reality.  The beheading video was said to be propaganda for a murderous group – true enough, but it was unclear how looking away in horror would defeat or out-persuade the murderers.  The images were out there, in any case.  ISIL supporters were said to revel in them.  The information sphere is irrevocable.

Respect for the feelings of family members was also offered as a reason for blocking images of Foley’s death.  Twitter generalized this into a new policy regarding the removal of images of “deceased individuals” at the request of family members.  It was a kindly meant but somewhat misguided gesture.  The pain of loss, after all, is not in images but in the flesh and the heart.  Death is the ultimate irrevocable.  Murder, I should think, trumps insensitivity.  Reality must be confronted and grappled with.  It would have been a bizarre response to 9/11, if we had decided to ban the images of that horrible day.

In fact, a strange moral and political myopia afflicted the digital chatter about the murder of James Foley.  A violent, disgusting act had been perpetrated, but most of the anger was directed at those who shared images of it.  It was as if social media users cared only about social media – or worse:  as if by blocking out pictures of the crime, we could somehow avoid dealing with a world that contained dangerous criminals.

For the record, I incline to the second explanation.  The reflex to blot out shocking information, particularly of the visual kind, has been a well-documented trait of the elites.

At the time of Egypt’s bloody street revolt, state-owned TV ran footage of happy consumers in shopping malls.  During violent police repression of anti-Erdogan protests in Turkey, CNN – owned by Erdogan allies – showed a documentary about penguins.  A government report on the 2011 London riots concluded that the “single most important reason” for the disorders was “the perception, relayed by television as well as social media,” that police had lost control of the streets.  In every case, the idea was that if the mediated images of unpleasant events were blocked, the reality of the latter would be nullified.  This is not magical thinking, merely a throwback to an era when hierarchical institutions controlled the flow of information.

That era is long gone.  I find myself perplexed to observe so many participants in social media – the unruly public, gifted amateurs who have broken the information monopoly of the institutions – grasping for the illusion of control.

We can’t give James Foley his life back, or wish ISIL out of existence, by closing our eyes to the moral horror of their collision.  Instead, we should ponder the relationship between our current response and the next video:  the next horror.  We should gaze into the heart of darkness of that knife-wielding killer and his kind.  They seem to think Americans can be stampeded by fear.  To the extent that we treat images of their crimes as a reality too unbearably painful for us to behold or discuss or accept – to that extent exactly, I suspect, they will feed us more.

Posted in new media, Uncategorized, visual persuasion | Leave a comment