The locus of morality

July 21, 2015


This blog is predicated on the notion that freedom requires self-discipline:  a moral framework by means of which we restrain the force of our desires.  Otherwise, political power, in the form of the police, will compel restraint.  Either we rule ourselves or government will regulate our behavior – ultimately, at the point of a gun.

All this seems trivially true to me.

Yet it rubs against the intellectual grain of the times.  Power has swallowed morality.  That is taken for granted in our everyday talk and in our great public decisions.  Virtue descends from above – it’s a question of budgets and policies, wholly divorced from behavior.  Personal responsibility has become social responsibility.  Conscience, now also socialized, is no longer focused on my private failings:  it’s all about yours.

Even justice has been depersonalized.  People speak of social justice and economic justice – foggy abstractions that can only be realized, if ever, with massive applications of power.

So we have two possible loci of morality.  One is the individual, judged on his behavior.  The other is government, judged politically.  The two are not commensurate.  If morality pertains to the individual, I can still judge the individuals in government by the morality of their actions.  Metaphorically, I can even treat the government as an individual, and pass moral judgement on its actions.

But if good and evil are government responsibilities like taxation and war-making, my personal behavior becomes morally irrelevant.  On my own, as a private person, I can play no part in moral decision-making any more than I can tax my co-worker with the BMW or make war on annoying foreigners.  All that is required of me is the right political posture – or failing that, obedience to those who wield political authority.

The transfer of morality to state power must be paid for in the coin of personal freedom and ultimately of morality itself:  so I will argue in this post.


Morality, as I use the term, consists of those shared models of behavior that make social life possible.  These models evolved in the harsh landscape of history, and are enforced far more stringently by custom and convention than by law.

I thus inherited an ideal of how to be a good husband and father.  My character is judged by the degree to which I approximate the ideal.  I judge myself by how far I can stretch the limits of my personality in the direction of the ideal.  I could, of course, invent a whole new model of fatherhood, and if I’m a moral genius my innovation might spread.  But that’s a bad bet on both counts.  Moral genius is rare, and moral innovations must compete with battle-tested adaptive behaviors.

The human condition is partial, flawed:  tragic.  I will never be a perfect husband or father.  Ideals are direction markers, not final destinations.  I can only inch toward perfection.  The same applies to you, good reader – and our shared ideals help us to harmonize our progress.  But we do have choices:  without them, there can be no morality.  We can inch forward or slip back.  We can be strong or weak.  We can do good or evil.  The drama of life may be a struggle against imperfection, but we have a say in the plot.

Morality is most persuasive where the power of convention is greatest:  in the “small world” of family, friends, neighborhood, and church.  Ideals at this level are often shared to a minute level of detail – food and clothes can be moralized, for example.  Personal histories are known in some depth, and moral judgments have immediate, and personal, consequences.  I can forbid my kids from playing in my weird neighbor’s house, but I still have to stare at him across the driveway.

The small world is regulated almost entirely by moral conduct.  We feel it a violation of the right order of things when siblings or neighbors take to the courts to settle disputes.


Moral communities are constituted by adherence to some system of shared ideals.  Such communities usually transcend the small world but are never equivalent to society.  A modern nation, therefore, is a patchwork of competing moral communities, with conflict over fundamental principles baked into the arrangement.  In the past, despotic rulers used brute force to pick winners.  Louis XIV, a Catholic, abolished the rights of his Protestant subjects and sent the dragoons after those who objected.

Liberal democracy must take a less direct approach.

The political system that became liberal democracy emerged from Europe’s wars of religion.  Its founding document was John Locke’s Letter Concerning Toleration.  Then like now, the dilemma was how to manage “the divisions that are amongst sects.”  The liberal trick was to detach personal morality from legal and political compulsion to the greatest extent possible.

Government, Locke argued, existed to protect persons and property.  Morality – the “care of each man’s soul” – was “left entirely to every man’s self.”  The path to heaven could be determined only by the private person.  Government officials, who dealt in worldly affairs, lacked any special knowledge on the subject.

But there were limits to toleration.  Liberal democracy has never endorsed the principle that everything goes.  Eventually you came to a boundary.  For Locke, a Protestant, Catholics and atheists were beyond the pale.  So were communities that promoted public disturbance.  Today, of course, we are puzzled by Locke’s choices.  Catholics and atheists, we know, belong to perfectly legitimate moral communities.  Public disturbance, seventeenth-century style, we often identify with free expression.

So why did Locke draw the boundaries where he did?  The answer is that they were self-evident, if you grant his purposes.  Locke was pleading for an unprecedented expansion of tolerance and moral diversity.  He proscribed the absolute minimum acceptable to an educated English Protestant of his time and place.

Convention – the gravitational pull of opinion across history – determined the limits of behavior for him as it does for us.

We may consider Locke’s judgments exclusionary, but we still follow his method.  As citizens of a liberal democracy that is forced to adjudicate between competing moral communities, we tolerate most practices and prohibit an absolute minimum.  The arbiter is again convention.  To a future historian, our boundaries will seem no less arbitrary than Locke’s.

We now embrace gay marriage but prohibit polygamy.  What principles support either judgment?  We tolerate much public nudity but have expanded the definition of rape.  What’s the underlying logic?

At the national level, under liberal democracy, we appeal to grand principles but the principles contradict each other.  If we compel Catholic hospitals to perform abortions, we are trampling on freedom of religion on behalf of the right to privacy.  If we accept that white supremacists must enjoy freedom of expression, we threaten racial equality.

Present-day moral communities can turn to God or some universal principle to justify banning unwanted behaviors.  The same was true of traditional monarchy and twentieth-century totalitarian dictatorship:  both systems were, in essence, the moral tyranny of one sect over all others.

Liberal democracy has chosen to tread on thinner ice:  being secular and tolerant, it can only appeal to public opinion.  The principles behind prohibition, however universal in scope, can be enforced only on conventional grounds.  The public must support them over rival principles – and, as with race, abortion, or gay marriage, it retains the right to change its mind.

“There is something fundamentally indeterminate about democracy,” Pierre Rosanvallon observed, quite correctly, in Counter-Democracy.


Governments of every stripe rest uneasily upon this tangle of fault lines.  Their job is to preserve a peace constantly threatened by competing communities and principles – Locke’s “divisions that are amongst sects.”

The church of social justice, currently dominant over our institutions, would invert the terms of Locke’s equation, have government assume moral leadership, and apply political power to crush evil.  Sexuality, for example, would be subject to minute regulation much like, say, the pharmaceutical industry.

The question begged is how, precisely, this will be achieved.

One way to locate morality in government is the Louis XIV way:  winner take all.  But cuius regio, eius religio is hardly a principle of social justice.

A second way is Platonic:  the rule of experts who manipulate vast social forces for the benefit of the majority or, alternately, of those who are now marginalized.  But his assumes that such a class exists.  Consider 2008.  Consider Iraq and weapons of mass destruction.  Those still in doubt, read Philip Tetlock.  Faith in experts is much easier to disprove empirically than faith in a personal God.

A third way is Rosanvallon’s “counter-democracy”:  enlightened groups browbeat the government into right action, usually on an issue-by-issue basis.  Here public opinion turns against itself, and the indeterminacy of liberal democratic government becomes its only virtue.

This approach manages to be at once sectarian and universal, controlling and anti-establishment, unbendingly orthodox and – Rosanvallon’s term – radical:

Radicalism no longer looks forward to un grand soir, a “great night” of revolutionary upheaval; to be radical is to persist in criticizing the powerful of the world in moral terms and to seek to awaken passive citizens from their slumbers.  To be radical is to point the finger of blame every day; it is to twist a knife in each of society’s wounds.

But the rule of radical opinion is just a form of moral tyranny by one sect over all others, no different, in this respect, from the purges and final solutions perpetrated by the “vanguards” of the last century.  To implement the model systematically would mean the end of pluralism, of liberalism, in our politics.

Advocates might assert that a higher morality based on social justice trumps, and should trump, a false consciousness of individual freedom.  This assumes that governments could bring about social justice, if only they tried.  The longish track record of modern government refutes this claim.  It doesn’t have a clue about, for example, how to bring about income equality or racial integration in housing.  Neither do scientists, economists, entrepreneurs, the markets, the socially conscious, or the socially unconscious.

In the complexities of the “big world” beyond our immediate circle, no person or class or institution has a clue about consequences.  (For the empirical evidence, look up Paul Ormerod and my own book on the matter.)  So the expectation of utopia by government action forever fuels the rage of the righteous over government failure.


Personal morality depends on local knowledge applied to the small world where personal life plays out.  Consequences are felt directly, yet matter less than character.  If I try to save a drowning man, I risk drowning myself – but it’s still the right thing to do.

Government can only work statistically, and bet on the law of large numbers.  Given what we know about complex systems, that is rarely a winning game.  Consequences are experienced by the public, never by officials or lawmakers.  If this is morality, it’s morality for the masses, shot full of waivers and exemptions.

To the extent that power controls personal choices and we acquiesce, we are infantilized:  that is, we become like children in the care of parents, and cease to be moral or political agents.  That is a servile condition.  Masters have always stereotyped slaves as carefree but irresponsible children.

To the extent that morality becomes a political prize, the consent necessary for legitimacy will fracture along the lines of the moral communities.  Catholics will demand the re-imposition of the ban on abortion.  Progressives will demand the silencing of hate speech.  Feminists will transform the sex act into a series of elaborate legal agreements, with appropriate punishment for violators.

With every step, politics will become less political, morality less moral, and our poor, battered democracy less liberal.


What is the way out?  I don’t pretend to know with certainty, but I suspect that it will be found in the relationship between morality, reality, and fear.

Morality is a striving against reality.  We direct our behavior toward some ideal that, given the ways of the world, can never be perfected.  Morality can’t be a denial of reality.  It can’t set the standard at utopia, complete happiness, or brilliant self-actuation.  To insist that the human condition be other than what nature and history have made it is a sign of immaturity:  a temper tantrum against the universe.

Every 12-year-old wants to be a superhero.  They all grow up to be accountants and college administrators.

Part of the way out, then, is acceptance of the tragic dimension in striving against reality.

Morality is the enemy of fear.  The individual’s sense of right has always been a bulwark against the predations of naked power.  Conversely, state terror is the enemy of morality.  The same is true of the lynch mob.  Both use fear of force to end the argument in their favor.

We live in a moment dominated by the internet mob.  Grown-up children, outraged by reality, make up this mob.  They are self-anointed enforcers of official morality, as they imagine it ought to be.  They patrol social media for offending statements – and, when these are found, they indulge in orgies of digital hatred.  Threats of death, violence, rape, loss of employment, violation of privacy, all are flung in defense of grand humanitarian principles.  It’s a short walk from utopia to the jungle.

People in high places and low have become afraid to contradict the mob.  I encounter this more and more.  People fear for their jobs and reputations:  they don’t want controversy, they don’t want to be at the center of a public shitstorm.  So they measure their words.  They tailor their opinions.  They keep quiet when they disagree with official doctrine.

Part of the way out, I submit, is for all of us to show the courage to say and act as we think is right, and let the mob be damned.

atticus finch

History = Entropy (2)

June 16, 2015

dali memory

In my last post I argued that history is entropy:  the inexorable but random disintegration of human systems under the pressure of time and change.  Odd bits endure or are remembered.  Millions of lives are consigned to an everlasting silence.

Entropy is the irreversible loss of available energy:  it defines the arrow of time as moving from difference to sameness, from action to stillness, from imbalance to equilibrium.  According to the second law of thermodynamics, the entropy of the universe is increasing and must continue to increase.  It all ends in utter dissipation:  “heat death.”

This vision of the universe as a process of perpetual decline would have been congenial to the ancient Greeks, who believed, with Hesiod, that we have fallen from a happy golden age to an age of iron men and iron hearts.  And, to be sure, to the extent that human beings are chunks of energetic molecules they must abide by the second law.  Every person, in the end, dissipates into stillness and atomic dust.

But there are counterfactuals to consider.  The law of entropy posits that change must always be from complexity to simplicity.  As clever observers like Alicia Juarrero have pointed out, however, this doesn’t seem to cover the evolution of organic or cultural life.

I want to focus here on the weird trajectory of human history.

To the extent that we are clusters of symbols, we seem to be able to hold entropy at bay, and reverse the arrow of time in the direction of complexity.  It isn’t just that we began with fire and flint and are now worried about the laws of thermodynamics.  Even members of hunter gatherer communities, living in relatively simple social structures, perceive the world in extraordinarily complex ways.

History, then, can be understood as progress.  The human race has multiplied and grown greatly in numbers.  Human knowledge has built upon itself and expanded into ever more specialized domains.  Health, wealth, literacy and education, life expectancy, access to music, literature, art:  all are measurably superior today to the time of our Upper Paleolithic great-grandfathers.

But history is also tragedy.  Entropy can be kept at bay, but only for so long.  Every human group has had an allotted time-span.  Every symbol will grow hollow and die.  I can wave my hand rhetorically at Ozymandias and the looted tombs of the pharaohs, but the truth is that most societies that ever existed, with all their toil and yearning for an ideal life, have been swallowed into that everlasting silence.

There is no progress in feeling or in wisdom.  A single play by Aeschylus or Shakespeare is far superior to all the bloviations produced today.

Progress and tragedy are positive and negative poles:  between them flows the alternating current of human history.


The capacity of symbolic systems to push back entropy, even for a moment, needs to be explained.  This doesn’t happen to the law of gravity.  No social arrangement or science or art form will allow us to levitate.

My candidate for an explanation is straightforward and requires no mathematical formulas.  The great symbolic systems developed in a harsh, savagely competitive environment.  Each represented the struggle of a community, a way of life, not only against the entropy of time but also against other communities, other ways of life.  Even when people were thin on the ground, the world was always crowded with symbol.

In such an environment, the system of all human systems would be what N. N. Taleb calls anti-fragile.  It turned disaster to advantage.  It used entropy to clear a space for greater complexity:  that is, for development at a higher level.  When a community perished, and its symbols lapsed into silence, that wasn’t the end of the story.  A frontier was now open to another community, another set of symbols, another model of how to be human.

The result wasn’t always progress:  but often enough, it was.

The reason for this is also straightforward.  Unlike, say, a salt molecule, human beings possess intentionality.  They act to a purpose.  The sum of individual purposes is demonstrably reproduction, expansion, social complexity.  At the level of the whole community, the grand ideals, those models of existence, organized purpose beyond the individual and across centuries.

Ideals of courage, sexual behavior, love of country, etc., provide the motivational fuel that drive symbolic systems forward.  For a time – not forever – they defy and defeat entropy, a law binding on the lifeless and purposeless only.


It used to be said that history was written by the winners.  We now call the manipulative authors “oppressors.”  This implies that there are things that we should remember – the role of women in society, for example – and other things that we have leave to forget.

But if history is entropy, the question of what should be remembered has no meaning.  We might as well ask what the most effective style of levitation is.  In the short term, no doubt, our squabbles will stagger and stumble around favored bits of memory.  In the end, however, those bits and much more will be devoured by time and change.  Entropy is an equal opportunity destroyer of identity, an eraser of groups.

So the alternative to history isn’t a more inclusive and just history.  It’s silence.  The real question, let me suggest, is what to do with the history we’ve got.

History, like memory, has many legitimate uses:  I will touch on two that subsume many of the others.

Let’s accept that history is not a record of the past, faithful or otherwise.  Rather, it’s a glimpse into the remarkable struggle that defines the human race:  that between the symbolic systems that identify communities and the inexorable advance of entropy.  Greek city-states and Chinese imperial courts didn’t deal in the issues that obsess 21st-century social thinkers.  They faced survival or dissolution as ways of life in their own environments.  Both endured for a time.  The random pieces they have left behind suggest their endurance rested on radically different models of humanity.

From this perspective, history is a fragmentary vision of the attempts of some communities to cheat the silence.  It’s a patchwork of tricks and ruses, compounded of will and imagination, and amounting, for the living, to a sort of progress.

Since we, too, are dangling over the chasm, and must ultimately be swallowed by it, these flashes of tenacity should be of abiding interest to us.

The second perspective concerns the cost of the struggle.  I shouldn’t have to say it, but I will:  there is always a cost.  That was true in the days of cave paintings and it is true in the age of the iPhone.  There is always a cost – sometimes, a terrible one.  Individuals and communities have compromised their purposes, and absorbed much suffering and heartache, to keep the silence at bay.

It’s perfectly licit to note that the ancient Greeks exploited their slaves and shut away their women.  It’s nonsensical to chide them as if they were elected officials of a liberal democracy in the year 2015.  Our complaints say more about us than them:  we have lost sight of entropy, to the extent that we feel free to trample on the few remaining shards of human memory.

We live inside a system of symbols that instructs us how to behave:  a moral structure.  By our lights, the Greeks did wrong.  But the Greeks never knew our moral ideals.  Like all communities at all times, they followed their own.  They paid the price in their own coin, not in dollars or euros.  If we find their way of life reprehensible for our own good reasons, it’s not difficult to imagine that they would have found our way equally repellent, for reasons that were entirely theirs.

At a basic level, the study of history is that of the moral and material cost of persisting in an entropic universe.  It can be likened to an inquest into the bits and pieces that remain from a tragic accident.

Dead societies are easy enough to ignore or condemn, but very hard, maybe impossible, to understand.  The effort is made worthwhile by the need to see the moment of action, our decisive now, in depth rather than in a single dimension.  Without history, we have nothing with which to compare ourselves, no way to measure or judge the cost we bear, no idea whether we live in the best or worst of possible worlds.

Human systems differ in their ideals and strategies, but the struggle, pitting purpose against silence, stays the same.  We children of the digital age haven’t been granted a cosmic pass.  We can’t pretend to be conscientious objectors.

The tragic accident – that’s us, and we forget that to our peril.

History = Entropy

June 9, 2015


The question of what becomes history – what gets remembered, and why – has been troubling me lately.

I recently visited Sicily and rooted around the ruins of classical Greek civilization there.  In Agrigento, a series of temples built on a ridge above the sea remain in an astonishing state of preservation.  In Taormina, an ancient theater looks down from the vertiginous heights on a receding shoreline and the sparkling Mediterranean.  In Syracuse, the ribs of the old temple of Athena stick out of the Baroque cathedral of Santa Lucia.

Of all that aspiration and glory, almost nothing is remembered.

From the heights of Taormina I could gaze on the site of Naxos, located in the next headland, just across the bay.  Naxos was the original Greek toehold in Sicily, founded in the mid-700s BC.  It had some sort of an existence for 250 years or so, then was destroyed by the tyrant Dionysus of Syracuse, a man who knew Plato personally.  The city walls were razed and the inhabitants sold into slavery.

For functional purposes, much of ancient Sicily has no history.  If we wish to understand how people lived, or why they died, we have little to go on.  About Naxos we know next to nothing.  Agrigento has the temples and tales of a couple of tyrants.  Even Syracuse, the chief city, has mere patches of historical narrative, a few names and events, stretched over a vast chasm of silence.

Taormina, founded by people from Naxos, has no history at all:  only a magnificently-sited ancient theater to lure the 21st-century tourist.

It’s a truism that history is told by the winners.  But that’s only for the short term, and in a limited sense.  The Israelites were among history’s great losers, and their story is still being told.  The Athenians lost their big war to the Spartans, but who cares about Sparta today?  (Even 300 was based on Herodotus, a product , if not a native, of Athens.  We know the Spartans through Athenian eyes.)

The winners tell their story:  what happens then?  Time happens.  Change happens.  Wars, migrations, fire, plagues, earthquakes, all these happen, and they happen inexorably but at random.  There’s no predicting disaster, no predicting the next brilliant cultural wave that will sweep the past out of fashion and into obscurity.  The winners tell their story but that story, too, is forgotten more often than it is recalled.

Naxos was a city for 250 years, practically the lifespan of the United States of America.  Then it was lost to silence.  Taormina had no history, but out of that nothingness a beautiful theater endures.

History is entropy.  It is in the human spirit to engender systems and to tell stories about those systems, and because these are important to their makers, and sometimes to whole populations, they wish to believe that they will last forever.  We build:  we seek order, harmony, balance, progress.  Time and nature break things down:  cities into rubble, memories into silence, life into death.

Ancient peoples were deeply aware of this struggle, since disintegration was ever close to the surface of things.  That was one of the great themes in Aeschylus:  the human will to impose order colliding disastrously with the chaotic forces that ruled the universe.  Cities like Naxos were sacked and destroyed.  The people and their memories were exterminated.

In The Seven Against Thebes, the women wail about what will happen if the city falls to the enemy.  They weep for young, unmarried girls who will be dragged off “like horses by the hair” to become playthings of foreign thugs.

I suspect that we, the beneficiaries of the Industrial Revolution and the great enrichment that followed, take for granted that our systems are indestructible.  Modern life works off that premise.  When the system works as it should, we are merely grumpy.  When it fails in any respect, we become enraged.

Everyone feels that, in the nature of things, something called “a job,” or even better, “a career,” should be offered to them.  Everyone, regardless of condition or talent, demands the right to satisfaction and self-expression.  Everyone must enjoy orgiastic sex and a sublimely happy family life – with a spouse or a partner of this sex or that – with just the right number of children, or none.

Our systems are eternal and our claims on life are many and aggressive.  The implicit idea is that entropy has folded its bloody beak in its great black wings and fallen asleep.

That is the sort of prideful error Aeschylus would make the starting-point of tragedy.

Greek flesh, modern dreams

April 14, 2015

greek modern

We of the 21st century are a people of the dream, engaged in constant self-transformation.  The dreams we pursue are usually of better, sometimes of different.  If I am poor, I want to become rich.  That’s the American dream.  If I am an outcast, I demand equality.  That was the dream of Martin Luther King.

If I don’t like my body or my face, I’ll assert my right to change them.  If I don’t like my sex, I can change that, too – and expect all around me to adapt to my new identity.

Limits are never acknowledged.  Costs are equated with injustice.  We believe reality to be soft and pliable, a construct that can – and should – be easily deconstructed to harmonize with the desires of the will.

Yet there’s a penalty to be paid for embracing the proposition that reality must conform to private dreams.  We often feel uncertain of who we are or where we stand.  We are distrustful of spin from politicians and the media, afraid of “identity theft” and false-identity web predators, desperately hungry for authenticity, fixity, clarity – elements of that reality we, with our modern dreams, have claimed to transcend.

Worse:  because billions are dreaming the same dreams, they collide with one another, block one another, and force the dreamer to awaken to the fact that he is a limited, bounded creature most of whose hopes will never come true.  In past times, that was a truism that suffused human existence with sadness.  In ours, it is cause for much frustration and shouting.

A people of the dream will be condemned, always and necessarily, to the Age of the Rant.


I’m presently researching the works of the ancient Greeks, and I have been struck by how fundamentally unlike us they were on this question.

The Greeks felt reality as a crushing weight on their shoulders.  Transformations required divine interference, and were seldom happy occasions.  King Midas and his donkey’s ears, inflicted by the god Apollo, can stand as a fairly restrained example.

The world was ruled by “fortune,” “the gods,” or “necessity,” so that even the most cruel and unjust events formed part of a mysterious moral balance.  Truth was thus to be accepted on its own harsh terms, not ours.  Spin and propaganda were considered insults against the sacred order of things:  hubris.  They would bring down retribution.

Oedipus was innocent in his motives.  He belonged to the most applauded modern category:  victimhood.  Yet Oedipus never spins his story, never plays the victim, but proclaims with a certain pride, “Of all men, I alone can bear this guilt.”

The Greeks, arguably the most brilliant people in history, were unwilling to imagine themselves other than they were.  They were a people of the flesh:  and it may be useful to explore the conditions that shaped their identity.


To begin with, Greek city states were miniscule.  You could get a quorum of the Athenian assembly with only 6,000 citizens.  Most cities were much smaller than Athens.  Your nation was like your neighborhood:  it was impossible to get lost in the crowd.

Greek life was lived outdoors and in public.  Citizens exercised naked and marched next to one another in the battle line.  You knew where you stood with everyone else.  Everyone else knew where you stood, too.  Fakery was futile.

The classical Greeks are often portrayed as idealists rather than realists, but what we call idealism to the Greeks meant mastery at an extraordinarily high level of excellence.  This ideal rested on reality:  truth was beauty, and truth was given, not dreamed.  For the individual, this must have meant a tragic foreclosing of possibilities.

The Athenians of the time of Pericles abounded with genius, but lacked silence and solitude.  Public life swallowed the private person to an extent that makes Jeff Jarvis look like a desert hermit by comparison.  Purely personal expression, beyond very tight boundaries, was frowned upon as subversive and quickly punished.  Socrates suffered this fate – so did his beloved Alcibiades, a very different personality.  The oracle’s admonition, “Know yourself,” translated into something like “Understand your limits and live within them.”

For that there was a good reason.  Existence was precarious.  City states were poor and weak, war was a constant, and defeat often meant that your city got wiped off the map – the men massacred, the women sold into slavery.  Thus the state’s survival had to be ensured before personal business could be attended to.  Every male citizen was a soldier, no matter how refined his intellect.  The playwright Aeschylus was a hero of Marathon.  Socrates was admired for his endurance while on campaign.

Consider the contrast with modern life.  Our actions are cushioned by political and economic developments that would have astounded the Greeks.  We hand off to others the power to run our government, the duty to fight our wars, then we expect, as a birthright, to enjoy personal security and affluence.  When we fail, we suffer a spell of individual unhappiness:  nobody dies.  So it doesn’t really matter whether we are right or wrong about the nature of reality.  We can afford to dream.

In the unforgiving environment that shaped the Greeks, getting reality wrong was fatal.  The stories they told harped constantly on that theme.  Whoever dreamed of happiness would be rudely awakened.  Whoever rose too high would be brought low.  Dream bowed to fate, and fate, they knew, was perverse.

A profound sense of tragedy, impossible for the modern mind to comprehend, darkened every aspect of Greek life.

Because we float as transient tourists over a playground world, we come to crave authenticity.  The citizen of Athens or Sparta enjoyed a totally authentic existence, but felt pinned to the earth by his own flesh.  He had conquered political freedom, first of our species to do so – but individually he was a prisoner shackled to that cruel jailer, fate.

From Plato to the Stoics, Greek moral philosophy can be understood as a series of desperate attempts to escape into metaphysics from the narrow prison of reality.


elgin 07-1

Study with an innocent eye the youthful faces that stare at us out of the Elgin Marbles.  They are exquisitely beautiful, but stamped with the sneer of cold command.  They lack something – some quality we latter-day people of the dream find essential to social life.  They seem devoid of sympathy.

The agony of truth and the aspiration to mastery were expressed by the Greeks in an unparalleled burst of genius, but also, and regularly, in sickening displays of brutality.  Our Athenian, for all his brilliance, celebrated his victories with a massacre of enemies.  He abandoned his imperfect offspring to die – a state policy of Sparta’s, heartily endorsed by Plato.  It was Plato, Athenian aristocrat, archetype of the philosopher (“lover of truth”), who recommended cross-breeding citizens “like watchdogs” for desirable political characteristics.

I think the outcome of this biological program would have looked much like the young athletes of the Elgin Marbles.

In the end, I believe, the Greeks became what they most condemned, and failed by their own measure.  A pitiless realism was an unrealistic arrangement for community life.  The triumph of the flesh meant war to the death against all rivals, internal and external.  In their moment of greatness, as individuals and as a people, the Greeks were stuck with what they were.  It was impossible to advance, to move beyond.

No nation equaled them in talent, or could defeat them in battle.  That didn’t matter.  Like all doomed tragic characters, Greek civilization destroyed itself.


For us, danger threatens from the opposite direction.  Our lives are personal dramas.  We are endlessly fascinated with our internal states.  Reality, as I have noted, is expected to yield without a struggle to our emotional requirements:  that’s the millennial dream of transformation.  When the world fails us, we are outraged, and in our rage we smash and batter at whatever happens to be in our way, without a thought for consequences.

Already truth is becoming alienated from facts.  We want reality to be as we command it to be.  That is another way of saying that we, too, are escaping into metaphysics, though in a far less rigorous and innovative way than the Greeks.  They were system-makers, while we just play at make-belief.

The modern dream was once – maybe still remains – a mighty engine of progress.  Unlike the Greeks, we could move beyond ourselves.  We could rise higher and reach for private happiness without incurring hubris or deserving punishment by god or man.

But reality is our common ground.  The flight into fantasy ruptures every social bond and leaves us vulnerable to those who, in the Greek manner, feel earth-bound and flesh-bound, and play the game of life authentically, for keeps.  Our viral outrage, torn loose from reality or any tolerance of the human condition, must in time transform the dreamer into his own assassin – the nihilist – and the dream of self-betterment into a nightmare of barbarism and bloodshed.

Andrey Miroshnichenko’s review of The Revolt of the Public

March 18, 2015

Andrey Miroshnichenko is the author of Human as Media, a book I have cited often in this blog.  He and I became connected as fellow scholars who turn out to have an astoundingly similar perspective on the impact of media on politics.  I say astounding because we wrote our books in complete isolation from each other yet arrived at almost exactly the same place, used many of the same words, and even reached for the same obscure citations (Ortega y Gasset being a favorite).

Andrey and I take the similarities to be meaningful.  We think we are on to something.  We may even be advancing on that phantom, truth.

Andrey has just produced a brilliant and thoughtful review of The Revolt of the Public.  As old-fashioned bloggers used to say:  read the whole thing.  It’s worth it.

For those who want a foretaste, here is Andrey’s version of how he and I came to recognize each other as kindred spirits:

Reading Gurri’s book was for me a particularly fascinating experience, because of the many overlaps between his ideas and those presented in my book “Human as media. The emancipation of authorship.”(Miroshnichenko, 2013). Gurri and I were not familiar with each other’s work until I came across his book and wrote to him. Our understanding of the present moment is so strikingly similar that we both turned to the same, regretfully obscure, “mass man as a spoiled child” quotation taken from Ortega y Gasset (Ortega y Gasset, 1930).

We have since discussed this similarity in our analysis, and we agree that if two independent researchers can see and describe their subject so similarly, that subject in all likelihood has been correctly portrayed. I find this to be an exciting development, particularly since neither physics nor math are involved; there are no “objective” laws of nature applied… Or are there? In the age of accelerated information, the social-informational sphere is so mediated by technologies and so alienated from an observer that it can probably be caught by an inquisitive mind like something “physically given”.

As might be expected, he and I don’t agree on every point regarding the great media-driven transformation of social and political life – what I have called the crisis of authority.  But I find his disagreements to be the most interesting aspect of the review:  they are fascinating, instructive – and probably correct.

Gurri has researched the manner in which the Fifth Wave influences politics. But at some point, with growing Internet penetration, media ceases to be just a factor of the political process; quite the reverse, political processes become internal parts of the media environment.

The review concludes on a thought-provoking note:

Societies accustomed to the conditions supported by broadcast-style, top-down media – i.e., the Fourth Wave, according to Gurri – including those societies that just recently began to experience those conditions, have suddenly found themselves sinking in the Fifth Wave, which is the environment of engaged media, where everyone has the technical capability to express publicly their personal reactions.

This new environment has thus far emancipated technical authorship for about 2.5 billion persons. Considering the spread of information technologies, the “normal” rate of Internet penetration, and population growth, we can predict that number of emancipated authors who can communicate reactions beyond their physical circle will reach 8-10 billion within the next 30 years.

We are at the moment in the middle of the explosion of mass authorship. Books like Gurri’s Revolt of the Public are extraordinarily helpful and necessary if we are to understand the present and prepare for what is to come. The next wave of turbulence caused by emancipated authorship is coming, and it will not be Tahrir-like. The likelihood is that it will develop the characteristics of the recent conflicts in Ukraine and Ferguson, Missouri, the first hints of which could be observed in the London riots of 2011. These future collisions should be analyzed in the context of media ecology as well as of political science.

Go read.  Now.

Andrey Miroshnichenko

Andrey Miroshnichenko

Morality for a conventional animal

February 18, 2015



How We Got Stuck in the Funhouse

What is is less important to us than what ought to be.  The reason is simple.  We judge the one by the other.

At the elemental moment of experience, we impose a moral scale on reality.  This is unavoidable.  We can’t possibly stand outside ourselves.  Even those who seek God in revelation must do so through the dark mirror of their humanity.  Even for them, man is the measure of all things.

If morality is the judge of reality, the question would seem to be what system of morality brings the greatest justice or completeness to the trial.  This has been a reflexive tic of educated persons in my lifetime:  the judgment of morality, ending always in a guilty verdict and the death sentence.  In the rush, a step has been missed.  The old morality lies dead by our own hand, yet the superior alternative is nowhere to be found.

Those marching toward that land of milk and honey were lured into a conceptual funhouse from which they never emerged.

We can’t transcend the world to judge the world.  We need a moral standard to discover a superior morality:  but in that case, the decision is already made.  Once I need a standard to find the standard I get lost in an infinite regress.

The fine spirits of my generation took the verdict against morality for granted, but this reflected a sense of the historical moment, the zeitgeist, and beyond that the oppositional herd instinct of the intellectuals.  A positive alternative was lacking.  A platform from which to condemn the old morality was lacking, so the accusers made strange gestures toward universality and science, and ended exactly where they began – appealing to the old morality they had discarded as if it were a universal or scientific truth.

German and French philosophers paved the road to the funhouse.  They dismissed reality as mere convention, and repudiated convention as a mask for exploitation and abuse.  But this indictment could only be derived from the most conventional of moral precepts, directly descended from Christianity.  The philosophers were looking for an Artist-Tyrant beyond all morality but instead found the middle-class parson who had troubled their childhoods.  They spoke of authenticity and existential freedom but they were stuck in the funhouse, going round and round.

The rest of us followed without a moment’s hesitation.  That’s the seductive power of style and fashion.


How Convention Trumped Perfection, and Always Will

We have been driven to the edge of moral dementia by the uniquely modern quarrel with convention.  In the past, convention was just the way things were done.  To us, it’s the curse of history – a masked lie – a magician’s cloak dropped over reality to conceal the immorality of power.  We think ourselves worthy of lives that are perfect and true, and we look on our existing social arrangements with a ferocious loathing.

Those who dispensed the lethal injection to the old morality intended to clear the ground for an authentic moral order.  But nothing is happening.  The world-historical clock remains stuck at a minute before midnight, while the new dispensation refuses to be born.  Such an absurdity can be explained only in the context of history, which is to say of tradition and convention.

In principle, morality and convention differ sharply.  The one aims for high ideals, the other for compromises so we can get along.  Morality is absolute and universal, convention relative and local.  Morality commands “You shall not kill,” but convention finds many reasons to do so.

The sophisticated and super-educated of our times, in their simplicity, believed they could rip away the veil of convention, and dwell in a world ruled by lofty ideals.  Behind the veil, however, they found – nothing.  That world does not exist.  It has never existed.

Functionally, the gap between morality and convention narrows to the vanishing point.  We are born with certain behavioral predilections.  This has been called the moral sense but is really an instinct for rightness in social encounters.  It resembles the language instinct in being a generic endowment “tuned” to the specific idiom of the community.

The mechanism for moral tuning is imperfectly understood, but seems to involve observation, imitation, and powerful emotional tags that convey the feeling of right and wrong in many settings and circumstances.   In this manner, ought becomes enthroned at the gateway of experience.

Abstract principles play no part in the tuning process.  They are supplied after the fact:  that “all men are created equal” was to Jefferson a self-evident description of reality, rather than a newly discovered axiom from which all justice must follow.  Once articulated, such principles get absorbed into the moral language – but the fit with the community’s ideals of behavior is always imperfect, always riddled with exceptions.

Any attempt to formulate morality, like geometry, from axiomatic principles will necessarily fail.  The cause of failure isn’t our fallen state or conspiracies by billionaires.  Geometrical morality must fail because ours is a deeply conventional species.  If we rip away convention we slip insensibly back into convention – never pure morality.  If we deny the legitimacy of history, we set in motion a tragicomedy of unintended consequences rising out of the depths of historical causation.

That describes our present condition.  We have rebelled against history and failed, and now we are sick with vertigo in that funhouse of unintended effects.


Conventional Morality and the Morality of Convention

The strictures of convention must be treated with great care, not because they are right in every instance but because they tap directly into potent emotions.  When displaced, these emotions can explode into nihilistic violence.  The twentieth century, a long experiment in the trampling of convention, invented the killing fields and the extermination camp.  The new millennium, at war with history, seems to be staggering in the same direction:  we are the first to crash commercial aircraft into skyscrapers.

Syria 2015

Syria 2015

But this is a call to caution, not to fatalism or moral inertia.  Conventions change:  they must, if they are to adapt to a highly unstable human and technological environment.  The break with the past is always difficult, often traumatic.  Many of the conventions that have evolved since 9/11 and the emergence of social media would have been considered dueling offenses by our great-grandfathers.

The only meaningful question before us isn’t whether the old conventions were superior to the new, but how on earth we can arrive at such a judgment.  The way out of the funhouse, for those who wish to leave, consists of discovering the authority or criteria that entitles us to call some aspect of conventional morality right or wrong, better or worse.

The pleasure principle and its obverse, the precautionary or victim principle, are applied by default in the public arena.  Both pointed the way to the funhouse.  We have been pleasured and victimized by special pleaders into a state of utter bewilderment.

Morality consists of behaviors found to be successful and right in the past.  “Successful” means that the behavior binds the individual to the community.  “Right” means that it moves in the direction the community has set for itself.  All this is given.  We confront morality from childhood as a series of tragic choices, by which we amputate our most urgent desires so we can learn to live with others.

But there are choices.  Without repudiating morality or veering from the historic direction of the community, we assert ourselves.  Moral boundaries fit no person perfectly:  in a gray uncertain hour, each of us must cross the frontier.  Many do so out of selfishness.  That is almost always a false step.  But morality can judge morality.  Convention can oppose convention.  Human nature, our native sense of rightness, will set boundaries beyond which no social arrangements can function.

Motherhood is an ancient moral ideal for women.  Success in the pursuit of an interesting career is a more recent ideal.  Choices must be made.  Something will be lost.  So long as the pleasure principle has been discarded, so long as the community’s perspective has been considered, an individual is free to wrestle with her private circumstances.  Her path through the wilderness will be her own.

Independence is a necessary condition in a democracy.  Sustenance is necessary for life.  Dependence on government handouts will threaten personal freedom, but may ensure survival.  Choices must be made.  To the extent that the individual has risen above a childish self-indulgence, he is free to make a call.

The same is true of private choices like abortion and of more public decisions like a stand on gay marriage.  History and the community must have their say.  Selfishness can play no part.  Beyond that the individual must search his private store of wisdom, and judge.


The Amorality of Government, and Who Is Responsible

Conventional morality is nothing more than the sum of such judgments.  It isn’t tidy, but it allows free range to trial and error, success and failure, and so opens the way for our moral thinking to evolve gracefully with our changing circumstances.  It is also morality as actually practiced in an open society, as opposed to our fond geometric illusions or the circular condemnations of the philosophers.

The top-down approach that unleashed a torrent of rights and prohibitions for political power to impose has conspicuously failed.  Absolutist zeal is no match for the hardness of reality – or the digestive capacity of history.  The lost highway of radical transformation dead ends in disorientation and nausea.

Morality works on a more human scale.  Modern democratic government, for example, connects to office-holders, factions, and voters, never directly to morality.  A president or an attorney general can act immorally.  The same can be said of a bureaucrat or a voter.  Government, considered objectively, is just machinery.

While it’s true that convention pertains to community, and reality, and history, responsibility ultimately must always take the form of a personal decision.  The only legitimate player in the moral drama is the individual.

In praise of (my) ignorance

January 21, 2015


The truth – the whole truth – lies beyond the reach of the human race.  God may perceive all of reality at a glance, but the rest of us are doomed to a point of view.

Science, the modern form of revelation, deals with a limited set of facts and relationships.  Its truth is esoteric, partial, and passive.  Research can deliver a life-saving vaccine:  but caring about human life is a moral proposition, unprovable by scientific methods.  The brilliant scientists of Nazi Germany perfected the jet engine, rocket propulsion, and the mass production of murder at Auschwitz.

Truth isn’t up for grabs, of course.  It isn’t socially constructed or invented by evil white men.  If you believe that, stand for just a fraction of a second in front of an onrushing truck.  The world can kill you.  We learn that lesson early, or we are crushed by it.

Truth is perspectival.  I can see you only from where I stand – never as you see yourself, never as others see you, never totally, never absolutely.

Being human, I perceive truth under a specific aspect:  that of my perspective.  So I know a bit, but I miss a lot.  Manhattan from the top of the Empire State Building is a radically different experience from street level.  In each case, we think we know what we’re looking at, but we miss a lot.

To speak with absolute certainty is to strike an affectation.  I find this to be the comic failing of our moment in history.  A great revolution in information and communication has placed all authority in crisis, yet people tend to speak in dogmatic assertions, as if they have access to the whole truth.  They concede that you and I are blinded by a point of view, and miss a lot:  but not them.  Since “them” means us, any discussion quickly takes a sour, theological turn.

The translation of the Bible into the vernacular spawned a host of sectarian prophets who insisted they alone understood its message.  In our time, the unprecedented dispersal and democratization of information has had a similar effect.  Facile minds are amazed by how much they know, and find easy patterns, and preach to the multitudes the one true way in politics, or economics, or war and peace, or food consumption, or social relations.

Latter-day sectarians think they own the lever with which to move the world.  But the reality is, they don’t.  They believe they can ordain the future.  But they can’t.  Failure is inevitable but always interpreted as betrayal.  Truth then becomes something ordinary people can’t handle:  a murky conspiracy, the rule of secretive vampires.

Much of the anger consuming our public debates pours out of a conviction that truth really is up for grabs – that nothing exists unless powerful people wish it so.  As an explanation of the world, this is largely self-refuting:  every accusation must be understood to be a manipulation, around and around.  But as an exercise in shifting blame for failure and moral inertia, it’s emotionally satisfying.

An age of affectation feigns to see through the truth:  but we are still clueless children playing games in the street, while that truck, loaded with randomness, bears down on us.

Accepting that truth is perspectival doesn’t entail wobbliness of any sort.  It entails humility.  It’s a sharp reminder of the human condition.  Every day I must wrestle with the angel of doubt.  On every question I am forced to measure, dismayed, the vast abyss of my ignorance.

I’m an analyst.  I spout assertions all over the place.  Should I hem and haw and qualify?  That would be indigestible.  Should I assume the mantle of authority?  That would be dishonest.  Is there a middle ground?  Probably, but it would mean absorbing as many different perspectives on the subject as my limited mental bandwidth can hold.  That’s time-consuming and hard.

Is the payoff worth it?  Much of the time, the effort seems disproportionate to the output.  I labor mightily to get at truth, but deliver a point of view.

Yet all this turns out to be healthy:  possibly for me, certainly for us.  Humility is the beginning of true science, the foundation of what Karl Popper called the open society.  Ignorance invites tolerance:  I want to get to where I’m going, and your directions, though contrary to mine, may actually take me there.

I don’t need a mystical feeling of fraternity to hear you out.  I just have to remember that the world is very large, and that I am small.

Failure, too, becomes a source of enraged cynicism only if I presume a right to God-like infallibility.  Failure is information plugged into the one process that has ever propelled the human race forward:  trial and error.  This method isn’t called “trial and glory” for a reason.  In human affairs, most things fail.  We then have a choice:  learn and advance toward the light, or scratch the itch of wounded vanity and blame the world’s injustice.

Nor is this choice in any sense determined.  If ignorance is infinite, so are the paths that lead to truth.  We decide on direction.  As an analyst, I decide.  Reality contains an element of freedom, and for this reason acquires a distinctly historical flavor.  (The universe, without changing a single atom, resembled a clockworks in the eighteenth century, a folded sheet in the twentieth, a crazy dynamic system today).

Truth comes with a history and a genealogy.  Those who aspire to an unchanging realm of perfect certainty misunderstand the human adventure, which is a great migration into the unknown.


Get every new post delivered to your Inbox.