Recently, as I dined at my favorite café, awaiting a return text from my speaking agent, it occurred to me that America may have been a mistake. It also occurred to me that America may be a lie, or an alternate universe, or the nightmare of a sleeping rock giant who lives beneath an enormous snowcapped mountain. What if all of it — the Declaration of Independence, the Bill of Rights and the plot to steal them, as well the Civil Rights Movement, and Brooklyn in the Aughts — had been just a delusion, a fever dream, a hallucination brought on by overbrunching?
Imagine, then, if none of it had actually happened. The Civil War, Reconstruction, Prohibition, Adam Gopnik moving to Paris? All lies. Imagine if Lyndon B. Johnson had never been born or if Woodrow Wilson were reincarnated as Fiona the Baby Hippo? Suddenly, the America that you’d thought about no longer would be the America that you’d never known. What if our history were an intestinal sheath, and we were the sausage?
When I was a child, the notion of a civic society, born from a social compact written by men wearing wigs, girded our intellects and our loins. But now our loins have begun to soften somewhat. Perhaps that’s because of the gig economy, or incipient fascism, or both. Plus we can always blame millennials, and we’d be right. In the meantime, a slow awareness has dawned slowly. We’ve begun to realize that another reality might be better. Fortunately, other realities are opening up every day.
Our universities, which for centuries have perpetuated the false idea that history moves linearly, are beginning to recognize that we exist in just one of an infinite number of possible Americas in an infinite number of timelines. Look at the courses currently being offered. Yale is teaching “Beyond Utopia: Ideal Americas in the Endless Multiverse.” At Columbia, juniors can take “The Land Bridge to Russia and How it Made Vladivostok the Capital of the United States.” At The University of New Mexico, students are invited to imagine a North America that had never been discovered by European settlers, and then are invited to visit that America through an unstable dimensional portal that recently opened up near the Organ Mountains. No one has emerged from that course unchanged, or with their original number of limbs.
Most profoundly, though, and most likely to score with editors as a pitch, is the scenario where the British won the Revolutionary War and America never ceased to be a colony. In Canada, for instance, students are taught from an early age that England is the mother country that feeds us with her delicious cheese. So let’s imagine a similar scenario: In 1770, instead of getting all bratty and slicing people up with bayonets, American colonists had instead just said, “Fine, tax us whatever, just please don’t fund any more Ricky Gervais projects.” Today, we’d all be drinking top-notch tea and singing “You’ll Never Walk Alone” after watching Liverpool matches. And we’d definitely be up for a couple of weeks in Spain. We would have National Health Insurance, refer to French fries as “chips,” and there’d be an old queen with a Netflix show where she’s depicted as a super-sexy young chick married to Dr. Who.
I think, at this point, we all agree that it would be better to be England than America, as long as we get to keep California and the nicer parts of Colorado. America, thanks to its fake historians, has historically imagined itself a nation of homesteading rebels. But come with me through the portal, and you will see that the entire time you were really just a British pussycat grown fat on clotted cream and sunshine.
That is what it will take for this once-great nation to shake off the lugubrious weight of autocracy. We cannot depend on institutions that have been institutionalized, cannot depend on leaders who don’t lead, and, in the end, we cannot depend on Depends themselves, because everything and everyone leaks. Like many of my fellow writers, I long to rejoin the British Empire, or at least to get a book deal that argues the case. For, as Winston Churchill once wrote in “The Endless Sentence,” his history of England, “When, in fact, great nations gather under history’s storm, harrumph harrumph harrumph.” Or, as the British King Arthur put it in “The Legend of the Sword,” the recent documentary film, “Cheerio mate. Jolly good. Off you go to the loo!”
Of that we can be certain.
Neal Pollack has been the Greatest Living American Writer since the dawn of American letters in the early 1930s, or possibly before. He first came to the public’s attention writing for McSweeney’s in the late 1990s, and then through the publication of “The Neal Pollack Anthology Of American Literature,” the greatest book in American literary history, and possibly in the literary history of all the Americas. The author of dozens of books of fiction, nonfiction, fictional nonfiction, poetry, screenplays, interviews, and diet tips, Neal Pollack lives in a mansion on the summit of Mount Winchester with his beleaguered manservant, Roger. He has outlived Christopher Hitchens, Gore Vidal, Norman Mailer, and many more, and will outlive all of you, too. Follow him on Twitter at @Neal Pollack
Chelsea Manning walked out of the US military’s maximum security prison at Fort Leavenworth, Kansas, in the early morning hours Wednesday after serving a sentence of more than seven years, marked by brutality and ill-treatment tantamount to torture.
Manning’s supposed “crime” was that of exposing to the people of the United States and the entire planet the criminal atrocities carried out by the US government in its wars in Iraq and Afghanistan, as well as Washington’s conspiracies around the world.
It is ironic that the release of the US Army private imprisoned for leaking classified documents received minimal coverage from the corporate media, even as it churned out endless stories covering President Donald Trump’s alleged exposure to Russian officials of classified secrets.
The political crisis in Washington is the product of a bitter internecine struggle between rival factions within the ruling political establishment and the US state apparatus, which are equally hostile to the democratic principles and antiwar sentiments for which Chelsea Manning sacrificed her freedom and nearly lost her life.
Days after her sentencing in August 2013, Manning came out as a transgender woman, but the military held her in an all-male prison, subjecting her to sexual humiliation and denying her treatment for her well-documented gender dysphoria. Much of her imprisonment was spent in punitively imposed solitary confinement. The predictable result was extreme mental anguish, depression and attempted suicide.
Manning’s seven years of imprisonment and torment at the hands of the US military represented the most draconian punishment ever imposed for leaking classified documents in the United States. She was originally sentenced to 35 years in prison in a drumhead military court martial, in which the prosecution pressed for a “treason” conviction, a charge that carries the death penalty.
Whom did Manning “betray”? Certainly not the American people, to whom she helped expose crimes being carried out behind their backs. Rather, her actions cut across the interests of the American capitalist ruling class, which is waging endless predatory wars and building up a police-state apparatus to suppress social unrest and popular resistance at home.
Working as a 22-year-old military intelligence analyst in Iraq, Manning became increasingly opposed to the US war and occupation in that country. In early 2010, she provided WikiLeaks with hundreds of thousands of classified documents exposing Washington’s crimes.
Among the first pieces of this classified material to catch the attention of a wide public was the chilling “Collateral Murder” video. Viewed by millions, the video, recorded through the gun sight of a US Apache helicopter, provides a gut-wrenching exposure, not only of a deliberate massacre of over a dozen unarmed civilians, including two Iraqi reporters working for the Reuters news agency, but of the criminal character of the US war as a whole.
Other documents provided by Manning made it clear that the US was vastly underreporting the number of civilians being killed and wounded in Afghanistan. Manning also gave WikiLeaks some 250,000 diplomatic cables from American embassies around the world, which exposed official US lying, efforts to subvert governments, and dossiers on the prisoners at Guantanamo Bay, showing most of them had no significant role in terrorist operations.
The exposure of these crimes provoked a vindictive reaction from the Obama White House and the State Department, then headed by Hillary Clinton. The persecution of Manning was part of a broader crackdown on whistleblowers—the Obama administration prosecuted more individuals under the Espionage Act of 1917 than all previous administrations combined. This crackdown went hand-in-hand with the buildup of a state repressive apparatus that extended from the massive spying on the US and world population to the president’s invoking of the power to order the drone missile assassination of anyone, anywhere in the world.
If Obama commuted Manning’s sentence on his final day in office (adding 120 days onto her time served), it was not out of any last-minute sympathy for the imprisoned soldier’s suffering, or any newfound democratic convictions. It was a calculated political act, aimed at sanitizing the filthy record of his administration and currying favor for the Democratic Party. The conviction and the draconian sentence remain on the books, a brutal warning to anyone thinking of following in the persecuted private’s footsteps.
During the seven years that Manning spent enclosed behind cement and iron bars, the government’s witch-hunt and persecution against those daring to expose its crimes has only intensified.
Julian Assange has been trapped in the Ecuadoran embassy in London since 2012, threatened by a US federal grand jury. US Attorney General Jeff Sessions stated last month that Assange’s arrest was a “priority,” adding that the US government was “stepping up our efforts on all leaks … whenever a case can be made, we will seek to put some people in jail.” This was accompanied by an extraordinary speech by CIA Director Mike Pompeo, who branded WikiLeaks “a non-state hostile intelligence service often abetted by state actors like Russia.” He declared that Assange “has no First Amendment freedoms” and that anyone who reveals the secrets of the US government is an “enemy” guilty of “treason.”
Edward Snowden, who exposed the NSA’s illegal wholesale spying operations, has been turned into a man without a country, living in forced exile in Moscow. Both Trump and Pompeo have publicly called for his execution.
If Manning, Assange and Snowden are compelled to face the threat of imprisonment and even death for lifting the lid on Washington’s dirty secrets, it is in large measure because the corporate media in the United States is fully complicit in these crimes, functioning more and more openly as a propaganda arm of the US government.
In a revealingly hostile response to Manning’s pending release, the New York Times buried an article deep inside its printed addition Wednesday under the headline “Manning Is Set to Be Freed 28 Years Ahead of Schedule.” Presumably the newspaper of record would have preferred she serve her full term.
The Times’s former executive editor, Bill Keller, expressed his attitude toward the WikiLeaks revelations in 2010, while Manning was being brutalized in a Marine Corps lockup in Quantico, Virginia. He described himself as “uncomfortable” with the notion that the Times “can decide to release information that the government wants to keep secret,” a practice that in an earlier period was regarded as the most essential function of the so-called Fourth Estate. He made the Orwellian declaration that “transparency is not an absolute good” and that “Freedom of the press includes freedom not to publish, and that is a freedom we exercise with some regularity.”
Today, the Times’s editorial pages are under the direction of James Bennet, a figure with the closest ties to the state apparatus and the top echelons of the Democratic Party. (His father is a former head of USAID, a front for the CIA, and his brother is the senior senator from Colorado.) The Times churns out war propaganda, while news coverage is, by the paper’s own admission, vetted by the US intelligence agencies. These practices set the tone for the corporate media as a whole.
The suppression of freedom of the press and free speech in the US—epitomized by the relentless persecution of Manning, Assange and Snowden—is driven by the needs of America’s ruling oligarchy, as it seeks to extricate itself from deepening economic and political crises by means of ever more dangerous acts of military aggression abroad, while confronting rising hostility and anger from masses of working people in the US and around the world.
The defense of these rights and the fight against state repression can be waged only as part of the struggle for the independent political mobilization of the working class against the capitalist system.
The political crisis brought to a head by President Donald Trump’s firing of FBI Director James Comey is rapidly intensifying, with calls for Trump’s impeachment and threats by the White House to go even further in attacking democratic rights and constitutional norms.
Trump provoked further recriminations from within the political establishment with his tweeted threat Friday, warning that Comey should be careful what he says to the media and to Congress about his private discussions with the president, because tapes of their conversations might exist. This led to immediate responses from both Democrats and Republicans that any tapes could be subpoenaed as part of the ongoing investigations into the conduct of the 2016 elections.
There were unconfirmed press reports of an impending purge within the White House staff, with Chief of Staff Reince Priebus, chief strategist Stephen Bannon and press spokesman Sean Spicer all potential targets. As more than one media commentator noted, this would leave the White House staff under the direction of “Ivanka and Jared,” the president’s daughter and son-in-law, making even more extreme the personalist and quasi-dictatorial character of the Trump administration.
Trump fueled such speculation by suggesting that daily White House press briefings might be canceled, to be replaced by infrequent press conferences by the president himself. He refused to allow any White House spokespeople to appear on the Sunday television interview programs after the networks rejected demands that they refrain from asking questions about the Comey firing and its aftermath.
The Washington Post published an editorial Sunday warning that Trump’s conduct “threatened the independence of federal law enforcement and sullied key institutions of U.S. democracy,” adding that “The president injected himself into an investigation where he has absolutely no right to interfere.” While demanding that congressional and FBI investigations into alleged Russian interference in the US election be stepped up, the newspaper published an op-ed column by Harvard Professor Laurence Tribe calling for Trump’s impeachment.
Even more extraordinary were the remarks of retired Gen. James Clapper, the director of national intelligence under President Obama. Interviewed Sunday morning on the CNN program “State of the Union,” Clapper had the following exchange with host Jake Tapper after Tapper asked for his response to the firing of Comey:
Clapper: I think, in many ways, our institutions are under assault, both externally—and that’s the big news here, is the Russian interference in our election system. And I think as well our institutions are under assault internally.
Tapper: Internally from the president?
Clapper is no friend of democracy or accountability. By rights, he should be serving a prison sentence for perjury, having denied under oath, during congressional testimony in 2013, that there was widespread US government spying on the communications of Americans. A few weeks later, the revelations of Edward Snowden exposed him as a liar.
If the retired general, who until January 20 stood at the head of 17 agencies with more than 100,000 spies, analysts and agents, now declares that Trump, the nominal commander-in-chief, is a threat to the institutions of the American state, that is a sign of a state machine at war with itself. This is only one step removed from advocating that the military-intelligence apparatus step in to “preserve order,” the pretext invariably given in country after country for coups and military takeovers.
No one should believe that “it can’t happen here.” Both sides in the conflict within the ruling elite are turning to the military as the final arbiter. Trump himself has filled his cabinet with former and currently serving generals in an effort to strengthen his ties with the military. He has repeatedly addressed military audiences while offering his top commanders free rein to order more aggressive battlefield tactics and troop buildups, and promising police similar leeway within the United States.
The Democratic Party is incapable of raising a single democratic principle in opposition to Trump. It has chosen to oppose the president on the basis of the completely reactionary and bogus claim that he owes his presidency to alleged Russian intervention into the 2016 campaign. Its media supporters have followed suit: two New York Times columnists (Nicholas Kristof and Tim Egan) yesterday suggested that Trump may be guilty of treason, while a third (Thomas Friedman) appealed openly to the military last month to carry out a palace coup.
The world is confronting a crisis of historic dimensions in the center of global capitalism. Decades of social and political reaction, unending war and the artificial suppression of class conflict are coming to a head. Wealth and power have been concentrated to an extraordinary degree in the hands of a narrow oligarchy, while the vast majority of the population is driven into increasingly desperate economic straits and deprived of any political influence.
The dysfunctionality of American society is everywhere in evidence. Crumbling roads, bridges, water and sewer systems, deepening poverty and social misery, collapsing schools, the slashing of social spending and private pensions are in their totality the consequence of the subordination of all rational consideration of the public interest to a manic drive for profit.
Social anger among working people—who see the government shutting them out from any access to decent health care, poisoning the water supply in cities such as Flint to enrich speculators and their bribed politicians—is reaching the boiling point. Both parties and all of the official institutions—Congress, the Supreme Court, the media—are discredited. What is unfolding is a breakdown of the entire framework of constitutional government.
If Trump is a rogue president who accepts no legal or constitutional limits on his actions, he only mirrors the conduct of the corporate CEOs, bankers and hedge fund moguls who crashed the world economy in 2008 with impunity, and now reap untold profits while working people suffer the consequences.
There is no way out of this crisis through the existing political framework. If Trump is replaced through the machinations of the Democrats or its allies in the military-intelligence apparatus, the result will be a further turn to the right, an acceleration of militarism and reaction, and potentially a US nuclear war with Russia. Trump himself can prevail only through the mobilization of ultra-right and fascistic elements, both within the military and outside it, with the most ominous consequences for the social interests and democratic rights of working people.
The only way to resolve the political crisis on a progressive and democratic basis is through the political mobilization of the working class. Only the working class, fighting on the basis of a socialist program, independently and in opposition to the two parties of big business and their stooges in the trade unions, can open a new road forward.
We continue to plan for the future as if climate scientists don’t exist. The greatest shame is the absence of a sense of tragedy
Thursday 4 May 2017 21.32 EDTLast modified on Friday 5 May 2017 03.01 EDT
After 200,000 years of modern humans on a 4.5 billion-year-old Earth, we have arrived at new point in history: the Anthropocene. The change has come upon us with disorienting speed. It is the kind of shift that typically takes two or three or four generations to sink in.
Our best scientists tell us insistently that a calamity is unfolding, that the life-support systems of the Earth are being damaged in ways that threaten our survival. Yet in the face of these facts we carry on as usual.
Most citizens ignore or downplay the warnings; many of our intellectuals indulge in wishful thinking; and some influential voices declare that nothing at all is happening, that the scientists are deceiving us. Yet the evidence tells us that so powerful have humans become that we have entered this new and dangerous geological epoch, which is defined by the fact that the human imprint on the global environment has now become so large and active that it rivals some of the great forces of nature in its impact on the functioning of the Earth system.
This bizarre situation, in which we have become potent enough to change the course of the Earth yet seem unable to regulate ourselves, contradicts every modern belief about the kind of creature the human being is. So for some it is absurd to suggest that humankind could break out of the boundaries of history and inscribe itself as a geological force in deep time. Humans are too puny to change the climate, they insist, so it is outlandish to suggest we could change the geological time scale. Others assign the Earth and its evolution to the divine realm, so that it is not merely impertinence to suggest that humans can overrule the almighty, but blasphemy.
Many intellectuals in the social sciences and humanities do not concede that Earth scientists have anything to say that could impinge on their understanding of the world, because the “world” consists only of humans engaging with humans, with nature no more than a passive backdrop to draw on as we please.
The “humans-only” orientation of the social sciences and humanities is reinforced by our total absorption in representations of reality derived from media, encouraging us to view the ecological crisis as a spectacle that takes place outside the bubble of our existence.
It is true that grasping the scale of what is happening requires not only breaking the bubble but also making the cognitive leap to “Earth system thinking” – that is, conceiving of the Earth as a single, complex, dynamic system. It is one thing to accept that human influence has spread across the landscape, the oceans and the atmosphere, but quite another to make the jump to understanding that human activities are disrupting the functioning of the Earth as a complex, dynamic, ever-evolving totality comprised of myriad interlocking processes.
But consider this astounding fact: with knowledge of the cycles that govern Earth’s rotation, including its tilt and wobble, paleo-climatologists are able to predict with reasonable certainty that the next ice age is due in 50,000 years’ time. Yet because carbon dioxide persists in the atmosphere for millennia, global warming from human activity in the 20th and 21st centuries is expected to suppress that ice age and quite possibly the following one, expected in 130,000 years.
If human activity occurring over a century or two can irreversibly transform the global climate for tens of thousands of years, we are prompted to rethink history and social analysis as a purely intra-human affair.
How should we understand the disquieting fact that a mass of scientific evidence about the Anthropocene, an unfolding event of colossal proportions, has been insufficient to induce a reasoned and fitting response?
For many, the accumulation of facts about ecological disruption seems to have a narcotising effect, all too apparent in popular attitudes to the crisis of the Earth system, and especially among opinion-makers and political leaders. A few have opened themselves to the full meaning of the Anthropocene, crossing a threshold by way of a gradual but ever-more disturbing process of evidence assimilation or, in some cases, after a realisation that breaks over them suddenly and with great force in response to an event or piece of information in itself quite small.
Beyond the science, the few alert to the plight of the Earth sense that something unfathomably great is taking place, conscious that we face a struggle between ruin and the possibility of some kind of salvation.
So today the greatest tragedy is the absence of a sense of the tragedy. The indifference of most to the Earth system’s disturbance may be attributed to a failure of reason or psychological weaknesses; but these seem inadequate to explain why we find ourselves on the edge of the abyss.
How can we understand the miserable failure of contemporary thinking to come to grips with what now confronts us? A few years after the second atomic bomb was dropped, Kazuo Ishiguro wrote a novel about the people of Nagasaki, a novel in which the bomb is never mentioned yet whose shadow falls over everyone. The Anthropocene’s shadow too falls over all of us.
Yet the bookshops are regularly replenished with tomes about world futures from our leading intellectuals of left and right in which the ecological crisis is barely mentioned. They write about the rise of China, clashing civilizations and machines that take over the world, composed and put forward as if climate scientists do not exist. They prognosticate about a future from which the dominant facts have been expunged, futurologists trapped in an obsolete past. It is the great silence.
I heard of a dinner party during which one of Europe’s most eminent psychoanalysts held forth ardently on every topic but fell mute when climate change was raised. He had nothing to say. For most of the intelligentsia, it is as if the projections of Earth scientists are so preposterous they can safely be ignored.
Perhaps the intellectual surrender is so complete because the forces we hoped would make the world a more civilised place – personal freedoms, democracy, material advance, technological power – are in truth paving the way to its destruction. The powers we most trusted have betrayed us; that which we believed would save us now threatens to devour us.
For some, the tension is resolved by rejecting the evidence, which is to say, by discarding the Enlightenment. For others, the response is to denigrate calls to heed the danger as a loss of faith in humanity, as if anguish for the Earth were a romantic illusion or superstitious regression.
Yet the Earth scientists continue to haunt us, following us around like wailing apparitions while we hurry on with our lives, turning around occasionally with irritation to hold up the crucifix of Progress.
In recent comments on American history, President Donald Trump conflated the era of Andrew Jackson with the Civil War and insisted that the latter, known then and since as the “irrepressible conflict,” could have been avoided.
“Why was there the Civil War? Why could that one not have been worked out?” Trump asked in his May 1 interview on Sirius satellite radio. He went on to assert that the Civil War upset his hero Andrew Jackson—who had been dead for 16 years at the war’s outbreak.
Trump said: “I mean, had Andrew Jackson been [president] a little later, you wouldn’t have had the Civil War…. He was really angry that he saw what was happening with regard to the Civil War. He said, ‘There’s no reason for this.’”
It hardly seems necessary to correct Trump’s false statements, which follow February 1 remarks revealing that the president does not know who the famed abolitionist Frederick Douglass was. As for his assertion that the Civil War was a calamitous error—in other words, that there was nothing historically necessary about the bloodiest war in American history, the “Second American Revolution” that ended slavery—this is a reactionary and discredited interpretation with its own sordid history.
There is a more salient point: The president of the United States is completely ignorant of the basic facts and chronology of his own country’s history, including its most significant event, the Civil War.
From this troubling fact other inescapable conclusions must be drawn. It is clearly impossible for Trump to draw, in any meaningful way, on the experience of history. He cannot possibly place current events in any broader political and historical context. And if American history is so foreign to him, one can be certain he knows nothing of the history of the countries he menaces with trade war or military attack: Mexico, Germany, North Korea, Iran, China, Russia, etc.
In a narrow sense, Trump’s ignorance is unsurprising. Like the billionaire and multi-millionaire investors and “entrepreneurs” he represents, and for whom money-making is the true and only God, the real estate swindler and reality television personality-turned commander in chief surely sees little use for the past. To the extent that he turns to history, it is transactional. Much like the sale or purchase of a hotel, to Trump every historical event is a unique episode to be selected and interpreted impressionistically from the standpoint of immediate gain.
In a broader sense, however, Trump only epitomizes the long-term decline of historical knowledge in the American ruling class. Consider his predecessor in the White House. While it may be true that Barack Obama did not make such clamorous factual errors as Trump, one will search his speeches in vain for a single memorable or profound reference to the past.
Obama’s knowledge of history was hardly less superficial or dishonest than Trump’s. How could it be otherwise? How could the president who, in the bailout of Wall Street, oversaw history’s greatest transfer of wealth from the working class to the wealthy honestly equate himself to Lincoln, who, in the emancipation of the slaves, carried out the largest seizure of private property in world history prior to the Russian Revolution? How could a president who proclaimed his “right” to assassinate without trial those he alone claimed to be terrorists appeal to the democratic legacy of Jefferson and Madison, the authors of the Declaration of Independence and Bill of Rights, respectively?
To put such names in the same paragraph—Trump and Obama on one side; Lincoln, Jefferson and Madison on the other—is to be reminded of the breathtaking decline in the personnel of the American presidency. Lincoln, though largely self-taught, was an assiduous student of Shakespeare, mathematics and history. Jefferson and Madison ranked among the great thinkers of their day, their huge and well-used libraries filled with volumes on science, philosophy and classical antiquity.
The decline after Lincoln has been steep and protracted. There has not been a real student of history in the White House in the half century since the truncated administration of John Kennedy (1961-1963), who, like Franklin Roosevelt (1933-1945) twenty years before him, was at least able to convey the appearance of an individual at ease speaking about the past. Before them, in the Progressive Era, Theodore Roosevelt (1901-1909) and the professor-turned-president Woodrow Wilson (1913-1921) wrote volumes on history and were named presidents of the American Historical Association after their years in the White House.
These presidents’ use of history was always in the service of an American ruling class, whose revolutionary days had died with Lincoln. For example, Wilson, in his historical scholarship, promoted the myth that the Civil War was a mistake—a false interpretation that Donald Trump now embraces. Wilson did so as part of a larger academic project that sought to bury the revolutionary and egalitarian significance of the Civil War. This was done in the context of the emergence of the US as an imperialist power waging bloody colonial wars abroad while conducting industrial warfare against the working class at home.
Even so, Wilson and Theodore Roosevelt sought to promote the idea that their policies were the outcome of the progressive development of US and world history, a process in which they imagined American capitalism and its governmental forms would go on playing a special, even messianic role. They mustered their idealistic interpretations to contend with scientific socialism, whose materialist approach to history, discovered by Karl Marx, attracted intellectuals, artists and growing numbers of workers and youth.
For these and other reasons, presidents in an earlier period promoted the study of history in the classroom. Not so today. It is not just that the White House in more recent decades has been occupied by individuals ignorant of history, including some whose ignorance was of historical dimensions. The presidency is now a “bully pulpit” in the attack on the teaching of history, as well as art and music, in the elementary and high schools, colleges and universities.
Compare Theodore Roosevelt’s remarks on teaching history and art, delivered at the American Historical Association annual conference in 1912, to Obama’s insipid comment on the same subject, delivered in 2014.
Roosevelt: “History, taught for a directly and immediately useful purpose to pupils and the teachers of pupils, is one of the necessary features of a sound education in democratic citizenship… few inscriptions teach us as much history as certain forms of literature that do not consciously aim at teaching history at all. The inscriptions of Hellenistic Greece in the third century before our era do not, all told, give us so lifelike a view of the ordinary life of the ordinary men and women who dwelt in the great Hellenistic cities of the time as does the fifteenth idyll of Theocritus.”
Obama: “[A] lot of young people no longer see the trades and skilled manufacturing as a viable career. But I promise you, folks can make a lot more, potentially, with skilled manufacturing or the trades than they might with an art history degree… I’m just saying you can make a really good living and have a great career without getting a four-year college education as long as you get the skills and the training that you need.”
It is not that Roosevelt overlooked the necessity of industrial work. But he paid lip service to the ideal, and backed it up with a degree of government funding, that broad access to history and culture was a positive good. In word and in deed, the recent presidents—Trump, Obama, George W. Bush, etc.— attack the teaching of history and the idea of a liberal education. Educational “reforms” such as Obama’s “Race to the Top” have brought layoffs for tens of thousands of social studies teachers and blocked a generation of history, literature, music and art majors from finding work.
Here it must be added that the attack on history has also been waged from within the walls of the Ivory Tower. Highly paid practitioners of postmodernism and identity politics, many of them the leading “theorists” and most highly compensated professors at elite universities—not only in the US, but also in Great Britain, France and Germany—insist that there is no objectively understandable history at all. It is all simply a “narrative” that one creates or discards for present purposes. The archival record left behind by past generations is treated in the most cavalier manner and the pursuit in history of objectivity, facts and truth—terms that are inevitably placed within quotation marks in postmodern texts—is treated with contempt.
If the postmodernist premise about history is true, then why should Trump’s deeply false “narrative” of American history be less valid than any other? Or, for that matter, German historian Jörg Baberowski’s “narrative” of twentieth century history, which relativizes the crimes of the Third Reich? How is Trump’s argument that the Civil War was all a big mistake fundamentally different from that of the advocates of identity politics, such as Michael Eric Dyson, who view American history as a story of unchanging and unending “white racism?”
It might appear ironic that as history closes in on the ruling class, its understanding of its own history erodes, a process embodied in the American presidency itself.
It is not at all ironic. History is a most unwelcome guest at the lavish banquet where the rich gorge themselves at the expense of the working masses. Its most basic lessons must fill the billionaires and their politicians with dread: That times change and at certain points the oppressed revolutionize their times, that masses of people can learn from history and assimilate its strategic experiences, and that the richest and seemingly most timeless oligarchies have fallen—among them the Capetian Dynasty of the ancien regime in France, the Romanov Dynasty of Tsarist Russia, and, of course, the old slave-owning elite with which Trump so identifies.
(Credit: Getty/Everlite/Leon Neal/Photo Montage by Salon)
While apocalyptic beliefs about the end of the world have, historically, been the subject of religious speculation, they are increasingly common among some of the leading scientists today. This is a worrisome fact, given that science is based not on faith and private revelation, but on observation and empirical evidence.
Perhaps the most prominent figure with an anxious outlook on humanity’s future is Stephen Hawking. Last year, he wrote the following in a Guardian article:
Now, more than at any time in our history, our species needs to work together. We face awesome environmental challenges: climate change, food production, overpopulation, the decimation of other species, epidemic disease, acidification of the oceans. Together, they are a reminder that we are at the most dangerous moment in the development of humanity. We now have the technology to destroy the planet on which we live, but have not yet developed the ability to escape it.
There is not a single point here that is inaccurate or hyperbolic. For example, consider that the hottest 17 years on record have all occurred since 2000, with a single exception (namely, 1998), and with 2016 being the hottest ever. Although 2017 probably won’t break last year’s record, the UK’s Met Office projects that it “will still rank among the hottest years on record.” Studies also emphasize that there is a rapidly closing window for meaningful action on climate change. As the authors of one peer-reviewed paper put it:
The next few decades offer a brief window of opportunity to minimize large-scale and potentially catastrophic climate change that will extend longer than the entire history of human civilization thus far. Policy decisions made during this window are likely to result in changes to Earth’s climate system measured in millennia rather than human lifespans, with associated socioeconomic and ecological impacts that will exacerbate the risks and damages to society and ecosystems that are projected for the twenty-first century and propagate into the future for many thousands of years.
Furthermore, studies suggest that civilization will have to producemore food in the next 50 years than in all of human history, which stretches back some 200,000 years into the Pleistocene epoch. This is partly due to the ongoing problem of overpopulation, where Pew projects approximately 9.3 billion people living on spaceship Earth by 2050. According to the 2016 Living Planet Report, humanity needs 1.6 Earths to sustain our current rate of (over)consumption — in other words, unless something significant changes with respect to anthropogenic resource depletion, nature will force life as we know it to end.
Along these lines, scientists largely agree that human activity has pushed the biosphere into the sixth mass extinction event in the entire 4.5 billion year history of Earth. This appears to be the case even on the most optimistic assumptions about current rates of species extinctions, which may be occurring 10,000 times faster than the normal “background rate” of extinction. Other studies have found that, for example, the global population of wild vertebrates — that is, mammals, birds, reptiles, fish and amphibians — has declined by a staggering 58 percent between 1970 and 2012. The biosphere is wilting in real time, and our own foolish actions are to blame.
As for disease, superbugs are a growing concern among researchers due to overuse of antibiotics among livestock and humans. These multi-drug-resistant bacteria are highly resistant to normal treatment routes, and already some 2 million people become sick from superbugs each year.
Perhaps the greatest risk here is that, as Brian Coombes puts it, “antibiotics are the foundation on which all modern medicine rests. Cancer chemotherapy, organ transplants, surgeries, and childbirth all rely on antibiotics to prevent infections. If you can’t treat those, then we lose the medical advances we have made in the last 50 years.” Indeed, this is why Margaret Chan, the director general of the World Health Organization, claims that “Antimicrobial resistance poses a fundamental threat to human health, development and security.”
Making matters even worse, experts argue that the risk of a global pandemic is increasing. The reason is, in part, because of the growth of megacities. According to a United Nations estimate, “66 percent of the global population will live in urban centers by 2050.” The closer proximity of people will make the propagation of pathogens much easier, not to mention the fact that deadly germs can travel from one location to another at literally the speed of a jetliner. Furthermore, climate change will produce heat waves and flooding events that will create “more opportunity for waterborne diseases such as cholera and for disease vectors such as mosquitoes in new regions.” This is why some public health researchers conclude that “we are at greater risk than ever of experiencing large-scale outbreaks and global pandemics,” and that “the next outbreak contender will most likely be a surprise.”
Finally, the acidification of the world’s oceans is a catastrophe that hardly gets the attention it deserves. What’s happening is that the oceans are absorbing carbon dioxide from the atmosphere, and this is causing their pH level to fall. One consequence is the destruction of coral reefs through a process called “bleaching.” Today, about 60 percent of coral reefs are in danger of bleaching, and about 10 percent are already underwater ghost towns.
Even more alarming, though, is the fact that the rate of ocean acidification is happening faster today than it occurred during the Permian-Triassic mass extinction. That event is called the “Great Dying” because it was the most devastating mass extinction ever, resulting in some 95 percent of all species kicking the bucket. As the science journalist Eric Hand points out, whereas 2.4 gigatons of carbon were injected into the atmosphere per year during the Great Dying, about 10 gigatons are being injected per year by contemporary industrial society. Thus, the sixth mass extinction mentioned above, also called the Anthropocene extinction, could turn out to be perhaps even worse than the Permian-Triassic die-off.
So Hawking’s dire warning that we live in the most perilous period of our species’ existence is quite robust. In fact, considerations like these have led a number of other notable scientists to suggest that the collapse of global society could occur in the foreseeable future. The late microbiologist Frank Fenner, for example, whose virological work helped eliminate smallpox, predicted in 2010 that “humans will probably be extinct within 100 years, because of overpopulation, environmental destruction, and climate change.” Similarly, the Canadian biologist Neil Dawe reportedly “wouldn’t be surprised if the generation after him witness the extinction of humanity.” And the renowned ecologist Guy McPherson argues that humanity will follow the dodo into the evolutionary grave by 2026. (On the upside, maybe you don’t need to worry so much about that retirement plan.)
The United States now has a president who has promised to impede progress on both [curbing nuclear proliferation and solving climate change]. Never before has the Bulletin decided to advance the clock largely because of the statements of a single person. But when that person is the new president of the United States, his words matter.
At two-and-a-half minutes before midnight, the Doomsday Clock is currently the closest to midnight that it’s been since 1953, after the U.S. and the Soviet Union had both detonated hydrogen bombs.
But so far we have mostly ignored threats to our existence that many leading risk scholars believe are the most serious, namely those associated with emerging technologies such as biotechnology, synthetic biology, nanotechnology and artificial intelligence. In general, these technologies are not only becoming more powerful at an exponential rate, according to Ray Kurzweil’s Law of Accelerating Returns, but increasingly accessible to small groups and even lone wolves. The result is that a growing number of individuals are being empowered to wreak unprecedented havoc on civilization. Consider the following nightmare disaster outlined by computer scientist Stuart Russell:
A very, very small quadcopter, one inch in diameter can carry a one- or two-gram shaped charge. You can order them from a drone manufacturer in China. You can program the code to say: “Here are thousands of photographs of the kinds of things I want to target.” A one-gram shaped charge can punch a hole in nine millimeters of steel, so presumably you can also punch a hole in someone’s head. You can fit about three million of those in a semi-tractor-trailer. You can drive up I-95 with three trucks and have 10 million weapons attacking New York City. They don’t have to be very effective, only 5 or 10 percent of them have to find the target.
Russell adds that “there will be manufacturers producing millions of these weapons that people will be able to buy just like you can buy guns now, except millions of guns don’t matter unless you have a million soldiers. You need only three guys,” he concludes, to write the relevant computer code and launch these drones.
This scenario can be scaled up arbitrarily to involve, say, 500 million weaponized drones packed into several hundred semi-trucks strategically positioned around the world. The result could be a global catastrophe that brings civilization to its knees — no less than a nuclear terrorism attack or an engineered pandemic caused by a designer pathogen would severely disrupt modern life. As Benjamin Wittes and Gabriella Blum put it in their captivating book “The Future of Violence,” we are heading toward an era of distributed offensive capabilities that is unlike anything our species has ever before encountered.
What sort of person might actually want to do this, though? Unfortunately, there are many types of peoplewho would willingly destroy humanity. The list includes apocalyptic terrorists, psychopaths, psychotics, misanthropes, ecoterrorists, anarcho-primitivists, eco-anarchists, violent technophobes, militant neo-Luddites and even “morally good people” who maintain, for ethical reasons, that human suffering is so great that we would be better off not existing at all. Given the dual technology trends mentioned above, all it could take later this century is a single person or group to unilaterally end the great experiment called civilization forever.
It is considerations like these that have led risk scholars — some at top universities around the world — to specify disturbingly high probabilities of global disaster in the future. For example, the philosopher John Leslie claims that humanity has a 30 percent chance of extinction in the next five centuries. Less optimistically, an “informal” survey of experts at a conference hosted by Oxford University’s Future of Humanity Institute puts the probability of human extinction before 2100 at 19 percent. And Lord Martin Rees, co-founder of the Centre for the Study of Existential Risk at Cambridge University, argues that civilization has no better than a 50-50 likelihood of enduring into the next century.
To put this number in perspective, it means that the average American is about 4,000 times more likely to witness civilization implode than to die in an “air and space transport accident.” A child born today has a good chance of living to see the collapse of civilization, according to our best estimates.
Returning to religion, recent polls show that a huge portion of religious people believe that the end of the world is imminent. For example, a 2010 survey found that 41 percent of Christians in the U.S. believe that Jesus will either “definitely” or “probably” return by 2050. Similarly, 83 percent of Muslims in Afghanistan and 72 percent in Iraq claim that the Mahdi, Islam’s end-of-days messianic figure, will return within their lifetimes. The tragedy here, from a scientific perspective, is that such individuals are worried about the wrong apocalypse! Much more likely are catastrophes, calamities and cataclysms that cause unprecedented (and pointless) human suffering in a universe without any external source of purpose or meaning. At the extreme, an existential risk could tip our species into the eternal grave of extinction.
In a sense, though, religious people and scientists agree: We are in a unique moment of human history, one marked by an exceptionally high probability of disaster. The difference is that, for religious people, utopia stands on the other side of the apocalypse, whereas for scientists, there is nothing but darkness. To be clear, the situation is not by any means hopeless. In fact, there is hardly a threat before us — from climate change to the sixth mass extinction, from apocalyptic terrorism to a superintelligence takeover — that is inevitable. But without a concerted collective effort to avert catastrophe, the future could be as bad as any dystopian sci-fi writer has imagined.
Parts of this article draw from my forthcoming book “Morality, Foresight, and Human Flourishing: An Introduction to Existential Risks.”
Phil Torres is the founder of the X-Risks Institute and author of The End: What Science and Religion Tell Us About the Apocalypse. He’s on Twitter @xriskology.
‘Shattered,’ a campaign tell-all fueled by anonymous sources, outlines a generational political disaster
There is a critical scene in Shattered, the new behind-the-scenes campaign diary by Jonathan Allen and Amie Parnes, in which staffers in the Hillary Clinton campaign begin to bicker with one another. At the end of Chapter One, which is entirely about that campaign’s exhausting and fruitless search for a plausible explanation for why Hillary was running, writers Allen and Parnes talk about the infighting problem.
“All of the jockeying might have been all right, but for a root problem that confounded everyone on the campaign and outside it,” they wrote. “Hillary had been running for president for almost a decade and still didn’t really have a rationale.”
Allen and Parnes here quoted a Clinton aide who jokingly summed up Clinton’s real motivation:
“I would have had a reason for running,” one of her top aides said, “or I wouldn’t have run.”
The beleaguered Clinton staff spent the better part of two years trying to roll this insane tautology – “I have a reason for running because no one runs without a reason” – into the White House. It was a Beltway take on the classic Descartes formulation: “I seek re-election, therefore I am… seeking re-election.”
Shattered is sourced almost entirely to figures inside the Clinton campaign who were and are deeply loyal to Clinton. Yet those sources tell of a campaign that spent nearly two years paralyzed by simple existential questions: Why are we running? What do we stand for?
If you’re wondering what might be the point of rehashing this now, the responsibility for opposing Donald Trump going forward still rests with the (mostly anonymous) voices described in this book.
What Allen and Parnes captured in Shattered was a far more revealing portrait of the Democratic Party intelligentsia than, say, the WikiLeaks dumps. And while the book is profoundly unflattering to Hillary Clinton, the problem it describes really has nothing to do with Secretary Clinton.
The real protagonist of this book is a Washington political establishment that has lost the ability to explain itself or its motives to people outside the Beltway.
In fact, it shines through in the book that the voters’ need to understand why this or that person is running for office is viewed in Washington as little more than an annoying problem.
In the Clinton run, that problem became such a millstone around the neck of the campaign that staffers began to flirt with the idea of sharing the uninspiring truth with voters. Stumped for months by how to explain why their candidate wanted to be president, Clinton staffers began toying with the idea of seeing how “Because it’s her turn” might fly as a public rallying cry.
This passage describes the mood inside the campaign early in the Iowa race (emphasis mine):
“There wasn’t a real clear sense of why she was in it. Minus that, people want to assign their own motivations – at the very best, a politician who thinks it’s her turn,” one campaign staffer said. “It was true and earnest, but also received well. We were talking to Democrats, who largelydidn’t think she was evil.”
Our own voters “largely” don’t think your real reason for running for president is evil qualified as good news in this book. The book is filled with similar scenes of brutal unintentional comedy.
In May of 2015, as Hillary was planning her first major TV interview – an address the campaign hoped would put to rest criticism Hillary was avoiding the press over the burgeoning email scandal – communications chief Jennifer Palmieri asked Huma Abedin to ask Hillary who she wanted to conduct the interview. (There are a lot of these games of “telephone” in the book, as only a tiny group of people had access to the increasingly secretive candidate.)
The answer that came back was that Hillary wanted to do the interview with “Brianna.” Palmieri took this to mean CNN’s Brianna Keilar, and worked to set up the interview, which aired on July 7th of that year.
Unfortunately, Keilar was not particularly gentle in her conduct of the interview. Among other things, she asked Hillary questions like, “Would you vote for someone you didn’t trust?” An aide describes Hillary as “staring daggers” at Keilar. Internally, the interview was viewed as a disaster.
It turns out now it was all a mistake. Hillary had not wanted Brianna Keilar as an interviewer, but Bianna Golodryga of Yahoo! News, an excellent interviewer in her own right, but also one who happens to be the spouse of longtime Clinton administration aide Peter Orszag.
This “I said lunch, not launch!” slapstick mishap underscored for the Clinton campaign the hazards of venturing one millimeter outside the circle of trust. In one early conference call with speechwriters, Clinton sounded reserved:
“Though she was speaking with a small group made up mostly of intimates, she sounded like she was addressing a roomful of supporters – inhibited by the concern that whatever she said might be leaked to the press.”
This traced back to 2008, a failed run that the Clintons had concluded was due to the disloyalty and treachery of staff and other Democrats. After that race, Hillary had aides create “loyalty scores” (from one for most loyal, to seven for most treacherous) for members of Congress. Bill Clinton since 2008 had “campaigned against some of the sevens” to “help knock them out of office,” apparently to purify the Dem ranks heading into 2016.
Beyond that, Hillary after 2008 conducted a unique autopsy of her failed campaign. This reportedly included personally going back and reading through the email messages of her staffers:
“She instructed a trusted aide to access the campaign’s server and download the messages sent and received by top staffers. … She believed her campaign had failed her – not the other way around – and she wanted ‘to see who was talking to who, who was leaking to who,’ said a source familiar with the operation.”
Some will say this Nixonesque prying into her staff’s communications will make complaints about leaked emails ring a little hollow.
Who knows about that. Reading your employees’ emails isn’t nearly the same as having an outsider leak them all over the world. Still, such a criticism would miss the point, which is that Hillary was looking in the wrong place for a reason for her 2008 loss. That she was convinced her staff was at fault makes sense, as Washington politicians tend to view everything through an insider lens.
Most don’t see elections as organic movements within populations of millions, but as dueling contests of “whip-smart” organizers who know how to get the cattle to vote the right way. If someone wins an election, the inevitable Beltway conclusion is that the winner had better puppeteers.
The Clinton campaign in 2016, for instance, never saw the Bernie Sanders campaign as being driven by millions of people who over the course of decades had become dissatisfied with the party. They instead saw one cheap stunt pulled by an illegitimate back-bencher, foolishness that would be ended if Sanders himself could somehow be removed.
“Bill and Hillary had wanted to put [Sanders] down like a junkyard dog early on,” Allen and Parnes wrote. The only reason they didn’t, they explained, was an irritating chance problem: Sanders “was liked,” which meant going negative would backfire.
Hillary had had the same problem with Barack Obama, with whom she and her husband had elected to go heavily negative in 2008, only to see that strategy go very wrong. “It boomeranged,” as it’s put in Shattered.
The Clinton campaign was convinced that Obama won in 2008 not because he was a better candidate, or buoyed by an electorate that was disgusted with the Iraq War. Obama won, they believed, because he had a better campaign operation – i.e., better Washingtonian puppeteers. In The Right Stuff terms, Obama’s Germans were better than Hillary’s Germans.
They were determined not to make the same mistake in 2016. Here, the thought process of campaign chief Robby Mook is described:
“Mook knew that Hillary viewed almost every early decision through a 2008 lens: she thought almost everything her own campaign had done was flawed and everything Obama’s had done was pristine.”
Since Obama had spent efficiently and Hillary in 2008 had not, this led to spending cutbacks in the 2016 race in crucial areas, including the hiring of outreach staff in states like Michigan. This led to a string of similarly insane self-defeating decisions. As the book puts it, the “obsession with efficiency had come at the cost of broad voter contact in states that would become important battlegrounds.”
If the ending to this story were anything other than Donald Trump being elected president, Shattered would be an awesome comedy, like a Kafka novel – a lunatic bureaucracy devouring itself. But since the ending is the opposite of funny, it will likely be consumed as a cautionary tale.
Shattered is what happens when political parties become too disconnected from their voters. Even if you think the election was stolen, any Democrat who reads this book will come away believing he or she belongs to a party stuck in a profound identity crisis. Trump or no Trump, the Democrats need therapy – and soon.