The 2,000-Year History of GPS Tracking

| Tue Apr. 15, 2014 3:00 AM PDT
Egyptian geographer Claudius Ptolemy and Hiawatha Bray’s “You Are Here”

Boston Globe technology writer Hiawatha Bray recalls the moment that inspired him to write his new book, You Are Here: From the Compass to GPS, the History and Future of How We Find Ourselves. “I got a phone around 2003 or so,” he says. “And when you turned the phone on—it was a Verizon dumb phone, it wasn’t anything fancy—it said, ‘GPS’. And I said, ‘GPS? There’s GPS in my phone?’” He asked around and discovered that yes, there was GPS in his phone, due to a 1994 FCC ruling. At the time, cellphone usage was increasing rapidly, but 911 and other emergency responders could only accurately track the location of land line callers. So the FCC decided that cellphone providers like Verizon must be able to give emergency responders a more accurate location of cellphone users calling 911. After discovering this, “It hit me,” Bray says. “We were about to enter a world in which…everybody had a cellphone, and that would also mean that we would know where everybody was. Somebody ought to write about that!”

So he began researching transformative events that lead to our new ability to navigate (almost) anywhere. In addition, he discovered the military-led GPS and government-led mapping technologies that helped create new digital industries. The result of his curiosity is You Are Here, an entertaining, detailed history of how we evolved from primitive navigation tools to our current state of instant digital mapping—and, of course, governments’ subsequent ability to track us. The book was finished prior to the recent disappearance of Malaysia Airlines flight 370, but Bray says gaps in navigation and communication like that are now “few and far between.”

Here are 13 pivotal moments in the history of GPS tracking and digital mapping that Bray points out in You Are Here:

1st century: The Chinese begin writing about mysterious ladles made of lodestone. The ladle handles always point south when used during future-telling rituals. In the following centuries, lodestone’s magnetic abilities lead to the development of the first compasses.

Image: ladle

Model of a Han Dynasty south-indicating ladle Wikimedia Commons

2nd century: Ptolemy’s Geography is published and sets the standard for maps that use latitude and longitude.

Image: Ptolemy map

Ptolemy’s 2nd-century world map (redrawn in the 15th century) Wikimedia Commons

1473: Abraham Zacuto begins working on solar declination tables. They take him five years, but once finished, the tables allow sailors to determine their latitude on any ocean.

Image: declination tables

The Great Composition by Abraham Zacuto. (A 17th-century copy of the manuscript originally written by Zacuto in 1491.) Courtesy of The Library of The Jewish Theological Seminary

1887: German physicist Heinrich Hertz creates electromagnetic waves, proof that electricity, magnetism, and light are related. His discovery inspires other inventors to experiment with radio and wireless transmissions.

Image: Hertz

The Hertz resonator John Jenkins. Sparkmuseum.com

1895: Italian inventor Guglielmo Marconi, one of those inventors inspired by Hertz’s experiment, attaches his radio transmitter antennae to the earth and sends telegraph messages miles away. Bray notes that there were many people before Marconi who had developed means of wireless communication. “Saying that Marconi invented the radio is like saying that Columbus discovered America,” he writes. But sending messages over long distances was Marconi’s great breakthrough.

Image: Marconi

Inventor Guglielmo Marconi in 1901, operating an apparatus similar to the one he used to transmit the first wireless signal across Atlantic Wikimedia Commons

1958: Approximately six months after the Soviets launched Sputnik, Frank McLure, the research director at Johns Hopkins Applied Physics Laboratory, calls physicists William Guier and George Weiffenbach into his office. Guier and Weiffenbach used radio receivers to listen to Sputnik’s consistent electronic beeping and calculate the Soviet satellite’s location; McLure wants to know if the process could work in reverse, allowing a satellite to location their position on earth. The foundation for GPS tracking is born.

​1969: A pair of Bell Labs scientists named William Boyle and George Smith create a silicon chip that records light and coverts it into digital data. It is called a charge-coupled device, or CCD, and serves as the basis for digital photography used in spy and mapping satellites.

1976: The top-secret, school-bus-size KH-11 satellite is launched. It uses Boyle and Smith’s CCD technology to take the first digital spy photographs. Prior to this digital technology, actual film was used for making spy photographs. It was a risky and dangerous venture for pilots like Francis Gary Powers, who was shot down while flying a U-2 spy plane and taking film photographs over the Soviet Union in 1960.

Image: KH-11 image

KH-11 satellite photo showing construction of a Kiev-class aircraft carrier Wikimedia Commons

1983: Korean Air Lines flight 007 is shot down after leaving Anchorage, Alaska, and veering into Soviet airspace. All 269 passengers are killed, including Georgia Democratic Rep. Larry McDonald. Two weeks after the attack, President Ronald Reagan directs the military’s GPS technology to be made available for civilian use so that similar tragedies would not be repeated. Bray notes, however, that GPS technology had always been intended to be made public eventually. Here’s Reagan’s address to the nation following the attack:

1989: The US Census Bureau releases (PDF) TIGER (Topologically Integrated Geographic Encoding and Referencing) into the public domain. The digital map data allows any individual or company to create virtual maps.

1994: The FCC declares that wireless carriers must find ways for emergency services to locate mobile 911 callers. Cellphone companies choose to use their cellphone towers to comply. However, entrepreneurs begin to see the potential for GPS-integrated phones, as well. Bray highlights SnapTrack, a company that figures out early on how to squeeze GPS systems into phones—and is purchased by Qualcomm in 2000 for $1 billion.

1996: GeoSystems launches an internet-based mapping service called MapQuest, which uses the Census Bureau’s public-domain mapping data. It attracts hundreds of thousands of users and is purchased by AOL four years later for $1.1 billion.

2004: Google buys Australian mapping startup Where 2 Technologies and American satellite photography company Keyhole for undisclosed amounts. The next year, they launch Google Maps, which is now the most-used mobile app in the world.

2012: The Supreme Court ruling in United States v. Jones (PDF) restricts police usage of GPS to track suspected criminals. Bray tells the story of Antoine Jones, who was convicted of dealing cocaine after police placed a GPS device on his wife’s Jeep to track his movements. The court’s decision in his case is unanimous: The GPS device had been placed without a valid search warrant. Despite the unanimous decision, just five justices signed off on the majority opinion. Others wanted further privacy protections in such cases—a mixed decision that leaves future battles for privacy open to interpretation.

 

http://www.motherjones.com/mixed-media/2014/04/you-are-here-book-hiawatha-bray-gps-navigation

What We Lose When We Rip the Heart Out of Arts Education



It’s National Poetry Month, but if the Common Core has its way,
our children will hardly know what poetry is.

Photo Credit: Aaron Amat via Shutterstock.com

“No, no. You’ve got something the test and machines will never be able to measure: you’re artistic. That’s one of the tragedies of our times, that no machine has ever been built that can recognize that quality, appreciate it, foster it, sympathize with it.” —Paul Proteus to his wife Anita in Kurt Vonnegut’s Player Piano

“So much depends upon a red wheel barrow glazed with rain water beside the white chickens” is, essentially, a grammatical sentence in the English language. While the syntax is somewhat out of the norm, the diction is accessible to small children—the hardest word likely being “depends.” But “The Red Wheelbarrow” by William Carlos Williams is much more than a sentence; it is a poem:

so much depends
upon

a red wheel
barrow

glazed with rain
water

beside the white
chickens.

A relatively simple sentence shaped into purposeful lines and stanzas becomes poetry. And like Langston Hughes’ “Harlem” and Gwendolyn Brooks’ “We Real Cool,” it sparks in me a profoundly important response each time I read these poems:I wish I had written that. It is the same awe and wonder I felt as a shy, self-conscious teenager when I bought, collected and read comic books, marveling at the artwork I wished I had drawn.

Will we wake one morning soon to find the carcasses of poems washed up on the beach by the tsunami of the Common Core?

That question, especially during National Poetry Month, haunts me more every day, notably because of the double-impending doom augured by the Common Core: the rise of nonfiction (and the concurrent erasing of poetry and fiction) from the ELA curriculum and the mantra-of-the-moment, “close reading” (the sheep’s clothing for that familiar old wolf New Criticism):

We have come to a moment in the history of the U.S. when we no longer even pretend to care about art. And poetry is the most human of the arts—the very human effort to make order out of chaos, meaning out of the meaningless: “Daddy, daddy, you bastard, I’m through” (Sylvia Plath, “Daddy”).

***

The course was speech, taught by Mr. Brannon. I was a freshman at a junior college just 15-20 miles from my home. Despite the college’s close proximity to my home, my father insisted I live on campus. But that class and those first two years of college were more than living on campus; they were the essential beginning of my life.

In one of the earliest classes, Mr. Brannon read aloud and gave us a copy of “[in Just-]“ by e. e. cummings. I imagine that moment was, for me, what many people describe as a religious experience. That was more than 30 years ago, but I own two precious books that followed from that day in class: cummings’ Complete Poems and Selected Poems. Several years later, Emily Dickinson‘s Complete Poemswould join my commitment to reading every poem by those poets who made me respond over and over, I wish I had written that.

But my introduction to cummings was more than just finding the poetry I wanted to read; it was when I realized I was a poet. Now, when the words “j was young&happy” come to me, I know there is work to do—I recognize the gift of poetry.

***

As a high school English teacher, I divided my academic year into quarters by genre/form: nonfiction, poetry, short fiction, and novels/plays. The poetry quarter, when announced to students, initially received moans and even direct complaints: “I hate poetry.” That always broke my heart. Life and school had already taken something very precious from these young people:

children guessed (but only a few
and down they forgot as up they grew…
                              (“[anyone lived in a pretty how town],” e.e. cummings

I began to teach poetry in conjunction with popular songs. Although my students in rural South Carolina were overwhelmingly country music fans, I focused my nine weeks of poetry on the songs of alternative group R.E.M. At first, that too elicited moans from students in those early days of exploring poetry (see that unit on the blog “There’s time to teach”).

Concurrently, throughout my high school teaching career, students would gather in my room during our long mid-morning break and lunch (much to the chagrin of administration). And almost always, we played music, even closing the door so two of my students could dance and sing and laugh along with the Violent Femmes.

Many of those students are in their 30s and 40s, but it is common for them to contact me—often on Facebook—and recall fondly R.E.M. and our poetry unit. Those days meant something to them that lingers, that matters in ways that cannot be measured. It was an oasis of happiness in their lives at school.

***

e.e. cummings begins “since feeling is first,” and then adds:

my blood approves,
and kisses are better fate
than wisdom
lady i swear by all flowers. Don’t cry
—the best gesture of my brain is less than
your eyelids’ flutter….

Each year when my students and I examined this poem, we would discuss that cummings—in Andrew Marvell fashion—offers an argument that is profoundly unlike what parents, teachers, preachers, and politicians claim.

I often paired this poem with Coldplay’s “The Scientist,” focusing on:

I was just guessing at numbers and figures
Pulling your puzzles apart
Questions of science, science and progress
Do not speak as loud as my heart

Especially for teenagers, this question, this tension between heart and mind, mattered. Just as it recurs in the words of poets and musicians over decades, centuries. Poetry, as with all art, is the expressed heart—that quest to rise above our corporeal humanness:

               Bold Lover, never, never canst thou kiss,
Though winning near the goal yet, do not grieve;
       She cannot fade, though thou hast not thy bliss,
               For ever wilt thou love, and she be fair!
                                           (Ode on a Grecian Urn,” John Keats)

***

I have loved a few people intensely—so deeply that my love, I believe, resides permanently in my bones. One such love is my daughter, and she now carries the next human who will add to that ache of being fully human—loving another beyond words.

And that is poetry.

Poetry is not identifying iambic pentameter on a poetry test or discussing the nuances of enjambment in an analysis of a Dickinson poem.

Poems are not fodder for close reading.

Poetry is the ineluctable “Oh my heart” that comes from living fully in the moment, the moment that draws us to words as well as inspires us toward words.

We read a poem, we listen to a song, and our hearts rise out of our eyes as tears.

That is poetry.

Like the picture books of our childhood, poetry must be a part of our learning, essential to our school days—each poem an oasis of happiness that “machines will never be able to measure.”

***

Will we wake one morning to find the carcasses of poems washed up on the beach by the tsunami of the Common Core?

Maybe the doomsayers are wrong. Maybe poetry will not be erased from our classrooms. School with less poetry is school with less heart. School with no poetry is school with no heart.

Both are tragic mistakes, because if school needs anything, it is more heart. And poetry? Oh my heart.

This piece originally appeared on the Becoming Radical blog.

New study finds US to be ruled by oligarchic elite

by Jerome Roos on April 17, 2014

Post image for New study finds US to be ruled by oligarchic elite

Political scientists show that average American has “near-zero” influence on policy outcomes, but their groundbreaking study is not without problems.

 

It’s not every day that an academic article in the arcane world of American political science makes headlines around the world, but then again, these aren’t normal days either. On Wednesday, various mainstream media outlets — including even the conservative British daily The Telegraph — ran a series of articles with essentially the same title: “Study finds that US is an oligarchy.” Or, as the Washington Post summed up: “Rich people rule!” The paper, according to the review in the Post, “should reshape how we think about American democracy.”

The conclusion sounds like it could have come straight out of a general assembly or drum circle at Zuccotti Park, but the authors of the paper in question — two Professors of Politics at Princeton and Northwestern University — aren’t quite of the radical dreadlocked variety. No, like Piketty’s book, this article is real “science”. It’s even got numbers in it! Martin Gilens of Princeton and Benjamin Page of Northwestern University took a dataset of 1,779 policy issues, ran a bunch of regressions, and basically found that the United States is not a democracy after all:

Multivariate analysis indicates that economic elites and organized groups representing business interests have substantial independent impacts on U.S. government policy, while average citizens and mass-based interest groups have little or no independent influence. The results provide substantial support for theories of Economic Elite Domination and for theories of Biased Pluralism, but not for theories of Majoritarian Electoral Democracy or Majoritarian Pluralism.

The findings, of course, are both very interesting and very obvious. What Gilens and Page claim to have empirically demonstrated is that policy outcomes by and large favor the interests of business and the wealthiest segment of the population, while the preferences of the vast majority of Americans are of little to no consequence for policy outcomes. As the authors show, this new data backs up the conclusions of a number of long-forgotten studies from the 1950s and 1960s — not least the landmark contributions by C.W. Mills and Ralph Miliband — that tried to debunk the assertion of mainstream pluralist scholars that no single interest group dominates US policymaking.

But while Gilens and Page’s study will undoubtedly be considered a milestone in the study of business power, there’s also a risk in focusing too narrowly on the elites and their interest groups themselves; namely the risk of losing sight of the broader set of social relations and institutional arrangements in which they are embedded. What I am referring to, of course, is the dreaded C-word: capitalism — a term that appears only once in the main body of Gilens and Page’s text, in a superficial reference to The Communist Manifesto, whose claims are quickly dismissed as empirically untestable. How can you talk about oligarchy and economic elites without talking about capitalism?

What’s missing from the analysis is therefore precisely what was missing from C.W. Mills’ and Miliband’s studies: an account of the nature of the capitalist state as such. By branding the US political system an “oligarchy”, the authors conveniently sidestep an even thornier question: what if oligarchy, as opposed to democracy, is actually the natural political form in capitalist society? What if the capitalist state is by its very definition an oligarchic form of domination? If that’s the case, the authors have merely proved the obvious: that the United States is a thoroughly capitalist society. Congratulations for figuring that one out! They should have just called a spade a spade.

That, of course, wouldn’t have raised many eyebrows. But it’s worth noting that this was precisely the critique that Nicos Poulantzas leveled at Ralph Miliband in the New Left Review in the early 1970s — and it doesn’t take an Althusserian structuralist to see that he had a point. Miliband’s study of capitalist elites, Poulantzas showed, was very useful for debunking pluralist illusions about the democratic nature of US politics, but by focusing narrowly on elite preferences and the “instrumental” use of political and economic resources to influence policy, Miliband’s empiricism ceded way too much methodological ground to “bourgeois” political science. By trying to painstakingly prove the existence of a causal relationship between instrumental elite behavior and policy outcomes, Miliband ended up missing the bigger picture: the class-bias inherent in the capitalist state itself, irrespective of who occupies it.

These methodological and theoretical limitations have consequences that extend far beyond the academic debate: at the end of the day, these are political questions. The way we perceive business power and define the capitalist state will inevitably have serious implications for our political strategies. The danger with empirical studies that narrowly emphasize the role of elites at the expense of the deeper structural sources of capitalist power is that they will end up reinforcing the illusion that simply replacing the elites and “taking money out of politics” would be sufficient to restore democracy to its past glory. That, of course, would be profoundly misleading. If we are serious about unseating the oligarchs from power, let’s make sure not to get carried away by the numbers and not to lose sight of the bigger picture.

Jerome Roos is a PhD candidate in International Political Economy at the European University Institute, and founding editor of ROAR Magazine.

The commons lies at the heart of a major cultural and social shift now underway.

The New Economic Events Giving Lie to the Fiction That

We Are All Selfish, Rational Materialists

Photo Credit: AllanGregg; Screenshot / YouTube.com

Jeremy Rifkin’s new book, “The Zero Marginal Cost Society,” brings welcome new attention to the commons just as it begins to explode in countless new directions. His book focuses on one of the most significant vectors of commons-based innovation — the Internet and digital technologies — and documents how the incremental costs of nearly everything is rapidly diminishing, often to zero. Rifkin explored the sweeping implications of this trend in an excerpt from his book and points to the “eclipse of capitalism” in the decades ahead.

But it’s worth noting that the commons is not just an Internet phenomenon or a matter of economics. The commons lies at the heart of a major cultural and social shift now underway. People’s attitudes about corporate property rights and neoliberal capitalism are changing as cooperative endeavors — on digital networks and elsewhere — become more feasible and attractive. This can be seen in the proliferation of hackerspaces and Fablabs, in the growth of alternative currencies, in many land trusts and cooperatives and in seed-sharing collectives and countless natural resource commons.

Beneath the radar screen of mainstream politics, which remains largely clueless about such cultural trends on the edge, a new breed of commoners is building the vision of a very different kind of society, project by project. This new universe of social activity is being built on the foundation of a very different ethics and social logic than that of homo economicus — the economist’s fiction that we are all selfish, utility-maximizing, rational materialists.

Durable projects based on social cooperation are producing enormous amounts of wealth; it’s just that this wealth is not generally not monetized or traded. It’s socially or ecologically embedded wealth that is managed by self-styled commoners themselves. Typically, such commoners act more as stewards of their common wealth than as owners who treat it as private capital. Commoners realize that a life defined by impersonal transactions is not as rich or satisfying as one defined by abiding relationships. The larger trends toward zero-marginal-cost production make it perfectly logical for people to seek out commons-based alternatives.

You can find these alternatives popping up all over: in the 10,000-plus open access scientific journals whose research is freely shareable to anyone and in community gardens that produce both fresh vegetables and neighborliness. In hundreds of “timebanks” that let people meet basic needs through time-barters, and in highly productive, ecologically minded commons-based agriculture.

Economists tend to ignore such wealth because it generally doesn’t involve market activity. No cash is exchanged, no legal contracts signed and no measureable Gross Domestic Product is generated. But the wealth of the commons is not accumulated like capital; its vitality comes from being circulated. As I describe in my new book, “Think Like a Commoner,” the story of our time is the rise of the commons as a new way to emancipate oneself from predatory markets and to collaborate with peers to protect and expand one’s shared wealth. This is a story that is being played out in countless digital arenas, as Rifkin documents, but also in such diverse contexts as cities, farming, museums, theaters and indigenous communities.

One reason that so many commons arise and flourish is because they help their participants meet important basic needs in fair, responsive and socially satisfying ways. That’s quite attractive to those who are otherwise held captive by conventional, predatory markets. Big agriculture is more concerned with efficiency and profit than ecological stewardship. Large transnationals are more interested in rip-and-run resource extraction (mining, fracking, timber) than in the protection of sacred lands and time-honored ways of life. “Copyright industries” like Hollywood and record labels want to treat all of culture as tightly controlled “product,” not as something that is freely shared and built upon.

Nowadays the commons has a special appeal for people of the global South who are often victimized by the “enclosures” inflicted by neoliberal investment and trade policies. Enclosures are the act of privatizing and commodifying previously shared resources. For example, millions of acres of land in Africa, Asia and Latin America are currently being seized by investors in a massive international land grab. Hedge funds and even the government of South Korea, Saudi Arabia and China are enacting an eerie replay of the English enclosure movement. Commoners who have worked the land for generations as a customary right are being forced to migrate to cities in search of work, where they often end up as paupers and sweatshop employees: a modern-day replay of Charles Dickens’ novels.

By the lights of modern economic theory, it’s all for the best because it promotes “development” (i.e., consumerism and other market dependencies). But many commoners are now fighting the dispossession and dependencies that enclosures entail by struggling to retain some measure of dignity and self-determination through their commons. The International Land Alliance estimates that 2 billion people around the world depend upon subsistence commons of forests, fisheries, arable land, water and wild game to meet their everyday needs.

Strangely, the leading introductory economics textbooks in the U.S. virtually ignore the commons except for the obligatory warning about the “tragedy of the commons.” They prefer not to recognize that the commons represents an entirely viable but different paradigm of “development” – one that can transcend the unsustainable consumerism, cultural disintegration and economic growth of our time. As the late Nobel Prize winner Elinor Ostrom showed, commons are an entirely sustainable, ecologically friendly model of resource management, contrary to the “tragedy” parable.

Commoners are not all alike. They have many profound differences in their governance systems, management practices and cultural values. And commons are not without their conflicts, struggles and failures. That said, most commoners tend to share fundamental commitments to participation, openness, inclusiveness, social equity, ecological respect and human rights.

The politics of the commons movement can be confounding to conventional observers because political goals are not the paramount priority; protection of the commons is. Commoners tend to be more focused on “prepolitical” social activity and relationships, which is why commons are embraced by such a wide variety of people. As German commons advocate Silke Helfrich notes in The Wealth of the Commons, “Commons draw from the best of all political ideologies.” Conservatives like the tendency of commons to promote responsibility. Liberals are pleased with the focus on equality and basic social entitlement. Libertarians like the emphasis on individual initiative. And leftists like the idea of limiting the scope of the Market.

It is important to realize that the commons is not a discussion about objects, but a discussion about who we are and how we treat each other. What decisions are being made about our resources? Does economic activity satisfy basic human needs and honor human rights and dignity? These kind of discussions are not often heard in in conventional business and policy circles, alas.

To conventional minds, the idea of the commons as a paradigm of social governance appears either utopian or communistic, or at the very least, impractical. But a diverse, eclectic universe of commons around the world demonstrates otherwise. It is the neoliberal project of ever-expanding consumption on a global scale that is the utopian, totalistic dream. It manifestly cannot fulfill its mythological vision of human progress through ubiquitous market activity and greater heaps of private consumption, if only because it demands more from Nature than it can possibly deliver – while inflicting too much social inequity and disruption as well.

Fortunately, the Internet and indigenous peoples, the re-localization movement and hackers, community foresters and fishing cooperatives and many, many others, are showing that the commons can be an effective vehicle for social and political emancipation. Jeremy Rifkin’s astute analysis of this powerful trend will help open up a much-needed discussion in the stodgy precincts of conventional economics.

David A. Bollier is an author, activist, blogger and independent scholar with a primary focus on “the commons” as a new paradigm for economics, politics, and culture. He is the founding editor of Onthecommons.org (2002-2010), co-founder and principal of the international consulting project Commons Strategy Group, and co-director of the Commons Law Project. Bollier is the author of numerous books, including “Think Like a Commoner: A Short Introduction to the Life of the Commons.”

 http://www.alternet.org/economy/were-about-enter-whole-new-era-economics-and-its-going-make-everyone-feel-lot-more-wealthy?akid=11716.265072.WdcnEx&rd=1&src=newsletter981596&t=7&paging=off&current_page=1#bookmark

How the Internet Is Taking Away America’s Religion

Back in 1990, about 8 percent of the U.S. population had no religious preference. By 2010, this percentage had more than doubled to 18 percent. That’s a difference of about 25 million people, all of whom have somehow lost their religion.

That raises an obvious question: how come? Why are Americans losing their faith?

Today, we get a possible answer thanks to the work of Allen Downey, a computer scientist at the Olin College of Engineering in Massachusetts, who has analyzed the data in detail. He says that the demise is the result of several factors but the most controversial of these is the rise of the Internet. He concludes that the increase in Internet use in the last two decades has caused a significant drop in religious affiliation.

Downey’s data comes from the General Social Survey, a widely respected sociological survey carried out by the University of Chicago, that has regularly measure people’s attitudes and demographics since 1972.

In that time, the General Social Survey has asked people questions such as: “what is your religious preference?” and “in what religion were you raised?” It also collects data on each respondent’s age, level of education, socioeconomic group, and so on. And in the Internet era, it has asked how long each person spends online. The total data set that Downey used consists of responses from almost 9,000 people.

Downey’s approach is to determine how the drop in religious affiliation correlates with other elements of the survey such as religious upbringing, socioeconomic status, education, and so on.

He finds that the biggest influence on religious affiliation is religious upbringing—people who are brought up in a religion are more likely to be affiliated to that religion later.

However, the number of people with a religious upbringing has dropped since 1990. It’s easy to imagine how this inevitably leads to a fall in the number who are religious later in life. In fact, Downey’s analysis shows that this is an important factor. However, it cannot account for all of the fall or anywhere near it. In fact, that data indicates that it only explains about 25 percent of the drop.

He goes on to show that college-level education also correlates with the drop. Once it again, it’s easy to imagine how contact with a wider group of people at college might contribute to a loss of religion.

Since the 1980s, the fraction of people receiving college level education has increased from 17.4 percent to 27.2 percent in the 2000s. So it’s not surprising that this is reflected in the drop in numbers claiming religious affiliation today. But although the correlation is statistically significant, it can only account for about 5 percent of the drop, so some other factor must also be involved.

That’s where the Internet comes in. In the 1980s, Internet use was essentially zero, but in 2010, 53 percent of the population spent two hours per week online and 25 percent surfed for more than 7 hours.

This increase closely matches the decrease in religious affiliation. In fact, Downey calculates that it can account for about 25 percent of the drop.

That’s a fascinating result. It implies that since 1990, the increase in Internet use has had as powerful an influence on religious affiliation as the drop in religious upbringing.

At this point, it’s worth spending a little time talking about the nature of these conclusions. What Downey has found is correlations and any statistician will tell you that correlations do not imply causation. If A is correlated with B, there can be several possible explanations. A might cause B, B might cause A, or some other factor might cause both A and B.

But that does not mean that it is impossible to draw conclusions from correlations, only that they must be properly guarded. “Correlation does provide evidence in favor of causation, especially when we can eliminate alternative explanations or have reason to believe that they are less likely,” says Downey.

For example, it’s easy to imagine that a religious upbringing causes religious affiliation later in life. However, it’s impossible for the correlation to work the other way round. Religious affiliation later in life cannot cause a religious upbringing (although it may color a person’s view of their upbringing).

It’s also straightforward to imagine how spending time on the Internet can lead to religious disaffiliation. “For people living in homogeneous communities, the Internet provides opportunities to find information about people of other religions (and none), and to interact with them personally,” says Downey. “Conversely, it is harder (but not impossible) to imagine plausible reasons why disaffiliation might cause increased Internet use.”

There is another possibility, of course: that a third unidentified factor causes both increased Internet use and religious disaffiliation. But Downey discounts this possibility. “We have controlled for most of the obvious candidates, including income, education, socioeconomic status, and rural/urban environments,” he says.

If this third factor exists, it must have specific characteristics. It would have to be something new that was increasing in prevalence during the 1990s and 2000s, just like the Internet. “It is hard to imagine what that factor might be,” says Downey.

That leaves him in little doubt that his conclusion is reasonable. “Internet use decreases the chance of religious affiliation,” he says.

But there is something else going on here too. Downey has found three factors—the drop in religious upbringing, the increase in college-level education and the increase in Internet use—that together explain about 50 percent of the drop in religious affiliation.

But what of the other 50 percent? In the data, the only factor that correlates with this is date of birth—people born later are less likely to have a religious affiliation. But as Downey points out, year of birth cannot be a causal factor. “So about half of the observed change remains unexplained,” he says.

So that leaves us with a mystery. The drop in religious upbringing and the increase in Internet use seem to be causing people to lose their faith. But something else about modern life that is not captured in this data is having an even bigger impact.

What can that be? Answers please in the comments section.

Ref: http://arxiv.org/abs/1403.5534: Religious Affiliation, Education and Internet Use

 

http://www.technologyreview.com/view/526111/how-the-internet-is-taking-away-americas-religion/

Why Atheists Like Dawkins and Hitchens Are Dead Wrong



Acolytes of Dawkins & Hitchens pretend that ignorant evangelicals represent all of religion. Here’s what they miss.

Photo Credit: ollyy/Shutterstock.com

I’m supposed to hate science. Or so I’m told.

I spent my childhood with my nose firmly placed between the pages of books on reptiles, dinosaurs, marine life and mammals. When I wasn’t busy wondering if I wanted to be more like Barbara Walters or Nancy Drew, I was busy digging holes in my parents’ backyard hoping to find lost bones of some great prehistoric mystery. I spent hours sifting through rocks that could possibly connect me to the past or, maybe, a hidden crystalline adventure inside. Potatoes were both  apart of a delicious dinner and batteries for those ‘I got this’ moments; magnets repelling one another were a sorcery I needed to, somehow, defeat. The greatest teachers I ever had were Miss Frizzle and Bill Nye the Science Guy.

I also spent my childhood reciting verses from the Qur’an and a long prayer for everyone — in my family and the world — every night before going to bed. I spoke to my late grandfather, asking him to save me a spot in heaven. I went to the mosque and stepped on the shoes resting outside a prayer hall filled with worshippers. I tried fasting so I could be cool like my parents; played with prayer beads and always begged my mother to tell me more stories from the lives of the Abrahamic prophets.

With age, my wonder with religion and science did not cease. Both were, to me, extraordinary portals into the life around me that left me constantly bewildered, breathless and amazed.

Science would come to dominate my adolescent and early teenage years: papier mache cigarettes highlighting the most dangerous carcinogens, science fair projects on the virtues of chocolate consumption during menstruation; lamb lung and eye dissections, color coded notes, litmus tests on pretty papers, and disturbingly thorough study guides for five-question quizzes. My faith, too, remained operational in my day-to-day life: longer conversations with my late grandfather and all 30 Ramadan fasts, albeit with begrudging pre-dawn prayers. I attended Qur’anic recitation classes where I could not, for the life of me, recite anything that was not in English. I still read and listened to the stories of the prophets, with perhaps a greater sense of historical wonder and on occasion I would perform some of the daily prayers. Unsupervised access to the internet also led to the inevitable debates in Yahoo chat rooms about how Islam did not subjugate me as a woman. At the age of 16, I was busting out Quranic verses and references from the traditions of the Prophet Muhammad to shut up internet dwellers like Crusade563 and PopSmurf1967.

It never once occurred to me during those years, and later, that there could be any sort of a conflict between my faith and science; to me both were part of the same things: This universe and my existence within it.

And yet, here we are today being told that the two are irreconcilable; that religion begets an anti-science crusade and science pushes anti-religion valor. When did this become the only conversation on religion and science that we’re allowed to have?

This current discourse that pits faith and science against one another like Nero’s lions versus Christians — inappropriate analogy intended — borrows directly from the conflation of all religious traditions with the history and experience of Euro-American Christianity, specifically of the evangelical variety.

In my own religious tradition, Islam, there is a vibrant history of religion and science not just co-existing but informing one another intimately. Astrophysicistschemistsbiologistsalchemistssurgeonspsychologistsgeographerslogiciansmathematicians– amongst so many others – would often function as theologians, saints, spiritual masters, jurists and poets as much as they would as scientists. Indeed, a quick survey of some of the most well known Muslim intellectuals of the past 1,400 years illustrates their masterful polymathy, their ability to reach across fields of expertise without blinking at any supposed “dissonance.” And, of course, this is not something exclusive to Islam; across the religious terrain we can find countless polymaths who delved into the worlds of God and science.

Despite the history of the intellectual output of, well, the whole rest of the world, contemporary discussions in this country on the relationship between science and religion take religion to consist solely, again, of Euro-American Evangelical Christianity.  Thus “religious perspectives on human origins” are not really all that encompassing. Muslims, for instance, do not believe in Christian creationism and, actually, have differences on the nature of human origin. The Muslim creationism movement, headed by Turkish author and creationist activist Adnan Oktar (known popularly by the pseudonym Harun Yahya), is actually relatively recent and borrows much from Christian creationism – including even directly copied passages and arguments from anti-evolution Christian literature.

The absence of a centralized religious clergy and authority in Sunni Islam allows for individual and scholarly theological negotiation – meaning that there is not, necessarily, a “right” answer embedded in Divine Truth to social and political questions. Some of the most influential and fundamental Islamic legal texts are filled with arguments and counter-arguments which all come from the same source (divine revelation), just different approaches to it.

In other words: There’s plenty of wiggle room and then some. On anything that is not established as theological Truth (e.g. God’s existence, the finality of Prophethood, pillars and articles of faith), there is ample room for examination, debate and disagreement, because it does not undercut the fabric of faith itself.

Muslims, generally, accept evolution as a fundamental part of the natural process; they differ, however, on human evolution – specifically the idea that humans and apes share an ancestor in common.  In the 13th century, Shi’i Persian polymath Nasir al-din al-Tusi discussed biological evolution in his book “Akhlaq-i-Nasri” (Nasirean Ethics). While al-Tusi’s theory of evolution differs from the one put forward by Charles Darwin 600 years later and the theory of evolution that we have today, he argued that the elemental source of all living things was one. From this single elemental source came four attributes of nature: water, air, soil and fire – all of which would evolve into different living species through hereditary variability. Hierarchy would emerge through differences in learning how to adapt and survive. Al-Tusi’s discussion on biological evolution and the relationship of synchronicity between animate and inanimate (how they emerge from the same source and work in tandem with one another) objects is stunning in its observational precision as well as its fusion with theistic considerations. Yet it is, at best, unacknowledged today in the Euro-centric conversation on religion and science. Why?

My point here in this conversation about religion and science’s falsely created incommensurability isn’t about the existence of God – I would like to think that ultimately there is space for belief and disbelief. I would like to also believe, however, that the conversation on belief and disbelief can move beyond the Dawkinsean vitriol that disguises bigotry as a self-righteous claim to the sanctity of science; a claim that makes science the proudly held property of the Euro-American civilization and experience.

Hoisted into popular culture by the Holy Trinity of Dawkins-Hitchens-Harris, New Atheism mirrors the very religious zealotry it claims is at the root of so much moral, political and social decay. In particular, these authors and their posse of followers have – as Nathan Lean characterized it in this publication back in March of last year – taken a particular penchant for “flirting with Islamophobia.” Instead of engaging with Islamic theology, New Atheists – the most prominent figurehead being Richard Dawkins – are more interested in ridiculing Muslims and Islam by employing the use of the same tired, racist talking points and images that situate Muslims in need of ‘enlightenment’ – or, salvation.

The Evangelical Christian Right is a formidable force to be reckoned with in American national politics; there are legitimate fears by believing, non-believing and non-caring Americans that the course of the nation, from women’s rights to education, can and will be significantly set back because of the whims of loud and large group of citizens who refuse to acknowledge certain facts and changing realities and want the lives of all citizens to be subservient to their own will. This segment of the world’s religious topography, however, does not represent Religion or, in particular, Religion’s relationship with science.

Religion is a vast historical experience between human communities, its individual parts, the environment and something Sacred that acts as that elemental glue between everything. Science and religion are not incommensurable – and it’s time we stop treating them like they are.

 

Sana Saeed is a writer on politics with an interest in minority politics, media critique and religion in the public sphere. Follow her on Twitter@SanaSaeed.

http://www.alternet.org/belief/why-atheists-dawkins-and-hitchens-are-dead-wrong?akid=11690.265072.L-s5s2&rd=1&src=newsletter978792&t=11&paging=off&current_page=1

A Hard Rain: Noah, Revised

by LOUIS PROYECT

More Tolkien than Torah, Darin Arinovsky’s “Noah” is a cinematic tour de force that combines breathtaking CGI-based imaginary landscapes with a film score by Clint Mansell that hearkens back to Hollywood’s golden age of Bernard Herrmann and Max Steiner. Even without a single minute of dialog, the film achieves the mesmerizing quality of Godfrey Reggio’s Qatsi trilogy, especially the last installment Naqoyqatsi, the Hopi word for “Life at War”.

Like other films that view the bible as a theme to riff on in the manner of Miles Davis improvising on a banal tune like “Billy Boy”, Aronovsky takes the material of Genesis 5:32-10:1 and shapes it according to his own aesthetic and philosophical prerogatives. As might be expected, the Christian fundamentalists are not happy with the film since it turns Noah into something of a serial killer on an unprecedented scale, acting on what he conceives of as “the Creator’s” instructions, namely to bring the human race to an end. Religious Jews who have a literalist interpretation of the bible have been far less vocal, no doubt a function of the Hasidic sects viewing all movies as diversions from Torah studies. (For those with unfamiliarity with Jewish dogma, the Torah encompasses the first five books of the Old Testament that are replete with fables such as the Great Flood, many of which have inspired some classic cinematography, such as Charlton Heston splitting the Red Sea.)

Unlike the fable it is based on, Aronovsky’s Noah never received instructions about being fruitful and multiplying. His intention is to leave the planet to the animals and wind down the human race’s participation in the tree of life, to use the title of Terrence Malick’s overrated 2011 film. In my view, Aronovsky has much deeper thoughts and more sure-handed cinematic instincts than Malick could ever hope for. To pick only one scene, the massive moving carpet of animals headed toward the Ark is a CGI tour de force. Instead of a stately procession in circus parade fashion, it is more like a zoological tsunami that anticipates the great tsunami soon to follow.

Clint Mansell, whose orchestral accompaniment to this and other key scenes is so effective, has an interesting background. He was the lead singer and guitarist for the band Pop Will Eat Itself, a group that originated in 1981 and whose style incorporated hip-hop and industrial rock at one point or another. Mansell made the transition to film score composer in 1998, working on Aronovsky’s first film “Pi”, a surrealist thriller about a character named Maximillian Cohen who believed that everything in nature could be understood through numbers.

Speaking of numbers, Russell Crowe was cast perfectly as Noah given his past leading roles. As mathematician John Nash in A Perfect Mind, who suffered from schizophrenia, he played a man hearing voices after the fashion of Noah. The voices in Nash’s head told him that he had to save the world from the Commies, while those in Noah’s assured him that “the Creator” needed to kill everybody on earth except Noah and his immediate family. Which character was more insane? That’s the real question.

Another role that prepared Crowe for his latest was as Captain of the HMS Surprise, a British warship led on an Ahab-like pursuit of a French rival during the Napoleonic wars. As Captain Jack Aubrey, Crowe was ready to sacrifice his crew and himself for the greater glory of the British monarchy just as Noah was ready to do for “the Creator”, an entity that never makes much of an appearance in Aronovsky’s film, unlike the typical Biblical epic.

One of the two revisionist elements of Aronovsky’s film that have merited the most controversy is his inclusion of a character named Tubal-Cain who is a descendant of Adam’s bad son just as Noah is a descendant of the good son Seth. Played by Ray Winstone, Tubal-Cain is the warlord ruling over all those wicked people the Creator is bent on destroying, just like an artist who burns a painting from earlier in his career that he deems inferior to his latest. Unlike a movie based on the tale of “Sodom and Gomorrah”, it is not quite clear what got enraged God. After all, there are no sadomasochistic orgies going on in Tubal-Cain’s camp as he lays siege to Noah’s Ark (not that there is really anything wrong with sadomasochistic orgies). All we know from the Torah is that “The Lord saw how great the wickedness of the human race had become on the earth.” If you read the bible carefully, you’ll understand that the deity gets much more pissed off at worshipping false idols than he does over murder, theft, rape, and other acts normal people consider far more wicked. Indeed, Tubal-Cain is convinced that Noah is a mad man since his fundamentally “deep ecology” views on the need to rid the planet of the pestilent homo sapiens is at odds with God making man in his own image and giving him ”dominion over the fish of the sea and over the birds of the heavens and over the livestock and over all the earth and over every creeping thing that creeps on the earth.” What’s wrong with that? Animal rights lovers and vegetarians need not apply.

The other element is “the watchers”, who are Ent-like creatures that help Noah and his family ward off Tubal-Cain’s warriors while serving as carpenters on the Ark. Instead of being tree-like monsters, they are giants made of stone who happen to be “fallen angels” trying to get on the Creator’s good side after their past transgressions. Unlike the characters in John Milton’s “Paradise Lost”, these angels seem perfectly reasonable and no threat to the established order. As is persistent throughout the film and the Old Testament itself, the Creator’s moral compass often seems more broken than those he holds dominion over.

That fundamentally strikes me as the underlying philosophical issue of Aronovsky’s film, namely the impossibility of living a “good life” on the basis of biblical myths, legends, and fables. The moral relativism of “Noah” was likely to have angered those who believe that the bible was literally written by God, even if it was close to the mark.

The film also resonates with current-day concerns over a new threat to the continued existence of humanity, namely the climate change that is capable of a new Great Flood that will unfortunately only kill the innocent rather than the wicked. What the bible never makes clear is that god is merciful to those who have capital rather than pure hearts.

Unlike the past five extinctions, the sixth that is posed by climate change and other looming environmental disasters will be as a result of human intervention rather than a deus ex machina like a meteor.

Interestingly enough, there is some scholarly support for the idea that a great flood occurred in the distant past, one that is evoked not only in the Torah but in the Babylonian Epic of Gilgamesh and Plato’s Timaeus as well.

In an article titled “Noah’s Flood Reconsidered” for the autumn 1964 issue of Iraq, a scholarly journal, E.I. Mallowan concluded that the flood depicted in the Epic of Gilgamesh—the obvious inspiration for Noah—occurred some time prior to 2650 BC.

Indeed, archaeologists working in the ancient city of Ur in 1928-29 found evidence of two deep pits that exposed a stratum of “clean water-laid clay”, proving in their eyes that a Noachian-type flood had occurred. However, neither the Epic of Gilgamesh nor the archaeologists viewed the flood as impacting all of humanity, only a great city and civilization that existed at the dawn of history. Despite Iraq’s reputation as desert-like, it is also subject to powerful storms that wash away everything in its path—a natural catastrophe rivaling the man-made catastrophe of George W. Bush.

It has been many years since I looked at Plato’s Timaeus—48 in fact, when I was avoiding the draft in the New School Graduate Philosophy program—but I took a quick look in preparing this article.

Like the rest of his work, this is a Socratic dialog in which the principals are sounding boards for Plato’s idealism. One of them, an Athenian named Timaeus, describes a Creator who is a lot more human than the cruel and capricious figure of the Old Testament: “Why did the Creator make the world?…He was good, and therefore not jealous, and being free from jealousy he desired that all things should be like himself.” And, like the hero of Darin Aronovsky’s “Pi”, Plato’s creator sees the natural world as one based on numbers. After creating three major entities of the existing world—body, soul, and essence—god proceeded to divide the entire mass into portions related to one another in the ratios of 1, 2, 3, 4, 9, 8, and 27.

Once Timaeus establishes the ratios that govern the known universe, he drills down into the less than perfect reality that govern our daily lives, such as those inflicted on our bodies: “When on the other hand the body, though wasted, still holds out, then the bile is expelled, like an exile from a factious state, causing associating diarrhoeas and dysenteries and similar disorders.”

Critias, another Athenian, weighs in on the ever-present danger of natural catastrophes including the one that befell Atlantis:

Now in this island of Atlantis there was a great and wonderful empire which had rule over the whole island and several others, and over parts of the continent, and, furthermore, the men of Atlantis had subjected the parts of Libya within the columns of Heracles as far as Egypt, and of Europe as far as Tyrrhenia….But afterwards there occurred violent earthquakes and floods; and in a single day and night of misfortune all your warlike men in a body sank into the earth, and the island of Atlantis in like manner disappeared in the depths of the sea. For which reason the sea in those parts is impassable and impenetrable, because there is a shoal of mud in the way; and this was caused by the subsidence of the island.

Perhaps someday archaeologists will discover evidence of a great flood that destroyed Atlantis just as they have found evidence of the flood depicted in the Epic of Gilgamesh. In late January divers discovered perfectly preserved stone-age tools that were between 10 and 11,000 years old in the Swedish bay of Hanö. Södertörn University’s Björn Nilsson, the leader of the research team, was annoyed (by comparisons in the popular press made to Atlantis:

Nilsson admitted that “lousy Swedish tabloids” had blown the story out of the water by labelling the find “Sweden’s Atlantis”, even though the remnants never belonged to an actual village. The people were all nomadic at the time, he explained, so there was no village. He trumpeted, however, that the finds so far were “world-class” and “one-of-a-kind”. He added that was extremely rare to find evidence from the Stone Age so unspoiled.

We’ll probably never know what caused these nomads to be swept away by floods but we will know what might cover Manhattan under the Atlantic in the not too distant future. We cannot go back in history to change the circumstances that led to such disasters but we can control our own fate in order to save both animals and the human race. For that effort we need to rely on science and radical politics, not the Creator.

Louis Proyect blogs at http://louisproyect.org and is the moderator of the Marxism mailing list. In his spare time, he reviews films for CounterPunch.

 

http://www.counterpunch.org/2014/04/04/a-hard-rain-2/

 

The conscience and courage of Chelsea Manning

by Nozomi Hayase on April 4, 2014

Post image for The conscience and courage of Chelsea Manning

Four years after WikiLeaks’ release of the Collateral Murder video, Manning’s contagious courage continues to reveal the dehumanized colonizer within.

Four years have passed since WikiLeaks’ sensational release of the classified US military video titled Collateral MurderOn April 5, 2010, the raw footage was published depicting airstrikes by a US Army helicopter gunship in the Iraqi suburb of New Baghdad. The soldiers attacked Iraqis, killing about a dozen men wandering down a street, including two Reuters staffers, Namir Noor-Eldeen and Saeed Chmagh in the first of three reckless attacks involving civilians.

The video opened with a quote from George Orwell: “Political language … is designed to make lies sound truthful and murder respectable, and to give the appearance of solidity to pure wind.” It gained global attention, with viewers reaching millions, shattering the euphemism of ‘collateral damage’ and revealing the true state of modern warfare behind the warping shield of propaganda.

Much focus in the media at the time was given to analyzing whether some of the Iraqi people in the video were carrying rocket propelled grenades or AK-47s and arguments ensued about the rules of engagement. The unfolding of these scenes calls for recognition, for us to take a look at these wars from a wider perspective than the narrow view offered by the establishment media lens.

Before anyone talks about the laws of armed conflict and whether the rules of engagement were broken or not, we need to ask why these armed crews were even there in the first place. We should be examining the legality of the Iraq War itself. Speaking in defense of the disclosure of classified US military documents on the Iraq War, Assange pointed out how “most wars that are started by democracies involve lying,” and noted how “the start of the Iraq war involved very serious lies that were repeated and amplified by some parts of the press.”

Iraq has never been shown to have threatened the United States and it is common knowledge that the premise of this war was based on blatant lies. Colin Powell’s fabrications at the UN Security Council about Iraq’s supposed weapons of mass destruction were a particular low point for the US in its base war propaganda. The International Military Tribunal at Nuremberg designated the term ‘war of aggression’ as an attack on another nation or people without any justification of self-defense and listed it as a major international war crime.

In a report given at a New York Commission Hearing in May 11, 1991, attorney and President Emeritus of the Center for Constitutional Rights Michael Ratner seriously questioned the conduct of United States against Iraq:

As people living in the United States we have an obligation not to close our eyes, cover our ears and remain silent. We must not and cannot be ‘good Germans.’ We must be, as Bertrand Russell said about the crimes committed by the U.S. in Vietnam, ‘Against the Crime of Silence.’ We must bear witness to the tens of thousands of deaths for whom our government and its leaders bear responsibility and ask the question, ‘Has the United States committed war crimes with regard to its initiation and conduct of the war against Iraq?’

The questions raised by the graphic video-game turkey-shoot nature of this video needs to be placed within its larger context, along with examining the justification or potential war crimes of each incident in the video.

The moving imagery in the video revealed a particular mindset displayed by these US military-trained soldiers. It is the consciousness behind the gun-sight. The mind is generally blind to biases behind a perception that is trained to look at the world through the crosshairs of a gun-sight. From a broader historical perspective, one could say it is a colonial mind that controls an inception point, setting its own rules of engagement and defining the course of events and destiny of those caught in it.

“Lets shoot. Light ‘em all up. Come on, fire!” In a series of air-to-ground attacks, a helicopter crew excitedly found a target. One man can be heard saying, “Oh, yeah, look at those dead bastards,” and another responds saying “Nice.” When they find a wounded individual trying to crawl away, another man simply says: “All you gotta do is pick up a weapon,” expressing his wish to shoot him.

After finding that there were kids in the minivan that they had engaged, simply on their way to school, one man can clearly be heard blaming the victims: “It’s their fault for bringing their kids into a battle.” These civilians are no longer seen as victims and the permission to engage is manufactured by the aggressors attacking “targets” who are just trying to get away.

In the original 38-minute video recording the scenes in New Baghdad on July 12, 2007, the past century has lingered to haunt our global society. The dark shadow of colonization is carried over into the military-industrial age of the 20th century with its outward-thrusting brutality. The cynical naming of the ‘Apache’ helicopter evokes a memory of the genocide of American natives long ago. Native American activist Winona LaDuke once spoke of how it is common military-speak when you leave a base in a foreign country to say that you are heading ‘out into Indian Country.’ The brutal projection of US power into the oil-rich Middle East contains echos of these historical ‘Indian Wars’. The unfolding scenes appear as if the US is almost glorifying and continuing these crimes against humanity from the past.

This colonial mentality and injustice, never atoned for, is now expanding into a global web of military forces that more and more serve hidden corporate goals and agendas. In Discourse on Colonialism, the French poet and author Aimé Césaire wrote how colonization brutalizes and de-civilizes even the colonizer himself:

[C]olonization … dehumanizes even the most civilized man; that colonial activity, colonial enterprise, colonial conquest, which is based on contempt for the native and justified by that contempt, inevitably tends to change him who undertakes it; that the colonizer, who in order to ease his conscience gets into the habit of seeing the other man as an animal, accustoms himself to treating him like an animal, and tends objectively to transform himself into an animal.

The real scenes of modern war on the ground stand like a mirror. Reflected in the graphic WikiLeaks video, we begin to see something about each one of us that has long escaped consciousness. In the raw image of this cruel scene, we can see a part of our culture’s collective shadow, as the barbarian degraded in the effort of ‘civilizing’ those ‘others’. Descending into torture, drone attacks on wedding parties and other acts of collateral murder, this barbarism is clothed in the rhetoric of civility and self-defense, yet reveals the unredeemed colonizer within.

What is it that is shattering the armament around the hearts of so many? The conscience of Chelsea Manning, the source behind the leak of Collateral Murder, was the spark for a worldwide awakening. Her act of conscience shattered the abstraction and opened the gate that guarded this inception point, allowing the public to bear witness to uncensored images of modern warfare and decide for themselves how to see it. In the unfolding images, we were able to see what Chelsea Manning saw.

At the pretrial hearing in her prosecution for leaking the largest trove of secret documents in US history, Manning read out a personal statement to the court in Fort Meade, Maryland, describing how she came to download hundreds of thousands of classified documents and videos from military databases and submit them to the whistleblowing website WikiLeaks. She spoke about facts regarding the 12 July 2007 aerial weapons team — the video depicting the incident in New Baghdad.

Manning began her statement by saying how at first, having already seen countless similar combat scenes, she didn’t think the video was very special. Yet she came to be troubled by “the recording of audio comments by the aerial weapons team crew and the second engagement in the video of an unarmed bongo truck.” Then she spoke of the attitudes of the soldiers in the helicopter: “The most alarming aspect of the video to me … was the seemly delightful bloodlust they appeared to have.” She continued:

They dehumanized the individuals they were engaging and seemed to not value human life by referring to them as “dead bastards” and congratulating each other on the ability to kill in large numbers. At one point in the video there is an individual on the ground attempting to crawl to safety. The individual is seriously wounded. Instead of calling for medical attention to the location, one of the aerial weapons team crew members verbally asks for the wounded person to pick up a weapon so that he can have a reason to engage. For me, this seems similar to a child torturing ants with a magnifying glass.

Manning furthermore spoke about the specific moment where the father driving his kids to school in a van stopped and attempted to assist the wounded:

While saddened by the aerial weapons team crew’s lack of concern about human life, I was disturbed by the response of the discovery of injured children at the scene. In the video, you can see that the bongo truck [was] driving up to assist the wounded individual. In response the aerial weapons team crew — as soon as the individuals are a threat, they repeatedly request for authorization to fire on the bongo truck and once granted they engage the vehicle at least six times.

She further pointed to the attitude of the aerial weapons team when they learned about the injured children in the van, noting how their actions showed no remorse or sympathy for those they killed or injured, even exhibiting pleasure when a vehicle drove over one of the bodies.

Manning had come to see this everyday reality in Iraq from the perspective of those who have been conjured into the designation of ‘enemy’. From that moment, she began to see these unfolding human tragedies increasingly from the point of view of those she was trained to see as others; those who have been methodically demonized throughout this war of terror.

How should we understand this sudden awakening of conscience? In elucidating the etymology of the word conscience, the Jungian psychoanalyst Edward Edinger related it to the concept of consciousness:

Conscious derives from con or cum, meaning ‘with’ or ‘together,’ and scire, ‘to know’ or ‘to see’. It has the same derivation as conscience. Thus the root meaning of both consciousness and conscience is ‘knowing with’ or ‘seeing with’ an ‘other’. In contrast, the word science, which also derives from scire, means simply knowing, i.e., knowing without ‘withness.’ … The experience of knowing with can be understood to mean the ability to participate in a knowing process simultaneously as subject and object, as knower and known. This is only possible within a relationship to an object that can also be a subject.

Conscience first engages the empathic imagination, breaking down walls of separation. One can begin to feel another person’s pain as if it were one’s own. The moment Manning saw other human beings who she had been trained to see as ‘enemy combatants’ in the gunsight, she freed them from a perception enslaved by the subject position of US supremacy; a perception that had made these human beings into lifeless objects. Here, the other perspective that had long been denied was brought back to consciousness. Manning saw another human being whose life was as precious as hers; not an enemy, but a victim of an unjust war waged by an imperialist military-industrial complex.

In the famous chat log with hacker Adrian Lamo that led to her arrest, Manning recounted how she wanted “people to see the truth… regardless of who they are… because without information, you cannot make informed decisions as a public… We’re human… and we’re killing ourselves…”

Manning saw what people too often fail to see: she saw those who had been branded ‘enemy combatants’ as human beings like herself. This happened also to US soldier Ethan McCord, who rescued the little girl from the bongo truck in the Collateral Murder video, and who realized she was no different from his own daughter:

Manning’s deed of whistleblowing was an act of conscience: knowledge gained by placing herself in a relationship with others; putting herself in the other’s shoes. She was willing to sacrifice her safety to restore a lost image; an inception point and authentic act of courage from a place of our common humanity.

Manning’s courage to act out of her conscience interrupted a trajectory of history that had been moving in a particular direction. The memory started to flow, reaching back before the invasion of Iraq, before 9/11 and even before the nation’s addiction to oil began — to the genocide of the natives; the moment when those who are made enemies became dehumanized in eyes.

Before anyone even starts talking about justification for acts of war, we should all be asking: who are these Iraqis and Afghans, these Libyans or Syrians who are so often portrayed as “putting America in danger”? In that iconic leaked footage from a fateful day in New Baghdad, who did we see or fail to see? Unfolding images of the decimated Reuters reporters shot from the Apache helicopter confront us with a question: are we truly civilized? Who are the people who have been dehumanized, turned into enemies and made into inferior beings?

One ordinary person with extraordinary courage offered the possibility to restart a genuine conversation about the legitimacy of Western “civilization” that has until now been operating as a monologue. Manning created a possibility for real dialogue, one that is long overdue. Her courage, and the tireless work of those at WikiLeaks, calls us to truly see these events beyond the political language that makes lies sound truthful and murder respectable.

Are we able to witness what is really happening — ongoing collateral murder carried out in our name — even right in this very moment? Manning’s conscience awakened her heart. We, too, can awaken our hearts, for courage is contagious.

Cable news is living in an alternate universe!

Obsessed with missing planes and a “new Cold War,” TV news people are missing the stories right under their noses

 

Cable news is living in an alternate universe!
This piece originally appeared on TomDispatch.

Isn’t there something strangely reassuring when your eyeballs are gripped by a “mystery” on the news that has no greater meaning and yet sweeps all else away?  This, of course, is the essence of the ongoing tale of the disappearance of Malaysia Airlines Flight 370.  Except to the relatives of those on board, it never really mattered what happened in the cockpit that day.  To the extent that the plane’s disappearance was solvable, the mystery could only end in one of two ways: it landed somewhere (somehow unnoticed, a deep unlikelihood) or it crashed somewhere, probably in an ocean.  End of story.  It was, however, a tale with thrilling upsides when it came to filling airtime, especially on cable news.  The fact that there was no there there allowed for the raising of every possible disappearance trope – from Star Trekkian black holes to the Bermuda Triangle to Muslim terrorists — and it had the added benefit of instantly evoking a popular TV show.  It was a formula too good to waste, and wasted it wasn’t.

The same has been true of the story that, in the U.S., came to vie with it for the top news spot: the devastating mudslide in Washington State.  An act of nature, sweeping out of nowhere, buries part of a tiny community, leaving an unknown but possibly large number of people dead.  Was anyone still alive under all that mud?  (Such potential “miracles” are like manna from heaven for the TV news.)  How many died?  These questions mattered locally and to desperate relatives of those who had disappeared, but otherwise had little import.  Yes, unbridled growth, lack of attention to expected disasters, and even possibly climate change were topics that might have been attached to the mudslide horror.  As a gruesome incident, it could have stood in for a lot, but in the end it stood in for nothing except itself and that was undoubtedly its abiding appeal.



Both stories had the added benefit (for TV) of an endless stream of distraught relatives: teary or weeping or stoic or angry faces in desperately tight close-ups making heartfelt pleas for more information.  For the media, it was like the weather before climate change came along.

In response, just about anything else that could pass for news was swept aside.  Given a media that normally rushes heedlessly from one potential 24/7 story to another, this was striking.  In the case of Flight 370, for instance, on the 21st day after its disappearance, it still led NBC’s Nightly News with Brian Williams (with the mudslide, one week after it happened, the number two story).

In those weeks, only one other story broke their stranglehold on the news.  It was the seemingly critical question of what in the world was going on in Ukraine.  There was the Russian military move into the Crimea, the referendum on that peninsula, its annexation, the alarm of the U.S. and the European Union, the imposition of (modest) sanctions, and various warnings of a Russian military build-up and possible invasion of eastern Ukraine.  Unlike the other two stories, it seemed consequential enough.  And yet in some eerie way, it, too, came to resemble them.  It was as if with the news on Ukraine we were being sucked back into another era — that of the superpower-run twentieth century.

The question that seemed to loom was this: Are we in a new (i.e., the old) Cold War?  It was so front and center that it sent opinion pollsters scrambling and they promptly discovered that half of all Americans thought we were — itself less a testament to American opinion than to the overwhelming media narrative that we were indeed living through the Cold War redux.

Was the Soviet Union being raised from the dead?  Think of this as the Flight 370 of global political coverage.  It had everything a story needed: people in the square; a foreign leader who glowered just like a movie villain should and, for once in the twenty-first century, wasn’t a U.S. president or vice president; and fears of Russian troops entering the rest of Ukraine, with Lithuania, Estonia, or some other former satellite of the Soviet Union next in line.  Where would it end?  How could Vladimir Putin’s juggernaut be stopped?

As a story, it was a time warp miracle all its own.  After so many years, an American president was denouncing not al-Qaeda, or the Taliban, or the Iraqis, or the Iranians, but the Russians.  Once again, as in the good old bad old days, U.S. officials could decry the tyranny of a major state and its dangers to the globe with a straight face.  There was finally a black-and-white tale of international morality in which Americans could denounce an invasion.  It had the comfy familiarity of an old-fashioned script, one whose ending everybody already knew.  It implied that the world was once again easy to grasp, that everything was finally back in order — the good guys and the bad guys, East and West, freedom and tyranny.

As an old script, it had all the fearsome charms of familiarity.  While signaling danger, it actually helped tame a world that otherwise looked unsettling indeed.

As it happens, however, Soviet armies will never again threaten to plunge through the Fulda Gap.  The Warsaw Pact is long gone, never to be revived, and Germany will remain a united powerhouse, not a divided land.  Argue as you will about whether the Russians or Putin are “evil,” one thing is certain, there is no “empire” to go with it.  President Obama was on the mark recently when he referred to Putin’s Russia as a “regional power” and not a superpower at all.  Not even close.  If anything, it’s a country that, thanks to NATO, the U.S., and the European Union, already had its back to the wall, with its former “satellites” long ago stripped away, and Ukraine looking like it was about to go, too.  (After all, an American diplomat,talking tough, was secretly recorded seemingly sorting out a future Ukrainian government with the local American ambassador!)

Russia may not even quite be a regional powerhouse.  Its economy is shaky and, unlike the Soviet Union, it is now largely an oil and gas state and, worse yet, its energy reserves are expected to be in decline in future decades.

A Planet for the Taking

So, no, Virginia, Flight 370 was not commandeered by aliens and Vladimir Putin is not Joseph Stalin’s younger brother.  The U.S. is not in a new Cold War, its troops do not stand in any danger of going toe-to-toe with Russian invaders, and a two-superpower world is dead and buried, but so, it seems, is a one superpower world.  History is a powerful tool, but sometimes when lost stories and old scripts dominate the headlines, it’s worth asking whether, behind the scrim of the familiar and the empty, there might not lurk an unnerving world, a new age that no one cares to focus on.

As with a magician, sometimes you have to look where he isn’t pointing to catch sight of reality.  With that in mind, I’d like to nominate British journalist Patrick Cockburn for a prize.  In the midst of the recent headlines, in the most important article no one noticed, he pointed out something genuinely unnerving about our world.

Yes, we’re all aware that the U.S. invasion of Iraq didn’t exactly work out as planned and that Afghanistan has been a nearly 13-year disaster, even though the U.S. faced the most ragtag of minority insurgencies in both places.  What, however, about the monumental struggle that used to be called the Global War on Terror?  After all, we got Osama bin Laden.  It took a while, but SEAL Team 6 shot him down in his hideout in Pakistan.  And for years, thanks to the CIA’s drone assassination campaigns in the Pakistani tribal borderlands, Yemen, and Somalia (as well as a full scale hunter-killer operation in Iraq while we were still occupying that country), we’ve been told that endless key al-Qaeda “lieutenants” have been sent to their deaths and that al-Qaeda in Afghanistan has been reduced to 50-100 members.

Yet Cockburn concludes: “Twelve years after the ‘war on terror’ was launched it has visibly failed and al-Qaeda-type jihadis, once confined to a few camps in Afghanistan, today rule whole provinces in the heart of the Middle East.”  Look across that region today and from Pakistan to Libya, you see the rise, not the fall, of jihadis of every type.  In Syria and parts of Iraq, groups that have associated themselves with al-Qaeda now have a controlling military presence in territories the size of, as Cockburn points out, Great Britain.  He calls al-Qaeda’s recent rise as the jihadi brand name of choice and the failure of the U.S. campaign against it “perhaps the most extraordinary development of the 21st century.”  And that, unlike the claims we’ve been hearing at the top of the news for weeks now, might not be an exaggeration.

Looked at another way, despite what had just happened to the Pentagon and those towers in New York, on September 12, 2001, the globe’s “sole superpower” had remarkably few enemies.  Small numbers of jihadis scattered mostly in the backlands of the planet and centered in an impoverished, decimated country — Afghanistan — with the most retro regime on Earth.  There were, in addition, three rickety “rogue states” (North Korea, Iraq, and Iran) singled out for enemy status but incapable of harming the U.S., and that was that.

The world, as Dick Cheney & Co. took for granted, looked ready to be dominated by the only (angry) hyperpower left after centuries of imperial rivalry.  The U.S. military, its technological capability unrivaled by any state or possible grouping of states, was to be let loose to bring the Greater Middle East to heel in a decisive way.  Between that regular military and para-militarizing intelligence agencies, the planet was to be scoured of enemies, the “swamp drained” in up to 60 countries.  The result would be a Pax Americana in the Middle East, and perhaps even globally, into the distant future.  It was to be legendary.  And no method — not torture, abuse, kidnapping, the creation of “black sites,” detention without charges, assassination, the creation of secret law, or surveillance on a previously unimaginable scale — was to be left out of the toolkit used to birth this new all-American planet.  The “gloves” were to be taken off in a big way.

Thirteen years later, those plans, those dreams are down the drain.  The Greater Middle East is in chaos.  The U.S. seems incapable of intervening in a meaningful way just about anywhere on Earth despite the fact that its military remains unchallenged on a global level.  It’s little short of mind-blowing.  And it couldn’t have been more unexpected for those in power in Washington and perhaps for Americans generally.  This is perhaps why, despite changing American attitudes on interventions and future involvement abroad, it’s been so hard to take in, so little focused upon here — even in the bogus, politicized discussions of American “strength” and “weakness” which circle around the latest Russian events, as they had previously around the crises in Iran and Syria.

Somehow, with what in any age would have seemed like a classic winning hand, Washington never put a card on that “table” (on which all “options” were always being kept open) that wasn’t trumped.  Events in Ukraine and the Crimea seem to be part of this.

The Chinese had an evocative phrase for times of dynastic collapse: “chaos under heaven.”  Moments when it seems as if the planet itself is shifting on its axis don’t come often, but they may indeed feel like chaos under heaven — an increasingly apt phrase for a world in which no country seems to exert much control, tensions are rising in hard to identify ways, and the very climate, the very habitability of the planet is increasingly at risk.

When The Losers Are The Winners

Even in a losing game, there are usually winners.  One of the conundrums of this particular moment, however, is that the winners in our American world are exactly those who have repeatedly been playing the losing hands.  Their reward for one self-defined disaster after another has been yet more money, yet wider areas of everyday life to control, and yet more power.  No matter how inept they may prove as imperial players on a world stage, they can essentially do no wrong domestically when it comes to embedding themselves ever more deeply in our lives in the name of our “security” and our “safety.”  It’s a remarkable tale.  Legendary, one might almost say.  As the power of American power to accomplish seemingly anything fades, the power of the national security state only grows.

It’s true that, in the wake of the Edward Snowden revelations, the managers of our secret state have had to pull back in a few areas — especially the gathering and holding of phone metadata for the complete U.S. population.  But the significance of this is easy to exaggerate.  It’s worth remembering that in the wake of the Watergate era, the last time we went through a round of “reforms” of an out-of-control secret world, the national security state somehow ended up with its own secret court system and secret body of law to which all citizens became accountable even though they could know nothing about it.  Four decades later, in a situation in which that secret state is so much stronger, such reforms may once again turn out only to enhance its power.

It’s true as well that the CIA has had to pull back on some of the methods it used to such disastrous effect after 9/11, in particular closing those “black sites” it set up (though some may still exist, possibly in Somalia and perhaps on U.S. naval ships) and on the use of torture.  Nonetheless, the recent spectacle of the Agency’s attack on Senator Dianne Feinstein and the staff of the Senate Intelligence Committee over a still-secret 6,300 page critical report on its Bush-era torture and black site programs should be instructive.  After all, Feinstein has made her reputation, in part, as the senator from the national security state.  She has typicallysupported the NSA’s secret programs as well as those of other intelligence outfits, straight down the line.  There has perhaps been no one more sympathetic among Democratic representatives in Congress.

On a single issue, a single set of programs by a single agency, however, she chose to differ and offer genuine criticism.  You might think that, under the circumstances, she would still be handled by the secret state with kid gloves.  Instead, the CIA referred her committee staff to the Justice Department for possible crimes, while she was attacked as if she were the Great Satan and finally driven to the Senate floor to denounce the CIA for potential criminal acts and infringing the Constitution.  Even the president didn’t come to her aid.

Think of this as a reasonable yardstick for measuring the real power relations between Washington’s official overseers and those who are supposed to be overseen.

Think of the overseen as now negotiating from a position of significant strength the details of their future benefits package.  And we can count on one thing: whatever changes are made, they will be largely cosmetic.  The many parts of America’s growing shadow government – secret law, secret surveillance, secret power, and the secret state — are here to stay.

From the 9/11 attacks on, that secret state and the militarized world of Washington that goes with it have shown themselves, even by their own standards, woefully incapable of handling a new and puzzling world.  Their actions have repeatedly undermined the usual sort of imperial control, instead facilitating spreading chaos.  Post-9/11, they have had a remarkable knack for creating not just blowback — the CIA term of tradecraft that scholar Chalmers Johnson put intoour vocabulary — but something for which we have no word.  Think of it perhaps as just the “blow” part of that term.

The orderliness of secret power in Washington and chaos under heaven, the growth of a police state and a planet run riot, turn out to be two sides of the same coin.  If you want a news story that will glue eyes, then think of it this way: on September 12, 2001, the national security state entered the cockpit of (to modernize a phrase) the plane of state, hijacked it, and steered it directly for the Bermuda Triangle — and here was the strangest thing of all: no one even noticed.

Tom Engelhardt, co-founder of the American Empire Project, runs the Nation Institute’s TomDispatch.com. His latest book, “The United States of Fear” (Haymarket Books), has just been published.

 

http://www.salon.com/2014/04/04/cable_news_is_living_in_an_alternate_universe_partner/?source=newsletter

How McCutcheon decision will destroy American politics

Supreme Court’s abomination:

Thanks to Scalia and co., the rich will now be able to buy politicians as effortlessly as they buy anything else

Supreme Court's abomination: How McCutcheon decision will destroy American politics
Supreme Court Chief Justices Antonin Scalia, John Roberts (Credit: Reuters/Brendan McDermid/AP/Larry Downing/photo collage by Salon)

“Money talks,” Elvis Costello once observed, “and it’s persuasive.” The belief that this is especially true in the world of politics led to the passage of the Federal Election Campaign Act. In the aftermath of Watergate the FECA was strengthened in an attempt to limit the corrupting influence of money on politics, and, until 2010, the Supreme Court largely upheld Congress’s power to do so.

That year the Citizens United case, which essentially found that the free speech rights of corporations were more important than legislative attempts to keep money from corrupting the political process, occasioned a great deal of outrage. But that case marked merely the beginning of what is likely to prove to be a series of increasingly successful assaults on campaign finance laws.

And now, Wednesday, the next blow to attempting to keep the rich from being able to buy politicians as effortlessly as they purchase anything else has been struck by McCutcheon v. FEC, a Supreme Court case dealing with limits on how much money individuals can contribute to candidates.

McCutcheon has now struck down overall limits on individual campaign contributions. This latest outburst of judicial activism in the struggle to render campaign finance laws completely toothless is merely accelerating a historical process that is coming to seem almost inevitable.

To see why, consider the practical implications of the theory that weak or nonexistent limits on campaign finance will allow the rich to transform what is putatively a democratic republic into an unapologetic plutocracy.

If money can buy the political outcomes desired by the super-wealthy oligarchs at the apex of our increasingly unequal economy, then there are only two possible ways to avoid this result. First, we can assume that that there is a strong distinction between law and politics, that judges make legal rather than political decisions, and that legal decisions, unlike political outcomes, cannot be bought.

Or, in the alternative, we can construct a society that does not tolerate the sort of vast accumulations of individual and family wealth that would allow a tiny economic elite to buy both the legislative and the judicial processes.



Pursuing the first alternative is obviously naive.  If stupendous wealth is free to buy electoral results, it will also be free to buy judicial decisions as well. And, as the outcome in Citizens United demonstrated, this will not even require anything as crude as straightforward bribery.

If the Koch brothers want the First Amendment to mean that rich people have a constitutional right to buy unlimited political influence, they and their ilk will use their wealth to eventually bring about the social and political conditions that will guarantee that five people who sincerely agree with them on this point will be sitting on the Supreme Court.

The second alternative to avoiding a frank plutocracy is (perhaps) more plausible, as it does not depend on the fiction that the judicial interpretation of our laws is not itself a political process.

Yet an important new book by economist Thomas Piketty points to a pessimistic conclusion in this regard. Drawing on hundreds of years of economic data (some of which has only recently become available to researchers) Piketty reaches a simple but disturbing conclusion: In the long run, the return on capital tends to be greater than the growth rate of the economies in which that capital is located.

What this means is that in a modern market economy the increasing concentration of wealth in the hands of the already-rich is as natural as water flowing downhill, and can only be ameliorated by powerful political intervention, in the form of wealth redistribution via taxes, and to a lesser extent laws that systematically protect labor from capital. (Piketty argues that, because of historical circumstances that are unlikely to be repeated, this sort of intervention happened in the Western world in general, and in America in particular, between the First World War and the early 1970s.)

Readers can already guess the dire conclusion that flows from combining Piketty’s theory with the plausible assumption that unregulated wealth leads to plutocracy: If the only way to avoid plutocracy would be to employ political processes that the plutocrats themselves will eventually buy lock, stock and barrel, then the only way to avoid being ruled by the Lords of Capital is to become one of them. This, in effect, is the contemporary GOP’s economic creed in a nutshell.

Paul Campos is a professor of law at the University of Colorado at Boulder.