Letter To The Millennials

A Boomer Professor talks to his students

Written by

  • Director, USC Annenberg Innovation Lab. Producer, “Mean Streets”, “The Last Waltz”, “Until the End Of the World”, “To Die For”

So we are about to embark on a sixteen-week exploration of innovation, entertainment, and the arts. This course is going to be about all three, but I’m going to start with the “art” part — because without the art, no amount of technological innovation or entertainment marketing savvy is going to get you to go to the movie theater. However, I think there’s also a deeper, more controversial claim to be made along these same lines: Without the art, none of the innovation matters — and indeed, it may be impossible — because the art is what gives us vision, and what grounds us to the human element in all of this. Although there will be lectures, during which I’ll do my best to share what I’ve learned about the way innovation, entertainment, and the arts fit together, the most crucial part of the class is the dialogue between us, and specifically the insights coming from you as you teach me about your culture and your ideals. The bottom line is that the world has come a long way, but from my perspective, we’re also living in uniquely worrisome times; my generation had dreams of how to make a better life that have remained woefully unfulfilled (leaving many of us cynical and disillusioned), but at the same time your generation has been saddled with the wreckage of our attempts and are now facing what may seem to be insurmountable odds. I’m writing this letter in the hopes that it will help set the stage for a truly cross-generational dialogue over the next sixteen weeks, in which I help you understand the contexts and choices that have brought us where we are today, and in which you help me, and one another, figure out the best way to move forward from here.

When I was your age, I had my heart broken and my idealism challenged multiple times by the assassinations of my political heroes: namely, John and Bobby Kennedy and Martin Luther King. Many in my generation turned away from politics and found our solace in works of art and entertainment. So one of the things I want to teach you about is a time from 1965–1980 when the artists really ruled both the music and the film industries. Some said “the lunatics had taken over the asylum” (and, amusingly enough, David Geffen named his record company Asylum), but if you look at the quality of work that was produced, it was extraordinary; in fact, most of it is still watched and listened to today. Moreover, in that period the most artistic work also sold the best: The Beatles’ Sgt. Pepper was without doubt the best record of the year but also the best selling, and The Godfather was similarly both best movie of the year and the biggest box office hit. That’s not happening right now, and I want to try to understand why that is. I want to explore, with you, what the implications of this shift might be, and whether this represents a problem. It may be that those fifteen years your parents and I were lucky enough to experience was one of those renaissance moments that only come along once every century, so perhaps it’s asking too much to expect that I’ll see it occur again in my lifetime. Nevertheless, I do hope it happens at least once in yours.

I spoke of the heartbreak of political murder that has permanently marked me and my peers, but we have also been profoundly disappointed by politics’ failure to improve the lives of the average citizen. In 1969, the median salary for a male worker was $35,567 (in 2012 dollars). Today, it is $33,904. So for 44 years, while wages for the top 10% have continued to climb, most Americans have been caught in a “Great Stagnation,” bringing into question the whole purpose of the American capitalist economy (and, along the way, shattering our faith in the “American Dream”). The Reagan-era notion that what benefited the 1% — “the establishment” — would benefit everyone has by now been thoroughly discredited, yet it seems that we are still struggling to pick up the pieces after this failed experiment.

Seen through this lens, the savage partisanship of the current moment makes an odd kind of sense. What were the establishment priorities that moved inexorably forward in both Republican and Democratic administrations? The first was a robust and aggressive foreign policy. As Stephen Kinzer wrote about those in power during the 1950s, “Exceptionalism — the view that the United States has a right to impose its will because it knows more, sees farther, and lives on a higher moral plane than other nations — was to them not a platitude, but the organizing principle of daily life and global politics.”

From Eisenhower to Obama, this principle has been the guiding light of our foreign policy, bringing with it annual defense expenditures that dwarf those of all the world’s major powers combined. The second principle of the establishment was that “what is good for Wall Street is good for America.” Despite Democrats’ efforts to paint the GOP as the party of Wall Street, one would only have to look at the track record of Clinton’s treasury secretaries Rubin and Summers (specifically, their zealous efforts to kill the Glass-Steagal Act and deregulate the big banks and the commodities markets) to see that both major parties are guilty of sucking up to money; apparently, the establishment rules no matter who is in power. Was it any surprise, then, that Obama appointed the architects of bank deregulation, Summers and Geithner, to clean up the mess their policies had caused? Was it any surprise that they failed? Was it any surprise that establishment ideas about the surveillance state were not challenged by Obama? The good news is that, as a nation, we have grown tired of being the world’s unpaid cop, and we are tired of dancing to Wall Street’s tune. Slowly, we are learning that these policies may benefit the 1%, but they don’t benefit the people as a whole. My guess is the 2016 election may be fought on this ground, and we may finally begin to see real change, but the fact remains that we — both your generation and mine — are right now deeply mired in the fallout of unfulfilled promises and the failures of the political system.

So this is the source of boomer disillusionment. But even if we are cynical about political change, we can try to imagine together a future where great artistic work continues to flourish; this, then, is the Innovation and Entertainment part of the course. It’s not that I want you to give up on politics — in fact the events of the last few weeks in Ferguson only reinforce my belief that when people disdain politics, their anger gets channeled into violence. But what I do want you to think about is that art and culture are more plastic — they can be molded and changed easier than politics. There is a sense in which art, politics, and economics are all inextricably and symbiotically tied together, but history has proven to us that art serves as a powerful corrective against the dangers of the establishment. There is a system of checks and balances in which, even though the arts may rely on the social structures afforded by strong economic and political systems, artists can also inspire a culture to move forward, to reject the evils of greed and prejudice, and to reconnect to its human roots. If we are seeking a political and economic change, then, an authentic embrace of the arts may be key. Part of your role as communication scholars is to look more closely at the communication surrounding us and think critically about the effects its having, whose agenda is being promoted, and whether that’s the agenda that will serve us best. One of the tasks we’ll wrestle with in this class will be how we can get the digital fire hose of social media to really support artists, not just brands.

In 2011, the screenwriter Charlie Kaufman (Being John Malkovich, Adaptation) gave a lecture at the British Film Institute. He said something both simple and profound:

People all over the world spend countless hours of their lives every week being fed entertainment in the form of movies, TV shows, newspapers, YouTube videos and the Internet. And it’s ludicrous to believe that this stuff doesn’t alter our brains.

It’s also equally ludicrous to believe that — at the very least — this mass distraction and manipulation is not convenient for the people who are in charge. People are starving. They may not know it because they’re being fed mass produced garbage. The packaging is colorful and loud, but it’s produced in the same factories that make Pop Tarts and iPads, by people sitting around thinking, “What can we do to get people to buy more of these?

And they’re very good at their jobs. But that’s what it is you’re getting, because that’s what they’re making. They’re selling you something. And the world is built on this now. Politics and government are built on this, corporations are built on this. Interpersonal relationships are built on this. And we’re starving, all of us, and we’re killing each other, and we’re hating each other, and we’re calling each other liars and evil because it’s all become marketing and we want to win because we’re lonely and empty and scared and we’re led to believe winning will change all that. But there is no winning.

I think Charlie is right. People are starving, so we give them bread and circuses.

​ But I think Charlie is wrong when he says “there is no winning”. In fact I think we are really in a “winner-take-all” society. Look at the digital pop charts. 80% of the music streams are for 1% of the content. That means that Jay-Z and Beyoncé are billionaires, but the average musician can barely make a living. Bob Dylan’s first album only sold 4,000 copies. In this day and age, he would have been dropped by his label before he created his greatest work.

A writer I greatly admired, Gabriel García Márquez, died recently. For me, Márquez embodied the role of the artist in society, marked by the refusal to believe that we are incapable of creating a more just world. Utopias are out of favor now. Yet Marquez never gave up believing in the transformational power of words to conjure magic and seize the imagination. The other crucial aspect of Márquez’s work is that he teaches us the importance of regionalism. In a commercial culture of sameness where you can stroll through a mall in Shanghai and forget that you’re not in Los Angeles, Marquez’s work was distinctly Latin American. His work was as unique as the songs of Gilberto Gil, or the cinema of Alejandro González Iñárritu. In a cultural like ours that has so long advocated a “melting pot” philosophy that papers over our differences, it is valuable to recognize that there is a difference between allowing our differences to serve as barriers and appreciating the things that make each culture unique, situated in time and space and connected to its people. What’s more, young artists also need to have the sense of history that Marquez celebrated when he said, “I cannot imagine how anyone could even think of writing a novel without having at least a vague of idea of the 10,000 years of literature that have gone before.” Cultural amnesia only leads to cultural death.

With these values in mind, my hope is to lead you in a discussion of politics and culture in the context of 250 years of America’s somewhat utopian battle to build “a city on a hill.” I think many in my generation had this utopian impulse (which is, it should be observed, different than idealism), but it is slipping away like a short-term memory. I did not aspire to be that professor who quotes Dr. King, but I feel I must. He said the night before he was assassinated, “I may not get there with you, but I believe in the promised land.” My generation knew that the road towards a better society would be long, but we hoped our children’s children might live in that land, even if we weren’t able to get there with you. It may take even longer than we imagined, but I know your generation believes in justice and equality, and that fills me with hope that the dream of some sort of promised land is not wholly lost. The next step, then, is to figure out how to work together, to learn from the past while living in the present moment in order to secure a better future, and I believe this class offers us an incredible opportunity to do precisely that.

So what are the skills that we can develop together in order to open a real cross-generational dialogue? First, I would hope we would learn to improvise. I want you to challenge me, just as I encourage and challenge you. Improvisation means sometimes throwing away your notes and just responding from your gut to the ideas being presented. It takes both courage and intelligence, but I’m pretty sure you have deep stores of both qualities, which will help you show leadership both in class and throughout the rest of your life. Leadership is more than just bravery and intellect, however; it also requires vulnerability and compassion, skills that I hope we can similarly cultivate together. I want you to know that I don’t have all the answers — and, more importantly, I know that I don’t have all the answers. I am somewhat confused by our current culture and I am looking to you for insight. You need to have that same vulnerability with your peers, and you also need to treat them with compassion as you struggle together to understand this new world of disruption. I know these four elements — courage, intelligence, vulnerability, and compassion — may seem like they are working at cross-purposes, but we will need all four qualities if we are to take on the two tasks before us. One of our tasks is to try to restore a sense of excellence in our culture — the belief that great art and entertainment can also be popular. The second task is for baby boomer parents and their millennial children to form a natural political alliance going forward. As I’ve said, I don’t think the notion that we will get to “the promised land” is totally dead, and with your energy and the tools of the new media ecosystem to help us organize, we can keep working towards a newly hopeful society, culture, and economy, in spite of the mess we have left you with.

This is, at least, the plan. Of course, as the great critic James Agee once said, “Performance, in which the whole fate and terror rests, is another matter.”

 

 

View profile at Medium.com

Companies sell mobile phone spying tools to governments worldwide

http://srgurukul.com/images/Mobile.jpg

By Thomas Gaist
26 August 2014

Cell phone location tracking technologies long used by the US National Security Agency and British GCHQ are increasingly available for purchase by other governments throughout the world, the Washington Post reported Monday.

Cell phone location data tracking systems, which include a range of associated intelligence gathering capabilities, are constantly being developed and marketed by private security contractors. The technology enables governments and private entities to track the movements of cell phone users across national boundaries, in many cases pinpointing users’ precise locations within a few meters.

One surveillance firm, called Defentek, boasts on its web page that its Infiltrator Global Real-Time Tracking System can “locate and track any phone number in the world.” The Infiltrator System is “a strategic solution that infiltrates and is undetected and unknown by the network, carrier, or the target,” the site says.

Analysis of cell phone location tracking software by the watchdog group Privacy International highlighted the role of Verint, a sophisticated Israeli-American private security and intelligence contractor that employs former government agents, including special forces soldiers.

Verint reports on its web page that the company’s systems are used by “more than 10,000 organizations in over 180 countries,” the Washington Post reported.

The spread of such cutting-edge surveillance systems by private security and intelligence firms is taking place with the help of the major telecommunications corporations. Verint states that it has installed location data capture software on cellular networks in numerous countries with the knowledge and cooperation of major telecommunications providers.

A confidential Verint advertising brochure posted online by Privacy International detailed the wide array of surveillance capabilities offered by Verint to clients. According to its advertising material, Verint’s “Solution’s Portfolio” includes “Cellular Interception and Control, Mobile Satellite Interception, Global Cellular Location, and IP Interception and Tampering.” The brochure notes that the company sells “Monitoring Centres that can operate at nationwide levels and has been known to have had installations in Slovakia, Ivory Coast, India and Vietnam.”

For the right price, Verint will also carry out and/or facilitate a number of other intelligence-related operations on behalf of its clients, including:

* Identifying potential targets and building an intelligence picture over cellular networks

* Passively and covertly collecting cellular traffic in an area and analyzing it in real time to identify potential targets

* Identifying suspicious communication patterns using a range of analysis tools, including Location, Speech Recognition, Link Analysis, Text Matching

* Intercepting voice calls and text messages of potential targets

* Identifying, intercepting, decoding, manipulating and analyzing WiFi-enabled devices such as tablets, smartphones, and laptops

Verint also claims that it can break into encrypted communications and remotely activate microphones on cell phones, and the company offers training sessions simulating a range of tactical scenarios with its in-house veteran military and intelligence personnel.

Reports from the summer of 2013 showed that Verint provided systems used by the Mexican government during the administration of President Felipe Calderon to capture and analyze all types of communications in that country beginning in 2007, as part of operations initiated in coordination with the US State Department.

In its report, the Washington Post noted that surveillance agencies and private companies are increasingly deploying “IMSI catchers,” also referred to as StingRays, which enable users to send fake text messages, inject malware into targeted phones, and intercept the content of various forms of cellphone-based communications.

In addition to using StingRays, surveillance agencies can tap directly into cell phone towers to identify movement patterns of nearby telephone users. Location data from cell phone towers, moreover, is regularly transferred in bulk to federal, state, and local security agencies across the US through a procedure known as “tower dumps.”

Revelations from December of 2013 have already shown that the NSA’s CO-TRAVELLER program gathers around 5 billion pieces of cell phone location data worldwide on a daily basis, and has been capable of tracking the location of cellphones, even when switched off, since 2004. Location data gathered by the NSA allows the agency to map the overall movement pattern of targeted individuals, their daily routes and habitual meeting places.

The US uses related technology to orchestrate its drone wars in Afghanistan, Pakistan, Yemen and elsewhere. As part of a program codenamed GILGAMESH, the NSA’s “Geo Cell” program, which sports the motto “We Track ‘Em, You Whack ‘Em,” guides drone strikes against alleged terrorists by tracking the location of SIM cards inside their cellphones.

All of these surveillance and tracking programs are part of the efforts of the US and other imperialist states to compile comprehensive databases on their respective populations in response to growing popular opposition to the growth of social inequality and attacks on democratic rights.

Princeton Study: U.S. No Longer An Actual Democracy

Pyki5gv7hpf2qwunerkw

AP Photo / Patrick Semansky

Asking “[w]ho really rules?” researchers Martin Gilens and Benjamin I. Page argue that over the past few decades America’s political system has slowly transformed from a democracy into an oligarchy, where wealthy elites wield most power.

Using data drawn from over 1,800 different policy initiatives from 1981 to 2002, the two conclude that rich, well-connected individuals on the political scene now steer the direction of the country, regardless of or even against the will of the majority of voters.

TPM Interview: Scholar Behind Viral ‘Oligarchy’ Study Tells You What It Means

“The central point that emerges from our research is that economic elites and organized groups representing business interests have substantial independent impacts on U.S. government policy,” they write, “while mass-based interest groups and average citizens have little or no independent influence.”

As one illustration, Gilens and Page compare the political preferences of Americans at the 50th income percentile to preferences of Americans at the 90th percentile as well as major lobbying or business groups. They find that the government—whether Republican or Democratic—more often follows the preferences of the latter group rather than the first.

The researches note that this is not a new development caused by, say, recent Supreme Court decisions allowing more money in politics, such as Citizens United or this month’s ruling on McCutcheon v. FEC. As the data stretching back to the 1980s suggests, this has been a long term trend, and is therefore harder for most people to perceive, let alone reverse.

“Ordinary citizens,” they write, “might often be observed to ‘win’ (that is, to get their preferred policy outcomes) even if they had no independent effect whatsoever on policy making, if elites (with whom they often agree) actually prevail.”

http://talkingpointsmemo.com/livewire/princeton-experts-say-us-no-longer-democracy

Are We Out of Big Ideas?

http://geniussquared.com/wp-content/uploads/2011/03/Plato-Socrates-Aristotle.png

“The Gods have died!”, the villagers cried.

“What should we do? Mourn? Grieve? Plead?”, the Chief asked the Priest, desperate.

“No”, said the Priest, looking at the sky, afraid.

**

Here’s a tiny question. Are we idearupt? As in: bankrupt of great ideas?

Go ahead. Name me an “ism” that still works.

I’ll wait.

Conservatism? #LOL. Liberalism? #lol. Capitalism, or what’s left of it? Sure, maybe for billionaires. “Libertarianism”? I invite you to Mogadishu, good sir. Socialism…syndicalism…anarchism…mercantilism…revanchism…shit!!

Wait. What about…Bronyism?

Perhaps you see my point.

We’re living through a kind of implosion. Not just of institutions—that much is obvious. But a collapse of institutions that was detonated by an implosion.

Of ideas.

Yesterday’s ideas about how to organize societies and economies simply don’t work anymore.

And so we’re left in a vacuum. What’s a vacuum? A void. An emptiness. An absence. We’re out of good ideas about how societies, democracies, and economies should be organized and managed.

But not just “how”. More deeply, by whom—and why.

What’s the point, you often wonder. Of your life. Of the sheer goddamned futility of it all.

Working harder on stuff that doesn’t matter to buy junk you can’t afford to impress people you don’t like obeying the orders of robots programmed by assholes who’ve never read a book in their lives that oversee the entire economy purely for the production of “profit” not real things that actually benefit human lives which are getting poorer so they’re just one paycheck away from disaster…and even if you do somehow win the infernal contest of all the above, what’s the jackpot at the end of the rainbow? A life that’s totally meaningless in the first place.

What the fuck?

If you think all that’s…futile…you’re not wrong. You’re precisely right. It is. Yesterday’s great “isms” do not offer enough, to enough, for enough, from enough.

Whether it is “liberalism” or “conservatism”, the result is the same.

The middle class implodes; the rich grow incalculably richer; the poor are trampled. What’s the result? To pay for social services, the assets of the state are “privatized“; but they cannot do so for long. Eventually, ninety percent plus of people see their incomes stagnate; their wealth vanish; economies stall as people grow poorer. Society can no longer afford public goods, as tax bases dry up; public and private debts grow; and currencies are devalued. People’s lives go from prosperous and stable to precarious and impoverished in a generation or two.

See the pattern? The collapse of great ideas about to organize stuff isn’t merely…an idea. It’s reality.

Consider the twentieth century. The world created international law, international development, international trade, and international human rights. These were tremendous, astonishing human accomplishments. The kind that mankind might never have even dreamed of a few short centuries ago.

And now? What do we consider “great ideas”? Cruising to your less-than-minimum-wage temp gig at a robo-warehouse in your self-driving car share checking how many “friends” Spot made on the latest doggy dating app hoping you got another heart on yours?

Those aren’t great ideas. They’re clever businesses, and for that we should applaud them. But we must recognize. You can’t Tinder your way to a better world. You can’t even Tinder your way to a life worth living.

All the great “isms” are winking out. And so. The world is starting to burn. Nations are fracturing. Social contracts are being torn apart. In most of the world’s richest nations, not one but two generations will be lost. The global economy is stagnating.

And already from that witches cauldron is rising the smoke. Of violence, animosity, extremism, hatred. Which will eventually, if the fire is left untended, kindle into a wildfire of war.

All this is not inevitable. Yet. But it is predictable. For a single, simple reason.

We no longer have ideas powerful enough to organize the world. Yesterday’s “isms” are vanishing. And in their place is left a vacuum.

Here’s the catch.

You.

You probably believe that something always fills a vacuum. For you’ve been trained to be an obedient believer in progress; in advancement; in growth; in efficiency; in spontaneous order; in self-organization; in automaticity; in manifest destiny; and in all that’s inevitability.

In other words, you’re a True Believer in…the Big Idea: the idea of the progress of ideas.

Something always fills a vacuum, right? A bigger, better idea?

Wrong.

Sometimes, nothing does. For a very long while.

Sometimes, there is no progress of ideas.

Sometimes the darkness stays. And lasts. And deepens. Into an endless, frozen midnight. An abyss of collapsing ideas; from which mankind must escape.

We call those times Dark Ages. And my worry is that we’re stumbling headlong into one.

**

“The Gods have died!”, the villagers cried.

“What should we do? Mourn? Grieve? Plead?”, the Chief asked the Priest, desperate.

“No”, said the Priest, looking at the sky, afraid. “We must pray!”, he shouted, angrily.

“Pray?”, the villagers muttered to themselves, confused. “To whom?”

“To the Gods”, the Priest whispered.

“But the Gods are dead”, the Chief protested.

“Who do you think killed them?”, the Priest demanded.

“Gods who were more powerful still. And it is to them we must pray”.

“New Gods! But who are they?”, the villagers asked one another, astonished, anxious, afraid.

“They will reveal themselves. But only if our prayers prove worthy. Come. Let us pray!”, the Priest commanded.

“We are saved!”, cried the villagers.

“Glory!”, cried the Chief.

The Priest smiled.

He raised his hands to the heavens; and they all bowed beneath the perfect sky the new Gods hid behind.

The sun rose high. There was not a cloud to be seen.

It was how every Dark Age begins.

View story at Medium.com

BLOGGER COMMENT:  The “big ideas” of today are silly multimillion dollar phone apps and dumping ice cubes on your head in front of the new Audi.

I recall having many conversations like this in the Sixties. I’m happy there are those beginning to question the status quo in the 21st century.  Perhaps one might begin by looking back to Aristotle, Plato, Socrates…

What Facebook doesn’t show you

BLOGGER COMMENT:  Interaction with your FB “friends” is relatively insignificant. So what’s the point of “social media?” Data gathering for corporations. Certainly not socializing…
August 18

When you spend a day with something that knows you in ways you don’t know yourself, you learn that maybe you aren’t quite as interested in the things you think you are.

Here’s what I learned about myself: It seems I don’t much care about my hometown or the people in it, I’m far more interested in feminist blogs than I am in technology or sports, I’m still hung up on New York after moving away last spring, and I’m apparently very interested in the goings on of someone I worked with at Pizza Hut when I was 16.

What was the source of these revelatory, self-image-shifting facts? The same place you probably went when you got to work this morning: Facebook, which we can’t stop feeding, and obsessively tracks our every online movement.

Over the course of five or six hours on July 17, I pored over my News Feed, endlessly scrolling and refreshing until every piece of content that appeared was a repeat. I cataloged each post, totaling 1,417 status updates, photos, links, Likes, event RSVP’s and more, creating an assortment of everything Facebook thinks I care about.

But for all those link shares and wall posts, I still wasn’t sure exactly why I was seeing what I was seeing, or if I was even seeing what I wanted to see. (A Pizza Hut co-worker? Really?) So I went through my whole Facebook network – all of my 403 friends and the 157 Pages I Like – and recorded every single thing they posted on July 17.

Spoiler: My News Feed showed me only a fraction of my network’s total activity, most of what it showed me was old, and what I was shown was often jarringly unexpected.

Facebook says roughly one in seven people on the planet log in at least once a month. And yet, how News Feed works remains bafflingly opaque, like a secret box of technology, algorithms and magic that remains one of tech’s bigger mysteries. An entire consulting industry is built around trying to game it (think SEO for Google), and publishers invest enormous amounts of energy into succeeding on it, but as soon as people start to figure it out Facebook tweaks its secret recipe and everything goes out the window.

What we know is this: The more popular a piece of content posted in your network becomes, the more likely it is to spill into your News Feed; and the friends and Pages you interact most with are the ones you’ll see most frequently, according to Justin Lafferty, editor of InsideFacebook.com.

“Mark Zuckerberg wants News Feed to be like a newspaper,” he said. “The top stories are curated based on relevancy and the user’s connection to that page or friend,” he said, adding that like a printed newspaper or magazine, older stories can still be germane.

But beyond that, not much is known, and the further you dig into what Facebook thinks about you, the more odd things can get.

For example: I lived in Denver until I was 20 and still consider it home. Throughout my day on Facebook, I didn’t see a single story from The Denver Post, despite that Page posting 17 pieces of unique content. The same was true for Westword, a Denver alt-weekly I used to read religiously; a handful of local TV news stations I Like; and high school friends, acquaintances and even people I still consider close friends who live there. Do I not care about my home as much as I thought? Despite letting Facebook track me basically wherever and whenever it wants to, it still doesn’t think I’m interested in Denver or what goes on there?

On the other hand, women-oriented blogs such as Jezebel, Refinery29 and The Cut at times dominated my News Feed, with a whopping 40 posts between them appearing. The Verge, which I thought was among my favorite blogs, barely showed up.

And even as I was doing my experiment, I could see subtle shifts in what appeared, which, in turn, perhaps changes who Facebook thinks I am. Status updates from those same high school friends I hadn’t interacted with in years suddenly started popping up toward the end of the day. The same went for Pages I liked long ago and forgot about, and parties in New York I wasn’t invited to but saw close friends RSVP to.

The day had become an oddly pointed reminder of a past I don’t seem to care about, and a distressing collection of everything I’m missing out on today.

By midnight, after almost six hours of scrolling, refreshing and note-taking throughout the day, I had consumed 1,417 unique events. Posts from July 17 became rare as older posts crept in, and eventually everything I was seeing in my News Feed I had seen before. I had exhausted my well of Facebook content, I thought – a triumph! I had conquered Facebook!

Well, no: I wasn’t even close. After going back to record every single event that happened in my entire network on July 17, I saw that 2,593 pieces of new content had been produced. I saw 738 of them, or about 29%. The other 679 posts that appeared in my News Feed were old news by the time I saw them, sometimes by more than two days.

So that means that after doing everything possible to see all of the activity in my network, I saw less than third of it. Considering the average U.S. user spends around 40 minutes on Facebook per day – or about one-tenth of the time I spent in my News Feed – it’s easy to imagine that percentage dipping far, far below my 29%.

But that might be the point.

Greg Marra, a product manager on News Feed at Facebook, told me that it is fundamentally a reflection of the user and his or her interests.

“News Feed is made by you,” Marra said. “It tries to show the most interesting things possible for you, it’s a very personalized system,” he said, adding, “We try to let users take control.”

Marra said there are countless signals that tell Facebook what to pump into a person’s News Feed, including relationships with other users, the topic of content in a given link, how long a user spends reading a story he or she found though Facebook, if and how many times X user visits Y user’s profile, friends’ activity on a certain post, all of our previous activity and more.

“We learn based on what you’ve done in the past,” Marra said. “And we try to quickly learn about the things that you’re interested in.”

(Remember that Facebook’s learning can sometimes result in disastrous PR.)

So after a full day spent on Facebook, what was I left with? In the end, not much. A heap of work for myself to complete this story; a still-muddled understanding of how News Feed works; and a slightly different view of what I think I care about.

Fittingly enough: The final post I saw on my Endless Day of Facebook was a status update about a flash flood warning that was more than 40 hours old.

It was for Denver.

http://www.washingtonpost.com/news/the-intersect/wp/2014/08/18/what-facebook-doesnt-show-you/?Post+generic=%3Ftid%3Dsm_twitter_washingtonpost

 

Facebook, email and the neuroscience of always being distracted

I used to be able to read for hours without digital interruption. Now? That’s just funny. I want my focus back!

"War and Peace" tortured me: Facebook, email and the neuroscience of always being distracted
This essay is adapted from “The End of Absence”

I’m enough of a distraction addict that a low-level ambient guilt about not getting my real work done hovers around me for most of the day. And this distractible quality in me pervades every part of my life. The distractions—What am I making for dinner?, Who was that woman in “Fargo”?, or, quite commonly, What else should I be reading?—are invariably things that can wait. What, I wonder, would I be capable of doing if I weren’t constantly worrying about what I ought to be doing?

And who is this frumpy thirty-something man who has tried to read “War and Peace” five times, never making it past the garden gate? I took the tome down from the shelf this morning and frowned again at those sad little dog-ears near the fifty-page mark.

Are the luxuries of time on which deep reading is reliant available to us anymore? Even the attention we deign to give to our distractions, those frissons, is narrowing.

It’s important to note this slippage. As a child, I would read for hours in bed without the possibility of a single digital interruption. Even the phone (which was anchored by wires to the kitchen wall downstairs) was generally mute after dinner. Our two hours of permitted television would come to an end, and I would seek out the solitary refuge of a novel. And deep reading (as opposed to reading a Tumblr feed) was a true refuge. What I liked best about that absorbing act was the fact books became a world unto themselves, one that I (an otherwise powerless kid) had some control over. There was a childish pleasure in holding the mysterious object in my hands; in preparing for the story’s finale by monitoring what Austen called a “tell-tale compression of the pages”; in proceeding through some perfect sequence of plot points that bested by far the awkward happenstance of real life.

The physical book, held, knowable, became a small mental apartment I could have dominion over, something that was alive because of my attention and then lived in me.

But now . . . that thankful retreat, where my child-self could become so lost, seems unavailable to me. Today there is no room in my house, no block in my city, where I am unreachable.

Eventually, if we start giving them a chance, moments of absence reappear, and we can pick them up if we like. One appeared this morning, when my partner flew to Paris. He’ll be gone for two weeks. I’ll miss him, but this is also my big break.



I’ve taken “War and Peace” back down off the shelf. It’s sitting beside my computer as I write these lines—accusatory as some attention-starved pet.

You and me, old friend. You, me, and two weeks. I open the book, I shut the book, and I open the book again. The ink swirls up at me. This is hard. Why is this so hard?

* * *

Dr. Douglas Gentile, a friendly professor at Iowa State University, recently commiserated with me about my pathetic attention span. “It’s me, too, of course,” he said. “When I try to write a paper, I can’t keep from checking my e-mail every five minutes. Even though I know it’s actually making me less productive.” This failing is especially worrying for Gentile because he happens to be one of the world’s leading authorities on the effects of media on the brains of the young. “I know, I know! I know all the research on multitasking. I can tell you absolutely that everyone who thinks they’re good at multitasking is wrong. We know that in fact it’s those who think they’re good at multitasking who are the least productive when they multitask.”

The brain itself is not, whatever we may like to believe, a multitasking device. And that is where our problem begins. Your brain does a certain amount of parallel processing in order to synthesize auditory and visual information into a single understanding of the world around you, but the brain’s attention is itself only a spotlight, capable of shining on one thing at a time. So the very word multitask is a misnomer. There is rapid-shifting minitasking, there is lame-spasms-of-effort-tasking, but there is, alas, no such thing as multitasking. “When we think we’re multitasking,” says Gentile, “we’re actually multiswitching.”

We can hardly blame ourselves for being enraptured by the promise of multitasking, though. Computers—like televisions before them—tap into a very basic brain function called an “orienting response.” Orienting responses served us well in the wilderness of our species’ early years. When the light changes in your peripheral vision, you must look at it because that could be the shadow of something that’s about to eat you. If a twig snaps behind you, ditto. Having evolved in an environment rife with danger and uncertainty, we are hardwired to always default to fast-paced shifts in focus. Orienting responses are the brain’s ever-armed alarm system and cannot be ignored.

Gentile believes it’s time for a renaissance in our understanding of mental health. To begin with, just as we can’t accept our body’s cravings for chocolate cake at face value, neither can we any longer afford to indulge the automatic desires our brains harbor for distraction.

* * *

It’s not merely difficult at first. It’s torture. I slump into the book, reread sentences, entire paragraphs. I get through two pages and then stop to check my e-mail—and down the rabbit hole I go. After all, one does not read “War and Peace” so much as suffer through it. It doesn’t help that the world at large, being so divorced from such pursuits, is often aggressive toward those who drop away into single-subject attention wells. People don’t like it when you read “War and Peace.” It’s too long, too boring, not worth the effort. And you’re elitist for trying.

In order to finish the thing in the two weeks I have allotted myself, I must read one hundred pages each day without fail. If something distracts me from my day’s reading—a friend in the hospital, a magazine assignment, sunshine—I must read two hundred pages on the following day. I’ve read at this pace before, in my university days, but that was years ago and I’ve been steadily down-training my brain ever since.

* * *

Another week has passed—my “War and Peace” struggle continues. I’ve realized now that the subject of my distraction is far more likely to be something I need to look at than something I need to do. There have always been activities—dishes, gardening, sex, shopping—that derail whatever purpose we’ve assigned to ourselves on a given day. What’s different now is the addition of so much content that we passively consume.

Only this morning I watched a boy break down crying on “X Factor,” then regain his courage and belt out a half-decent rendition of  Beyoncé’s “Listen”; next I looked up the original Beyoncé video and played it twice while reading the first few paragraphs of a story about the humanity of child soldiers; then I switched to a Nina Simone playlist prepared for me by Songza, which played while I flipped through a slide show of American soldiers seeing their dogs for the first time in years; and so on, ad nauseam. Until I shook I out of this funk and tried to remember what I’d sat down to work on in the first place.

* * *

If I’m to break from our culture of distraction, I’m going to need practical advice, not just depressing statistics. To that end, I switch gears and decide to stop talking to scientists for a while; I need to talk to someone who deals with attention and productivity in the so-called real world. Someone with a big smile and tailored suits such as organizational guru Peter Bregman. He runs a global consulting firm that gets CEOs to unleash the potential of their workers, and he’s also the author of the acclaimed business book 18 Minutes, which counsels readers to take a minute out of every work hour (plus five minutes at the start and end of the day) to do nothing but set an intention.

Bregman told me he sets his watch to beep every hour as a reminder that it’s time to right his course again. Aside from the intention setting, Bregman counsels no more than three e-mail check-ins a day. This notion of batch processing was anathema to someone like me, used to checking my in-box so constantly, particularly when my work feels stuck. “It’s incredibly inefficient to switch back and forth,” said Bregman, echoing every scientist I’d spoken to on multitasking. “Besides, e-mail is, actually, just about the least efficient mode of conversation you can have. And what we know about multitasking is that, frankly, you can’t. You just derail.”

“I just always feel I’m missing something important,” I said. “And that’s precisely why we lose hours every day, that fear.” Bregman argues that it’s people who can get ahead of that fear who end up excelling in the business world that he spends his own days in. “I think everyone is more distractible today than we used to be. It’s a very hard thing to fix. And as people become more distracted, we know they’re actually doing less, getting less done. Your efforts just leak out. And those who aren’t—aren’t leaking—are going to be the most successful.”

I hate that I leak. But there’s a religious certainty required in order to devote yourself to one thing while cutting off the rest of the world. We don’t know that the inbox is emergency-free, we don’t know that the work we’re doing is the work we ought to be doing. But we can’t move forward in a sane way without having some faith in the moment we’ve committed to. “You need to decide that things don’t matter as much as you might think they matter,” Bregman suggested as I told him about my flitting ways. And that made me think there might be a connection between the responsibility-free days of my youth and that earlier self’s ability to concentrate. My young self had nowhere else to be, no permanent anxiety nagging at his conscience. Could I return to that sense of ease? Could I simply be where I was and not seek out a shifting plurality to fill up my time?

* * *

It happened softly and without my really noticing.

As I wore a deeper groove into the cushions of my sofa, so the book I was holding wore a groove into my (equally soft) mind. Moments of total absence began to take hold more often; I remembered what it was like to be lost entirely in a well-spun narrative. There was the scene where Anna Mikhailovna begs so pitifully for a little money, hoping to send her son to war properly dressed. And there were, increasingly, more like it. More moments where the world around me dropped away and I was properly absorbed. A “causeless springtime feeling of joy” overtakes Prince Andrei; a tearful Pierre sees in a comet his last shimmering hope; Emperor Napoleon takes his troops into the heart of Russia, oblivious to the coming winter that will destroy them all…

It takes a week or so for withdrawal symptoms to work through a heroin addict’s body. While I wouldn’t pretend to compare severity here, doubtless we need patience, too, when we deprive ourselves of the manic digital distractions we’ve grown addicted to.

That’s how it was with my Tolstoy and me. The periods without distraction grew longer, I settled into the sofa and couldn’t hear the phone, couldn’t hear the ghost-buzz of something else to do. I’m teaching myself to slip away from the world again.

* * *

Yesterday I fell asleep on the sofa with a few dozen pages of “War and Peace” to go. I could hear my cell phone buzzing from its perch on top of the piano. I saw the glowing green eye of my Cyclops modem as it broadcast potential distraction all around. But on I went past the turgid military campaigns and past the fretting of Russian princesses, until sleep finally claimed me and my head, exhausted, dreamed of nothing at all. This morning I finished the thing at last. The clean edges of its thirteen hundred pages have been ruffled down into a paper cabbage, the cover is pilled from the time I dropped it in the bath. Holding the thing aloft, trophy style, I notice the book is slightly larger than it was before I read it.

It’s only after the book is laid down, and I’ve quietly showered and shaved, that I realize I haven’t checked my e-mail today. The thought of that duty comes down on me like an anvil.

Instead, I lie back on the sofa and think some more about my favorite reader Milton – about his own anxieties around reading. By the mid-1650s, he had suffered that larger removal from the crowds, he had lost his vision entirely and could not read at all—at least not with his own eyes. From within this new solitude, he worried that he could no longer meet his potential. One sonnet, written shortly after the loss of his vision, begins:

When I consider how my light is spent,

Ere half my days, in this dark world and wide, and that one Talent

which is death to hide Lodged with me useless . . .

Yet from that position, in the greatest of caves, he began producing his greatest work. The epic “Paradise Lost,” a totemic feat of concentration, was dictated to aides, including his three daughters.

Milton already knew, after all, the great value in removing himself from the rush of the world, so perhaps those anxieties around his blindness never had a hope of dominating his mind. I, on the other hand, and all my peers, must make a constant study of concentration itself. I slot my ragged “War and Peace” back on the shelf. It left its marks on me the same way I left my marks on it (I feel awake as a man dragged across barnacles on the bottom of some ocean). I think: This is where I was most alive, most happy. How did I go from loving that absence to being tortured by it? How can I learn to love that absence again?

This essay is adapted from “The End of Absence” by Michael Harris, published by Current / Penguin Random House.

 

http://www.salon.com/2014/08/17/war_and_peace_tortured_me_facebook_email_and_the_neuroscience_of_always_being_distracted/?source=newsletter

Eight (No, Nine!) Problems With Big Data

Credit Open, N.Y.

BIG data is suddenly everywhere. Everyone seems to be collecting it, analyzing it, making money from it and celebrating (or fearing) its powers. Whether we’re talking about analyzing zillions of Google search queries to predict flu outbreaks, or zillions of phone records to detect signs of terrorist activity, or zillions of airline stats to find the best time to buy plane tickets, big data is on the case. By combining the power of modern computing with the plentiful data of the digital era, it promises to solve virtually any problem — crime, public health, the evolution of grammar, the perils of dating — just by crunching the numbers.

Or so its champions allege. “In the next two decades,” the journalist Patrick Tucker writes in the latest big data manifesto, “The Naked Future,” “we will be able to predict huge areas of the future with far greater accuracy than ever before in human history, including events long thought to be beyond the realm of human inference.” Statistical correlations have never sounded so good.

Is big data really all it’s cracked up to be? There is no doubt that big data is a valuable tool that has already had a critical impact in certain areas. For instance, almost every successful artificial intelligence computer program in the last 20 years, from Google’s search engine to the I.B.M. “Jeopardy!” champion Watson, has involved the substantial crunching of large bodies of data. But precisely because of its newfound popularity and growing use, we need to be levelheaded about what big data can — and can’t — do.

The first thing to note is that although big data is very good at detecting correlations, especially subtle correlations that an analysis of smaller data sets might miss, it never tells us which correlations are meaningful. A big data analysis might reveal, for instance, that from 2006 to 2011 the United States murder rate was well correlated with the market share of Internet Explorer: Both went down sharply. But it’s hard to imagine there is any causal relationship between the two. Likewise, from 1998 to 2007 the number of new cases of autism diagnosed was extremely well correlated with sales of organic food (both went up sharply), but identifying the correlation won’t by itself tell us whether diet has anything to do with autism.

Second, big data can work well as an adjunct to scientific inquiry but rarely succeeds as a wholesale replacement. Molecular biologists, for example, would very much like to be able to infer the three-dimensional structure of proteins from their underlying DNA sequence, and scientists working on the problem use big data as one tool among many. But no scientist thinks you can solve this problem by crunching data alone, no matter how powerful the statistical analysis; you will always need to start with an analysis that relies on an understanding of physics and biochemistry.

Third, many tools that are based on big data can be easily gamed. For example, big data programs for grading student essays often rely on measures like sentence length and word sophistication, which are found to correlate well with the scores given by human graders. But once students figure out how such a program works, they start writing long sentences and using obscure words, rather than learning how to actually formulate and write clear, coherent text. Even Google’s celebrated search engine, rightly seen as a big data success story, is not immune to “Google bombing” and “spamdexing,” wily techniques for artificially elevating website search placement.

Fourth, even when the results of a big data analysis aren’t intentionally gamed, they often turn out to be less robust than they initially seem. Consider Google Flu Trends, once the poster child for big data. In 2009, Google reported — to considerable fanfare — that by analyzing flu-related search queries, it had been able to detect the spread of the flu as accurately and more quickly than the Centers for Disease Control and Prevention. A few years later, though, Google Flu Trends began to falter; for the last two years it has made more bad predictions than good ones.

As a recent article in the journal Science explained, one major contributing cause of the failures of Google Flu Trends may have been that the Google search engine itself constantly changes, such that patterns in data collected at one time do not necessarily apply to data collected at another time. As the statistician Kaiser Fung has noted, collections of big data that rely on web hits often merge data that was collected in different ways and with different purposes — sometimes to ill effect. It can be risky to draw conclusions from data sets of this kind.

A fifth concern might be called the echo-chamber effect, which also stems from the fact that much of big data comes from the web. Whenever the source of information for a big data analysis is itself a product of big data, opportunities for vicious cycles abound. Consider translation programs like Google Translate, which draw on many pairs of parallel texts from different languages — for example, the same Wikipedia entry in two different languages — to discern the patterns of translation between those languages. This is a perfectly reasonable strategy, except for the fact that with some of the less common languages, many of the Wikipedia articles themselves may have been written using Google Translate. In those cases, any initial errors in Google Translate infect Wikipedia, which is fed back into Google Translate, reinforcing the error.

A sixth worry is the risk of too many correlations. If you look 100 times for correlations between two variables, you risk finding, purely by chance, about five bogus correlations that appear statistically significant — even though there is no actual meaningful connection between the variables. Absent careful supervision, the magnitudes of big data can greatly amplify such errors.

Seventh, big data is prone to giving scientific-sounding solutions to hopelessly imprecise questions. In the past few months, for instance, there have been two separate attempts to rank people in terms of their “historical importance” or “cultural contributions,” based on data drawn from Wikipedia. One is the book “Who’s Bigger? Where Historical Figures Really Rank,” by the computer scientist Steven Skiena and the engineer Charles Ward. The other is an M.I.T. Media Lab project called Pantheon.

Both efforts get many things right — Jesus, Lincoln and Shakespeare were surely important people — but both also make some egregious errors. “Who’s Bigger?” claims that Francis Scott Key was the 19th most important poet in history; Pantheon has claimed that Nostradamus was the 20th most important writer in history, well ahead of Jane Austen (78th) and George Eliot (380th). Worse, both projects suggest a misleading degree of scientific precision with evaluations that are inherently vague, or even meaningless. Big data can reduce anything to a single number, but you shouldn’t be fooled by the appearance of exactitude.

FINALLY, big data is at its best when analyzing things that are extremely common, but often falls short when analyzing things that are less common. For instance, programs that use big data to deal with text, such as search engines and translation programs, often rely heavily on something called trigrams: sequences of three words in a row (like “in a row”). Reliable statistical information can be compiled about common trigrams, precisely because they appear frequently. But no existing body of data will ever be large enough to include all the trigrams that people might use, because of the continuing inventiveness of language.

To select an example more or less at random, a book review that the actor Rob Lowe recently wrote for this newspaper contained nine trigrams such as “dumbed-down escapist fare” that had never before appeared anywhere in all the petabytes of text indexed by Google. To witness the limitations that big data can have with novelty, Google-translate “dumbed-down escapist fare” into German and then back into English: out comes the incoherent “scaled-flight fare.” That is a long way from what Mr. Lowe intended — and from big data’s aspirations for translation.

Wait, we almost forgot one last problem: the hype. Champions of big data promote it as a revolutionary advance. But even the examples that people give of the successes of big data, like Google Flu Trends, though useful, are small potatoes in the larger scheme of things. They are far less important than the great innovations of the 19th and 20th centuries, like antibiotics, automobiles and the airplane.

Big data is here to stay, as it should be. But let’s be realistic: It’s an important resource for anyone analyzing data, not a silver bullet.

Follow

Get every new post delivered to your Inbox.

Join 1,526 other followers