Neglecting the Lessons of Cypherpunk History


Over the course of the Snowden revelations there have been a number of high profile figures who’ve praised the merits of encryption as a remedy to the quandary of mass interception. Companies like Google and Apple have been quick to publicize their adoption of cryptographic countermeasures in an effort to maintain quarterly earnings. This marketing campaign has even convinced less credulous onlookers like Glenn Greenwald. For example, in a recent Intercept piece, Greenwald claimed:

“It is well-established that, prior to the Snowden reporting, Silicon Valley companies were secret, eager and vital participants in the growing Surveillance State. Once their role was revealed, and they perceived those disclosures threatening to their future profit-making, they instantly adopted a PR tactic of presenting themselves as Guardians of Privacy. Much of that is simply self-serving re-branding, but some of it, as I described last week, are genuine improvements in the technological means of protecting user privacy, such as the encryption products now being offered by Apple and Google, motivated by the belief that, post-Snowden, parading around as privacy protectors is necessary to stay competitive.”

So, while he concedes the role of public relations in the ongoing cyber security push, Greenwald concurrently believes encryption is a “genuine” countermeasure. In other words, what we’re seeing is mostly marketing hype… except for the part about strong encryption.

With regard to the promise of encryption as a privacy cure-all, history tells a markedly different story. Guarantees of security through encryption have often proven illusory, a magic act. Seeking refuge in a technical quick fix can be hazardous for a number of reasons.

Monolithic corporations aren’t our saviors — they’re the central part of the problem.

Tech Companies Are Peddling a Phony Version of Security, Using the Govt. as the Bogeyman

This week the USA Freedom Act was blocked in the Senate as it failed to garner the 60 votes required to move forward. Presumably the bill would have imposed limits on NSA surveillance. Careful scrutiny of the bill’s text however reveals yet another mere gesture of reform, one that would codify and entrench existing surveillance capabilities rather than eliminate them.

Glenn Greenwald, commenting from his perch at the Intercept, opined:

“All of that illustrates what is, to me, the most important point from all of this: the last place one should look to impose limits on the powers of the U.S. government is . . . the U.S. government. Governments don’t walk around trying to figure out how to limit their own power, and that’s particularly true of empires.”

Anyone who followed the sweeping deregulation of the financial industry during the Clinton era, the Gramm–Leach–Bliley Act of 1999 which effectively repealed Glass-Steagall and the Commodity Futures Modernization Act of 2000, immediately sees through Greenwald’s impromptu dogma. Let’s not forget the energy market deregulation in California and subsequent manipulation that resulted in blackouts throughout the state. Ditto that for the latest roll back of arms export controls that opened up markets for the defense industry. And never mind all those hi-tech companies that want to loosen H1-B restrictions.

The truth is that the government is more than happy to cede power and authority… just as long as doing so serves the corporate factions that have achieved state capture. The “empire” Greenwald speaks of is a corporate empire. In concrete analytic results that affirm Thomas Ferguson’s Investment Theory of Party Competition, researchers from Princeton and Northwestern University conclude that:

“Multivariate analysis indicates that economic elites and organized groups representing business interests have substantial independent impacts on U.S. government policy, while average citizens and mass-based interest groups have little or no independent influence.”

Glenn’s stance reveals a broader libertarian theme. One that the Koch brothers would no doubt find amenable: the government is suspect and efforts to rein in mass interception must therefore arise from the corporate entities. Greenwald appears to believe that the market will solve everything. Specifically, he postulates that consumer demand for security will drive companies to offer products that protect user privacy, adopt “strong” encryption, etc.

The Primacy of Security Theater

Certainly large hi-tech companies care about quarterly earnings. That definitely explains all of the tax evasion, wage ceilings, and the slave labor. But these same companies would be hard pressed to actually protect user privacy because spying on users is a fundamental part of their business model. Like government spies, corporate spies collect and monetize oceans of data.

Furthermore hi-tech players don’t need to actually bullet-proof their products to win back customers. It’s far more cost effective to simply manufacture the perception of better security: slap on some crypto, flood the news with public relation pieces, and get some government officials (e.g. James ComeyRobert Hannigan, and Stewart Baker) to whine visibly about the purported enhancements in order to lend the marketing campaign credibility. The techno-libertarians of Silicon Valley are masters of Security Theater.

Witness, if you will, Microsoft’s litany of assurances about security over the years, followed predictably by an endless train of critical zero-day bugs. Faced with such dissonance it becomes clear that “security” in high-tech is viewed as a public relations issue, a branding mechanism to boost profits. Greenwald is underestimating the contempt that CEOs have for the credulity of their user base, much less their own workers.

Does allegedly “strong” cryptography offer salvation? Cryptome’s John Young thinks otherwise:

“Encryption is a citizen fraud, bastard progeny of national security, which offers malware insecurity requiring endless ‘improvements’ to correct the innately incorrigible. Its advocates presume it will empower users rather than subject them to ever more vulnerability to shady digital coders complicit with dark coders of law in exploiting fear, uncertainty and doubt.”

Business interests, having lured customers in droves with a myriad of false promises, will go back to secretly cooperating with government spies as they always have: introducing subtle weaknesses into cryptographic protocols, designing backdoors that double as accidental zero-day bugs, building rootkits which hide in plain sight, and handing over user data. In other words all of the behavior that was described by Edward Snowden’s documents. Like a jilted lover, consumers will be pacified with a clever sales pitch that conceals deeper corporate subterfuge.

Ultimately it’s a matter of shared class interest. The private sector almost always cooperates with the intelligence services because American spies pursue the long-term prerogatives of neoliberal capitalism; open markets and access to resources the world over. Or perhaps someone has forgotten the taped phone call of Victoria Nuland selecting the next prime minister of Ukraine as the IMF salivates over austerity measures? POTUS caters to his constituents, the corporate ruling class, which transitively convey their wishes to clandestine services like the CIA. Recall Ed Snowden’s open letter to Brazil:

“These programs were never about terrorism: they’re about economic spying, social control, and diplomatic manipulation. They’re about power.”

To confront the Deep State Greenwald is essentially advocating that we elicit change by acting like consumers instead of constitutionally endowed citizens. This is a grave mistake because profits can be decoupled from genuine security in a society defined by secrecy, propaganda, and state capture. Large monolithic corporations aren’t our saviors. They’re the central part of the problem. We shouldn’t run to the corporate elite to protect us. We should engage politically to retake and remake our republic.


Bill Blunden is an independent investigator whose current areas of inquiry include information security, anti-forensics, and institutional analysis.

You should actually blame America for everything you hate about internet culture

November 21

The tastes of American Internet-users are both well-known and much-derided: Cat videos. Personality quizzes. Lists of things that only people from your generation/alma mater/exact geographic area “understand.”

But in France, it turns out, even viral-content fiends are a bit more … sophistiqués.

“In France, articles about cats do not work,” Buzzfeed’s Scott Lamb told Le Figaro, a leading Parisian paper. Instead, he explained, Buzzfeed’s first year in the country has shown it that “the French love sharing news and politics on social networks – in short, pretty serious stuff.”

This is interesting for two reasons: first, as conclusive proof that the French are irredeemable snobs; second, as a crack in the glossy, understudied facade of what we commonly call “Internet culture.”

When the New York Times’s David Pogue tried to define the term in 2009, he ended up with a series of memes: the “Star Wars” kid, the dancing baby, rickrolling, the exploding whale. Likewise, if you look to anyone who claims to cover the Internet culture space — not only Buzzfeed, but Mashable, Gawker and, yeah, yours truly — their coverage frequently plays on what Lamb calls the “cute and positive” theme. They’re boys who work at Target and have swoopy hair, videos of babies acting like “tiny drunk adults,” hamsters eating burritos and birthday cakes.

That is the meaning we’ve assigned to “Internet culture,” itself an ambiguous term: It’s the fluff and the froth of the global Web.

But Lamb’s observations on Buzzfeed’s international growth would actually seem to suggest something different. Cat memes and other frivolities aren’t the work of an Internet culture. They’re the work of an American one.

American audiences love animals and “light content,” Lamb said, but readers in other countries have reacted differently. Germans were skeptical of the site’s feel-good frivolity, he said, and some Australians were outright “hostile.” Meanwhile, in France — land of la mode and le Michelin — critics immediately complained, right at Buzzfeed’s French launch, that the articles were too fluffy and poorly translated. Instead, Buzzfeed quickly found that readers were more likely to share articles about news, politics and regional identity, particularly in relation to the loved/hated Paris, than they were to share the site’s other fare.

A glance at Buzzfeed’s French page would appear to bear that out. Right now, its top stories “Ça fait le buzz” — that’s making the buzz, for you Americaines — are “21 photos that will make you laugh every time” and “26 images that will make you rethink your whole life.” They’re not making much buzz, though. Neither has earned more than 40,000 clicks — a pittance for the reigning king of virality, particularly in comparison to Buzzfeed’s versions on the English site.

All this goes to show that the things we term “Internet culture” are not necessarily born of the Internet, itself — the Internet is everywhere, but the insatiable thirst for cat videos is not. If you want to complain about dumb memes or clickbait or other apparent instances of socially sanctioned vapidity, blame America: We started it, not the Internet.

Appelons un chat un chat.

Caitlin Dewey runs The Intersect blog, writing about digital and Internet culture. Before joining the Post, she was an associate online editor at Kiplinger’s Personal Finance.

William Gibson: I never imagined Facebook

The brilliant science-fiction novelist who imagined the Web tells Salon how writers missed social media’s rise

William Gibson: I never imagined Facebook
William Gibson (Credit: Putnam/Michael O’Shea)

Even if you’ve never heard of William Gibson, you’re probably familiar with his work. Arguably the most important sci-fi writer of his generation, Gibson’s cyber-noir imagination has shaped everything from the Matrix aesthetic to geek culture to the way we conceptualize virtual reality. In a 1982 short story, Gibson coined the term “cyberspace.” Two years later, his first and most famous novel, “Neuromancer,” helped launch the cyberpunk genre. By the 1990s, Gibson was writing about big data, imagining Silk Road-esque Internet enclaves, and putting his characters on reality TV shows — a full four years before the first episode of “Big Brother.”

Prescience is flashy, but Gibson is less an oracle than a kind of speculative sociologist. A very contemporary flavor of dislocation seems to be his specialty. Gibson’s heroes shuttle between wildly discordant worlds: virtual paradises and physical squalor; digital landscapes and crumbling cities; extravagant wealth and poverty.

In his latest novel, “The Peripheral,” which came out on Tuesday, Gibson takes this dislocation to new extremes. Set in mid-21st century Appalachia and far-in-the-future London, “The Peripheral” is partly a murder mystery, and partly a time-travel mind-bender. Gibson’s characters aren’t just dislocated in space, now. They’ve become unhinged from history.

Born in South Carolina, Gibson has lived in Vancouver since the 1960s. Over the phone, we spoke about surveillance, celebrity and the concept of the eternal now.

You’re famous for writing about hackers, outlaws and marginal communities. But one of the heroes of “The Peripheral” is a near-omniscient intelligence agent. She has surveillance powers that the NSA could only dream of. Should I be surprised to see you portray that kind of character so positively?

Well, I don’t know. She’s complicated, because she is this kind of terrifying secret police person in the service of a ruthless global kleptocracy. At the same time, she seems to be slightly insane and rather nice. It’s not that I don’t have my serious purposes with her, but at the same time she’s something of a comic turn.

Her official role is supposed to be completely terrifying, but at the same time her role is not a surprise. It’s not like, “Wow, I never even knew that that existed.”

Most of the characters in “The Peripheral” assume that they’re being monitored at all times. That assumption is usually correct. As a reader, I was disconcerted by how natural this state of constant surveillance felt to me.

I don’t know if it would have been possible 30 years ago to convey that sense to the reader effectively, without the reader already having some sort of cultural module in place that can respond to that. If we had somehow been able to read this text 30 years ago, I don’t know how we would even register that. It would be a big thing for a reader to get their head around without a lot of explaining. It’s a scary thing, the extent to which I don’t have to explain why [the characters] take that surveillance for granted. Everybody just gets it.

You’re considered a founder of the cyberpunk genre, which tends to feature digital cowboys — independent operators working on the frontiers of technology. Is the counterculture ethos of cyberpunk still relevant in an era when the best hackers seem to be working for the Chinese and U.S. governments, and our most famous digital outlaw, Edward Snowden, is under the protection of Vladimir Putin?

It’s seemed to me for quite a while now that the most viable use for the term “cyberpunk” is in describing artifacts of popular culture. You can say, “Did you see this movie? No? Well, it’s really cyberpunk.” Or, “Did you see the cyberpunk pants she was wearing last night?”

People know what you’re talking about, but it doesn’t work so well describing human roles in the world today. We’re more complicated. I think one of the things I did in my early fiction, more or less for effect, was to depict worlds where there didn’t really seem to be much government. In “Neuromancer,” for example, there’s no government really on the case of these rogue AI experiments that are being done by billionaires in orbit. If I had been depicting a world in which there were governments and law enforcement, I would have depicted hackers on both sides of the fence.

In “Neuromancer,” I don’t think there’s any evidence of anybody who has any parents. It’s kind of a very adolescent book that way.

In “The Peripheral,” governments are involved on both sides of the book’s central conflict. Is that a sign that you’ve matured as a writer? Or are you reflecting changes in how governments operate?

I hope it’s both. This book probably has, for whatever reason, more of my own, I guess I could now call it adult, understanding of how things work. Which, I suspect, is as it should be. People in this book live under governments, for better or worse, and have parents, for better or worse.

In 1993, you wrote an influential article about Singapore for Wired magazine, in which you wondered whether the arrival of new information technology would make the country more free, or whether Singapore would prove that “it is possible to flourish through the active repression of free expression.” With two decades of perspective, do you feel like this question has been answered?

Well, I don’t know, actually. The question was, when I asked it, naive. I may have posed innocently a false dichotomy, because some days when you’re looking out at the Internet both things are possible simultaneously, in the same place.

So what do you think is a better way to phrase that question today? Or what would have been a better way to phrase it in 1993?

I think you would end with something like “or is this just the new normal?”

Is there anything about “the new normal” in particular that surprises you? What about the Internet today would you have been least likely to foresee?

It’s incredible, the ubiquity. I definitely didn’t foresee the extent to which we would all be connected almost all of the time without needing to be plugged in.

That makes me think of “Neuromancer,” in which the characters are always having to track down a physical jack, which they then use to plug themselves into this hyper-futuristic Internet.

Yes. It’s funny, when the book was first published, when it was just out — and it was not a big deal the first little while it was out, it was just another paperback original — I went to a science fiction convention. There were guys there who were, by the standards of 1984, far more computer-literate than I was. And they very cheerfully told me that I got it completely wrong, and I knew nothing. They kept saying over and over, “There’s never going to be enough bandwidth, you don’t understand. This could never happen.”

So, you know, here I am, this many years later with this little tiny flat thing in my hand that’s got more bandwidth than those guys thought was possible for a personal device to ever have, and the book is still resonant for at least some new readers, even though it’s increasingly hung with the inevitable obsolescence of having been first published in 1984. Now it’s not really in the pale, but in the broader outline.

You wrote “Neuromancer” on a 1927 Hermes typewriter. In an essay of yours from the mid-1990s, you specifically mention choosing not to use email. Does being a bit removed from digital culture help you critique it better? Or do you feel that you’re immersed in that culture, now?

I no longer have the luxury of being as removed from it as I was then. I was waiting for it to come to me. When I wrote [about staying off email], there was a learning curve involved in using email, a few years prior to the Web.

As soon as the Web arrived, I was there, because there was no learning curve. The interface had been civilized, and I’ve basically been there ever since. But I think I actually have a funny kind of advantage, in that I’m not generationally of [the Web]. Just being able to remember the world before it, some of the perspectives are quite interesting.

Drones and 3-D printing play major roles in “The Peripheral,” but social networks, for the most part, are obsolete in the book’s fictional future. How do you choose which technological trends to amplify in your writing, and which to ignore?

It’s mostly a matter of which ones I find most interesting at the time of writing. And the absence of social media in both those futures probably has more to do with my own lack of interest in that. It would mean a relatively enormous amount of work to incorporate social media into both those worlds, because it would all have to be invented and extrapolated.

Your three most recent novels, before “The Peripheral,” take place in some version of the present. You’re now returning to the future, which is where you started out as a writer in the 1980s. Futuristic sci-fi often feels more like cultural criticism of the present than an exercise in prediction. What is it about the future that helps us reflect on the contemporary world?

When I began to write science fiction, I already assumed that science fiction about the future is only ostensibly written about the future, that it’s really made of the present. Science fiction has wound up with a really good cultural toolkit — an unexpectedly good cultural toolkit — for taking apart the present and theorizing on how it works, in the guise of presenting an imagined future.

The three previous books were basically written to find out whether or not I could use the toolkit that I’d acquired writing fictions about imaginary futures on the present, but use it for more overtly naturalistic purposes. I have no idea at this point whether my next book will be set in an imaginary future or the contemporary present or the past.

Do you feel as if sci-fi has actually helped dictate the future? I was speaking with a friend earlier about this, and he phrased the question well: Did a book like “Neuromancer” predict the future, or did it establish a dress code for it? In other words, did it describe a future that people then tried to live out?

I think that the two halves of that are in some kind of symbiotic relationship with one another. Science fiction ostensibly tries to predict the future. And the people who wind up making the future sometimes did what they did because they read a piece of science fiction. “Dress code” is an interesting way to put it. It’s more like … it’s more like attitude, really. What will our attitude be toward the future when the future is the present? And that’s actually much more difficult to correctly predict than what sort of personal devices people will be carrying.

How do you think that attitude has changed since you started writing? Could you describe the attitude of our current moment?

The day the Apple Watch was launched, late in the day someone on Twitter announced that it was already over. They cited some subject, they linked to something, indicating that our moment of giddy future shock was now over. There’s just some sort of endless now, now.

Could you go into that a little bit more, what you mean by an “endless now”?

Fifty years ago, I think now was longer. I think that the cultural and individual concept of the present moment was a year, or two, or six months. It wasn’t measured in clicks. Concepts of the world and of the self couldn’t change as instantly or in some cases as constantly. And I think that has resulted in there being a now that’s so short that in a sense it’s as though it’s eternal. We’re just always in the moment.

And it takes something really horrible, like some terrible, gripping disaster, to lift us out of that, or some kind of extra-strong sense of outrage, which we know that we share with millions of other people. Unfortunately, those are the things that really perk us up. This is where we get perked up, perked up for longer than for over a new iPhone, say.

The worlds that you imagine are enchanting, but they also tend to be pretty grim. Is it possible to write good sci-fi that doesn’t have some sort of dystopian edge?

I don’t know. It wouldn’t occur to me to try. The world today, considered in its totality, has a considerable dystopian edge. Perhaps that’s always been true.

I often work in a form of literature that is inherently fantastic. But at the same time that I’m doing that, I’ve always shared concerns with more naturalistic forms of writing. I generally try to make my characters emotionally realistic. I do now, at least; I can’t say I always have done that. And I want the imaginary world they live in and the imaginary problems that they have to reflect the real world, and to some extent real problems that real people are having.

It’s difficult for me to imagine a character in a work of contemporary fiction who wouldn’t have any concerns with the more dystopian elements of contemporary reality. I can imagine one, but she’d be a weird … she’d be a strange character. Maybe some kind of monster. Totally narcissistic.

What makes this character monstrous? The narcissism?

Well, yeah, someone sufficiently self-involved. It doesn’t require anything like the more clinical forms of narcissism. But someone who’s sufficiently self-involved as to just not be bothered with the big bad things that are happening in the world, or the bad things — regular-size bad things — that are happening to one’s neighbors. There certainly are people like that out there. The Internet is full of them. I see them every day.

You were raised in the South, and you live in Vancouver, but, like Philip K. Dick, you’ve set some of your most famous work in San Francisco. What is the appeal of the city for technological dreamers? And how does the Silicon Valley of today fit into that Bay Area ethos?

I’m very curious to go back to San Francisco while on tour for this book, because it’s been a few years since I’ve been there, and it was quite a few years before that when I wrote about San Francisco in my second series of books.

I think one of the reasons I chose it was that it was a place that I would get to fairly frequently, so it would stay fresh in memory, but it also seemed kind of out of the loop. It was kind of an easy canvas for me, an easier canvas to set a future in than Los Angeles. It seemed to have fewer moving parts. And that’s obviously no longer the case, but I really know contemporary San Francisco now more by word of mouth than I do from first-person experience. I really think it sounds like a genuinely new iteration of San Francisco.

Do you think that Google and Facebook and this Silicon Valley culture are the heirs to the Internet that you so presciently imagined in the 1980s? Or do they feel like they’ve taken the Web in different directions than what you expected?

Generally it went it directions that didn’t occur to me. It seems to me now that if I had been a very different kind of novelist, I would have been more likely to foresee something like Facebook. But you know, if you try to imagine that somebody in 1982 writes this novel that totally and accurately predicted what it would be like to be on Facebook, and then tried to get it published? I don’t know if you would be able to get it published. Because how exciting is that, or what kind of crime story could you set there?

Without even knowing it, I was limited by the kind of fiction of the imaginary future that I was trying to write. I could use detective gangster stories, and there is a real world of the Internet that’s like that, you know? Very much like that. Although the crimes are so different. The ace Russian hacker mobs are not necessarily crashing into the global corporations. They’re stealing your Home Depot information. If I’d put that as an exploit in “Neuromancer,” nobody would have gotten it. Although it would have made me seem very, very prescient.

You’ve written often and eloquently about cults of celebrity and the surrealness of fame. By this point you’re pretty famous yourself. Has writing about fame changed the way you experience it? Does experiencing fame change the way you write about it?

Writers in our society, even today, have a fairly homeopathic level of celebrity compared to actors and really popular musicians, or Kardashians. I think in [my 1993 novel] “Virtual Light,” I sort of predicted Kardashian. Or there’s an implied celebrity industry in that book that’s very much like that. You become famous just for being famous. And you can keep it rolling.

But writers, not so much. Writers get just a little bit of it on a day-to-day basis. Writers are in an interesting place in our society to observe how that works, because we can be sort of famous, but not really famous. Partly I’d written about fame because I’d seen little bits of it, but the bigger reason is the extent to which it seems that celebrity is the essential postmodern product, and the essential post-industrial product. The so-called developed world pioneered it. So it’s sort of inherently in my ballpark. It would be weird if it wasn’t there.

You have this reputation of being something of a Cassandra. I don’t want to put you on the spot and ask for predictions. But I’m curious: For people who are trying to understand technological trends, and social trends, where do you recommend they look? What should they be observing?

I think the best advice I’ve ever heard on that was from Samuel R. Delany, the great American writer. He said, “If you want to know how something works, look at one that’s broken.” I encountered that remark of his before I began writing, and it’s one of my fridge magnets for writing.

Anything I make, and anything I’m describing in terms of its workings — even if I were a non-literary futuristic writer of some kind — I think that statement would be very resonant for me. Looking at the broken ones will tell you more about what the thing actually does than looking at one that’s perfectly functioning, because then you’re only seeing the surface, and you’re only seeing what its makers want you to see. If you want to understand social media, look at troubled social media. Or maybe failed social media, things like that.

Do you think that’s partly why so much science fiction is crime fiction, too?

Yeah, it might be. Crime fiction gives the author the excuse to have a protagonist who gets her nose into everything and goes where she’s not supposed to go and asks questions that will generate answers that the author wants the reader to see. It’s a handy combination. Detective fiction is in large part related to literary naturalism, and literary naturalism was a quite a radical concept that posed that you could use the novel to explore existing elements of society which had previously been forbidden, like the distribution of capital and class, and what sex really was. Those were all naturalistic concerns. They also yielded to detective fiction. Detective fiction and science fiction are an ideal cocktail, in my opinion.

AT&T and Verizon use “supercookies” to track users’ online activities

By Thomas Gaist
7 November 2014

Telecommunications corporations Verizon and AT&T automatically monitor and record all Internet activity by users accessing their cellular data networks, according to reports published this week by the Washington Post and privacy groups. The tracking system has been referred to as a “supercookie” because it is nearly impossible for users to disable it.

AT&T and Verizon secretly tracked internet activity by more than 100 million customers using the “supercookie” system, according to figures cited by the Washington Post. All users accessing AT&T and Verizon networks are subject to tracking and logging of their Internet browsing, regardless of whether they are customers with AT&T or Verizon, the Post reported.

Corporate and government clients are not subject to tracking with the “supercookie,” according to assurances given by Verizon.

The X-UID supercookie, which Verizon says was first activated in November 2012, allows Verizon and AT&T to keep a record of every single website a user visits, even when the user has enabled common security features such as “Private Browsing” mode or is using encryption technology.

Privacy groups note that data collected by the companies can easily be transferred to the NSA and other state surveillance agencies, and that even more advanced data tracking software is currently in development.

In 2012, Verizon launched Precision Market Insights (PMI), a subsidiary firm that sells information to marketing companies to tailor their advertising strategies based on Verizon customers’ Internet use patterns. PMI’s official literature touts the “PrecisionID” system, described as “an anonymous unique device identifier, which can be used to reach the right audiences on mobile through demographic, interest and geographic targeting.”

While the company maintains secrecy about its PMI operations, previous comments from top executives make clear the eagerness of Verizon’s corporate leadership to profit by spying on its customers.

“We realized we had a latent asset. We have information about how customers are using their mobile phones,” PMI vice president Colson Hillier told FierceMobileIT in October 2012.

Changes to Verizon’s privacy policy in 2011-12, enabled PMI to “take insights from the network … and create a series of tools that companies can use to better understand their consumers,” Hillier said.

“There’s a stampede by the cable companies and wireless carriers to expand data collection,” Jeffry Chester of the Center for Digital Democracy told the Washington Post.

“They all want to outdo Google,” Chester said.

PMI executive Bill Diggins bragged, “We are able to view just everything that they [cell phone users] do,” while speaking to the Paley Center’s “Data to Dollars” media symposium in 2012.

Verizon executive Thomas J. Tauke told a 2008 congressional hearing that Verizon would seek “meaningful, affirmative consent from consumers” before tracking their Internet usage with cookies.

Instead of positive consent, however, all users are subject to tracking by default, according to company sources cited by the Post, and Verizon continues to track and record all web activity even by customers who have “opted out” of the data tracking.

Once the data is collected, advertising companies can still use “de-anonymizing” technologies to identify and use data from customers who opted out, the Post reported.

Taken together with the growing mountain of evidence that the US government surveillance operations benefit from active collaboration with the major technology and communications companies, the latest revelations further show that the US corporate establishment views the privacy and democratic rights of the population with contempt.

Despite the public relations efforts of the companies to distance themselves from the mass surveillance programs run by the US and other governments, the “supercookie” exposures show that the most powerful telecoms are running data mining operations that are easily comparable to those of the government.

Aside from AT&T and Verizon, all of the other major tech and communications companies have been implicated in the US government’s global surveillance operations. Apple, Google, Microsoft, Yahoo, Facebook, AOL, Skype and Youtube all allowed the NSA’s PRISM program to collect e-mails, video and audio recordings, documents, photos, and other forms of data from their central servers, over a period of years, as part of secret agreements signed with the US government.

The NSA’s corporate partners are well compensated for their involvement in the mass spying. The NSA’s Corporate Partner Access Program paid some $280 million to tech companies to access and spy on their “high volume circuit and packet-switched networks” in 2012 alone, Snowden leaks from August 2013 showed.

Growing apart on Facebook: I moved out of my blue-collar hometown and left my friends behind

When I moved away to write novels, I thought I was graduating to a better life. Social media told a different story

Growing apart on Facebook: I moved out of my blue-collar hometown and left my friends behind
(Credit: Shutterstock/Salon)

I grew up in the 1970s in a pack of girls unmoved by feminism. In the Blue Ridge Mountains and green valley where we lived,we hot-rolled our hair, mooned over boys and spent our weekends tanning. I loved my girlfriends as much as I loved those hillsides of family farms and fields of wildflowers. Even then, though, I knew I would leave Roanoke, my blue-collar hometown in Virginia.

In those days the downtown was filled with boarded-up storefronts. Businesses had moved out to the malls, but even the malls were struggling. Cheap ranch houses and duplex complexes were springing up along the highway. The mountains in the distance were still lush and lovely, and there were still occasional farmhouses and open meadows, but what they were doing to Roanoke, to me, seemed awful.

Probably Roanoke troubled me more than it troubled my girlfriends because my parents were from up north. I’d spent my earliest years in Connecticut and Philadelphia. It left me sensitive to the notes of sexism, machismo and racism running beneath the Southern charm — to the problem with high school boys regularly beating each other bloody at Saturday night parties, say, or with casual use of the N-word. To my friends, steeped from the earliest days in the town’s habits and rituals, this was only life, nothing more.

My own mom, a former beauty queen from Albany, New York, opted to embrace the culture of Friday night football games and all-you-can-eat restaurants, and tried to instill in me the local wisdom that looking good, dressing well, was the ultimate path to happiness — to finding a man, that is.

My father, on the other hand, was an unapologetic liberal. He gave me books like George Orwell’s “Animal Farm” to read and encouraged me to study The New York Times Sunday Book Review. Many Saturday nights, I’d leave my father reading Freudian case studies for the only entertainment in town — the keggers, where all the boys wore cowboy hats and chewed tobacco and fought. They made me long for a wider world: French films and abstract art, the latest literary novels and indie-rock bands. At the time, artists seemed like gods to me, making beauty out of thin air. If I could get closer to the people who created culture, I’d be fortified, even ennobled.

As soon as I graduated from high school, I headed north for college, in Baltimore, while my friends stayed down south at more traditional schools. Soon I got letters detailing sorority-pledge hazings and Saturday afternoon football games. While my friends gushed about dressing up in formal gowns and white gloves, I was in sweats, wearing a ponytail, learning to write fiction.

The summer after my sophomore year, I interned in Washington, D.C. I invited two of my girlfriends up to visit. I took them to the 9:30 Club to see the Violent Femmes. The lead singer wore his leather jacket around his head like a turban and between songs yelled, “We’re not queer, we’re cool!” Back in my room my friends huddled together, their eyes huge. One told me she felt like she’d been “down into hell.” The other said the band members “looked like vampires.” I tried to soothe them by explaining how cool 9:30 was — Hüsker Dü, Root Boy Slim and even The Police had played there! I was trying to share my new world with them. But they could only see it as evil.

I mostly lost track of my girlfriends after that. I figured it would only upset me to keep up close contact. By the time I came home to Roanoke for my 20th high school reunion, I was living in Brooklyn and making my way as a novelist. I was also divorced and struggling to raise my daughter as a single mom.

At the reunion, one of the same friends I’d taken to the 9:30 Club years earlier broke into every conversation to say how sad it was I was getting divorced. She lived in Alabama now, was married to a banker, and had two children. Her parents had been part of the evangelical Moral Re-armament Movement and, after a brief wild phase, she had joined their ranks. The fact that I felt smothered and unhappy in my marriage was, to her, a pathetic excuse. And the fact that I lavished affection on my daughter, Abbie, did nothing to soften the tragedy of divorce. As I flew back to New York afterward, I was thankful, once again, to have real distance between myself and my high school girlfriends.

So when I joined Facebook a few years ago and my old girlfriends started to friend me, I hesitated. Wouldn’t hearing from them just continue to make it clear how different we’d become? Curiosity won out and I was soon logging on to see a flurry of religious posts that seemed to imagine God as a rich, attentive boyfriend: “God causes things to happen at exactly the right time,” and “Good morning Jesus! Thanks for waking me up with your touch of love!” It’s not that I was an unbeliever. I’d moved closer to God, too, in the years since high school. But faith to me had been a complicated journey; it was a world away from theirs, which seemed so much more open, joyous and doubt-free.

After a friend’s post warned that every American would be tagged with a microchip for Obamacare by 2017, I seriously considered un-friending all of them. It seemed Facebook hadn’t really managed to connect me with my old friends. If anything, their posts only alienated me from them more. Instead of being distant memories from another time in my life, they were a daily, digitized presence in my life. I found myself growing increasingly judgmental.

I realized that if I wanted to really reconnect with them after all this time, it wasn’t going to be enough to follow their posts and status updates. I’d need to reengage with my past. So instead of un-friending, I wrote my way back to them, in a novel about my teenage years in Roanoke, and the life I’d cut myself off from so long ago.

I recalled our early phase of unconditional intimacy: the time my friends and I broke into the Econo Lodge pool to swim in the middle of the night and drove up the side of a muddy, red dirt hill to watch the porno movies at the 220 Drive In; or how we went to Kiss concerts at the Civic Center and hung out at the Hardee’s waiting for the older boys we had crushes on to show up.

In high school I’d worshipped these girls. To me they were like sticks of butter, golden and sweet. They had long hair and wore gauzy blouses, pastel cords and clogs. We traded clothes, slept in the same beds, told each other our deepest desires. They’d also helped me through adolescence in a town where I did not always fit in. I was not a cheerleader or even on the drill team. I said odd, inchoate things about characters in the books I was reading. I stuttered. But they defended me from bullies, and their gentle teasing taught me not only how to dress but also what was and was not OK to talk about in polite company.

I also remembered how that first period of oneness had come under strain and frayed. Every one of these girls had a challenge at home; one’s dad was an alcoholic Vietnam veteran, and another’s a gambling addict. All our mothers were discombobulated. They’d been raised to be housewives, but the new culture insisted they get jobs and personal checking accounts. Seventies culture glorified working women like Mary Tyler Moore and Hill Street Blues’ Joyce Davenport. Our moms had become outmoded, female versions of Willy Loman, with a skill set no longer valued. One mom was so depressed she got up only for an hour at meal times. After heating up the TV dinners she went back to bed. We never talked about these problems, but we were inside each other’s houses and the darkness was palpable.

A few of our mothers went to work or back to school, but most of them, disoriented and threatened, put even more pressure on us, their daughters, to be traditional, to define ourselves through boyfriends and, later, husbands, to hold our looks as our most valuable asset and to uphold the sanctity of the traditional home. As each of my girlfriends got a boyfriend, the boy became the center of her universe. I had a boyfriend too, a tall blond god who looked like one of the guys from “The Dukes of Hazzard.” I painted my fingernails and teased up my hair for our dates, but in my heart I knew I’d soon blow past him. I didn’t mean to lead a double life, but at 17, my friends already longed to nest, talking about marriage and what good fathers their boyfriends would be. For me, the idea that these sullen, tobacco chewing, fistfighting, Jeep-driving young men would be good dads seemed unlikely at best.

In fact, I realized in writing my novel that my experience of mothering has been a direct response to my mom’s misery. I remember her crying to my dad that she felt she had nothing, that she was invisible, a zero. I wanted my daughter to see that her mother had agency in the world, that I was not solely defined by my care for her. From their posts, my old girlfriends seem to be more akin to our mothers’ models, parents first and people second. At times their smiles seemed hollow and fake to me. But at other times I thought I saw an authentic satisfaction in them that I craved. I went away from motherhood to mother better; they seemed to have found a way to embrace parenthood with less ambivalence.

I was nearly finished with my book when I got another Facebook request from an old friend. Jill and I had lived in the same Roanoke duplex complex in the early ’70s when we were just starting junior high. Her father had died in a motorcycle accident and her mother, who worked several jobs, had rarely been around. Jill, though, had a zillion ingenious and energetic plans. One we shared was living off the land; we spent hours pretending to keep house in the forest in back of our duplexes.

When I saw her page, at first I was disappointed. There were the usual Bible verses and photos of grown kids. But as I clicked down I saw pictures of fledgling bluebirds, a cedar waxwing eating berries, an errant heron catching a mole. Jill was a dental hygienist but also a wildlife rehabilitator. In pictures, giving a screech owl hydrotherapy, feeding a baby fox with a bottle and holding a raptor on her gloved arm, her face was transfigured. And I was jealous. Without leaving Roanoke or cutting herself off from her childhood, Jill had discovered profound work she could give herself to completely.

My friends’ posts — Tea Party petitions, exercise logs — can still annoy me, as I am sure my posts — notices for my daughter’s punk rock band and links to feminist articles on Jezebel — sometimes bug them. I’m much less sure now, though, that my interests in Brooklyn’s food trends, hot yoga and the indie movies at BAM must be more nourishing than their prayer meetings, hunting trips and car shows. Leaving Roanoke might have changed me, given me more creative opportunities and connected me to my vocation, but it wasn’t, I see now, a fix-all. The lifelong search for meaning, for coming to terms with death and maturing spiritually, is mysterious and more complicated than watching foreign films and reading postmodern novels.

My life is intellectually rich. Sure, it has more capital C culture in it. But then, the lives of the friends I left behind are marked by a continuity of place and community — and by a kind of peace and conviction — that can still elude me here in hip Brooklyn. I wouldn’t trade places with them. But I would say we’ve come out even in the end.

Darcey Steinke is the author of four novels, two of which were New York Times Notable Books of the Year. Her non-fiction has been featured in Vogue, the Washinton Post, the Chicago Tribune, the Village Voice, Spin, and the New York Times Magazine. She lives with her daughter in Brooklyn.

NSA chief calls for more “permeable” barrier between state and tech corporations

By Thomas Gaist
31 October 2014

In two speeches this month, US National Security Agency (NSA) Director Admiral Mike Rogers called for a further integration between the NSA and major technology and communications companies.

Speaking to more than 500 corporate, military and academic leaders at the Cyber Education Summit at Georgia Regents University earlier this month, Rogers argued that the needs of cybersecurity are rendering obsolete traditional distinctions between the private sector, the civilian government and the military-intelligence apparatus.

“Traditionally, in our structure as a nation, we have tried to very strongly differentiate between what is a private sector function, what is a governmental function and what is a function that really falls under national security. I would argue cyber crosses all three of those lines,” Rogers said.

In line with the demand advanced in his inaugural address as NSA head for a more “permeable membrane” between the national security agencies and private corporations, Rogers went on to call for effective merging of US government cyber systems with those of major corporations, allowing for direct, continuous communication between corporate databases and the NSA.

“In the end what we have got to get to, I believe, is real-time automated machine-to-machine interface,” Rogers said.

These remarks come as America’s top surveillance agency is experiencing a succession of media exposures of for-profit ventures by top officials of a legally dubious character.

Last week, the NSA announced that the agency’s top official for signals intelligence (SIGINT), Teresa Shea, will leave the agency after reports that she was running a side business specializing in electronic intelligence, Telic Networks. Shea’s husband also works for a major SIGINT contractor called DRS Signals Solutions.

Shea’s departure follows revelations that the NSA’s Chief Technical Officer Patrick Dowd has been working at least 20 hours per week for ex-NSA director General Keith Alexander’s own private cybersecurity venture, IronNet Cybersecurity. IronNet has already taken in millions in revenues by offering services based on technology patented by General Alexander during his tenure at NSA.

In addition to his own business activities, disclosure documents from early October show that Alexander made substantial investments in a number of tech companies while serving as NSA director. Because their operations focused on information security, the companies that received investment funds from Alexander stood to benefit from the NSA head’s hyping of the threat of cyberwarfare and terrorist attacks against US targets in the wake of the Snowden leaks, stoking accusations of conflict of interest against Alexander.

In a speech delivered Tuesday at the Third Annual Cybersecurity Summit at the US Chamber of Commerce, however, Rogers defended the private sector initiatives by NSA personnel, saying that the agency required continuous exchange of personnel and “flow of partnerships and information back and forth” with private tech firms.

“We’ve got to create a world where people from NSA can leave us for a while and go work in the private sector. And I would also like to create a world where the private sector can come spend a little time with us,” Rogers said.

The systematic transfer of data and technology between the US government and corporations—and between government agencies themselves, both within the US and internationally—is becoming ever more streamlined.

The NSA leadership, moreover, has already infiltrated undercover agents into tech companies worldwide, as recent leaked documents furnished by Edward Snowden have shown, and has developed secret contracts with major communications providers to insure cooperation with the US government’s data trawling programs (See: “Leaked documents expose secret contracts between NSA and tech companies”).

The NSA is also making its vast electronic warehouses of personal data assessable to the intelligence agencies of other imperialist powers. Britain’s Government Communications Headquarters (GCHQ) receives transfers of “unlimited bulk intelligence” from the NSA without any warranting process, according to a report published Wednesday by UK-based human rights group Liberty.

“British intelligence agencies can trawl through foreign intelligence material without meaningful restrictions and can keep such material, which includes both communications content and metadata, for up to two years,” the report states.

The drive of the NSA to implement total information sharing between the state and corporations reflects the unwillingness of the military-intelligence establishment to tolerate any obstacles, real or imaginary, to its ability to spy on the entire population at will. For the ruling elite, it is not sufficient to have secret contracts with the corporations and a panoply of surveillance programs mining their data.

On top of all of this, plans are now clearly afoot to implement automatic bulk data sharing between the corporations and the government, in direct violation of the US Constitution and international law.


Get every new post delivered to your Inbox.

Join 1,626 other followers