How Silicon Valley denies us the freedom to pay attention

Free your brain: 

A continual quest for attention both drives and compromises Silicon Valley’s techno-utopian vision

Free your brain: How Silicon Valley denies us the freedom to pay attention
(Credit: Salon/Flora Thevoux)

In late June, Mark Zuckerberg announced the new mission of Facebook: “To give people the power to build community and bring the world closer together.”

The rhetoric of the statement is carefully selected, centered on empowering people, and in so doing, ushering in world peace, or at least something like it. Tech giants across Silicon Valley are adopting similarly utopian visions, casting themselves as the purveyors of a more connected, more enlightened, more empowered future. Every year, these companies articulate their visions onstage at internationally streamed pep rallies, Apple’s WWDC and Google’s I/O being the best known.

But companies like Facebook can only “give people the power” because we first ceded it to them, in the form of our attention. After all, that is how many Silicon Valley companies thrive: Our attention, in the form of eyes and ears, provides a medium for them to advertise to us. And the more time we spend staring at them, the more money Facebook and Twitter make — in effect, it’s in their interest that we become psychologically dependent on the self-esteem boost from being wired in all the time.

This quest for our eyeballs doesn’t mesh well with Silicon Valley’s utopian visions of world peace and people power. Earlier this year, many sounded alarm bells when a “60 Minutes” exposé revealed the creepy cottage industry of “brain-hacking,” industrial psychology techniques that tech giants use and study to make us spend as much time staring at screens as possible.

Indeed, it is Silicon Valley’s continual quest for attention that both motivates their utopian dreams, and that compromises them from the start. As a result, the tech industry often has compromised ethics when it comes to product design.

Case in point: At January’s Consumer Electronics Convention – a sort of Mecca for tech start-ups dreaming of making it big – I found myself in a suite with one of the largest kid-tech (children’s toys) developers in the world. A small flock of PR reps, engineers and executives hovered around the entryway as one development head walked my photographer and me through the mock setup. They were showing off the first voice assistant developed solely with kids in mind.

At the end of the tour, I asked if the company had researched or planned to research the effects of voice assistant usage on kids. After all, parents had been using tablets to occupy their kids for years by the time evidence of their less-than-ideal impact on children’s attention, behavior and sleep emerged.

The answer I received was gentle but firm: No, because we respect parents’ right to make decisions on behalf of their children.

This free-market logic – that says the consumer alone arbitrates the value of a product – is pervasive in Silicon Valley. What consumer, after all, is going to argue they can’t make their own decisions responsibly? But a free market only functions properly when consumers operate with full agency and access to information, and tech companies are working hard to limit both.

During a “60 Minutes” story on brain hacking, former product manager at Google Tristan Harris said, “There’s always this narrative that technology’s neutral. And it’s up to us to choose how we use it.”

The problem, according to Harris, is that “this is just not true… [Developers] want you to use it in particular ways and for long periods of time. Because that’s how they make their money.”

Harris was homing in on the fact that, increasingly, it isn’t the price tag on the platform itself that earns companies money, but the attention they control on said platform – whether it’s a voice assistant, operating system, app or website. We literally “pay” attention to ads or sponsored content in order to access websites.

But Harris went on to explain that larger platforms, using systems of rewards similar to slot machines, are working not only to monetize our attention, but also to monopolize it. And with that monopoly comes incredible power.

If Facebook, for instance, can control hours of people’s attention daily, it can not only determine the rate at which it will sell that attention to advertisers, but also decide which advertisers or content creators it will sell to. In other words, in an attention economy Facebook becomes a gatekeeper for content – one that mediates not only personalized advertising, but also news and information.

This sort of monopoly brings the expected fiscal payoff, and also the amassing of immeasurable social and cultural power.

So how does Facebook’s new mission statement fit into this attention economy?

Think of it in terms of optics. The carotid artery of Facebook, along with the other tech giants of Silicon Valley, is brand. Brand ubiquity means Facebook is the first thing people check when they take their phones out of their pockets, or when they open Chrome or Safari (brought to you by Google and Apple, respectively). It means Prime Day is treated like a real holiday. Just like Kleenex means tissues and Xerox means copy, online search has literally become synonymous with Google.

Yet all these companies are painfully aware of what a brand-gone-bad can do – or undo. The current generation of online platforms is built on the foundations of empires that rose and fell while the attention economy was still incipient. Today’s companies have maintained their centrality by consistently copying (Instagram Stories, a clone of Snapchat) or outright purchasing (YouTube) their fiercest competitors – all to maintain or expand their brand.

And perhaps as important, tech giants have made it near impossible to imagine a future without them, simply by being the most prominent public entities doing such imagining.

Facebook’s mission affixes the company in our shared future, and also injects it with a moral or at least charitable sensibility – even if it’s only in the form of “bring[ing] the world closer together”-type vagaries.

So how should we as average consumers respond?

In his award-winning essay “Stand Out of Our Light: Freedom and Persuasion in the Attention Economy,” James Williams argues, “We must … move urgently to assert and defend our freedom of attention.”

To assert our freedom is to sufficiently recognize and evaluate the demands to attention all these devices and digital services represent. To defend our freedom entails two forms of action: first, by individual action – not unplugging completely, as the self-styled prophets of Facebook and Twitter encourage (before logging back on after a few months of asceticism) – but rather unplugging partially, habitually and ruthlessly.

Attention is the currency upon which tech giants are built. And the power of agency and free information is the power we cede when we turn over our attention wholly to platforms like Facebook.

But individual consumers can only do so much. The second way we must defend our freedom is through our demand for ethical practices from Silicon Valley.

Some critics believe government regulation is the only way to rein in Silicon Valley developers. The problem is, federal agencies that closely monitor the effects of product usage on consumers don’t have a good category for monitoring the effects of online platforms yet. The Food and Drug Administration (FDA) tracks medical technology. The Consumer Product Safety Commission (CPSC) focuses on physical risk to consumers. The Federal Communication Commission (FCC)  focuses on content — not platform. In other words, we don’t have a precedent for monitoring social media or other online platforms and their methods for retaining users.

Currently, there is no corollary agency that leads dedicated research into the effects of platforms like Facebook on users. There is no Surgeon General’s warning. There is no real protection for consumers from unethical practices by tech giants — as long as those practices fall in the cracks between existing ethics standards.

While it might seem idealistic to hold out for the creation of a new government agency that monitors Facebook (especially given the current political regime), the first step toward curbing Silicon Valley’s power is simple: We must acknowledge freedom of attention as an inalienable right — one inextricable from our freedom to pursue happiness. So long as the companies producing the hardware surrounding us and the platforms orienting social life online face no strictures, they will actively work to control how users think, slowly eroding our society’s collective free will.

With so much at stake, and with so little governmental infrastructure in place, checking tech giants’ ethics might seem like a daunting task. The U.S. government, after all, has demonstrated a consistent aversion to challenging Silicon Valley’s business and consumer-facing practices before.

But while we fight for better policy and stronger ethics-enforcing bodies, we can take one more practical step: “pay” attention to ethics in Silicon Valley. Read about Uber’s legal battles and the most recent research on social media’s effects on the brain. Demand more ethical practices from the companies we patronize. Why? The best moderators of technology ethics thus far have been tech giants themselves — when such moderation benefits the companies’ brands.

In Silicon Valley, money talks, but attention talks louder. It’s time to reclaim our voice.

http://www.salon.com/2017/08/05/free-your-brain-how-silicon-valley-denies-us-the-freedom-to-pay-attention/?source=newsletter

CNN’s “The Nineties”: Empty nostalgia for a decade we should let die

CNN delves into a decade of pat neoliberalism and hollow spectacle and, unsurprisingly, comes up with nothing

CNN’s “The Nineties”: Empty nostalgia for a decade we should let die
The Nineties (Credit: CNN)

To anyone who came of age in the 1990s, the current cultural ascent of fidget spinners is likely to induce an acute pang of recognition — equal parts wistful nostalgia, anxiety and woozy terror. The ‘90s were, as any certified “Nineties Kid” can attest, a decade marked by a succession of asinine schoolyard fads.

One can imagine an alternative timeline of the decade that marks time not by year, but the chronology of crazes: the Year of the Beanie Baby, the Year of the Tamagotchi, the Years of the Snap-Bracelet, the Macarena, the Baggy Starter Jacket, the Painstakingly Layered “The Rachel” Hairdo, and so on. What’s most remarkable about our culture’s whirring fidget spinner fetish is that it didn’t happen sooner; that this peak fad didn’t emerge from among the long, rolling sierra of hollow amusements that defined the 1990s.

Surveying the current pop-culture landscape, one gets the sense that the ‘90s— with all its flash-in-the-pan fads and cooked-up crazes — never ended. On TV, “The Simpsons” endures into its 28th season, while David Lynch and Mark Frost’s oddball ABC drama “Twin Peaks” enjoys a highly successful, and artistically fruitful, premium-cable revival. The Power Rangers, Ninja Turtles, Transformers and Treasure Trolls have graduated from small-screen Saturday morning silliness to blockbuster entertainments.

Elsewhere, the “normcore”/“dadcore”/“lazycore” fashion of singers like Mac DeMarco has made it OK (even haute) to dress up like a “Home Improvement”-era Jonathan Taylor Thomas. And Nintendo recently announced its latest money-printing scheme, in the form of the forthcoming SNES Classic Mini: a handheld throwback video game platform chock-full of nostalgia-baiting Console Wars standbys like “Donkey Kong Country,” “F-Zero” and “StarFox.” Content mills like BuzzFeed, Upworthy and their ilk bolster their bottom line churning out lists and quizzes reminding you that, yes, the show “Rugrats” existed.

To quote a nostalgic ’97-vintage hit single, which was itself a throwback to ‘60s jazz-pop, it’s all just a little bit of history repeating.

It’s natural to languish for the past: to trip down memory lane, get all dewy-eyed about the past, pine for the purity of the long-trampled gardens of innocence, and go full Proust on the bric-a-brac of youth that manages to impress itself on the soft, still-maturing amber of the adolescent mind, even if that stuff was total crap like Moon Shoes or a Street Shark or Totally Hair Barbie doll or a bucket of Nickelodeon-brand goo called “Gak.” The 1990s, however, offered a particularly potent nostalgia trap, something revealed watching CNN’s new TV documentary miniseries “event,” fittingly called “The Nineties.”

A follow-up to CNN’s previous history-of-a-decade events (“The Sixties,” “The Seventies” and “The Eighties”) and co-produced by Tom Hanks, the series provides some valuable insight into the nature of ’90s nostalgia. The two-part series opener, called “The One About TV,” threads the needle, examining the ways in which television of the era shifted the standards of cultural acceptability, be it in Andy Sipowicz’s expletive-laden racism, Homer Simpson’s casual stranglings of his misfit son or the highbrow, Noel Coward-in-primetime farces of “Frasier.”

To believe CNN’s procession of talking heads, damn near every TV show to debut after midnight on Jan. 1, 1990, was “revolutionary.” “The Simpsons” was revolutionary for the way it hated TV. “Twin Peaks” was revolutionary for the way it subverted it. “Seinfeld” ignored (or subtracted, into its famous “Show About Nothing” ethic) the conventions of the sitcom. “Frasier” elevated them. “Will & Grace,” “Ellen” and “The Real World” bravely depicted gay America. Ditto “Arsenio,” “Fresh Prince” and “In Living Color” in representing black America. “OZ” was revolutionary for its violence. “The Sopranos” was revolutionary in how it got you to root for the bad guy. “Friends” was revolutionary because it showed the day-to-day lives of, well, some friends. If the line of argumentation developed by “The Nineties” is to be believed, the TV game was being changed so frequently that it was becoming impossible to keep up with the rules.

Despite seeming argumentatively fallacious (if everything is subversive or game-changing, then, one might argue, nothing is), and further debasing the concept of revolution itself, such an argument cuts to the heart of ‘90s nostalgia. In pop culture, it was an era of seeming possibility, where it became OK to talk about masturbation (in one of “Seinfeld’s” more famous episodes) or even anal sex (as on “Sex & the City”), where “Twin Peaks” and “The Sopranos” spoke to the rot at the core of American life. “The Nineties” paints a flattering, borderline obsequious portrait of Gen-X ’90s kids as too hip, savvy and highly educated to be suckered in by the gleam and obvious propaganda that seemed to define “The Eighties.” (The ’90s kid finds a generational motto in the tagline offered by Fox’s conspiratorial cult sci-fi show “The X-Files”: trust no one.)

What “The Nineties” misses — very deliberately, one imagines — is the guiding cynicism of such revolutions in television. Far from being powered by a kind of radical politics of inclusivity, TV was (and remains) guided by its ability to deliver certain demographics to advertisers. In the 1990s, these demographics splintered, becoming more specialized. Likewise, entertainment streams split. The bully “mean girls” watched “90210,” the bullied watched “My So-Called Life,” and the kids bullied by the bullied watched “Buffy the Vampire Slayer.” Then on Thursday night, everyone watched “Seinfeld.”

This parade of prime-time cultural revolutions betrayed the actual guiding political attitude of the decade: stasis. The second episode of “The Nineties” turns to the scandal-plagued political life of Bill Clinton. “A new season of American renewal has begun!” beams Clinton, thumb pressed characteristically over a loosely clenched fist, early in the episode. For the Democrats, Bill Clinton seemed like a new hope: charming, charismatic, hip, appearing in sunglasses on Arsenio to blow his saxophone. But like so many of TV’s mock-insurgencies, the Clinton presidency was a coup in terms of aesthetics, and little else.

Beyond his sundry accusations of impropriety  (Whitewater, the Paula Jones and Monica Lewinsky sex scandals, etc.), Clinton supported the death penalty, “three strikes” sentencing, NAFTA, “don’t ask, don’t tell” and countless other policies that alienated him from his party’s left-progressive wing. Clinton embodied the emerging neoliberal ethic: cozying up to big banks and supporting laissez-faire economic policies that further destabilized the American working and middle classes, while largely avoiding the jingoist militarism, nationalism and family values moralism of ‘80s Reaganomics. Clinton’s American renewal was little more than face-lift.

“The Simpsons,” naturally, nailed this devil-you-know distinction in a 1996 Halloween episode, which saw the bodies of Bill Clinton and then-presidential rival Bob Dole inhabited by slithering extraterrestrials. Indistinguishable in terms of tone and policy, the body snatching alien candidates beguiled the easily duped electorate with nonsensical stump speeches about moving “forward, not backward; upward, not forward; and always twirling, twirling, twirling towards freedom.”

A 1992 book by the American political scientist Francis Fukuyama summed up the ’90s’ neoliberal approach to politics. In “The End of History and the Last Man,” Fukuyama posited that the collapse of the Soviet Union following the Cold War had resolved any grand ideological and historical conflicts in world politics. Liberal democracy and capitalism had won the day. Free market democracy was humanity’s final form. History — or at least the concept of history as a process of sociological evolution and conflict between competing political systems — had run its course.

Following the publication of “The End of History,” Fukuyama became an institutional poli-sci Svengali (John Gray at the New Statesman dubbed him the “court philosopher of global capitalism”), with his ideas holding significant major sway in political circles. The 1990s in America, and during the Clinton presidency, in particular, were a self-styled realization of the “end of history.” In the wake of the Cold War and collapse of the Berlin Wall, the president’s position was largely functionary: enable the smooth functioning of markets, and the free flow of capital. Such was the horizon of political thought.

Fukuyama’s book has been subjected to thorough criticism for its shortsightedness — not least of all for the way in which its central argument only serves to consolidate and naturalize the authority of the neoliberal elite. More concretely, 9/11 and its aftermath are often cited as signals of the “revenge of history,” which introduces new, complicated clashes of world-historical ideologies.

Though it’s often touted for its triumphalism, as a cheerleading handbook for the success of Westernized global capitalism, Fukuyama’s end of history theory is suffused with a certain melancholy. There’s one passage, often overlooked, which speaks to the general content and character of the ’90s (and “The Nineties”). “The end of history will be a very sad times,” he writes. “In the post-historical period there will be neither art nor philosophy, just the perpetual caretaking of the museum of human history. I can feel in myself, and see in others around me, a powerful nostalgia for the time when history existed. Such nostalgia, in fact, will continue to fuel competition and conflict even in the post-historical world for some time to come.”

Our fresh new millennium has been marked, in political terms, by cultural clashes between decadent Western liberalism and militant Islamism (both sides bolstering their positions with the hollow rhetoric of religious zealotry), the abject failure of both the Democratic and Republican parties, the reappearance of white supremacist and ethno-nationalist thinking, the thorough criticism of neoliberalism, and the rise of a new progressive-left (signaled by the popularity of Jeremy Corbyn and Bernie Sanders), alongside a similarly invigorated form of moderatism referred to as “the extreme centre.” Amid such wild vicissitudes, the placid neoliberal creep of Fukuyama’s “post-history” feels downright quaint.

This is the sort of modern nostalgia that CNN’s “The Nineties” taps into: a melancholy for the relative stability of a decade that was meant to mark the end of history itself. Not only did things seem even-keeled, but everything (a haircut, a GameBoy game about tiny Japanese cartoon monsters, a sitcom episode about waiting for a table) seemed radical, revolutionary and, somehow, deeply profound. We are, perhaps invariably, prone to feeling elegiac for even the hollowness of A Decade About Nothing. It’s particularly because the 1990s abide in our politicians, our ideologies, our prime-time entertainments, blockbusters movies and even, yes, in our faddish toys, designed to ease our fidgety anxiety about the muddled present, and keep us twirling, twirling back into memory of a simpler, stupider past.

John Semley lives and works in Toronto. He is a books columnist at the Globe & Mail newspaper and the author of “This Is A Book About The Kids In The Hall” (ECW Press).

The techie is the new hipster — but what is tech culture?

The archetype of the “techie” has become commonplace in the past decade in art and in real life. But what is it?

The techie is the new hipster — but what is tech culture even?
(Credit: Getty/Geber86)

If you live in any major city in the world, you probably know the type: they roam the clean parts of town, lattes in hand, wearing American Apparel hoodies emblazoned with logos of vowel-deficient startups. Somehow, in the past decade, a profession turned into a lifestyle and a culture, with its own customs, habits and even lingo. In film, television and literature, the techie archetype is mocked, recycled, reduced to a stereotype (as in Mike Judge’s sitcom “Silicon Valley”), a radical hero (as in “Mr. Robot”), or both (as in “The Circle”).

If, as many claim, the hipster died at the end of the 2000s, the techie seems to have taken its place in the 2010s — not quite an offshoot, but rather a mutation. Consider the similarities: Like hipsters, techies are privy to esoteric knowledge, though of obscure code rather than obscure bands. They both seem to love kale. They tend to rove in packs, are associated with gentrification, and are overwhelmingly male. There are some fashion similarities: the tight jeans, the hoodie fetish, the predilection for modernist Scandinavian furniture. And like “hipster,” the term “techie” is often considered a slur, a pejorative that you lob at someone you want to depict as out of touch, rarefied and elite — not a fellow prole, in other words.

Yet there are differences, too: The techie often brings with him or her a certain worldview and language that attempts to describe the world in computational terms; the transformation of the word “hack” into an everyday verb attests to this. Some techies view their own bodies as merely machines that require food the way computers need electricity, a belief system exemplified by the popularity of powdered foods like Soylent. This happens in exercise, too — the rush to gamify health and wellness by tracking steps, calories and heartbeats turns the body into a spreadsheet.

How does a profession mutate into a culture? David Golumbia, an associate professor of digital studies at Virginia Commonwealth University and author of “The Cultural Logic of Computation,” suggests that some of the cultural beliefs common to those in the tech industry about the utopian promise of computers trickle down into what we may think of as tech culture at large. Golumbia describes the basic idea, “computationalism,” as “the philosophical idea that the brain is a computer” as well as “a broader worldview according to which people or society are seen as computers, or that we might be living inside of a simulation.”

“You frequently find people who avoid formal education for some reason or another and then educate themselves through reading a variety of online resources that talk about this, and they subscribe to it as quasi-religious truth, that everything is a computer,” Golumbia said. “It’s appealing to people who find the messiness of the social and human world unappealing and difficult to manage. There’s frustration . . . expressed when parts of the world don’t appear to be computational, by which I mean, when their actions can’t be represented by algorithms that can be clearly defined.”

“It’s very reductive,” Golumbia added.

Mapping the social world onto the algorithmic world seems to be where tech culture goes astray. “This is part of my deep worry about it — we are heading in a direction where people who really identify with the computer are those who have a lot of trouble dealing with other people directly. People who find the social world difficult to manage often see the computer as the solution to their problems,” Golumbia said.

But tech culture isn’t confined to screen time anymore. It’s become part of everyday life, argues Jan English-Lueck, a professor of anthropology of San Jose State University and a distinguished fellow at the Institute for the Future. English-Lueck wrote an ethnographic account of Silicon Valley culture, “Cultures@SiliconValley,” and studies the people and culture of the region.

“We start to see our civic life in a very technical way. My favorite example of that is people going to a picnic and looking at some food and asking if that’s ‘open source’ [available to all]. So people use those technological metaphors to think about everyday things in life,” she said.

English-Lueck says the rapid pace of the tech field trickles down into tech culture, too. “People are fascinated with speed and efficiency, they’re enthusiastic and optimistic about what technology can accomplish.”

Golumbia saw the aspects of tech culture firsthand: Prior to being a professor, he worked in information technology for a software company on Wall Street. His convictions about computationalism were borne out in his colleagues. “What I saw was that there were at least two kinds of employees — there was a programmer type, who was very rigid but able to do the tasks that you put in front of them, and there were the managerial types who were much more flexible in their thinking.”

“My intuition in talking to [the] programmer types [was that] they had this very black-and-white mindset, that everything was or should be a computer,” he said. “And the managers, who tended to have taken at least a few liberal arts classes in college, and were interested in history of thought, understood you can’t manage people the way you manage machines.”

Yet the former worldview — that everything is a computer — seems to have won out. “When I started, I thought it was this minor small subgroup of society” that believed that, he told Salon. “But nowadays I think many executives in Silicon Valley have some version of this belief.”

For evidence that the metaphor of the human body as a computer has gone mainstream, look no further than our gadgetry. Devices like the Fitbit and the Apple Watch monitor a the wearer’s movement and activity constantly, producing data that they can obsess over or study. “There is a small group of people who become obsessed with quantification,” Golumbia told Salon. “Not just about exercise, but like, about intimate details of their life — how much time spent with one’s kids, how many orgasms you have — most people aren’t like that; they do counting for a while [and] then they get tired of counting. The counting part seems oppressive.”

But this counting obsession, a trickle-down ideology from tech culture, is no longer optional: In many gadgets, it is now imposed from above. My iPhone counts my steps whether I like it or not. And other industries and agencies love the idea that we should willingly be tracked and monitored constantly, including the NSA and social media companies who profit off knowing the intimate details of our lives and selling ads to us based on it. “Insurers are trying to get us to do this all the time as part of wellness programs,” Golumbia said. “It’s a booming top-down control thing that’s being sold to us as the opposite.”

Golumbia marvels at a recent ad for the Apple Watch that features the Beyoncé song “Freedom” blaring in the background. “How did we get to this world where freedom means having a device on your that measures what you do at all times?”

Keith A. Spencer is a cover editor at Salon.

Facebook and Twitter ‘harm young people’s mental health’

Poll of 14- to 24-year-olds shows Instagram, Facebook, Snapchat and Twitter increased feelings of inadequacy and anxiety

girls on their phones
Young people scored Instagram the worst social medium for sleep, body image and fear of missing out. Photograph: Mark Mawson/Getty Images

Four of the five most popular forms of social media harm young people’s mental health, with Instagram the most damaging, according to research by two health organisations.

Instagram has the most negative impact on young people’s mental wellbeing, a survey of almost 1,500 14- to 24-year-olds found, and the health groups accused it of deepening young people’s feelings of inadequacy and anxiety.

The survey, published on Friday, concluded that Snapchat, Facebook and Twitter are also harmful. Among the five only YouTube was judged to have a positive impact.

The four platforms have a negative effect because they can exacerbate children’s and young people’s body image worries, and worsen bullying, sleep problems and feelings of anxiety, depression and loneliness, the participants said.

The findings follow growing concern among politicians, health bodies, doctors, charities and parents about young people suffering harm as a result of sexting, cyberbullying and social media reinforcing feelings of self-loathing and even the risk of them committing suicide.

“It’s interesting to see Instagram and Snapchat ranking as the worst for mental health and wellbeing. Both platforms are very image-focused and it appears that they may be driving feelings of inadequacy and anxiety in young people,” said Shirley Cramer, chief executive of the Royal Society for Public Health, which undertook the survey with the Young Health Movement.

She demanded tough measures “to make social media less of a wild west when it comes to young people’s mental health and wellbeing”. Social media firms should bring in a pop-up image to warn young people that they have been using it a lot, while Instagram and similar platforms should alert users when photographs of people have been digitally manipulated, Cramer said.

The 1,479 young people surveyed were asked to rate the impact of the five forms of social media on 14 different criteria of health and wellbeing, including their effect on sleep, anxiety, depression, loneliness, self-identity, bullying, body image and the fear of missing out.

Instagram emerged with the most negative score. It rated badly for seven of the 14 measures, particularly its impact on sleep, body image and fear of missing out – and also for bullying and feelings of anxiety, depression and loneliness. However, young people cited its upsides too, including self-expression, self-identity and emotional support.

YouTube scored very badly for its impact on sleep but positively in nine of the 14 categories, notably awareness and understanding of other people’s health experience, self-expression, loneliness, depression and emotional support.

However, the leader of the UK’s psychiatrists said the findings were too simplistic and unfairly blamed social media for the complex reasons why the mental health of so many young people is suffering.

Prof Sir Simon Wessely, president of the Royal College of Psychiatrists, said: “I am sure that social media plays a role in unhappiness, but it has as many benefits as it does negatives.. We need to teach children how to cope with all aspects of social media – good and bad – to prepare them for an increasingly digitised world. There is real danger in blaming the medium for the message.”

Young Minds, the charity which Theresa May visited last week on a campaign stop, backed the call for Instagram and other platforms to take further steps to protect young users.

Tom Madders, its director of campaigns and communications, said: “Prompting young people about heavy usage and signposting to support they may need, on a platform that they identify with, could help many young people.”

However, he also urged caution in how content accessed by young people on social media is perceived. “It’s also important to recognise that simply ‘protecting’ young people from particular content types can never be the whole solution. We need to support young people so they understand the risks of how they behave online, and are empowered to make sense of and know how to respond to harmful content that slips through filters.”

Parents and mental health experts fear that platforms such as Instagram can make young users feel worried and inadequate by facilitating hostile comments about their appearance or reminding them that they have not been invited to, for example, a party many of their peers are attending.

May, who has made children’s mental health one of her priorities, highlighted social media’s damaging effects in her “shared society” speech in January, saying: “We know that the use of social media brings additional concerns and challenges. In 2014, just over one in 10 young people said that they had experienced cyberbullying by phone or over the internet.”

In February, Jeremy Hunt, the health secretary, warned social media and technology firms that they could face sanctions, including through legislation, unless they did more to tackle sexting, cyberbullying and the trolling of young users.

https://www.theguardian.com/society/2017/may/19/popular-social-media-sites-harm-young-peoples-mental-health

Confronting the Great Mass Addiction of Our Era

BOOKS
This examination of today’s tech-zombie epidemic is worth putting your phone down for – at least for a while.

Photo Credit: Federico Morando / Flickr

Are you addicted to technology? I’m certainly not. In my first sitting reading Adam Alter’s Irresistible, an investigation into why we can’t stop scrolling and clicking and surfing online, I only paused to check my phone four times. Because someone might have emailed me. Or texted me. One time I stopped to download an app Alter mentioned (research) and the final time I had to check the shares on my play brokerage app, Best Brokers (let’s call this one “business”).

Half the developed world is addicted to something, and Alter, a professor at New York University, informs us that, increasingly, that something isn’t drugs or alcohol, but behaviour. Recent studies suggest the most compulsive behaviour we engage in has to do with cyber connectivity; 40% of us have some sort of internet-based addiction – whether it’s checking your email (on average workers check it 36 times an hour), mindlessly scrolling through other people’s breakfasts on Instagram or gambling online.

Facebook was fun three years ago, Alter warns. Now it’s addictive. This tech zombie epidemic is not entirely our fault. Technology is designed to hook us, and to keep us locked in a refresh/reload cycle so that we don’t miss any news, cat memes or status updates from our friends. Tristan Harris, a “design ethicist” (whatever that is) tells the author that it’s not a question of willpower when “there are a thousand people on the other side of the screen whose job it is to break down the self-regulation you have”. After all, Steve Jobs gave the world the iPad, but made very sure his kids never got near one. Brain patterns of heroin users just after a hit and World of Warcraft addicts starting up a new game are nearly identical. The tech innovators behind our favourite products and apps understood that they were offering us endless portals to addiction. We’re the only ones late to the party.

Addiction isn’t inherent or genetic incertain people, as was previously thought. Rather, it is largely a function of environment and circumstance. Everyone is vulnerable; we’re all just a product or substance away from an uncomfortable attachment of some kind. And the internet, Alter writes, with its unpredictable but continuous loop of positive feedback, simulation of connectivity and culture of comparison, is “ripe for abuse”.

For one thing, it’s impossible to avoid; a recovering alcoholic can re-enter the slipstream of his life with more ease than someone addicted to online gaming – the alcoholic can avoid bars while the gaming addict still has to use a computer at work, to stay in touch with family, to be included in his micro-society.

Secondly, it’s bottomless. Everything is possible in the ideology of the internet – need a car in the middle of the night? Here you go. Want to borrow a stranger’s dog to play with for an hour, with no long-term responsibility for the animal? Sure, there’s an app for that. Want to send someone a message and see when it reaches their phone, when they read it and whether they like it? Even BlackBerry could do that.

Thirdly, it’s immersive – and even worse, it’s mobile. You can carry your addiction around with you. Everywhere. You don’t need to be locked in an airless room or unemployed in order to spend hours online. Moment, an app designed to track how often you pick up and look at your phone, estimates that the average smartphone user spends two to three hours on his or her mobile daily.

I downloaded Moment (the research I mentioned earlier) and uninstalled it after it informed me that, by noon, I had already fiddled away an hour of my time on the phone.

Though the age of mobile tech has only just begun, Alter believes that signs point to a crisis. In 2000, Microsoft Canada found that our average attention span was 12 seconds long. By 2013, it was eight seconds long. Goldfish, by comparison, can go nine seconds. Our ability to empathise, a slow-burning skill that requires immediate feedback on how our actions affect others, suffers the more we disconnect from real-life interaction in favour of virtual interfacing. Recent studies found that this decline in compassion was more pronounced among young girls. One in three teenage girls say their peers are cruel online (only one in 11 boys agree).

Sure, communication technology has its positives. It’s efficient and cheap, and has the ability to teach creatively, raise money for worldwide philanthropic causes and to disseminate news under and over the reach of censors, but the corrosive culture of online celebrity, fake news and trolling must have a downside, too – namely that we can’t seem to get away from it.

There is a tinge of first world problems in Irresistible. World of Warcraft support groups; a product Alter writes about called Realism (a plastic frame resembling a screenless smartphone, which you can hold to temper your raging internet addiction, but can’t actually use); a spike in girl gaming addicts fuelled by Kim Kardashian’s Hollywood app – it’s difficult to see why these things should elicit much sympathy while one in 10 people worldwide still lack access to clean drinking water. This very western focus on desire and goal orientation is one that eastern thinkers might consider a wrong view of the world and its material attachments, but Alter’s pop-scientific approach still makes for an entertaining break away from one’s phone.

Irresistible is published by Bodley Head.

Internet of Things isn’t fun anymore

IoT’s growing faster than the ability to defend it

The recent DDoS attack was a wake-up call for the IoT, which will get a whole lot bigger this holiday season

Internet of Things isn't fun anymore: IoT's growing faster than the ability to defend it

(Credit: iStockphoto/sorsillo)

This article was originally published by Scientific American.

Scientific AmericanWith this year’s approaching holiday gift season, the rapidly growing “Internet of Things” or IoT — which was exploited to help shut down parts of the Web very recently — is about to get a lot bigger, and fast. Christmas and Hanukkah wish lists are sure to be filled with smartwatches, fitness trackers, home-monitoring cameras and other internet-connected gadgets that upload photos, videos and workout details to the cloud. Unfortunately these devices are also vulnerable to viruses and other malicious software (malware) that can be used to turn them into virtual weapons without their owners’ consent or knowledge.

The recent distributed denial of service (DDoS) attacks — in which tens of millions of hacked devices were exploited to jam and take down internet computer servers — is an ominous sign for the Internet of Things. A DDoS is a cyber attack in which large numbers of devices are programmed to request access to the same website at the same time, creating data traffic bottlenecks that cut off access to the site. In this case, the attackers used malware known as “Mirai” to hack into devices whose passwords they could guess, because the owners either could not or did not change the devices’ default passwords.

The IoT is a vast and growing virtual universe that includes automobiles, medical devices, industrial systems and a growing number of consumer electronics devices. These include video game consoles, smart speakers such as the Amazon Echo and connected thermostats like the Nest, not to mention the smart home hubs and network routers that connect those devices to the internet and one another. Technology items have accounted for more than 73 percent of holiday gift spending in the United States each year for the past 15 years, according to the Consumer Technology Association. This year the CTA expects about 170 million people to buy presents that contribute to the IoT, and research and consulting firm Gartner predicts these networks will grow to encompass 50 billion devices worldwide by 2020. With Black Friday less than one month away, it is unlikely makers of these devices will be able to patch the security flaws that opened the door to the DDoS attack.

Before the IoT attack that temporarily paralyzed the internet across much of the Northeast and other broad patches of the United States, there had been hints that such a large assault was imminent. In September a network, or “botnet,” of Mirai-infected IoT devices launched a DDoS that took down the KrebsOnSecurity website run by investigative cybersecurity journalist Brian Krebs. A few weeks later someone published the source code for Mirai openly on the Internet for anyone to use. Within days Mirai was at the heart of the latest attacks against U.S. Dynamic Network Services, or DYN, a domain name system (DNS) service provider. DYN’s computer servers act like an internet switchboard by translating a website address into its corresponding internet protocol (IP) address. A browser needs that IP address to find and connect to the server hosting that site’s content.

The attacks kept the Sony PlayStation Network, Twitter, GitHub and Spotify’s web teams busy most of the day but had little impact on the owners of the devices hijacked to launch the attacks. Most of the people whose cameras and other digital devices were involved will never know, said Matthew Cook, a co-founder of Panopticon Laboratories, a company that specializes in developing cybersecurity for online games. Cook was speaking on a panel at a cybersecurity conference in New York last week.

But consumers will likely start paying more attention when they realize that someone could spy on them by hacking into their home’s web cameras, said another conference speaker, Andrew Lee, CEO of security software maker ESET North America. An attacker could use a Web camera to learn occupants’ daily routines — and thus know when no one is home — or even to record passwords as they are typed them into computers or mobile devices, Lee added.

The IoT is expanding faster than device makers’ interest in cybersecurity. In a report released last week by the National Cyber Security Alliance and ESET, only half of the 15,527 consumers surveyed said that concerns about the cybersecurity of an IoT device have discouraged them from buying one. Slightly more than half of those surveyed said they own up to three devices — in addition to their computers and smartphones — that connect to their home routers, with another 22 percent having between four and 10 additional connected devices. Yet 43 percent of respondents reported either not having changed their default router passwords or not being sure if they had. Also, some devices’ passwords are difficult to change and others have permanent passwords coded in.

With little time for makers of connected devices to fix security problems before the holidays, numerous cybersecurity researchers recommend consumers at the very least make sure their home internet routers are protected by a secure password.

SALON

The lie of white “economic insecurity”: Race, class and the rise of Donald Trump

The media loves to promote the lie that the white working class supports Trump and the GOP for economic reasons

The lie of white "economic insecurity": Race, class and the rise of Donald Trump
(Credit: Reuters/Mike Segar)

Questions of race and class have cast a heavy shadow over a presidential campaign in which “economic insecurity” has been repeatedly identified (quite incorrectly) by the mainstream news media as the driving force behind the rise of Donald Trump. In response, there has been a flurry of recent articles and essays exploring how matters of race and class are influencing the decision by “white working class” voters to support Donald Trump’s fascist, racist and nativist campaign for the White House.

Writing at The Guardian, sociologist Arlie Hochschild offers a devastating critique of how race and class intersect for white working-class American voters. In “How the Great Paradox of American Politics Holds the Secret to Trump’s Success,” Hochschild explores how white voters in the South and elsewhere rationalize their support for a Republican Party and a “small government” ethos that has devastated their lives and communities. She tells this story by focusing on one person, Lee Sherman, and his journey from pipefitter at a petrochemical plant to environmental activist and whistleblower to eventual Tea Party activist. Hochschild writes:

Yet over the course of his lifetime, Sherman had moved from the left to the right. When he lived as a young man in Washington State, he said proudly, “I ran the campaign of the first woman to run for Congress in the state.” But when he moved from Seattle to Dallas for work in the 1950s, he shifted from conservative Democrat to Republican, and after 2009, to the Tea Party. So while his central life experience had been betrayal at the hands of industry, he now felt – as his politics reflected – most betrayed by the federal government. He believed that PPG and many other local petrochemical companies at the time had done wrong, and that cleaning the mess up was right. He thought industry would not “do the right thing” by itself. But still he rejected the federal government. Indeed, Sherman embraced candidates who wanted to remove nearly all the guardrails on industry and cut the EPA. The Occupational Safety and Health Administration had vastly improved life for workmen such as Sherman – and he appreciated those reforms – but he felt the job was largely done.

Lee Sherman’s story is all too common. Because of political socialization by the right-wing media, the Christian evangelical movement, and closed personal and social networks, many white conservative voters are unable to practice the systems level thinking necessary to connect their day-to-day struggles with the policies put in place by the Republican Party.

While this way of seeing and understanding the social and political world (what Walt Whitman influentially described as “the pictures inside of people’s heads”) may be at odds with the type of critical thinking and evidence-based reasoning that liberals and progressives take for granted, it still exerts a powerful hold over many millions of conservatives. This alternate reality is, not surprisingly, anchored in place by the right-wing disinformation machine and Fox News.

Hochschild’s essay is further evidence of what I suggested in an earlier piece here at Salon: Republicans and the broader right-wing movement profit from a Machiavellian relationship where the more economic pain and suffering they inflict on red-state America, the more popular and powerful they become with those voters. This is political sadism as a campaign strategy.

Politico’s “What’s Going on With America’s White People?” features commentary by leading scholars and journalists such as Anne Case, Angus Denton, Nancy Isenberg, Carol Anderson and J.D. Vance, whose collective work examines the relationships between race, class and white America. The piece highlights how death anxieties greatly influence the political calculations and decision-making of white conservatives in red-state America. These people use their own broken communities — places that are awash in prescription drug addictions, have high rates of out-of-wedlock births and divorce, and see deaths of despair (suicide by guns and alcohol; chronic untreated illnesses) reign — to draw incorrect conclusions about America as a whole. These anxieties have combined with increasing levels of authoritarianism, racial resentment and old-fashioned racism among white conservatives and right-leaning independents to fuel extreme political polarization and make the emergence of a demagogue such as Donald Trump a near inevitability.

If the fever swamps that birthed Donald Trump are to be drained, there needs to be a renewed focus on the dynamics of race and class for white (conservative) voters during this 2016 presidential election. But these analyses should also be accompanied by several qualifiers.

First, liberals and progressives are often easily seduced by a narrative, popularized by Thomas Frank and others, in which white working-class and poor Americans are depicted as having been hoodwinked into voting for the Republican Party. In this argument, white poor and working-class red-state voters chose “culture war” issues over economic policies. However, as compellingly demonstrated by political scientist Larry Bartels (and complemented by fellow political scientist Andrew Gelman), poor and other lower-income voters tend to vote for the Democratic Party while middle- and upper-income voters tend to vote for the Republican Party. Poor and lower-income (white) voters participate in formal politics less frequently than middle- and upper-income voters. Moreover, “culture war” issues did not drive a mass defection of white working-class voters from the Democratic Party to the GOP.  In total, it is white economic and political elites and not the white poor and working classes who are largely responsible for the political and social dysfunction that plagues American politics today.

Second, since its very founding America has been struggling with two powerful impulses. On one hand, there is a truly progressive and left-wing type of pluralism that seeks to work across lines of race and class in order to create an inclusive democracy where upward mobility and the fruits of full citizenship are equally attainable for all people. This type of pluralism is embodied by Bernie Sanders — and to a lesser degree Hillary Clinton and the broader Democratic Party. Juxtaposed against this is a right-wing and reactionary type of pluralism that is exclusive and not inclusive, stokes the fires of racial and ethnic division, and offers a vision of America where white people stand on the necks of non-whites in order to elevate themselves. This is embodied by Donald Trump and a Republican Party that functions as the United States’ largest de facto white identity organization.

Most importantly, the white “working-class” and poor voters featured in the recent pieces by Politico and The Guardian possess agency. It has long been fashionable for liberals and progressives to suggest that the white poor and working classes are confused by “false consciousness” as demonstrated by their allegiance to America’s racial hierarchy and an economic system that often disadvantages people like them. In reality, the white poor and working class are keenly aware of the psychological and material advantages that come with whiteness and white privilege.

Whiteness is a type of property in the United States. For centuries, white people, across lines of class and gender, have coveted and fiercely protected it. The white working class and poor are not victims in this system; they have benefited greatly from it at the expense of non-whites. Ultimately, as Americans try to puzzle through their current political morass, a renewed emphasis on race and class is invaluable because it serves as a reminder of how simple binaries (one must choose between discussing either “race” or “class”) and crude essentialism (“a focus on class inequality will do more good than confronting racism!”) often disguises and confuses more than it reveals.

Chauncey DeVega is a politics staff writer for Salon. His essays can also be found at Chaunceydevega.com. He also hosts a weekly podcast, The Chauncey DeVega Show. Chauncey can be followed on Twitter and Facebook.