Millennial children use smartphones to spy on their parents

Smartphones’ tracking features have caused a role-reversal; one woman observed her parents lie about dinner plans

Millennial children use smartphones to spy on their parents
(Credit: Getty/Rawpixel Ltd)

Buried in a Wall Street Journal story about savvy young Americans using technology to get a one-up on their parents lies a tale of the Millennial family drama.

This generation of young adults is leading the anti-Trump charge. They’re killing industry after industry, but keeping libraries alive. But most importantly, they’re living with their parents, because, despite their education, they can’t find jobs to keep them afloat.

So, the generation that matured in tandem with the internet is able to use their parents’ phones to determine where their parents are at all times. And the Wall Street Journal looked at this trend, finding that there were a number of teens and young adults who were using this app for Ferris Bueller-like debauchery — throwing sleepovers for friends and cleaning up just before their parents were home, for example. But one story stood out, because it’s just the sad tale of a couple — this woman’s parents — who just want a cheap night out.

Alexa McDonald of Columbus, Ohio, discovered that her parents fib a bit. The 24-year-old call-center dispatcher’s app revealed that, while claiming to be stuck in traffic, they were sometimes at a restaurant. “I never called them out on it, but inside in my head I was like, ‘I’m hungry, I would have loved to have been included in that.’”

“I guess we’ve done that,” says her mother, Claudia McDonald of Mount Vernon, Ohio. One reason: “Whatever the dish is, she always wants shrimp on it,” she says. “It’s one of the more expensive things on the menu.”

Millennials are ruining family dinners.

Advertisements

Trump Damaged Democracy, Silicon Valley Will Finish It Off

‘BRAIN HACKING’

Donald Trump’s rise is, in a sense, just one symptom of the damage the tech oligarchs are doing to America.

When Democrats made their post-election populist “Better Deal” pitch, they took a strong stance against pharmaceutical and financial monopolies. But they conspicuously left out the most profound antitrust challenge of our time—the tech oligarchy.

The information sector, notes The Economist, is now the most consolidated sector of the American economy.

The Silicon Valley and its Puget Sound annex dominated by Google, Apple, Facebook, Amazon, and Microsoft increasingly resemble the pre-gas crisis Detroit of the Big Three. Tech’s Big Five all enjoy overwhelming market shares—for example Google controls upwards of 80 percent of global search—and the capital to either acquire or crush any newcomers. They are bringing us a hardly gilded age of prosperity but depressed competition, economic stagnation, and, increasingly, a chilling desire to control the national conversation.

Jeff Bezos harrumphs through his chosen megaphone, The Washington Post, about how “democracy dies in the dark.” But if Bezos—the world’s third richest man, who used the Post first to undermine Bernie Sanders and then to wage ceaseless war on the admittedly heinous Donald Trump—really wants to identify the biggest long-term threat to individual and community autonomy, he should turn on the lights and look in the mirror.

Trump’s election and volatile presidency may pose a more immediate menace, but when he is gone, or neutered by lack of support, the oligarchs’ damage to our democracy and culture will continue to metastasize.

Killing the Old Silicon Valley

Americans justifiably take pride in the creative and entrepreneurial genius of Silicon Valley. The tech sector has been, along with culture, agriculture, and energy, one of our most competitive industries, one defined by risk-taking and intense competition between firms in the Valley, and elsewhere.

This old model is fading. All but shielded from antitrust laws, the new Silicon Valley is losing its entrepreneurial yeastiness—which, ironically enough, was in part spawned by government efforts against old-line monopolists such as ATT and IBM. While the industry still promotes the myth of the stalwart tinkerers in their garages seeking to build the next great company, the model now is to get funding so that their company can be acquired by Facebook or one of the other titans. As one recent paper demonstrates, these “super platforms” depress competition, squeeze suppliers and reduce opportunities for potential rivals, much as the monopolists of the late 19th century did (PDF). The rush toward artificial intelligence, requiring vast reservoirs of both money and talent, may accelerate this consolidation. A few firms may join the oligarchy over time, such as Tesla or Uber, but these are all controlled by the same investors on the current Big Five.

This new hierarchy is narrowing the path to riches, or even the middle class. Rather than expand opportunity, the Valley increasingly creates jobs in the “gig economy” that promises not a way to the middle class, much less riches, but into the rising precariat—part-time, conditional workers. This emerging “gig economy” will likely expand with the digitization of retail, which could cost millions of working-class jobs.

For most Americans, the once promising “New Economy,” has meant a descent, as MIT’s Peter Temin recently put it, toward a precarious position usually associated with developing nations. Workers in the “gig economy,” unlike the old middle- and working-class, have little chance, for example, of buying a house—once a sure sign of upward mobility, something that is depressingly evident in the Bay Area, along the California coast, and parts of the Northeast.

Certainly the chances of striking out on one’s own have diminished. Sergei Brin, Google’s co-founder, recently suggested that startups would be better off moving from Silicon Valley to areas that are less expensive and highly regulated, and where the competition for talent is not dominated by a few behemoths who can gobble up potential competitors—Instagram, WhatsApp, Skype, LinkedIn, Oculus—or slowly crush them, as may be happening to Snap, a firm that followed the old model and refused to be swallowed by Facebook but went through with its own public offering. Now the Los Angeles-based company is under assault by the social media giant which is using technologies at its Instagram unit, itself an acquisition, that duplicate Snap’s trademark technologies and features.

Snap’s problems are not an isolated case. The result is that the number of high-tech startups is down by almost half from just two years ago; overall National Venture Capital Association reports that the number of deals is now at the lowest level since 2010. Outsiders, the supposed lifeblood of entrepreneurial development, are increasingly irrelevant in an increasingly closed system.

The New Hierarchy

For all its talk about “disruption,” Silicon Valley is increasingly about three things: money, hierarchy, and conformity. Tech entrepreneurs long have enjoyed financial success, but their dominance in the ranks of the ultra-rich has never been so profound. They now account for three of world’s five richest people—Bill Gates, Jeff Bezos, and Mark Zuckerberg—and dominate the list of billionaires under 40.

Unlike their often ruthless and unpleasant 20th century moguls, the Silicon Valley elite has done relatively little for the country’s lagging productivity or to create broad-based opportunity. The information sector has overall been a poor source of new jobs—roughly 70,000 since 2010—with the gains concentrated in just a few places. This as the number of generally more middle-class jobs tied to producing equipment has fallen by half since 1990 and most new employment opportunities have been in low-wage sectors like hospitality, medical care, and food preparation.

The rich, that is, have gotten richer, in part by taking pains to minimize their tax exposure. Now they are talking grandly about having the government provide all the now “excess” humans with a guaranteed minimum income. The titans who have shared or spread so little of their own wealth are increasingly united in the idea that the government—i.e., middle-class taxpayers—should spread more around.

Not at all coincidentally, the Bay Area itself—once a fertile place of grassroots and middle-class opportunity—now boasts an increasingly bifurcated economy. San Francisco, the Valley’s northern annex, regularly clocks in as among the most unequal cities in the country, with both extraordinary wealth and a vast homeless population.

The more suburban Silicon Valley now suffers a poverty rate of near 20 percent, above the national average. It also has its own large homeless population living in what KQED has described as “modern nomadic villages.” In recent years income gains in the region have flowed overwhelmingly to the top quintile of income-earners, who have seen their wages increase by over 25 percent since 1989, while income levels have declined for low-income households.

Despite endless prattling about diversity, African Americans and Hispanics who make up roughly one-third of the valley’s population, have barely 5 percent of jobs in the top Silicon Valley firms. Between 2009 and 2011, earnings dropped 18 percent for blacks in the Valley and by 5 percent for Latinos, according to a 2013 Joint Venture Silicon Valley report (PDF).

Similarly the share of women in the tech industry is barely half of their 47 percent share in the total workforce, and their ranks may even be shrinking. Stanford researcher Vivek Wadhwa describes the Valley still as “a boys’ club that regarded women as less capable than men and subjected them to negative stereotypes and abuse.”

While the industry hasn’t done much to actually employ women or minorities, it’s both self-righteously and opportunistically fed the outrage industry by booting right-wing voices from various platforms and pushing out people like former Google staffer James Damore, and before that Mozilla founder Brendan Eich after he made a small contribution to a 2014 measure banning gay marriage. Skepticism, once the benchmark of technology development, is now increasingly unwelcome in much of the Valley.

This marks a distinct change from the ’80s and ’90s, when the tech companies—then still involved in the manufacturing of physical products in the United States—tended toward libertarian political views. As late as the 1980s, moderate Republicans frequently won elections in places like San Mateo and Santa Clara. Now the area has evolved into one of the most one-sidedly progressive bastions in the nation. Over 70 percent of Bay Area residents are Democrats up from 55 percent in the 1970s. Today, the Calexit backers, many based in the Valley, even think that the country is too dunderheaded, and suggest they represent “different,” and morally superior, values than the rest of the country.

The Danger to Democracy

If these were policies adopted by an ice-cream chain, or a machine-tool maker, they might be annoying. But in the tech giants, with their vast and growing power to shape opinion, represent an existential threat. Mark Zuckerberg whose Facebook is now the largest source of media for younger people, has emerged, in the words of one European journalist (PDF), as “‘the world’s most powerful editor.” In the past they were the primary carriers of “fake news,” and have done as much as any institution to erode the old values (and economics) of journalism.

Both Facebook and Google now offer news “curated” by algorithms. Bans are increasingly used by Facebook and Twitter to keep out unpopular or incendiary views, and especially in the echo chamber of the Bay Area. This is sometimes directed at conservatives, such as Prager University, whose content may be offensive to some, but hardly subversive or “fake.” The real crime now is simply to question dominant ideology of Silicon Valley gentry progressivism.

Even at their most powerful the industrial age moguls could not control what people knew. They might back a newspaper, or later a radio or television station, but never secured absolute control of media. Competing interests still tussled in a highly regionalized and diverse media market. In contrast the digital universe, dominated by a handful of players located in just a few locales, threaten to make a pluralism of opinions a thing of the past. The former Google design ethicist Tristan Harris suggests that “a handful of tech leaders at Google and Facebook have built the most pervasive, centralized systems for steering human attention that has ever existed.”

Ultimately, particularly after the disasters associated with the Trump regime, the oligarchs seem certain to expand their efforts to control the one institution which could challenge their hegemony: government. Once seen as politically marginal, the oligarchs achieved a dominated role in the Democratic Party, in part by financing President Obama and later support for Hillary Clinton. In the Obama years Google operatives were in fact fairly ubiquitous, leading at least one magazine to label it “the Android Administration.” Since then a stream of Obama people have headed to Silicon Valley, working for firms such as Apple, Uber, and Airbnb. Obama himself has even mused about becoming a venture capitalist himself.

Of course with Trump in power, the oligarchs are mostly on the outs, although the twitterer in chief tried to recruit them. Now many of Silicon Valley power players are supporting the “resistance” and lending their expertise to Democratic campaigns. Unlike undocumented immigrants or other victims of Trumpism, they can count on many GOP politicians to watch their flank until the lunatic storm recedes.

In a future Democratic administration, as is already evident in places like California, the tech titans will use their money, savvy, and new dominance over our communications channels to steer and even dictate America’s political and cultural agendas to wield power in ways that even the likes of J.P. Morgan or John D. Rockefeller would envy.

What started as a brilliant, and profoundly non-political extension of the information revolution, notes early Google and Facebook investor Robert McNamee, now looms as “a menace,” part of a systematic “brain hacking” on a massive scale. We can choose to confront this reality—as the early 20th century progressives did—or stand aside and let the oligarchs chart our future without imposing any curbs on their seemingly inexorable hegemony.

http://www.thedailybeast.com/trump-damaged-democracy-silicon-valley-will-finish-it-off?via=newsletter&source=Weekend

How Silicon Valley denies us the freedom to pay attention

Free your brain: 

A continual quest for attention both drives and compromises Silicon Valley’s techno-utopian vision

Free your brain: How Silicon Valley denies us the freedom to pay attention
(Credit: Salon/Flora Thevoux)

In late June, Mark Zuckerberg announced the new mission of Facebook: “To give people the power to build community and bring the world closer together.”

The rhetoric of the statement is carefully selected, centered on empowering people, and in so doing, ushering in world peace, or at least something like it. Tech giants across Silicon Valley are adopting similarly utopian visions, casting themselves as the purveyors of a more connected, more enlightened, more empowered future. Every year, these companies articulate their visions onstage at internationally streamed pep rallies, Apple’s WWDC and Google’s I/O being the best known.

But companies like Facebook can only “give people the power” because we first ceded it to them, in the form of our attention. After all, that is how many Silicon Valley companies thrive: Our attention, in the form of eyes and ears, provides a medium for them to advertise to us. And the more time we spend staring at them, the more money Facebook and Twitter make — in effect, it’s in their interest that we become psychologically dependent on the self-esteem boost from being wired in all the time.

This quest for our eyeballs doesn’t mesh well with Silicon Valley’s utopian visions of world peace and people power. Earlier this year, many sounded alarm bells when a “60 Minutes” exposé revealed the creepy cottage industry of “brain-hacking,” industrial psychology techniques that tech giants use and study to make us spend as much time staring at screens as possible.

Indeed, it is Silicon Valley’s continual quest for attention that both motivates their utopian dreams, and that compromises them from the start. As a result, the tech industry often has compromised ethics when it comes to product design.

Case in point: At January’s Consumer Electronics Convention – a sort of Mecca for tech start-ups dreaming of making it big – I found myself in a suite with one of the largest kid-tech (children’s toys) developers in the world. A small flock of PR reps, engineers and executives hovered around the entryway as one development head walked my photographer and me through the mock setup. They were showing off the first voice assistant developed solely with kids in mind.

At the end of the tour, I asked if the company had researched or planned to research the effects of voice assistant usage on kids. After all, parents had been using tablets to occupy their kids for years by the time evidence of their less-than-ideal impact on children’s attention, behavior and sleep emerged.

The answer I received was gentle but firm: No, because we respect parents’ right to make decisions on behalf of their children.

This free-market logic – that says the consumer alone arbitrates the value of a product – is pervasive in Silicon Valley. What consumer, after all, is going to argue they can’t make their own decisions responsibly? But a free market only functions properly when consumers operate with full agency and access to information, and tech companies are working hard to limit both.

During a “60 Minutes” story on brain hacking, former product manager at Google Tristan Harris said, “There’s always this narrative that technology’s neutral. And it’s up to us to choose how we use it.”

The problem, according to Harris, is that “this is just not true… [Developers] want you to use it in particular ways and for long periods of time. Because that’s how they make their money.”

Harris was homing in on the fact that, increasingly, it isn’t the price tag on the platform itself that earns companies money, but the attention they control on said platform – whether it’s a voice assistant, operating system, app or website. We literally “pay” attention to ads or sponsored content in order to access websites.

But Harris went on to explain that larger platforms, using systems of rewards similar to slot machines, are working not only to monetize our attention, but also to monopolize it. And with that monopoly comes incredible power.

If Facebook, for instance, can control hours of people’s attention daily, it can not only determine the rate at which it will sell that attention to advertisers, but also decide which advertisers or content creators it will sell to. In other words, in an attention economy Facebook becomes a gatekeeper for content – one that mediates not only personalized advertising, but also news and information.

This sort of monopoly brings the expected fiscal payoff, and also the amassing of immeasurable social and cultural power.

So how does Facebook’s new mission statement fit into this attention economy?

Think of it in terms of optics. The carotid artery of Facebook, along with the other tech giants of Silicon Valley, is brand. Brand ubiquity means Facebook is the first thing people check when they take their phones out of their pockets, or when they open Chrome or Safari (brought to you by Google and Apple, respectively). It means Prime Day is treated like a real holiday. Just like Kleenex means tissues and Xerox means copy, online search has literally become synonymous with Google.

Yet all these companies are painfully aware of what a brand-gone-bad can do – or undo. The current generation of online platforms is built on the foundations of empires that rose and fell while the attention economy was still incipient. Today’s companies have maintained their centrality by consistently copying (Instagram Stories, a clone of Snapchat) or outright purchasing (YouTube) their fiercest competitors – all to maintain or expand their brand.

And perhaps as important, tech giants have made it near impossible to imagine a future without them, simply by being the most prominent public entities doing such imagining.

Facebook’s mission affixes the company in our shared future, and also injects it with a moral or at least charitable sensibility – even if it’s only in the form of “bring[ing] the world closer together”-type vagaries.

So how should we as average consumers respond?

In his award-winning essay “Stand Out of Our Light: Freedom and Persuasion in the Attention Economy,” James Williams argues, “We must … move urgently to assert and defend our freedom of attention.”

To assert our freedom is to sufficiently recognize and evaluate the demands to attention all these devices and digital services represent. To defend our freedom entails two forms of action: first, by individual action – not unplugging completely, as the self-styled prophets of Facebook and Twitter encourage (before logging back on after a few months of asceticism) – but rather unplugging partially, habitually and ruthlessly.

Attention is the currency upon which tech giants are built. And the power of agency and free information is the power we cede when we turn over our attention wholly to platforms like Facebook.

But individual consumers can only do so much. The second way we must defend our freedom is through our demand for ethical practices from Silicon Valley.

Some critics believe government regulation is the only way to rein in Silicon Valley developers. The problem is, federal agencies that closely monitor the effects of product usage on consumers don’t have a good category for monitoring the effects of online platforms yet. The Food and Drug Administration (FDA) tracks medical technology. The Consumer Product Safety Commission (CPSC) focuses on physical risk to consumers. The Federal Communication Commission (FCC)  focuses on content — not platform. In other words, we don’t have a precedent for monitoring social media or other online platforms and their methods for retaining users.

Currently, there is no corollary agency that leads dedicated research into the effects of platforms like Facebook on users. There is no Surgeon General’s warning. There is no real protection for consumers from unethical practices by tech giants — as long as those practices fall in the cracks between existing ethics standards.

While it might seem idealistic to hold out for the creation of a new government agency that monitors Facebook (especially given the current political regime), the first step toward curbing Silicon Valley’s power is simple: We must acknowledge freedom of attention as an inalienable right — one inextricable from our freedom to pursue happiness. So long as the companies producing the hardware surrounding us and the platforms orienting social life online face no strictures, they will actively work to control how users think, slowly eroding our society’s collective free will.

With so much at stake, and with so little governmental infrastructure in place, checking tech giants’ ethics might seem like a daunting task. The U.S. government, after all, has demonstrated a consistent aversion to challenging Silicon Valley’s business and consumer-facing practices before.

But while we fight for better policy and stronger ethics-enforcing bodies, we can take one more practical step: “pay” attention to ethics in Silicon Valley. Read about Uber’s legal battles and the most recent research on social media’s effects on the brain. Demand more ethical practices from the companies we patronize. Why? The best moderators of technology ethics thus far have been tech giants themselves — when such moderation benefits the companies’ brands.

In Silicon Valley, money talks, but attention talks louder. It’s time to reclaim our voice.

http://www.salon.com/2017/08/05/free-your-brain-how-silicon-valley-denies-us-the-freedom-to-pay-attention/?source=newsletter

CNN’s “The Nineties”: Empty nostalgia for a decade we should let die

CNN delves into a decade of pat neoliberalism and hollow spectacle and, unsurprisingly, comes up with nothing

CNN’s “The Nineties”: Empty nostalgia for a decade we should let die
The Nineties (Credit: CNN)

To anyone who came of age in the 1990s, the current cultural ascent of fidget spinners is likely to induce an acute pang of recognition — equal parts wistful nostalgia, anxiety and woozy terror. The ‘90s were, as any certified “Nineties Kid” can attest, a decade marked by a succession of asinine schoolyard fads.

One can imagine an alternative timeline of the decade that marks time not by year, but the chronology of crazes: the Year of the Beanie Baby, the Year of the Tamagotchi, the Years of the Snap-Bracelet, the Macarena, the Baggy Starter Jacket, the Painstakingly Layered “The Rachel” Hairdo, and so on. What’s most remarkable about our culture’s whirring fidget spinner fetish is that it didn’t happen sooner; that this peak fad didn’t emerge from among the long, rolling sierra of hollow amusements that defined the 1990s.

Surveying the current pop-culture landscape, one gets the sense that the ‘90s— with all its flash-in-the-pan fads and cooked-up crazes — never ended. On TV, “The Simpsons” endures into its 28th season, while David Lynch and Mark Frost’s oddball ABC drama “Twin Peaks” enjoys a highly successful, and artistically fruitful, premium-cable revival. The Power Rangers, Ninja Turtles, Transformers and Treasure Trolls have graduated from small-screen Saturday morning silliness to blockbuster entertainments.

Elsewhere, the “normcore”/“dadcore”/“lazycore” fashion of singers like Mac DeMarco has made it OK (even haute) to dress up like a “Home Improvement”-era Jonathan Taylor Thomas. And Nintendo recently announced its latest money-printing scheme, in the form of the forthcoming SNES Classic Mini: a handheld throwback video game platform chock-full of nostalgia-baiting Console Wars standbys like “Donkey Kong Country,” “F-Zero” and “StarFox.” Content mills like BuzzFeed, Upworthy and their ilk bolster their bottom line churning out lists and quizzes reminding you that, yes, the show “Rugrats” existed.

To quote a nostalgic ’97-vintage hit single, which was itself a throwback to ‘60s jazz-pop, it’s all just a little bit of history repeating.

It’s natural to languish for the past: to trip down memory lane, get all dewy-eyed about the past, pine for the purity of the long-trampled gardens of innocence, and go full Proust on the bric-a-brac of youth that manages to impress itself on the soft, still-maturing amber of the adolescent mind, even if that stuff was total crap like Moon Shoes or a Street Shark or Totally Hair Barbie doll or a bucket of Nickelodeon-brand goo called “Gak.” The 1990s, however, offered a particularly potent nostalgia trap, something revealed watching CNN’s new TV documentary miniseries “event,” fittingly called “The Nineties.”

A follow-up to CNN’s previous history-of-a-decade events (“The Sixties,” “The Seventies” and “The Eighties”) and co-produced by Tom Hanks, the series provides some valuable insight into the nature of ’90s nostalgia. The two-part series opener, called “The One About TV,” threads the needle, examining the ways in which television of the era shifted the standards of cultural acceptability, be it in Andy Sipowicz’s expletive-laden racism, Homer Simpson’s casual stranglings of his misfit son or the highbrow, Noel Coward-in-primetime farces of “Frasier.”

To believe CNN’s procession of talking heads, damn near every TV show to debut after midnight on Jan. 1, 1990, was “revolutionary.” “The Simpsons” was revolutionary for the way it hated TV. “Twin Peaks” was revolutionary for the way it subverted it. “Seinfeld” ignored (or subtracted, into its famous “Show About Nothing” ethic) the conventions of the sitcom. “Frasier” elevated them. “Will & Grace,” “Ellen” and “The Real World” bravely depicted gay America. Ditto “Arsenio,” “Fresh Prince” and “In Living Color” in representing black America. “OZ” was revolutionary for its violence. “The Sopranos” was revolutionary in how it got you to root for the bad guy. “Friends” was revolutionary because it showed the day-to-day lives of, well, some friends. If the line of argumentation developed by “The Nineties” is to be believed, the TV game was being changed so frequently that it was becoming impossible to keep up with the rules.

Despite seeming argumentatively fallacious (if everything is subversive or game-changing, then, one might argue, nothing is), and further debasing the concept of revolution itself, such an argument cuts to the heart of ‘90s nostalgia. In pop culture, it was an era of seeming possibility, where it became OK to talk about masturbation (in one of “Seinfeld’s” more famous episodes) or even anal sex (as on “Sex & the City”), where “Twin Peaks” and “The Sopranos” spoke to the rot at the core of American life. “The Nineties” paints a flattering, borderline obsequious portrait of Gen-X ’90s kids as too hip, savvy and highly educated to be suckered in by the gleam and obvious propaganda that seemed to define “The Eighties.” (The ’90s kid finds a generational motto in the tagline offered by Fox’s conspiratorial cult sci-fi show “The X-Files”: trust no one.)

What “The Nineties” misses — very deliberately, one imagines — is the guiding cynicism of such revolutions in television. Far from being powered by a kind of radical politics of inclusivity, TV was (and remains) guided by its ability to deliver certain demographics to advertisers. In the 1990s, these demographics splintered, becoming more specialized. Likewise, entertainment streams split. The bully “mean girls” watched “90210,” the bullied watched “My So-Called Life,” and the kids bullied by the bullied watched “Buffy the Vampire Slayer.” Then on Thursday night, everyone watched “Seinfeld.”

This parade of prime-time cultural revolutions betrayed the actual guiding political attitude of the decade: stasis. The second episode of “The Nineties” turns to the scandal-plagued political life of Bill Clinton. “A new season of American renewal has begun!” beams Clinton, thumb pressed characteristically over a loosely clenched fist, early in the episode. For the Democrats, Bill Clinton seemed like a new hope: charming, charismatic, hip, appearing in sunglasses on Arsenio to blow his saxophone. But like so many of TV’s mock-insurgencies, the Clinton presidency was a coup in terms of aesthetics, and little else.

Beyond his sundry accusations of impropriety  (Whitewater, the Paula Jones and Monica Lewinsky sex scandals, etc.), Clinton supported the death penalty, “three strikes” sentencing, NAFTA, “don’t ask, don’t tell” and countless other policies that alienated him from his party’s left-progressive wing. Clinton embodied the emerging neoliberal ethic: cozying up to big banks and supporting laissez-faire economic policies that further destabilized the American working and middle classes, while largely avoiding the jingoist militarism, nationalism and family values moralism of ‘80s Reaganomics. Clinton’s American renewal was little more than face-lift.

“The Simpsons,” naturally, nailed this devil-you-know distinction in a 1996 Halloween episode, which saw the bodies of Bill Clinton and then-presidential rival Bob Dole inhabited by slithering extraterrestrials. Indistinguishable in terms of tone and policy, the body snatching alien candidates beguiled the easily duped electorate with nonsensical stump speeches about moving “forward, not backward; upward, not forward; and always twirling, twirling, twirling towards freedom.”

A 1992 book by the American political scientist Francis Fukuyama summed up the ’90s’ neoliberal approach to politics. In “The End of History and the Last Man,” Fukuyama posited that the collapse of the Soviet Union following the Cold War had resolved any grand ideological and historical conflicts in world politics. Liberal democracy and capitalism had won the day. Free market democracy was humanity’s final form. History — or at least the concept of history as a process of sociological evolution and conflict between competing political systems — had run its course.

Following the publication of “The End of History,” Fukuyama became an institutional poli-sci Svengali (John Gray at the New Statesman dubbed him the “court philosopher of global capitalism”), with his ideas holding significant major sway in political circles. The 1990s in America, and during the Clinton presidency, in particular, were a self-styled realization of the “end of history.” In the wake of the Cold War and collapse of the Berlin Wall, the president’s position was largely functionary: enable the smooth functioning of markets, and the free flow of capital. Such was the horizon of political thought.

Fukuyama’s book has been subjected to thorough criticism for its shortsightedness — not least of all for the way in which its central argument only serves to consolidate and naturalize the authority of the neoliberal elite. More concretely, 9/11 and its aftermath are often cited as signals of the “revenge of history,” which introduces new, complicated clashes of world-historical ideologies.

Though it’s often touted for its triumphalism, as a cheerleading handbook for the success of Westernized global capitalism, Fukuyama’s end of history theory is suffused with a certain melancholy. There’s one passage, often overlooked, which speaks to the general content and character of the ’90s (and “The Nineties”). “The end of history will be a very sad times,” he writes. “In the post-historical period there will be neither art nor philosophy, just the perpetual caretaking of the museum of human history. I can feel in myself, and see in others around me, a powerful nostalgia for the time when history existed. Such nostalgia, in fact, will continue to fuel competition and conflict even in the post-historical world for some time to come.”

Our fresh new millennium has been marked, in political terms, by cultural clashes between decadent Western liberalism and militant Islamism (both sides bolstering their positions with the hollow rhetoric of religious zealotry), the abject failure of both the Democratic and Republican parties, the reappearance of white supremacist and ethno-nationalist thinking, the thorough criticism of neoliberalism, and the rise of a new progressive-left (signaled by the popularity of Jeremy Corbyn and Bernie Sanders), alongside a similarly invigorated form of moderatism referred to as “the extreme centre.” Amid such wild vicissitudes, the placid neoliberal creep of Fukuyama’s “post-history” feels downright quaint.

This is the sort of modern nostalgia that CNN’s “The Nineties” taps into: a melancholy for the relative stability of a decade that was meant to mark the end of history itself. Not only did things seem even-keeled, but everything (a haircut, a GameBoy game about tiny Japanese cartoon monsters, a sitcom episode about waiting for a table) seemed radical, revolutionary and, somehow, deeply profound. We are, perhaps invariably, prone to feeling elegiac for even the hollowness of A Decade About Nothing. It’s particularly because the 1990s abide in our politicians, our ideologies, our prime-time entertainments, blockbusters movies and even, yes, in our faddish toys, designed to ease our fidgety anxiety about the muddled present, and keep us twirling, twirling back into memory of a simpler, stupider past.

John Semley lives and works in Toronto. He is a books columnist at the Globe & Mail newspaper and the author of “This Is A Book About The Kids In The Hall” (ECW Press).

The techie is the new hipster — but what is tech culture?

The archetype of the “techie” has become commonplace in the past decade in art and in real life. But what is it?

The techie is the new hipster — but what is tech culture even?
(Credit: Getty/Geber86)

If you live in any major city in the world, you probably know the type: they roam the clean parts of town, lattes in hand, wearing American Apparel hoodies emblazoned with logos of vowel-deficient startups. Somehow, in the past decade, a profession turned into a lifestyle and a culture, with its own customs, habits and even lingo. In film, television and literature, the techie archetype is mocked, recycled, reduced to a stereotype (as in Mike Judge’s sitcom “Silicon Valley”), a radical hero (as in “Mr. Robot”), or both (as in “The Circle”).

If, as many claim, the hipster died at the end of the 2000s, the techie seems to have taken its place in the 2010s — not quite an offshoot, but rather a mutation. Consider the similarities: Like hipsters, techies are privy to esoteric knowledge, though of obscure code rather than obscure bands. They both seem to love kale. They tend to rove in packs, are associated with gentrification, and are overwhelmingly male. There are some fashion similarities: the tight jeans, the hoodie fetish, the predilection for modernist Scandinavian furniture. And like “hipster,” the term “techie” is often considered a slur, a pejorative that you lob at someone you want to depict as out of touch, rarefied and elite — not a fellow prole, in other words.

Yet there are differences, too: The techie often brings with him or her a certain worldview and language that attempts to describe the world in computational terms; the transformation of the word “hack” into an everyday verb attests to this. Some techies view their own bodies as merely machines that require food the way computers need electricity, a belief system exemplified by the popularity of powdered foods like Soylent. This happens in exercise, too — the rush to gamify health and wellness by tracking steps, calories and heartbeats turns the body into a spreadsheet.

How does a profession mutate into a culture? David Golumbia, an associate professor of digital studies at Virginia Commonwealth University and author of “The Cultural Logic of Computation,” suggests that some of the cultural beliefs common to those in the tech industry about the utopian promise of computers trickle down into what we may think of as tech culture at large. Golumbia describes the basic idea, “computationalism,” as “the philosophical idea that the brain is a computer” as well as “a broader worldview according to which people or society are seen as computers, or that we might be living inside of a simulation.”

“You frequently find people who avoid formal education for some reason or another and then educate themselves through reading a variety of online resources that talk about this, and they subscribe to it as quasi-religious truth, that everything is a computer,” Golumbia said. “It’s appealing to people who find the messiness of the social and human world unappealing and difficult to manage. There’s frustration . . . expressed when parts of the world don’t appear to be computational, by which I mean, when their actions can’t be represented by algorithms that can be clearly defined.”

“It’s very reductive,” Golumbia added.

Mapping the social world onto the algorithmic world seems to be where tech culture goes astray. “This is part of my deep worry about it — we are heading in a direction where people who really identify with the computer are those who have a lot of trouble dealing with other people directly. People who find the social world difficult to manage often see the computer as the solution to their problems,” Golumbia said.

But tech culture isn’t confined to screen time anymore. It’s become part of everyday life, argues Jan English-Lueck, a professor of anthropology of San Jose State University and a distinguished fellow at the Institute for the Future. English-Lueck wrote an ethnographic account of Silicon Valley culture, “Cultures@SiliconValley,” and studies the people and culture of the region.

“We start to see our civic life in a very technical way. My favorite example of that is people going to a picnic and looking at some food and asking if that’s ‘open source’ [available to all]. So people use those technological metaphors to think about everyday things in life,” she said.

English-Lueck says the rapid pace of the tech field trickles down into tech culture, too. “People are fascinated with speed and efficiency, they’re enthusiastic and optimistic about what technology can accomplish.”

Golumbia saw the aspects of tech culture firsthand: Prior to being a professor, he worked in information technology for a software company on Wall Street. His convictions about computationalism were borne out in his colleagues. “What I saw was that there were at least two kinds of employees — there was a programmer type, who was very rigid but able to do the tasks that you put in front of them, and there were the managerial types who were much more flexible in their thinking.”

“My intuition in talking to [the] programmer types [was that] they had this very black-and-white mindset, that everything was or should be a computer,” he said. “And the managers, who tended to have taken at least a few liberal arts classes in college, and were interested in history of thought, understood you can’t manage people the way you manage machines.”

Yet the former worldview — that everything is a computer — seems to have won out. “When I started, I thought it was this minor small subgroup of society” that believed that, he told Salon. “But nowadays I think many executives in Silicon Valley have some version of this belief.”

For evidence that the metaphor of the human body as a computer has gone mainstream, look no further than our gadgetry. Devices like the Fitbit and the Apple Watch monitor a the wearer’s movement and activity constantly, producing data that they can obsess over or study. “There is a small group of people who become obsessed with quantification,” Golumbia told Salon. “Not just about exercise, but like, about intimate details of their life — how much time spent with one’s kids, how many orgasms you have — most people aren’t like that; they do counting for a while [and] then they get tired of counting. The counting part seems oppressive.”

But this counting obsession, a trickle-down ideology from tech culture, is no longer optional: In many gadgets, it is now imposed from above. My iPhone counts my steps whether I like it or not. And other industries and agencies love the idea that we should willingly be tracked and monitored constantly, including the NSA and social media companies who profit off knowing the intimate details of our lives and selling ads to us based on it. “Insurers are trying to get us to do this all the time as part of wellness programs,” Golumbia said. “It’s a booming top-down control thing that’s being sold to us as the opposite.”

Golumbia marvels at a recent ad for the Apple Watch that features the Beyoncé song “Freedom” blaring in the background. “How did we get to this world where freedom means having a device on your that measures what you do at all times?”

Keith A. Spencer is a cover editor at Salon.

Facebook and Twitter ‘harm young people’s mental health’

Poll of 14- to 24-year-olds shows Instagram, Facebook, Snapchat and Twitter increased feelings of inadequacy and anxiety

girls on their phones
Young people scored Instagram the worst social medium for sleep, body image and fear of missing out. Photograph: Mark Mawson/Getty Images

Four of the five most popular forms of social media harm young people’s mental health, with Instagram the most damaging, according to research by two health organisations.

Instagram has the most negative impact on young people’s mental wellbeing, a survey of almost 1,500 14- to 24-year-olds found, and the health groups accused it of deepening young people’s feelings of inadequacy and anxiety.

The survey, published on Friday, concluded that Snapchat, Facebook and Twitter are also harmful. Among the five only YouTube was judged to have a positive impact.

The four platforms have a negative effect because they can exacerbate children’s and young people’s body image worries, and worsen bullying, sleep problems and feelings of anxiety, depression and loneliness, the participants said.

The findings follow growing concern among politicians, health bodies, doctors, charities and parents about young people suffering harm as a result of sexting, cyberbullying and social media reinforcing feelings of self-loathing and even the risk of them committing suicide.

“It’s interesting to see Instagram and Snapchat ranking as the worst for mental health and wellbeing. Both platforms are very image-focused and it appears that they may be driving feelings of inadequacy and anxiety in young people,” said Shirley Cramer, chief executive of the Royal Society for Public Health, which undertook the survey with the Young Health Movement.

She demanded tough measures “to make social media less of a wild west when it comes to young people’s mental health and wellbeing”. Social media firms should bring in a pop-up image to warn young people that they have been using it a lot, while Instagram and similar platforms should alert users when photographs of people have been digitally manipulated, Cramer said.

The 1,479 young people surveyed were asked to rate the impact of the five forms of social media on 14 different criteria of health and wellbeing, including their effect on sleep, anxiety, depression, loneliness, self-identity, bullying, body image and the fear of missing out.

Instagram emerged with the most negative score. It rated badly for seven of the 14 measures, particularly its impact on sleep, body image and fear of missing out – and also for bullying and feelings of anxiety, depression and loneliness. However, young people cited its upsides too, including self-expression, self-identity and emotional support.

YouTube scored very badly for its impact on sleep but positively in nine of the 14 categories, notably awareness and understanding of other people’s health experience, self-expression, loneliness, depression and emotional support.

However, the leader of the UK’s psychiatrists said the findings were too simplistic and unfairly blamed social media for the complex reasons why the mental health of so many young people is suffering.

Prof Sir Simon Wessely, president of the Royal College of Psychiatrists, said: “I am sure that social media plays a role in unhappiness, but it has as many benefits as it does negatives.. We need to teach children how to cope with all aspects of social media – good and bad – to prepare them for an increasingly digitised world. There is real danger in blaming the medium for the message.”

Young Minds, the charity which Theresa May visited last week on a campaign stop, backed the call for Instagram and other platforms to take further steps to protect young users.

Tom Madders, its director of campaigns and communications, said: “Prompting young people about heavy usage and signposting to support they may need, on a platform that they identify with, could help many young people.”

However, he also urged caution in how content accessed by young people on social media is perceived. “It’s also important to recognise that simply ‘protecting’ young people from particular content types can never be the whole solution. We need to support young people so they understand the risks of how they behave online, and are empowered to make sense of and know how to respond to harmful content that slips through filters.”

Parents and mental health experts fear that platforms such as Instagram can make young users feel worried and inadequate by facilitating hostile comments about their appearance or reminding them that they have not been invited to, for example, a party many of their peers are attending.

May, who has made children’s mental health one of her priorities, highlighted social media’s damaging effects in her “shared society” speech in January, saying: “We know that the use of social media brings additional concerns and challenges. In 2014, just over one in 10 young people said that they had experienced cyberbullying by phone or over the internet.”

In February, Jeremy Hunt, the health secretary, warned social media and technology firms that they could face sanctions, including through legislation, unless they did more to tackle sexting, cyberbullying and the trolling of young users.

https://www.theguardian.com/society/2017/may/19/popular-social-media-sites-harm-young-peoples-mental-health

Confronting the Great Mass Addiction of Our Era

BOOKS
This examination of today’s tech-zombie epidemic is worth putting your phone down for – at least for a while.

Photo Credit: Federico Morando / Flickr

Are you addicted to technology? I’m certainly not. In my first sitting reading Adam Alter’s Irresistible, an investigation into why we can’t stop scrolling and clicking and surfing online, I only paused to check my phone four times. Because someone might have emailed me. Or texted me. One time I stopped to download an app Alter mentioned (research) and the final time I had to check the shares on my play brokerage app, Best Brokers (let’s call this one “business”).

Half the developed world is addicted to something, and Alter, a professor at New York University, informs us that, increasingly, that something isn’t drugs or alcohol, but behaviour. Recent studies suggest the most compulsive behaviour we engage in has to do with cyber connectivity; 40% of us have some sort of internet-based addiction – whether it’s checking your email (on average workers check it 36 times an hour), mindlessly scrolling through other people’s breakfasts on Instagram or gambling online.

Facebook was fun three years ago, Alter warns. Now it’s addictive. This tech zombie epidemic is not entirely our fault. Technology is designed to hook us, and to keep us locked in a refresh/reload cycle so that we don’t miss any news, cat memes or status updates from our friends. Tristan Harris, a “design ethicist” (whatever that is) tells the author that it’s not a question of willpower when “there are a thousand people on the other side of the screen whose job it is to break down the self-regulation you have”. After all, Steve Jobs gave the world the iPad, but made very sure his kids never got near one. Brain patterns of heroin users just after a hit and World of Warcraft addicts starting up a new game are nearly identical. The tech innovators behind our favourite products and apps understood that they were offering us endless portals to addiction. We’re the only ones late to the party.

Addiction isn’t inherent or genetic incertain people, as was previously thought. Rather, it is largely a function of environment and circumstance. Everyone is vulnerable; we’re all just a product or substance away from an uncomfortable attachment of some kind. And the internet, Alter writes, with its unpredictable but continuous loop of positive feedback, simulation of connectivity and culture of comparison, is “ripe for abuse”.

For one thing, it’s impossible to avoid; a recovering alcoholic can re-enter the slipstream of his life with more ease than someone addicted to online gaming – the alcoholic can avoid bars while the gaming addict still has to use a computer at work, to stay in touch with family, to be included in his micro-society.

Secondly, it’s bottomless. Everything is possible in the ideology of the internet – need a car in the middle of the night? Here you go. Want to borrow a stranger’s dog to play with for an hour, with no long-term responsibility for the animal? Sure, there’s an app for that. Want to send someone a message and see when it reaches their phone, when they read it and whether they like it? Even BlackBerry could do that.

Thirdly, it’s immersive – and even worse, it’s mobile. You can carry your addiction around with you. Everywhere. You don’t need to be locked in an airless room or unemployed in order to spend hours online. Moment, an app designed to track how often you pick up and look at your phone, estimates that the average smartphone user spends two to three hours on his or her mobile daily.

I downloaded Moment (the research I mentioned earlier) and uninstalled it after it informed me that, by noon, I had already fiddled away an hour of my time on the phone.

Though the age of mobile tech has only just begun, Alter believes that signs point to a crisis. In 2000, Microsoft Canada found that our average attention span was 12 seconds long. By 2013, it was eight seconds long. Goldfish, by comparison, can go nine seconds. Our ability to empathise, a slow-burning skill that requires immediate feedback on how our actions affect others, suffers the more we disconnect from real-life interaction in favour of virtual interfacing. Recent studies found that this decline in compassion was more pronounced among young girls. One in three teenage girls say their peers are cruel online (only one in 11 boys agree).

Sure, communication technology has its positives. It’s efficient and cheap, and has the ability to teach creatively, raise money for worldwide philanthropic causes and to disseminate news under and over the reach of censors, but the corrosive culture of online celebrity, fake news and trolling must have a downside, too – namely that we can’t seem to get away from it.

There is a tinge of first world problems in Irresistible. World of Warcraft support groups; a product Alter writes about called Realism (a plastic frame resembling a screenless smartphone, which you can hold to temper your raging internet addiction, but can’t actually use); a spike in girl gaming addicts fuelled by Kim Kardashian’s Hollywood app – it’s difficult to see why these things should elicit much sympathy while one in 10 people worldwide still lack access to clean drinking water. This very western focus on desire and goal orientation is one that eastern thinkers might consider a wrong view of the world and its material attachments, but Alter’s pop-scientific approach still makes for an entertaining break away from one’s phone.

Irresistible is published by Bodley Head.