Street art in Mexico
by artist Ramed NC.
Photo by Ramed NC.
Street art in Mexico
by artist Ramed NC.
Photo by Ramed NC.
Rent and housing costs in most major cities have skyrocketed since the financial crisis, cutting deeply into workers’ standard of living and prompting concerns about a new global housing bubble. Driving the soaring cost of rent is a global financial system that is being pumped full of cheap credit by all the major central governments at the expense of workers around the world.
Prices in some areas boggle the mind. San Francisco’s average asking price for a one-bedroom apartment went from $1,258 per month in January 2010 to $4,126 in February 2016. In London, the average home price has doubled since 2009, from about £300,000 ($437,600 USD) to £600,000 ($875,100).
Hong Kong’s housing market, which largely avoided the US real estate crash, more than tripled in its average sale between 2004 and today. The city is now considered the least affordable place in the world, with the median Hong Kong home price worth 19 times the city’s average skilled white-collar worker’s annual salary.
Housing and rental markets are so high that the Swiss bank UBS estimates that the majority of the world’s urban real estate markets are now “significantly overvalued.”
What is most striking about the colossal increase in prices, however, is how divorced it is from the incomes of the vast majority of the global population, which are moving in the opposite direction.
Historically, rent prices have tended to move with income and inflation. For example, in the United States the median home price adjusted for inflation remained largely flat between 1970 and 1998, fluctuating slightly above and below $160,000. This was a period in which workers’ incomes were also flat. After 1998, however, the housing market skyrocketed, with the median home price rising from about $160,000 in 1998 to $275,000 in 2006, the peak of the finance-driven boom. This jump was driven by all manner of financial speculation, including rampant criminal behavior, which had been let loose by the lowering of interest rates by the US Federal Reserve.
The housing market today is going through a new version of the 2006 housing crisis. However, unlike 2006, this process is global. Nearly every major capitalist government in the world is pursuing a policy of near-zero interest rates, encouraging rampant speculation in both the stock market and the real estate market.
This trend can be seen clearly in the United States. Between 2001 and 2014, the average real rental price rose 7 percent nationwide according to Harvard University’s Joint Center for Housing Studies. During that same period, median household income dropped by 9 percent.
In Los Angeles, the second largest city in the US, 40 percent of families either make poverty wages or are unemployed. As families and individuals increasingly struggle to make ends meet, rent has increased sharply in LA. In January 2010, an average one bedroom apartment went for $1,224 a month. Six years later, the cost was $1,935. And the worst is not over. A 2016 forecast by USC Casden Multifamily predicts that in the next few years rent will “soar.” It is no wonder that homeless in the city grew by 16 percent in just two years between 2013 and 2015.
Another way of capturing the growing divide between wages and rent for hundreds of millions of workers around the world is the Median Multiple, the ratio between median household income and average home price. According to the Demographia International Housing Affordability 2016 Survey, a Median Multiple of three and under is considered affordable (e.g., a family making $50,000 a year buying a house at $150,000 or less). A multiple exceeding five is considered “severely unaffordable.”
In 2015 Demographia surveyed 367 cities inside the UK, US, Canada, Australia, New Zealand, Ireland and Japan. According to the group, the 10 most unaffordable cities were: Hong Kong with a Median Multiple of 19.0; Sydney (12.2); Vancouver (10.8); Melbourne, (9.7); Auckland (9.7); San Jose (9.7); San Francisco (9.4); London (8.5); Los Angeles (8.1) and San Diego (8.1). All of these cities have experienced a doubling or even tripling of their Median Multiple since 1998.
The surge in prices and collapse in income has led to more renters on the renting market, since buying has become out of reach. In the United States, between 2005 and 2015, there were 9 million new renting households. This is the largest gain on record for a 10-year period according to Harvard University’s Joint Center for Housing Studies. In 2015, 37 percent of all US households rented, the highest level since the mid-1960s, and up from 31 percent in 2005.
Workers are now becoming trapped in this situation, as they spend more of their income paying for rent and are less and less likely to be able to buy a house. In 2001 in the US, 41 percent of renters spent 30 percent of their income or more on rent. This rose to 49 percent in 2014. In the same year, 26 percent of the renting population spent more than half of their income on rent. In the UK, a fifth of all young adults now stay in their parents’ home until they are at least 26. In 2015, 31.5 percent of US young people aged 18 to 34 lived at home, up from 27 percent in 2005.
While workers suffer under crushing rent burdens, landlords and investors are raking in millions if not billions. This year, a total of 184 billionaires made their wealth through real estate. This was up by 22 individuals from the year before, even as the overall number of billionaires went down from 1,826 to 1,810 individuals.
Those who make money off of rents do not add anything to the productive system. While a certain amount of money can go to maintenance and upkeep, vast and increasing sums of money made by real estate are from the pure monopoly status of owning land.
The wealth of these billionaires principally comes from the unsavory fact that in order to keep the global economy afloat, the central banks around the world have pumped the major banks full with cheap credit.
As UBS Global notes in its 2015 Global Real Estate Bubble Index, “Loose monetary policy has prevented a normalization of housing markets and encouraged local bubble risks to grow.” They write that much of the “overvaluation” in the global housing stock comes from a “dependence on low interest rates.”
“Price-to-rent (PR) multiples are greatest in Zurich, Vancouver, Hong Kong, Geneva and Singapore. The extremely high PR multiplies indicate an undue dependence of housing prices on low interest rates. Paris, London and Sydney follow suit and form a trio of cities with PR multiples around 30. House prices in these cities are vulnerable to a sharp correction should interest rates rise.”
In other words, the deluge of cheap credit provided by the world’s central governments to their major banks has unleashed an orgy of speculation. The world’s richest are getting even richer by doing nothing as their real estate investments shoot through the roof. Meanwhile the vast majority of the world’s population must pay increasingly obscene amounts just to have a place to live.
As Lenin noted in his work Imperialism, in capitalism’s state of decay there is an “extraordinary growth of a class, or rather, of a stratum of rentiers, i.e., people who live by ‘clipping coupons’, who take no part in any enterprise whatever, whose profession is idleness.”
This describes exactly the parasitic layer of real estate moguls, whose money comes not from producing anything of value to the world economy, but by sucking away money from the system in the form of rent. There is no one who benefits from high rents except the small layer of people who control the vast majority of the world’s property.
What does it look like when an ideology dies? As with most things, fiction can be the best guide. In Red Plenty, his magnificent novel-cum-history of the Soviet Union, Francis Spufford charts how the communist dream of building a better, fairer society fell apart.
Even while they censored their citizens’ very thoughts, the communists dreamed big. Spufford’s hero is Leonid Kantorovich, the only Soviet ever to win a Nobel prize for economics. Rattling along on the Moscow metro, he fantasises about what plenty will bring to his impoverished fellow commuters: “The women’s clothes all turning to quilted silk, the military uniforms melting into tailored grey and silver: and faces, faces the length of the car, relaxing, losing the worry lines and the hungry looks and all the assorted toothmarks of necessity.”
But reality makes swift work of such sandcastles. The numbers are increasingly disobedient. The beautiful plans can only be realised through cheating, and the draughtsmen know it better than any dissidents. This is one of Spufford’s crucial insights: that long before any public protests, the insiders led the way in murmuring their disquiet. Whisper by whisper, memo by memo, the regime is steadily undermined from within. Its final toppling lies decades beyond the novel’s close, yet can already be spotted.
When Red Plenty was published in 2010, it was clear the ideology underpinning contemporary capitalism was failing, but not that it was dying. Yet a similar process as that described in the novel appears to be happening now, in our crisis-hit capitalism. And it is the very technocrats in charge of the system who are slowly, reluctantly admitting that it is bust.
You hear it when the Bank of England’s Mark Carney sounds the alarm about “a low-growth, low-inflation, low-interest-rate equilibrium”. Or when the Bank of International Settlements, the central bank’s central bank, warns that “the global economy seems unable to return to sustainable and balanced growth”. And you saw it most clearly last Thursday from the IMF.
What makes the fund’s intervention so remarkable is not what is being said – but who is saying it and just how bluntly. In the IMF’s flagship publication, three of its top economists have written an essay titled “Neoliberalism: Oversold?”.
The very headline delivers a jolt. For so long mainstream economists and policymakers have denied the very existence of such a thing as neoliberalism, dismissing it as an insult invented by gap-toothed malcontents who understand neither economics nor capitalism. Now here comes the IMF, describing how a “neoliberal agenda” has spread across the globe in the past 30 years. What they mean is that more and more states have remade their social and political institutions into pale copies of the market. Two British examples, suggests Will Davies – author of the Limits of Neoliberalism – would be the NHS and universities “where classrooms are being transformed into supermarkets”. In this way, the public sector is replaced by private companies, and democracy is supplanted by mere competition.
The results, the IMF researchers concede, have been terrible. Neoliberalism hasn’t delivered economic growth – it has only made a few people a lot better off. It causes epic crashes that leave behind human wreckage and cost billions to clean up, a finding with which most residents of food bank Britain would agree. And while George Osborne might justify austerity as “fixing the roof while the sun is shining”, the fund team defines it as “curbing the size of the state … another aspect of the neoliberal agenda”. And, they say, its costs “could be large – much larger than the benefit”.
Two things need to be borne in mind here. First, this study comes from the IMF’s research division – not from those staffers who fly into bankrupt countries, haggle over loan terms with cash-strapped governments and administer the fiscal waterboarding. Since 2008, a big gap has opened up between what the IMF thinks and what it does. Second, while the researchers go much further than fund watchers might have believed, they leave in some all-important get-out clauses. The authors even defend privatisation as leading to “more efficient provision of services” and less government spending – to which the only response must be to offer them a train ride across to Hinkley Point C.
Even so, this is a remarkable breach of the neoliberal consensus by the IMF. Inequality and the uselessness of much modern finance: such topics have become regular chew toys for economists and politicians, who prefer to treat them as aberrations from the norm. At last a major institution is going after not only the symptoms but the cause – and it is naming that cause as political. No wonder the study’s lead author says that this research wouldn’t even have been published by the fund five years ago.
From the 1980s the policymaking elite has waved away the notion that they were acting ideologically – merely doing “what works”. But you can only get away with that claim if what you’re doing is actually working. Since the crash, central bankers, politicians and TV correspondents have tried to reassure the public that this wheeze or those billions would do the trick and put the economy right again. They have riffled through every page in the textbook and beyond – bank bailouts, spending cuts, wage freezes, pumping billions into financial markets – and still growth remains anaemic.
And the longer the slump goes on, the more the public tumbles to the fact that not only has growth been feebler, but ordinary workers have enjoyed much less of its benefits. Last year the rich countries’ thinktank, the OECD, made a remarkable concession. It acknowledged that the share of UK economic growth enjoyed by workers is now at its lowest since the second world war. Even more remarkably, it said the same or worse applied to workers across the capitalist west.
Red Plenty ends with Nikita Khrushchev pacing outside his dacha, to where he has been forcibly retired. “Paradise,” he exclaims, “is a place where people want to end up, not a place they run from. What kind of socialism is that? What kind of shit is that, when you have to keep people in chains? What kind of social order? What kind of paradise?”
Economists don’t talk like novelists, more’s the pity, but what you’re witnessing amid all the graphs and technical language is the start of the long death of an ideology.
Street art in Argentina (RUTA 11 General Lavalle)
by artist Guri.
Photo by Guri.
Last week’s G7 summit in Japan was dominated by two interconnected issues: the deepening crisis of global capitalism and the drive to war, in particular the growing danger of a clash between China and the United States in the South China Sea. The inability of the major powers to offer the slightest resolution of the economic breakdown is fuelling national antagonisms and the slide toward conflict.
The US and Japan pressed hard at the G7 gathering for a strong communiqué critical of China that would justify the ramping up of provocative American military incursions within the 12-nautical-mile territorial limit around Chinese-claimed islets. Earlier this month, the US navy conducted a third so-called “freedom of navigation” operation near Fiery Cross Reef in the South China Sea, producing an angry reaction from Beijing and declarations that it would beef up its defences in the area.
In the campaigns currently underway for the US presidency and the Australian federal election, a conspiracy of silence reigns over the preparations for war, aimed at deadening the consciousness of the population to the rising danger of nuclear conflict. Two nuclear-armed powers are facing off not only in the South China Sea, but other dangerous flashpoints such as North Korea and Taiwan, each of which has been greatly exacerbated by Washington’s “pivot to Asia” and aggressive military build-up throughout the region.
An arms race is underway that finds its most acute expression in the arena of nuclear weaponry, delivery systems and associated technologies. Determined to maintain its supremacy in Asia and globally, the US is planning to spend $1 trillion over the next three decades to develop a broader range of sophisticated nuclear weapons and means for delivering them to their targets. The unstated aim of the Pentagon is to secure nuclear primacy—that is, the means for obliterating China’s nuclear arsenal and thus its ability to mount a counter attack. The Chinese response, which is just as reactionary, is to ensure it retains the ability to strike back in a manner that would kill tens of millions in the United States.
The reality of these dangers was underscored last week by the release of a report by the US-based Union of Concerned Scientists (UCS). It chillingly warned:
“Twenty-four hours a day, 365 days a year, the governments of the United States and the People’s Republic of China are a few poor decisions away from starting a war that could escalate rapidly and end in a nuclear exchange. Mismatched perceptions increase both the possibility of war and the likelihood it will result in the use of nuclear weapons. Miscommunication or misunderstanding could spark a conflict that both governments may find difficult to stop.”
While appealing for the two sides to acknowledge the risks and heighten diplomatic efforts to prevent conflict, the UCS analysis offered not the slightest hope that such steps would be taken. The report bleakly declared:
“Lack of mutual trust and a growing sense that their differences may be irreconcilable incline both governments to continue looking for military solutions—for new means of coercion that help them feel more secure. Establishing the trust needed to have confidence in diplomatic resolutions to the disagreements, animosities, and suspicions that have troubled leaders of the United States and the PRC [China] for almost 70 years is extremely difficult when both governments take every effort to up the technological ante as an act of bad faith.”
The intensifying military competition is an unequal one, which only heightens tensions and the danger of war. In the field of nuclear armaments, China is outgunned and outnumbered. While desperately seeking to catch up, the Chinese military is generations behind in the capability of its weaponry and fields an estimated 260 warheads, compared to about 7,000 for the US. Its prime objective is to ensure a credible nuclear deterrent would survive a US first strike. Unlike Beijing, Washington has never ruled out the first use of nuclear weapons.
The Guardian reported last week that China is poised to send submarines armed with nuclear weapons on patrol in the Pacific for the first time. Such a move signals a break with the current policy, under which warheads and missiles were stored separately under the strict control of the top leadership. Armed missiles will now be loaded onto nuclear submarines to enable their immediate launch against continental America in the event of war.
The Chinese leadership has been driven to such measures by the US military build-up in North East Asia, especially the deployment of anti-ballistic missile systems aimed at neutralising China’s ability to strike back. China’s nuclear submarines, however, are comparatively noisy, making them vulnerable to detection and destruction by US attack subs. A new scenario is unfolding in which a jittery Chinese commander could misunderstand an order and, fearing imminent attack, unleash the submarine’s missiles against pre-determined targets.
Nuclear war will not be averted through the diplomacy of major powers, worthless posturing about international nuclear disarmament or the vain hope that nuclear war is so terrible as to be unthinkable. Nuclear strategists have been “thinking the unthinkable” for more than half a century. The last world war ended with the atom bombing of Hiroshima and Nagasaki, killing some 200,000 people. President Barack Obama’s refusal last week in Hiroshima to offer an apology for those monstrous crimes of US imperialism is a sure sign that new ones are being prepared.
The relentless drive toward a new world war between nuclear-armed combatants stems from the crisis of capitalism and its irresolvable contradictions. Only the working class can end the danger of war by putting an end to the profit system and its outmoded nation state system.
Call them soldiers, call them monks, call them machines: so they were but happy ones, I should not care.
—Jeremy Bentham, 1787
Housed in a triumph of architectural transparency in Cambridge, Massachusetts, is the Media Lab complex at MIT, a global hub of human-machine research. From the outside of its newest construction, you can see clear through the building. Inside are open workspaces, glittering glass walls, and screens, all encouragement for researchers to peek in on one another. Everybody always gets to observe everybody else.
Here, computational social scientist Alex Pentland, known in the tech world as the godfather of wearables, directs a team that has created technology applied in Google Glass, smart watches, and other electronic or computerized devices you can wear or strap to your person. In Pentland’s quest to reshape society by tracking human behavior with software algorithms, he has discovered you don’t need to look through a glass window to find out what a person is up to. A wearable device can trace subliminal signals in a person’s tone of voice, body language, and interactions. From a distance, you can monitor not only movements and habits; you can begin to surmise thoughts and motivations.
In the mid-2000s Pentland invented the sociometric badge, which looks like an ID card and tracks and analyzes the wearer’s interactions, behavior patterns, and productivity. It became immediately clear that the technology would appeal to those interested in a more hierarchical kind of oversight than that enjoyed by the gurus of MIT’s high-tech playgrounds. In 2010 Pentland cofounded Humanyze, a company that offers employers the chance to find out how employee behavior affects their business. It works like this: A badge hanging from your neck embedded with microphones, accelerometers, infrared sensors, and a Bluetooth connection collects data every sixteen milliseconds, tracking such matters as how far you lean back in your chair, how often you participate in meetings, and what kind of conversationalist you are. Each day, four gigabytes’ worth of information about your office behavior is compiled and analyzed by Humanyze. This data, which then is delivered to your supervisor, reveals patterns that supposedly correlate with employee productivity.
Humanyze CEO Ben Waber, a former student of Pentland’s, has claimed to take his cues from the world of sports, where “smart clothes” are used to measure the mechanics of a pitcher’s throw or the launch of a skater’s leap. He is determined to usher in a new era of “Moneyball for business,” a nod to baseball executive Billy Beane, whose data-driven approach gave his team, the Oakland Athletics, a competitive edge. With fine-grained biological data points, Waber promises to show how top office performers behave—what happy, productive workers do.
Bank of America hired Humanyze to use sociometric badges to study activity at the bank’s call centers, which employ more than ten thousand souls in the United States alone. By scrutinizing how workers communicated with one another during breaks, analysts came to the conclusion that allowing people to break together, rather than in shifts, reduced stress. This was indicated by voice patterns picked up by the badge, processed by the technology, and reported on an analyst’s screen. Employees grew happier. Turnover decreased.
The executives at Humanyze emphasize that minute behavior monitoring keeps people content. So far, the company has focused on loaning the badges to clients for limited study periods, but as Humanyze scales up, corporate customers may soon be able to use their own in-house analysts and deploy the badges around the clock.
The optimists’ claim: technologies that monitor every possible dimension of biological activity can create faster, safer, and more efficient workplaces, full of employees whose behavior can be altered in accordance with company goals.
Widespread implementation is already underway. Tesco employees stock shelves with greater speed when they wear armbands that register their rate of activity. Military squad leaders are able to drill soldiers toward peak performance with the use of skin patches that measure vital signs. On Wall Street, experiments are ongoing to monitor the hormones of stock traders, the better to encourage profitable trades. According to cloud-computing company Rackspace, which conducted a survey in 2013 of four thousand people in the United States and United Kingdom, 6 percent of businesses provide wearable devices for workers. A third of the respondents expressed readiness to wear such devices, which are most commonly wrist- or head-mounted, if requested to do so.
Biological scrutiny is destined to expand far beyond on-the-job performance. Workers of the future may look forward to pre-employment genetic testing, allowing a business to sort potential employees based on disposition toward anything from post-traumatic stress disorder to altitude sickness. Wellness programs will give employers reams of information on exercise habits, tobacco use, cholesterol levels, blood pressure, and body mass index. Even the monitoring of brain signals may become an office commonplace: at IBM, researchers bankrolled by the military are working on functional magnetic-resonance imaging, or fMRI, a technology that can render certain brain activities into composite images, turning thoughts into fuzzy external pictures. Such technology is already being used in business to divine customer preferences and detect lies. In 2006 a San Diego start-up called No Lie MRI expressed plans to begin marketing the brain-scanning technology to employers, highlighting its usefulness for employee screening. And in Japan, researchers at ATR Computational Neuroscience Laboratories have a dream-reading device in the pipeline that they claim can predict what a person visualizes during sleep. Ryan Hurd, who serves on the board of the International Association for the Study of Dreams, says such conditioning could be used to enhance performance. While unconscious, athletes could continue to practice; creative types could boost their imaginations.
The masterminds at Humanyze have grasped a fundamental truth about surveillance: a person watched is a person transformed. The man who invented the panopticon—a circular building with a central inspection tower that has a view of everything around it—gleaned this, too. But contrary to most discussions of the “all-seeing place,” the idea was conceived not for the prison, but for the factory.
Jeremy Bentham is usually credited with the idea of the panopticon, but it was his younger brother, Samuel Bentham, who saw the promise of panoptical observation in the 1780s while in the service of Grigory Potemkin, a Russian officer and statesman. Potemkin, mostly remembered for creating fake villages to fool his lover, Catherine the Great, was in a quandary: his factories, which churned out everything from brandy to sailcloth, were a hot managerial mess. He turned to Samuel, a naval engineer whose inventions for Potemkin also included the Imperial Vermicular, a wormlike, segmented 250-foot barge that could navigate sinuous rivers. Samuel summoned skilled craftsmen from Britain and set them to the hopeless task of overseeing a refractory mass of unskilled peasant laborers who cursed and fought in a babel of languages. Determined to win Potemkin’s favor, he hit on a plan for a workshop at a factory in Krichev that would allow a person, or persons, to view the entire operation from a central inspector’s lodge “in the twinkling of an eye,” as his brother Jeremy would later write in a letter. The inspector could at once evoke the omnipresence of God and the traditional Russian noble surrounded by his peasants. Laborers who felt themselves to be under the constant eye of the inspector would give up their drunken brawls and wife-swapping in favor of work.
War thwarted Samuel’s plans for the Krichev factory, eventually forcing him to return home to Britain, where, in 1797, he drew up a second panoptical scheme, a workhouse for paupers. Six years earlier, in 1791, Jeremy had borrowed Samuel’s idea to publish a work on the panoptical prison, built so that guards could see all of the inmates while the latter could only presume they were being watched, fostering “the sentiment of a sort of omnipresence” and “a new mode of obtaining power of mind over mind.” In America, the Society for Alleviating the Miseries of Public Prisons adopted panoptical elements for the Eastern State Penitentiary in Philadelphia, adding solitary confinement with the idea of delivering the maximum opportunity for prisoner repentance and rehabilitation. Visiting the prison in 1842, Charles Dickens noted that its chief effect on inmates was to drive them insane.
Before the days of industrialization, employers had little use for surveillance schemes. The master craftsman lived in his workshop, and his five to ten apprentices, journeymen, and hirelings occupied the same building or adjacent cottages, taking their behavioral cues from his patriarchal authority. The blacksmith or master builder or shoemaker interacted with his underlings in a sociable and informal atmosphere, taking meals with them, playing cards, even tippling rum and cider. Large-scale manufacturing swept this all away. Workmen left the homes of their employers; by the early decades of the nineteenth century, the family-centered workplace—where employers provided models of behavior, food, and lodging—was becoming a thing of memory.
Proto-industrialists found that their new employees, an ever-shifting mass of migrants and dislocated farm boys, found ample opportunities for on-the-job drunkenness, inattention, and fractious behavior. In his classic work A Shopkeeper’s Millennium, historian Paul
E. Johnson observes that in America an answer to this problem was found in the Protestant temperance movement just then blowing righteous winds across the Northeast. Managers found that the revival and the Sunday school could foster strict internal values that made constant supervision less important. Workers, if properly evangelized, would turn willingly from the bottle to the grueling business of tending power-driven machines. God would do the monitoring as He does it best—from the inside.
Unfortunately, God’s providential eye tended to blink in the absence of regular churchgoing. So in the 1880s and 1890s, mechanical engineer Frederick Winslow Taylor displaced God with scientific management systems, devising precise methods of judging and measuring workers to ensure uniformity of behavior and enhanced efficiency. Taylor’s zeal to scrutinize every aspect of work in the factory led to such inventions as a keystroke monitor that could measure the speed of a typist’s fingers. His methods of identifying underperforming cogs in the industrial machine became so popular that Joseph Wharton, owner of Bethlehem Steel, incorporated Taylor’s theories into the bachelor’s degree program in business he had founded at the University of Pennsylvania. Harvard University soon created a new master’s degree in business administration, the MBA, that focused on studying Taylorism.
Workplace surveillance didn’t evolve much beyond Taylor’s ideas until closed-circuit television brought prying to heights unimagined by the brothers Bentham. In 1990 the Olivetti Research Laboratory, in partnership with the University of Cambridge Computer Laboratory, announced an exciting new workplace-spying project aptly named Pandora. The Pandora’s Box processor handled video streams and controlled real-time data paths that allowed supervisors to peek in on remote workstations. An improved system launched in 1995 was named Medusa, after the Greek monster who turned victims to stone with her gaze.
By the early twenty-first century, electronic monitoring in the workplace became de facto, with bosses peering into emails, computer files, and browser histories. From the lowest-rung laborers to the top of the ivory tower, no employee was safe. In 2013 Harvard University was found to have snooped in the email accounts of sixteen faculty deans for the source of a media leak during a cheating scandal. Global positioning systems using satellite technology, which came to maturity by 1994 and grew popular for tracking delivery trucks, opened new methods of watching. Dennis Gray, owner of Accurid Pest Solutions, could satisfy a hunch in 2013 that workers were straying from their tasks. He quietly installed GPS tracking software on the company-issued smartphones of five of its drivers; one indeed was found to be meeting up with a woman during work hours. In 2015 Myrna Arias, a sales executive for money-transfer service Intermex, objected to her employer monitoring her personal time and turned off the GPS device that tracked her around the clock. She was fired.
Surveillance technology stirs up profound questions as to who may observe whom, under what conditions, for how long, and for what purpose. The argument for monitoring the vital signs of an airline pilot, whose job routinely holds lives at stake, may seem compelling, but less so for a part-time grocery store clerk. In a 1986 executive order President Ronald Reagan, expressing concern about the “serious adverse effects” of drug use on the workforce, which resulted in “billions of dollars of lost productivity each year,” instituted mandatory drug testing for all safety-sensitive executive-level and civil-service federal employees. A noble mission, perhaps, but prone to expand like kudzu: by 2006 it entangled up to three out of four jobseekers, from would-be Walmart greeters to washroom attendants, who were forced to submit to such degradations as peeing in a plastic jar, sometimes under the watchful eye of a lab employee. Thirty-nine percent could expect random tests after they were hired, as well as dismissal for using substances on or off the job and regardless of whether their use impaired performance. Job applicants often accordingly changed their behavior; one scheme involved ordering dog urine through the mail to fool the bladder inspectors.
At the 2014 Conference on Human Factors in Computing Systems, held in Toronto, participants noticed an unusual sign affixed to restroom doors: behavior at these toilets is being recorded for analysis. It had been placed there by Quantified Toilets (slogan:
every day. every time.), whose mission, posted on its website, states: “We analyze the biological waste process of buildings to make better spaces and happier people.” At the conference, Quantified Toilets was able to provide a real-time feed of analytical results. These piss prophets of the new millennium could tell if participants were pregnant, whether or not they had a sexually transmitted disease, or when they had drugs or alcohol in their system. (One man had showed up with a blood-alcohol level of 0.0072 percent and a case of gonorrhea.)
Quantified Toilets, it turned out, was not a real company, but a thought experiment for the conference, designed to provoke discussion about issues of privacy in a world where every facial expression, utterance, heartbeat, and trip to the bathroom can be captured to generate a biometric profile. Workplace surveillance, after all, is a regulatory Wild West; employees have few rights to privacy on the job. A court order may be necessary for police to track a criminal suspect, but no such niceties prevent an employer from exploring the boundaries of new technologies. History suggests that abuses will be irresistible: in 1988 the Washington, DC, police department admitted using urine tests to screen female employees for pregnancy without their knowledge.
In the film Gattaca, set in the not-too-distant future, biometric surveillance is deployed to distinguish between genetically engineered superior humans and genetically natural inferior humans, who are forced to do menial jobs. We are quickly approaching such a world: employers who are able to identify—and create—workers with superior biological profiles are already turning the science fiction into reality.
Humanyze assures in corporate materials that privacy is a top priority. The names of employees are stored separately from behavioral information, and individual conversations aren’t recorded, just the metadata—a distinction familiar to those following the story of the widespread phone-tapping program of the NSA. Still, it requires little imagination to see how employers can use it for more extensive and rigorous surveillance of individual workers. A benign boss in the present may use data to decide the arrangements of break rooms and cubicles to enhance worker satisfaction and, in so doing, improve productivity. But in the future the same data may be retrieved and analyzed for unimagined possibilities. Observation is versatile in its application. In the face of capitalist demands for high performance and efficiency, abstract ideas like privacy and freedom can come to sound quaint and sentimental.
As optic and electronic watching give way to biosurveillance, the architecture of the Bentham brothers’ panopticon melts away and becomes internalized. The self-watching employee, under her own unwavering gaze, pre-adjusts
behavior according to a boss’ desire. Biosurveillance is sold as a tool for boosting happiness, but it also promotes a particular idea of what happiness is—which probably looks a lot more like workers who don’t make trouble than like squeaky wheels or even like the champions of disruption touted in Silicon Valley. The power to make you happy is also the power to define your happiness.
With his mantra “the medium is the message,” Marshall McLuhan stressed that the changes wrought upon us by technology may be more significant than the information revealed by it. Devices that monitor our minds and movements become part of who we are. Back in the Cold War, the Western press routinely derided Communist-bloc news clips of happy workers toiling away, singing songs in the mills and fields. One anti-communist propaganda animation from 1949, Meet King Joe, depicts a Chinese peasant smiling only because he is unaware of the paltriness and restrictions of his conditions. Such promos, perhaps, were just ahead of their time. Modern capitalism is poised to do them one better.
Despite its name, a company like Humanyze—which brings forth the next frontier of biometric, device-driven surveillance—can make us less ourselves, more like who we’re supposed to be according to objectives of those who track our metrics. When we can feel, even on a cellular level, the gaze of the inspector, the invisible hand becomes the invisible eye, guiding as it does best, from within. Perhaps we will find true what we once feared: that contented workers are all alike. But so long as we are happy, who cares?