Resisting the “sharing” economy

Under the guise of “innovation,” capitalism creeps into our personal relationships, networks and community.

He’s helped you a lot in the past and you don’t think twice about saying yes.

When the day comes, you pick him up in your car and drive together, alternating between chatting and singing along, badly, to the radio. You drop him off at the gate, give him a hug and wish him well on his trip. He offers to pay for gas, but you shake your head and say he can cook you dinner when he gets back instead. He smiles and takes his bag into the terminal. You wave and get back into your car.

You come to that dinner a few months later. The smell of food fills his apartment. As you wait for the dish to finish in the oven, he talks about his trip: all the places he went and the people he met. He said that a friend of someone he met there has been backpacking in this area and will be staying on his couch for a week or two. It was the least he could do, he said, after they treated him so well when he was there. A timer goes off and your friend goes to the oven to remove dinner. About an hour later, you’re both stuffed and, looking at what’s left, realize that he probably made way too much food. A conversation about food waste bubbles up and soon your friend gets an idea.

Your friend knocks on his neighbor’s door while you hold the tin of way-too-many leftovers. The neighbor opens up and your friend explains that he made more food than he could ever eat before it would spoil and so was wondering if she wanted some. She smiles and gets a tupperware that your friend fills up, she asks the two of you to come in for some wine, which you both eagerly accept. It’s tart and strong and refreshing. You stay for about 15 minutes and talk about cooking. After leaving, you and your friend repeat this with more of his neighbors until the leftovers are all gone, though you’re not exactly empty-handed: you have a small pie from one neighbor, a loaned book from another, two bottles of beer from a third, and a bunch of fresh basil from the forth, all given without any prompting or expectations, and accepted not as payment or exchange but as an expression of goodwill reflecting that which your friend sent to them.

What you witnessed that night is technically called “community”, but it’s something so fundamental to the human experience and so foundational to human well-being that even those without the word would recognize it for what it is: social relations for the sake of social relations, the benefits coming not as part of some market mechanism but from simple human connections, the very thing that allowed humans to survive without the teeth and claws that other creatures enjoyed. It’s something that has sustained us before the capitalist economic system was even conceived of.

Because of this, it doesn’t follow the logic of the market, the ruthlessness and greed that give meaning and horror, to the capitalist system. It follows, instead, the logic of solidarity and friendship – it cannot be turned into a stock, it cannot be sold in stores, and it cannot be hawked on an infomercial. Indeed, that is the point. And it is because of this that the capitalist system finds it so threatening and why it works so hard to dismantle it.

While capitalism has always produced alienation, the rise of the so-called “sharing” economy, facilitated through smartphone apps and fueled by mountains of venture capital, is the apotheosis of the system’s war against the non-economic sphere. You can share cars, apartments, even meals with the touch of a button. It promises to take power away from the large corporations and put it into the hands of the individual, turning a top-down command economy into a peer-to-peer networked one. In reality, however, it is nothing more than capitalism rebranding itself. Having studied complaints about it with all the seriousness of a market researcher, it has launched the same old product in a bright, shiny new package, the New Coke of economic systems. Don’t believe it. The end goal is the same as it always was: profit.

The rhetoric surrounding these “services” is nothing more than a cover for capitalism’s direct colonization of our social interactions, our personal relationships becoming nothing more than one more means of production for some far off executive congratulating himself for a job well done. No longer content with monopolizing our physical world, it has now turned to our social relations as well, seeking to reduce something fundamental to who we are into a line item on a balance sheet.

Under this system, getting a ride to the airport, staying at someone’s house when traveling, cooking meals and sharing leftovers, are actions undertaken not in the name of friendship and camaraderie but as an impersonal economic transaction. The “sharing” economy is nothing of the sort – it is a way for companies to get people to do their work without having to deal with things like wages or benefits. It’s a way to build a hotel empire without having to build any actual hotels; it’s how you make money off selling food without making, or even buying any yourself; it’s a fleet of taxis without having to deal with things like fuel costs, liability insurance and licensing (not to mention ornery unions). At best, it should be called a renting economy. The participants take on all the work and all the risk. All the companies do is provide the connections, something that can easily be done for free, and has been for centuries and yet, for some reason, the people who create these services are praised as innovators. It is a parasitic relationship that masquerades as symbiosis.

The tragedy of all this is that it has turned an idea with revolutionary potential into one more manifestation of the dominant economic paradigm, a top-down structure where anything outside the bottom line is, at best, a secondary concern best dealt with after the quarterly earnings report comes out, so as not to spook the investors. It’s like if someone invented the steam engine and the only thing people used it for was to get wrinkles out of shirts, for a hefty price. We shouldn’t really be surprised about this, though. This is what capitalism does: it expands and absorbs anything it touches. It has to grow, or it will die. It constantly needs new things to monetize, to commercialize, to turn into products that it can feed its captive global market, and so when it begins running out of other things to make money off of, why not turn to our social relations? At this rate, nowhere and nothing and no one will be free of its influence, to rise above the status of a commodity.

There is still a chance to preserve this one last bulwark against the hungry market, however, while the “sharing” economy is growing, it has yet to surpass the size of the real sharing economy, the old connections we share and the new ones we make every day. We must discard parasitism disguised as sharing and promote mutual aid and solidarity; networks of people that can sustain themselves and each other outside the ruthless logic of market relations. We must share food, not because we can make some money,but because we care about each other. We must share rooms, not because we have aspirations of becoming some mini-entrepreneur, but because we value our connections. We must open up to new relationships, not because they present more opportunities for monetization, but because we want to reverse the alienation and isolation that has been foisted on us by a cruel and uncaring economic system. We must not allow the last refuge from rapacious market relations to fall to capitalism, turning even our most intimate relationships into something with a calculable dollars-and-cents value that can be bought and sold like a used car.

This battle presents unique opportunities for resistance, because it is one that is largely decoupled from the physical world. They are fighting us on the ground of our personal relationships and it is here that we, not they, have the home field advantage. We can fight and we can win, as long as we have our friends.

— Chris Cunderscoreg is the founder of the blog We Are the 99 Percent.

https://www.adbusters.org/magazine/120/resisting-so-called-sharing-economy.html

HAPPY 4TH OF JULY!!

Brentwood Parade 7

 

For me the 4th is always associated with happy memories I have of the holiday in my home town of Brentwood, PA.  As my disenchantment with San Francisco’s technotopia grows, I find myself reaching back to the community of small-town America, and on this special day the iconic Independence Day celebrations.  Those celebrations always began with the community parade and the core of that event was the appearance of the volunteer fire trucks.
The volunteer fire department is emblematic of the difference between small-town America and the big cities.  Kurt Vonnegut, himself a volunteer fireman, called volunteer firefighters “… the only examples of enthusiastic unselfishness to be seen in this land.”  Imagine.  Citizens putting themselves in harm’s way, protecting their communities, for free.  Yes.
My father was a member of Brentwood’s Volunteer Fire Department and some of my earliest memories involved riding on the big Mack Firetrucks in the 4th of July Parade as a small boy.  Norman Rockwell moments and, indeed, American was a different place back then.  Especially small-town, tight communities.  I’m really feeling the lack of caring community these days.  My neighborhood has been destroyed and I’m surrounded by cold, uncaring tech bros who come and go.  I suspect they use Ocean Beach as a kind of holding area while they look for accommodation in the Golden Mission.  La Playa has become “gasoline alley.”  Ugh.

“Steve Jobs,” portrait of the artist as tech guru: What we lose when we worship at the altar of commerce

When we abandon the arts, this is what’s left 

"Steve Jobs," portrait of the artist as tech guru: What we lose when we worship at the altar of commerce
Michael Fassbender in “Steve Jobs” (Credit: Universal Pictures)

The trailer for the new Steve Jobs biopic has just been released, and it looks like the movie could be formidable, maybe one of the films of the year. Despite changes in cast and director, the matching of director Danny Boyle with actor Michael Fassbender (along with screenwriter Aaron Sorkin) could summon serious dramatic firepower.

The movie seems to make explicit something that’s been swirling for a while now: That engineers, software jockeys, and product designers are the capital-A Artists of our age. They are what painters and sculptors were to the Renaissance, what composers and poets were to the 19th century, what novelists and, later, auteur film directors, were to the 20th.

The likening of tech savants to artists goes back at least as far as Richard Florida’s books about the creative class, but it picked up energy with the 2011 death of Jobs, who was hailed as a job creator by Republican politicians and mystic genius by many others. You see this same impulse in the opening of Jonah Lehrer’s now-discredited book “Imagine,” which compared the inventor of the Swiffer (which “continues to dominate the post-mop market”) with William James and Bob Dylan.

The metaphor becomes quite clear in “Steve Jobs,” which is based on Walter Isaacson’s bestselling biography. In the trailer, Fassbender’s Jobs announces that he is not a musician – he is the conductor. “Musicians play their instruments,” he says. “I play the orchestra.” Stirring orchestral music – with stabbing violins – plays through the trailer. “Artists lead,” the Jobs character rants to a meeting at a particularly fraught time, and “hacks ask for a show of hands.”

But how many Americans – including those who can tell you the difference between every generation of iPhone – can name a single living conductor? What about a real visual artist? (That is, someone besides Lady Gaga.) As a recent CNN article asks, what about a famous living poet? (“No, not Maya Angelou. She died last year.”)

So how did we get here, where technology designers claiming the mantle of the Artist have replaced – in both the media and in the public’s esteem — the actual working, living, breathing artist?

The reason is not just the weird technological fetishism that has gripped American culture since the ‘80s. It also comes from how we as a society have spent our resources, and it goes way back.

While Americans, on the whole, didn’t worship culture with the same dedication as Europeans, the whole West saw the arts as something central, even a replacement for religion: After Nietzsche told us God was dead, theaters and concerts halls that looked like churches sprouted up not just in Britain and the continent, but in the wealthier and more settled cities in the States as well. Conductors like Toscanini became cultural heroes. Nations and plutocrats alike spent money to spread the gospel.

Cold War funding supported culture even more directly – Eisenhower sent Louis Armstrong overseas – and television stations and magazines considered the dissemination of the arts part of what they did. Maria Callas, Thelonious Monk, and Leonard Bernstein showed up not just in small-circulation specialty publications but on the cover of Time magazine.

For all the difference between their politics, generations, and backgrounds, the president who followed Eisenhower did not abandon the religion of culture: Kennedy had Robert Frost read at his inauguration. JFK spoke often, publicly and privately, about the importance of culture, writing that “There is a connection, hard to explain logically but easy to feel, between achievement in public life and progress in the arts.” Lyndon Johnson followed him by founding the National Endowment for the Arts. Nixon made war on a lot of the previous administration’s achievements, but not this.

Even more important, public schools offered music and arts education that gave at least some students a sense that this stuff mattered and was a basic part of being an educated, informed citizen.

How did all of this edifice collapse, so that music, poetry, theater, painting and everything else would be just another part of mix of commerce and “content”? That’s hard to make sense of, but let’s just say that the culture wars of the Bush I years, the demonization of artists and other subversives as a “cultural elite,” and the attacks on the canon by the academic left didn’t help. Nor did the conquest of neoliberalism, waged by Reagan and Thatcher and their respective brain trusts, which told us that markets are supreme and more important than musty old ideas like society or culture. And the globalization that came after gave narrow-minded utilitarians reason to slice and dice arts education. It’s still happening.

In the simplest sense: When you use state funding to help develop computer technology and what would become the Internet, and cut support for arts and culture, what do you think is gonna happen?

So what’s wrong with making Steve Jobs and others who came up with cool gadgets and efficient apps for getting pizza to people in San Francisco into the artists of our age? Doesn’t culture change over the decades and centuries?

Well, sort of, but here’s the key difference. The whole idea of poetry or a symphony or a novel is to get past daily life. It’s not just about cool or efficiency or even entertainment but an aspect of – to mangle the title of Geoff Dyer’s excellent essay collection – what was previously known as the human condition. We used to see culture as something that could be deeper than a really fast computer or a cordless mouse.

The literary essayist Richard Rodriguez has said that we live in “the age of the engineer.” If so, something really has died inside us. The Jobs movie looks great, but if this guys is our John Lennon or Nina Simone or Bernstein or Beethoven, we really are cooked.

Scott Timberg is a staff writer for Salon, focusing on culture. A longtime arts reporter in Los Angeles who has contributed to the New York Times, he runs the blog Culture Crash.He’s the author of the new book, “Culture Crash: The Killing of the Creative Class.”

The Tech Industry Bubble Is About To Burst

Euphoric reaction to superstar tech businesses is rampant — so much so that the tech industry is in denial about looming threats. The tech industry is in a bubble, and there are sufficient indicators for those willing to open their eyes. Rearing unicorns, however, is a distracting fascination.

The Perfect Storm

Raising funding for tech startups has never been so easy. Some of this flood of money has been because of mutual funds and hedge funds, including Fidelity, T. Rowe Price and Tiger Global Management. This is altering not only the funding landscape for tech startups, but also valuation expectations.

There are many concerns that valuations for businesses are confounding rationale. Entrepreneurs and their investors are deviating from more traditional valuation and performance metrics to more unconventional ones. Another cause cited for increasing valuations is the trend of protections for late investors that cause valuations to inflate further. The combination of a number of these factors has put the sector into a state of artificial valuations.

Meanwhile, the companies themselves are burning through cash like there is no tomorrow. Throwing money at marketing, overheads and, in particular, remuneration has become the accepted investment strategy for startup growth. All this does is perpetuate the vicious cycle of raising more money and spending more money. For the amounts that some of these businesses have raised, the jury is still out on actual profitability.

Unicorn Season

CB Insights publishes information on unicorns (companies with a valuation above $1 billion), which shows that access to the club has become increasingly less exclusive in the last couple of years. The chart below shows that the number of companies valued at $1 billion or above in 2014 exceeded previous years by quite some margin (47 unicorns joined the club in 2014 vs. 7 and 8 in 2012 and 2013, respectively). In addition, for the first 5 months of 2015, this trend shows no signs of abating (32 new unicorns as of June 1, 2015).

bubblechart

Different Experts, Same Conclusion

In the face of these trends, a small group of well-respected and influential individuals are voicing their concern. They are reflecting on what happened in the last dot-com bust and identifying fallacies in the current unsustainable modus operandi. These relatively lonely voices are difficult to ignore. They include established successful entrepreneurs, respected VC and hedge fund investors, economists and CEOs who are riding their very own unicorns.

Mark Cuban is scathing in his personal blog, arguing that this tech bubble is worse than that of 2000, because, he states, that unlike in 2000, this time the “bubble comes from private investors,” including angel investors and crowd funders. The problem for these investors is there is no liquidity in their investments, and we’re currently in a market with “no valuations and no liquidity.” He was one of the fortunate ones who exited his company, Broadcast.com, just before the 2000 boom, netting $5 billion. But he saw others around him not so lucky then, and fears the same this time around.

A number of high-profile investors have come out and said what their peers all secretly must know. Responding to concerns raised by Bill Gurley (Benchmark) and Fred Wilson (Union Square Ventures), Marc Andreessen of Andreessen Horowitz expressed his thoughts in an 18-tweet tirade. Andreessen agrees with Gurley and Wilson in that high cash burn in startups is the cause of spiralling valuations and underperformance; the availability of capital is hampering common sense.

The tech startup space at the moment resembles the story of the emperor with no clothes.

As Wilson emphasizes, “At some point you have to build a real business, generate real profits, sustain the company without the largess of investor’s capital, and start producing value the old fashioned way.” Gurley, a stalwart investor, puts the discussion into context by saying “We’re in a risk bubble … we’re taking on … a level of risk that we’ve never taken on before in the history of Silicon Valley startups.”

The tech bubble has resulted in unconventional investors, such as hedge funds, in privately owned startups. David Einhorn of Greenlight Capital Inc. stated that although he is bullish on the tech sector, he believes he has identified a number of momentum technology stocks that have reached prices beyond any normal sense of valuation, and that they have shorted many of them in what they call the “bubble basket.”

Meanwhile, Noble Prize-winning economist Robert Shiller, who previously warned about both the dot-com and housing bubbles, suspects the recent equity valuation increases are more because of fear than exuberance. Shiller believes that “compared with history, US stocks are overvalued.” He says, “one way to assess this is by looking at the CAPE (cyclically adjusted P/E) ratio … defined as the real stock price (using the S&P Composite Stock Price Index deflated by CPI) divided by the ten-year average of real earnings per share.”

Shiller says this has been a “good predictor of subsequent stock market returns, especially over the long run. The CAPE ratio has recently been around 27, which is quite high by US historical standards. The only other times it is has been that high or higher were in 1929, 2000, and 2007 — all moments before market crashes.”

Perhaps the most surprising contributor to the debate on a looming tech bubble is Evan Spiegel, CEO of Snapchat. Founded in 2011, Spiegel’s company is a certified “unicorn,” with a valuation in excess of $15 billion. Spiegel believes that years of near-zero interest rates have created an asset bubble that has led people to make “riskier investments” than they otherwise would. He added that a correction was inevitable.

What Does A Bubble Look Like?

To shed light on how close we may be to the tech bubble bursting, it is worthwhile trying to understand what determines being in a bubble. Typically, this refers to a situation where the price of an asset exceeds by a large margin its fundamental value.

In his 1986 book Stabilizing an Unstable Economy, economist Hyman Minsky’s theory of financial instability attracted a great deal of attention, and gathered an increasing number of adherents following the crisis of 2008-09. Minsky identified five stages that culminate in a bubble, as described in this Forbes article: displacement, boom, euphoria, profit taking, and panic.

Uber is an enviable company for much of what it has achieved, and the team is to be commended for how they have grown this business, as well as their previous successes. However, it serves as a good example to illustrate the dynamics of the tech bubble.

Displacement:Investors’ excitement with a new paradigm, such as advances in technology or historically low interest rates. The explosion of the “sharing economy” has resulted in companies such as Uber, Lyft and Airbnb growing exponentially in recent years by taking advantage of this new mode of operation.

Boom:Prices rise slowly at first, but then gain momentum as more participants enter the market. Fear of missing out (FOMO) attracts even more participants. Consequently, publicity for the asset class in question increases. Reviewing investment rounds for Uber since 2010 when they completed their seed round shows a large variety of investors wanting a piece of the action, perhaps in part due to a fear of missing out on the golden goose. The introduction of hedge funds and investment banks funding the business can also be seen, which underlines the facelift happening in this sector.

Euphoria:Asset prices increase exponentially; there is little rationale evident in decision making. During this phase, new valuation measures and metrics are touted to justify the unrelenting rise of asset prices. Uber’s increased valuation between funding rounds symbolizes the euphoria around the business. The chart below shows the evolution of Uber’s pre-money valuation over the last number of funding rounds.

Source: CB Insights; data analyzed by Funding Your Tech Startup

 

Although the pace of revenue growth at Uber is astounding (doubling approximately every 12 months at the moment), profitability is less certain. Profitability margins should increase over time as recognition and saturation are achieved in newer markets, but it is difficult to ignore the regulatory burdens and lawsuits the business is facing, which could steer it off course.

Profit taking:The few that have identified what’s going on are making their profit by selling their positions. This is the right time to exit, but is not seen by the majority. This is the next indicator on the horizon that will underline that we are in a tech bubble, and that it is about to burst. The catalyst for profit taking could be regulatory strains or excessive cash consumption that isn’t reflected by profitability gains in startups. Savvy investors will take the opportunity to exit while valuations are still high. The exits may well be too late for investors who are further behind on the FOMO curve or new types of investors who don’t appreciate that the market has moved.

Panic:By now it’s too late and asset prices collapse as rapidly as they once increased. With everyone trying to cash in realizing the situation, supply outstrips demand and many face big losses. Watch this space for the unfortunately impending examples.

Conclusion

The fact that we are in a tech bubble is in no doubt. The fact that the bubble is about to burst, however, is not something the sector wants to wake up to. The good times the sector is enjoying are becoming increasingly artificial. The tech startup space at the moment resembles the story of the emperor with no clothes. It remains for a few established, reasoned voices to persist with their concerns so the majority will finally listen.

 

http://techcrunch.com/2015/06/26/the-tech-industry-is-in-denial-but-the-bubble-is-about-to-burst/?ncid=rss&cps=gravity_1462_-7218853287940442458

Why Apple’s response to Charleston is so stupid

Banning games isn’t the answer: 

When it comes to disowning the Confederate battle flag, there’s a right way and a wrong way. Apple chose the latter

Banning games isn't the answer: Why Apple's response to Charleston is so stupid
Tim Cook (Credit: AP/Richard Drew)

Earlier this week, in response to South Carolina Gov. Nikki Haley’s call for the Confederate battle flag to be removed from the state’s Capitol grounds, I wrote a post commending the governor for doing what was obviously right.

But I also expressed concern that she was doing the right thing for the wrong reasons, and that, by defending her move with the language of feelings, she risked perpetuating a misunderstanding of why so many find the Confederate battle flag objectionable. It’s not about politeness or manners, I argued; it’s about fighting white supremacy.

In the mere two days since that post went live, it has become clear that Haley’s break with the flag, that once-unimpeachable shibboleth of Southern politics, wasn’t an act of bravery so much as good professional instincts. Not only has Haley been followed by fellow Southern Republicans in the South Carolina legislature, as well as inMississippi and Alabama, but private sector behemoths like Walmart, Amazon, Sears and eBay have decided to ditch the flag, too. And now comes word that mighty Apple has hopped on the bandwagon, kicking multiple rebel-flagged games from its App Store.

Unfortunately, however, it appears that Haley was ahead of the curve in more ways than one. Because just as Apple has joined her in no longer wanting to legitimize the trademark of the Army of Northern Virginia, its seems to have also done so without actually understanding why. The company claims it’s only zapping apps that feature the flag “in offensive or mean-spirited ways.” But when you look at some of their targets, including many games about the Civil War itself, that doesn’t hold up. A different, stupider explanation — that the company is treating the flag as if it were no less dangerous than the eyes of Medusa — makes more sense.

Take the “Civil War” game series by HexWar Games, for example, which saw at least four of its editions banned by Apple. To state the obvious, there are war games; and they’re war games about the Civil War. When it comes to the mission of the Confederate army, I’ll agree with Ulysses S. Grant’s famous description of it as “one of the worst for which a people ever fought, and one for which there was the least excuse.” But that hardly makes the use of the flag in a game about the war “mean-spirited” or “offensive.” Apple says they won’t touch apps that “display the Confederate flag for educational or historical uses,” and though I doubt these war games are educational, they are historical, at least.

Now, if these games soon get their App Store privileges back and once again find themselves on Apple’s virtual shelves, I won’t be surprised. According to Kotaku, this isn’t the first time the people running the App Store have shown signs of being either confused or incompetent. And despite what the angry young men of #GamerGatemay argue in dozens upon dozens of ever-so-angry tweets, this is not exactly the greatest infringement on liberty the world has ever seen. Stipulating all of that, though, I still think Apple’s decision was ominous; and I still believe it should especially concern sincere anti-racists.
Because if we adopt a zero-tolerance policy regarding the Confederate flag, you can guess, looking at the present, where it’s likely to lead. The idea that white supremacy is a distinctly Southern affliction would likely be reinforced, even though it has always been a fantasy. The mistaken view of racism as an artifact of history would likely get strengthened, too. People would likely treat the flag like we treat the word “nigger,” hoping that if they ban it from their consciousness, they can make racism disappear. And the myth that we can wall ourselves off from the nasty parts of our heritage, which is one of American society’s more distinctive neuroses, would become even harder to shake.

A superior course, I’d argue, would be to deepen our understanding of our past, so that we can see the connections between the injustices of history of history and the iniquities of the present. Instead of exiling the flag from the culture as if it never existed, we’d acknowledge how the legacy of the Confederacy is still with us today. We’d recognize white supremacy as an inextricable part of the country’s founding, but not one we can’t defeat so long as we have purpose and conviction. And the cherry on top? Nobody would have to give up their video games.

Elias Isquith is a staff writer at Salon, focusing on politics. Follow him on Twitter at @eliasisquith.

LOpht’s warnings about the Internet drew notice but little action

NET OF INSECURITY

A disaster foretold — and ignored

Published on June 22, 2015

The seven young men sitting before some of Capitol Hill’s most powerful lawmakers weren’t graduate students or junior analysts from some think tank. No, Space Rogue, Kingpin, Mudge and the others were hackers who had come from the mysterious environs of cyberspace to deliver a terrifying warning to the world.

The making of a vulnerable Internet: This story is the third of a multi-part project on the Internet’s inherent vulnerabilities and why they may never be fixed.

Part 1: The story of how the Internet became so vulnerable
Part 2: The long life of a ‘quick fix’

Your computers, they told the panel of senators in May 1998, are not safe — not the software, not the hardware, not the networks that link them together. The companies that build these things don’t care, the hackers continued, and they have no reason to care because failure costs them nothing. And the federal government has neither the skill nor the will to do anything about it.

“If you’re looking for computer security, then the Internet is not the place to be,” said Mudge, then 27 and looking like a biblical prophet with long brown hair flowing past his shoulders. The Internet itself, he added, could be taken down “by any of the seven individuals seated before you” with 30 minutes of well-choreographed keystrokes.

The senators — a bipartisan group including John Glenn, Joseph I. Lieberman and Fred D. Thompson — nodded gravely, making clear that they understood the gravity of the situation. “We’re going to have to do something about it,” Thompson said.

What happened instead was a tragedy of missed opportunity, and 17 years later the world is still paying the price in rampant insecurity.

The testimony from L0pht, as the hacker group called itself, was among the most audacious of a rising chorus of warnings delivered in the 1990s as the Internet was exploding in popularity, well on its way to becoming a potent global force for communication, commerce and criminality.

Hackers and other computer experts sounded alarms as the World Wide Web brought the transformative power of computer networking to the masses. This created a universe of risks for users and the critical real-world systems, such as power plants, rapidly going online as well.

Officials in Washington and throughout the world failed to forcefully address these problems as trouble spread across cyberspace, a vast new frontier of opportunity and lawlessness. Even today, many serious online intrusions exploit flaws in software first built in that era, such as Adobe Flash, Oracle’s Java and Microsoft’s Internet Explorer.

“We have the same security problems,” said Space Rogue, whose real name is Cris Thomas. “There’s a lot more money involved. There’s a lot more awareness. But the same problems are still there.”

L0pht, born of the bustling hacker scene in the Boston area, rose to prominence as a flood of new software was introducing such wonders as sound, animation and interactive games to the Web. This software, which required access to the core functions of each user’s computer, also gave hackers new opportunities to manipulate machines from afar.

Breaking into networked computers became so easy that the Internet, long the realm of idealistic scientists and hobbyists, gradually grew infested with the most pragmatic of professionals: crooks, scam artists, spies and cyberwarriors. They exploited computer bugs for profit or other gain while continually looking for new vulnerabilities.

Tech companies sometimes scrambled to fix problems — often after hackers or academic researchers revealed them publicly — but few companies were willing to undertake the costly overhauls necessary to make their systems significantly more secure against future attacks. Their profits depended on other factors, such as providing consumers new features, not warding off hackers.

“In the real world, people only invest money to solve real problems, as opposed to hypothetical ones,” said Dan S. Wallach, a Rice University computer science professor who has been studying online threats since the 1990s. “The thing that you’re selling is not security. The thing that you’re selling is something else.”

The result was a culture within the tech industry often derided as “patch and pray.” In other words, keep building, keep selling and send out fixes as necessary. If a system failed — causing lost data, stolen credit card numbers or time-consuming computer crashes — the burden fell not on giant, rich tech companies but on their customers.

The members of L0pht say they often experienced this cavalier attitude in their day jobs, where some toiled as humble programmers or salesmen at computer stores. When they reported bugs to software makers, company officials often asked: Does anybody else know about this?

CONTINUED:

http://www.washingtonpost.com/sf/business/2015/06/22/net-of-insecurity-part-3/

Jurassic World, summer blockbuster

By Christine Schofelt
23 June 2015

Directed by Colin Trevorrow; written by Trevorrow, Rick Jaffa, Amanda Silver and Derek Connolly

Twenty-two years after the events in the original Jurassic Park (1993), the dreams of that film’s dinosaur-resurrecting scientist John Hammond (the late Richard Attenborough) have been fulfilled with the establishment of Jurassic World in the new film of the same name.

Jurassic World

The island (fictionally located off the Pacific coast of Costa Rica) on which the original, failed park was built is now the home of the wildly popular dinosaur theme park, laboratory, hotels and shopping complex. In order to keep customers returning, increase profits and thereby satisfy corporate backers, new attractions in the form of different—and bigger—dinosaurs have to be constantly introduced.

This leads to the splicing of genes from various extinct specimens and the introduction of elements of reptiles from the present era. In typical Hollywood fashion, despite the most advanced laboratories and equipment, scientists fail to look far enough ahead and predictably “unpredictable” side effects take hold making the new creatures smarter and more deadly than their component parts … and the chase is on.

Though largely formulaic, Jurassic World is not without its charms and does touch on some interesting questions.

The film centers on two brothers, Zach (Nick Robinson) and Gray (Ty Simpkins), as well as on the relationship between their aunt, a driven businesswoman, Claire (Bryce Dallas Howard), and Owen (Chris Pratt), an expert on Velociraptors.

The latter pair have dated, fought and parted company, deciding they were “too different.” Owen, an ex-Navy war veteran, has been training some of the raptors, becoming in essence their “alpha.” His acquaintance and nominal boss, Hoskins (Vincent D’Onofrio), sees a military application for the raptors, and indeed for all the dinosaurs (“Imagine if we’d had them in Tora Bora”). He is determined to find a way to adapt them to this end. Owen disagrees with the plan.

Jurassic World

When the inevitable escape of a new, smart, enormous dinosaur occurs, Hoskins’ company, InGen Security, sends in its private troops alongside Owen’s raptors. The classified “contents” of the rogue lizard, Indominus rex, are revealed to include some raptor, which poses problems. Questions of loyalty on the part of Owen’s raptors come into the picture and the struggle between nature and nurture/training plays out. The troops are largely killed off, and the saving of the island and the 20,000 park guests is then down to Owen, Claire, and the “good” dinosaurs.

The machinations of Hoskins, presented in a very straightforward—one might say simplistic—manner as the villain here, include working with the top scientist to develop dinosaurs especially for use in warfare. More time could have been spent on this, to be sure, but the fact that this element is even presented in a negative light in a blockbuster summer release bears noting.

One would like to consider this a let-up in the relentless drumbeat for war that Hollywood has been only too glad to take part in. That might be premature, though the failure of the mercenaries and their firepower to contain (or survive) their fight against the rogue Indominus, who succumbs to the mighty bites of other resurrected/created creatures instead, seems a step in the right direction.

Jurassic World

Co-produced by Steven Spielberg (who directed the first two Jurassic films),Jurassic World seems to want to make some metaphorical points about the dog-eat-dog character of present-day social and corporate life. Director Colin Trevorrow, for example, told Entertainment Weekly: “The Indominus was meant to embody our worst tendencies. We’re surrounded by wonder and yet we want more. And we want it bigger, faster, louder, better. And in the world of the movie the animal is designed based on a series of corporate focus groups.”

And Trevorrow commented to News.co.au, “There’s something in the film about our greed and our desire for profit … The Indominus Rex, to me, is very much that desire, that need to be satisfied.” Bryce Dallas Howard, the daughter of Ron Howard, told the same news outlet about her character: “The quest for profit has compromised her own humanity.”

Of course, all of this, as sincere as it may be, has to be taken with a large grain of salt. The mild criticisms occur in a film that is very much an integral part of the Hollywood blockbuster phenomenon, which largely obstructs reflecting seriously on anything.

Throughout the film, which is well on its way to raking in a billion dollars in its first two weeks, one is struck by both the simultaneous gratuitous and near constant product placements (everything from Starbucks to Coca-Cola) and the questions raised directly about the ethics of putting science in the service of the “shareholders.” Formulaic as the subplots may be, to its credit the film does come down against the practice. However, unlike recent films such asChappie or Ex Machina, humanity’s scientific abilities themselves are less of a focus, and so the ethical questions are not terribly developed—instead the emphasis is on the chase, escape and the happy ending.

All in all, unfortunately, Jurassic World does what it was designed to do: entertain without demanding too much of the audience.

 

http://www.wsws.org/en/articles/2015/06/23/jura-j23.html