We need to talk about death

Why ignoring our darkest fears only makes them worse

It’s a universal human experience. So why do we act like we need to confront it alone?

We need to talk about death: Why ignoring our darkest fears only makes them worse
(Credit: P_Wei via iStock)

“I don’t want to die. It’s so permanent.”

So said my terminally ill grandmother, a kick-ass woman who made life-size oil paintings and drank vermouth on the rocks every afternoon.

This isn’t an anecdote I’d be likely to mention in regular conversation with friends. Talk about ruining everyone’s good time. (“Ick, that’s so morbid,” everyone would think.) But earlier this month, the New York Times released its 100 Notable Books of 2014, and among the notables was not one but two – two! – nonfiction titles about death. This seemingly unremarkable milestone is actually one that we should celebrate with a glass of champagne. Or, better yet, with vermouth.

Right now our approach to death, as a culture, is utterly insane: We just pretend it doesn’t exist. Any mention of mortality in casual conversation is greeted with awkwardness and a subject change. That same taboo even translates into situations where the concept of death is unavoidable: After losing a loved one, the bereaved are granted a few moments of mourning, after which the world around them kicks back into motion, as if nothing at all had changed. For those not personally affected by it, the reality of death stays hidden and ignored.

For me this isn’t an abstract topic. There’s been a lot of death in my life. There was my grandmother’s recent death, which sent my whole crazy family into a tailspin; but also my dad’s sudden death when I was 20. Under such circumstances (that is, the unexpected sort), you quickly discover that no one has any clue whatsoever how to deal with human mortality.

“Get through this and we’ll get through the worst of it,” someone said to me at my dad’s funeral, as if the funeral itself was death’s greatest burden, and not the permanent absence of the only dad I’ll ever have.

Gaffes like that are common. But insensitivity is just a symptom of much deeper issues, first of which is our underlying fear of death, a fear that might only boil to the surface when we’re directly confronted by it, but stays with us even as we try our best to ignore it. It’s a fear that my grandmother summed up perfectly when she was dying — the terror of our own, permanent nonexistence. Which makes sense. After all, it’s our basic biological imperative to survive. But on top of that natural fear of death, there’s another, separate issue: our unwillingness, as a culture, to shine a light on that fear, and talk about it. And as a result, we keep this whole huge part of the human experience cloistered away.



“We’re literally lacking a vocabulary to talk about [death],” said Lennon Flowers, a co-founder of an organization called the Dinner Party, which brings together 20- and 30-somethings who have lost a loved one to discuss “the ways in which it continues to affect our lives.”

That lack of vocabulary is a big problem, and not just for people who directly experience loss. It’s a problem for all of us, because it means we each grapple alone with the natural fear of our own expiry. We deny the fear, we bury it under an endless stream of distractions. And so it festers, making us all the more invested in keeping it buried, for how painful it would be to take it out and look at it after letting it rot for so long.

But why all the self-enforced agony? Maybe it’s because a more honest relationship with death would mean a more honest reckoning with our lives, calling into question the choices we’ve made and the ways we’ve chosen to live. And damn if that isn’t uncomfortable.

Of course, if there’s one thing our culture is great at, it’s giving instruction on how to live. There are the clichés — “live each day to the fullest” and “dance like no one’s watching” — and beyond them an endless stream of messages telling us how to look better, feel better, lose weight, have better sex, get promoted, flip houses, and make a delicious nutritious dinner in 30 minutes flat. But all of it is predicated on the notion that life is long and death is some shadowy thing that comes along when we hit 100. (And definitely not one minute before then!)

To get a sense of how self-defeating each of these goals can be, consider this chestnut given to us by a Native American sage by the name of Crazy Horse:

“Today is a good day to die, for all the things of my life are present.”

No, today is not a good day to die, because most of us feel we haven’t lived our lives yet. We run around from one thing to the next. We have plans to buy a house or a new car or, someday, to pursue our wildest dreams. We rush through the day to get to the evening, and through the week to get to the weekend, but once the weekend comes, we’re already thinking ahead to Monday morning. Our lives are one deferral after another.

Naturally, then, today isn’t a good day to die. How about tomorrow? Probably not. What number of days would we need to be comfortable saying what Crazy Horse said? Probably too big a number to count. We preserve the idea of death as an abstract thing that comes in very old age, rather than a constant possibility for us as fragile humans, because we build our whole lives atop that foundation.

What would we gain from finally opening up about death? How about the golden opportunity to consider what’s really important, not to mention the chance to be less lonely as we grapple with our own mortality, and the promise of being a real friend when someone we love loses someone they love. Plus it would all come back to us tenfold whenwe’re the ones going through a loss or reeling from a terminal diagnosis.

Sounds like a worthy undertaking, doesn’t it?

And that’s where there’s good news. Coming to grips with death is, as we’ve already established, really hard. But we at least have a model for doing so. Let’s consider, for example, the Times notable books I mentioned earlier. One of them, the graphic memoir “Can’t We Talk About Something More Pleasant,” provides an especially honest — and genuinely funny — account of author Roz Chast’s experience watching her parents grow old and die. The other book, Atul Gawande’s “Being Mortal,” reveals just how much even our medical establishment struggles with the end of life. Doctors are trained to treat sickness, of course, but often have little or no training in what to do when sickness is no longer treatable.

What both of these books do especially well is provide a vocabulary for articulating just how difficult a subject death can be for everyone — even the strongest and brightest among us. As a universal human experience, it isn’t something we should have to deal with alone. It doesn’t make a person weak or maladjusted just because he or she struggles openly with death. And what Chast and Gawande both demonstrate is that talking about it doesn’t have to be awkward or uncomfortable, because these are anxieties that all of us have in common.

It’s a common refrain that what distinguishes humans from other animals is that humans can understand, on a rational level, the full magnitude of our mortality. But what also distinguishes humans is the richness of our relationships and the depths of our empathy — the ability we have to communicate our experiences and support those around us. Death is a deeply unsettling prospect, no matter who you are. But it doesn’t need to be a burden you face alone.

The following is a list of resources for those looking for an organized platform to discuss the topic of death:

  • Atul Gawande serves as an advisor to the Conversation Project, a site that encourages families to talk openly about end-of-life care — and to choose, in advance, whether they want to be at home or in a hospital bed, on life support or not — in short, to say in unequivocal terms what matters most when the end is near.
  • Vivian Nunez is the 22-year-old founder of a brand-new site called Too Damn Young. Nunez lost her mom when she was 10 and her grandmother – her second mother – 11 years later. “Losing someone you love is an extraordinarily isolating experience,” she said. “This is especially significant when you’re talking about teenagers, or a young adult, who loses someone at a young age, and is forced to face how real mortality is, and then not encouraged to talk about it.” She founded Too Damn Young so that bereaved teenagers will know they’re not alone and so they’ll have a public space to talk about it.
  • The Recollectors is a groundbreaking project by writer Alysia Abbott, that tells the stories of people who lost a parent to AIDS. She’s exploding two big taboos – death and AIDS – in one clean shot.
  • Get Your Shit Together is another great one, a site launched by a young widow who learned the hard way that everyone should take some key steps to get their financial matters in order in case of an untimely death. “I (mostly) have my shit together,” the site’s founder says. “Now it’s your turn.”
  • There’s also Death Cafe, dedicated to “increasing awareness of death with a view to helping people make the most of their (finite) lives.” And Modern Loss, a site that’s received coverage from the New York Times and the Washington Post, shies away from nothing in its quest to tell stories about end of life and living with loss. “Death Cafe and Modern Loss have attracted a loyal following,” said Nicole Bélanger, author of “Grief in the Rearview: Three Motherless Years.” “They offer the safe space we crave to show up as we are, without worrying about having to polish up our grief and make it fit for public consumption.”

Perhaps these communities will start to influence the mainstream, as their emboldened members teach the rest of us that it’s OK, it’s really OK, to talk about death. If that happens, it will be a slow process – culture change always is. “Race and gender and myriad other subjects were forever taboo, but now we’re able to speak truth,” said Flowers of the Dinner Party. “And now we’re seeing that around death and dying.”

If she’s right, it’s the difference between the excruciating loneliness of hiding away our vulnerabilities and, instead, allowing them to connect us and bind us together.

THE DIVINE COMEDY

Dante Illuminating Florence with his Poemk, by Domenico di Michelino

Midway upon the journey of our life
I found myself within a forest dark,
For the straightforward pathway had been lost.

Ah me! how hard a thing it is to say
What was this forest savage, rough, and stern,
Which in the very thought renews the fear.

Dante: The Divine Comedy
Inferno: Canto I

While plumbing the depths of my Kindle seeking World War 2 research materials I discovered Dan Brown’s “Inferno” (2013) amongst the German books. I honestly don’t know how it got there.  But I began reading “Inferno” as a break from all the war material and Brown brought me back to the magical time I spent in Florence, Italy and my days as a scholar of Medieval and Renaissance literature and iconography. I went to Italy to research artistic symbols and images as reflected in literature and Florence was, of course, a major source of material. I even painstakingly translated parts of Dante’s “Divine Comedy” from the Medieval Italian. I miss the person I was back then. My plan was to stay in Florence but life intruded.

The meme-ification of Ayn Rand

How the grumpy author became an Internet superstar

“Feminist” T-shirts are her latest viral sensation. Why the objectivist’s writings lend themselves to the Web

, The Daily Dot

The meme-ification of Ayn Rand: How the grumpy author became an Internet superstar
Ayn Rand (Credit: WIkimedia)
This article originally appeared on The Daily Dot.

The Daily Dot Ayn Rand is not a feminist icon, but it speaks volumes about the Internet that some are implicitly characterizing her that way, so much so that she’s even become a ubiquitous force on the meme circuit.

Last week, Maureen O’Connor of The Cut wrote a piece about a popular shirt called the Unstoppable Muscle Tee, which features the quote: “The question isn’t who is going to let me, it’s who is going to stop me.”

As The Quote Investigator determined, this was actually a distortion of a well-known passage from one of Rand’s better-known novels, The Fountainhead:

“Do you mean to tell me that you’re thinking seriously of building that way, when and if you are an architect?”

“Yes.”

“My dear fellow, who will let you?”

“That’s not the point. The point is, who will stop me?”

Ironically, Rand not only isn’t responsible for this trendy girl power mantra, but was actually an avowed enemy of feminism. As The Atlas Society explains in their article about feminism in the philosophy of Objectivism (Rand’s main ideological legacy), Randians may have supported certain political and social freedoms for women—the right to have an abortion, the ability to rise to the head of business based on individual merit—but they subscribed fiercely to cultural gender biases. Referring to herself as a “male chauvinist,” Rand argued that sexually healthy women should feel a sense of “hero worship” for the men in their life, expressed disgust at the idea that any woman would want to be president, and deplored progressive identity-based activist movements as inherently collectivist in nature.



How did Rand get so big on the Internet, which has become a popular place for progressive memory? A Pew Research study from 2005 discovered that: “the percentage of both men and women who go online increases with the amount of household income,” and while both genders are equally likely to engage in heavy Internet use, white men statistically outnumber white women. This is important because Rand, despite iconoclastic eschewing ideological labels herself, is especially popular among libertarians, who are attracted to her pro-business, anti-government, and avowedly individualistic ideology. Self-identified libertarians and libertarian-minded conservatives, in turn, were found by a Pew Research study from 2011 to be disproportionately white, male, and affluent. Indeed, the sub-sect of the conservative movement that Pew determined was most likely to identify with the libertarian label were so-called “Business Conservatives,” who are “the only group in which a majority (67 percent) believes the economic system is fair to most Americans rather than unfairly tilted in favor of the powerful.” They are also very favorably inclined toward the potential presidential candidacy of Rep. Paul Ryan (79 percent), who is well-known within the Beltway as an admirer of Rand’s work (once telling The Weekly Standard that “I give out Atlas Shrugged [by Ayn Rand] as Christmas presents, and I make all my interns read it.”).

Rand’s fans, in other words, are one of the most visible forces on the Internet, and ideally situated to distribute her ideology. Rand’s online popularity is the result of this fortuitous intersection of power and interests among frequent Internet users. If one date can be established as the turning point for the flourishing of Internet libertarianism, it would most likely be May 16, 2007, when footage of former Rep. Ron Paul’s sharp non-interventionist rebuttal to Rudy Giuliani in that night’s Republican presidential debate became a viral hit. Ron Paul’s place in the ideological/cultural milieu that encompasses Randism is undeniable, as evidenced by exposes on their joint influence on college campuses and Paul’s upcoming cameo in the movie Atlas Shrugged: Part 3. During his 2008 and 2012 presidential campaigns, Paul attracted considerable attention for his remarkable ability to raise money through the Internet, and to this day he continues to root his cause in cyberspace through a titular online political opinion channel—while his son, Sen. Rand Paul, has made no secret of his hope to tap into his father’s base for his own likely presidential campaign in 2016. Even though the Pauls don’t share Rand’s views on many issues, the self-identified libertarians that infused energy and cash into their national campaigns are part of the same Internet phenomenon as the growth of Randism.

As the Unstoppable Muscle Tee hiccup makes clear, however, Rand’s Internet fashionability isn’t always tied to libertarianism or Objectivism (the name she gave her own ideology). It also has a great deal to do with the psychology of meme culture. In the words of Annalee Newitz, a writer who frequently comments on the cultural effects of science and technology:

To share a story is in part to take ownership of it, especially because you are often able to comment on a story that you are sharing on social media. If you can share a piece of information that’s an absolute truth—whether that’s how to uninstall apps on your phone, or what the NSA is really doing—you too become a truth teller. And that feels good. Just as good as it does to be the person who has the cutest cat picture on the Internet.

If there is one quality in Rand’s writing that was evident even to her early critics, it was the tone of absolute certainty that dripped from her prose, which manifests itself in the quotes appearing in memes such as “I swear by my life and my love of it that I will never live for the sake of another man, nor ask another man to live for mine,” or  “A creative man is motivated by the desire to achieve, not by the desire to beat others” and “The ladder of success is best climbed by stepping on the rungs of opportunity.” Another Rand meme revolves around the popular quote: “Individual rights are not subject to a public vote; a majority has no right to vote away the rights of a minority; the political function of rights is precisely to protect minorities from oppression by majorities (and the smallest minority on Earth is the individual).”

What’s particularly noteworthy about these observations, aside from their definitiveness, is the fact that virtually no one adhering to a mainstream Western political ideology would disagree with them. Could you conceive of anyone on the left, right, or middle arguing that they’d accept being forced to live for another’s sake or want another to live solely for their own? Or that their ambitions are not driven by a desire to beat others? Or that they don’t think success comes from seizing on opportunities? Or that they think majorities should be able to vote away the rights of minorities?

These statements are platitudes, compellingly worded rhetorical catch-alls with inspiring messages that are unlikely to be contested when taken solely at face value. Like the erroneously attributed “The question isn’t who is going to let me, it’s who is going to stop me,” they can mean whatever the user wishes for them to mean. Conservatives can and will be found who claim that only they adhere to those values while liberals do not, many liberals will say the same thing about conservatives, and, of course, Rand wrote each of these statements with her own distinctly Objectivist contexts in mind. Because each one contains a generally accepted “absolute truth” (at least insofar as the strict text itself is concerned), they are perfect fodder for those who spread memes through pictures, GIFs, and online merchandise—people who wish to be “truth tellers.”

Future historians may marvel at the perfect storm of cultural conditions that allowed this Rand boom to take place. After all, there is nothing about the rise of Internet libertarianism that automatically guarantees the rise of meming as a trend, or vice versa. In retrospect, however, the fact that both libertarianism and meming are distinct products of the Internet age—one for demographic reasons, the other for psychological ones—made the explosion of Randisms virtually inevitable. Even if they’re destined to be used by movements with which she’d want no part, Ayn Rand isn’t going to fade away from cyberspace anytime soon.

http://www.salon.com/2014/11/18/how_ayn_rand_became_an_internet_superstar_partner/?source=newsletter

William Gibson: I never imagined Facebook

The brilliant science-fiction novelist who imagined the Web tells Salon how writers missed social media’s rise

William Gibson: I never imagined Facebook
William Gibson (Credit: Putnam/Michael O’Shea)

Even if you’ve never heard of William Gibson, you’re probably familiar with his work. Arguably the most important sci-fi writer of his generation, Gibson’s cyber-noir imagination has shaped everything from the Matrix aesthetic to geek culture to the way we conceptualize virtual reality. In a 1982 short story, Gibson coined the term “cyberspace.” Two years later, his first and most famous novel, “Neuromancer,” helped launch the cyberpunk genre. By the 1990s, Gibson was writing about big data, imagining Silk Road-esque Internet enclaves, and putting his characters on reality TV shows — a full four years before the first episode of “Big Brother.”

Prescience is flashy, but Gibson is less an oracle than a kind of speculative sociologist. A very contemporary flavor of dislocation seems to be his specialty. Gibson’s heroes shuttle between wildly discordant worlds: virtual paradises and physical squalor; digital landscapes and crumbling cities; extravagant wealth and poverty.

In his latest novel, “The Peripheral,” which came out on Tuesday, Gibson takes this dislocation to new extremes. Set in mid-21st century Appalachia and far-in-the-future London, “The Peripheral” is partly a murder mystery, and partly a time-travel mind-bender. Gibson’s characters aren’t just dislocated in space, now. They’ve become unhinged from history.

Born in South Carolina, Gibson has lived in Vancouver since the 1960s. Over the phone, we spoke about surveillance, celebrity and the concept of the eternal now.

You’re famous for writing about hackers, outlaws and marginal communities. But one of the heroes of “The Peripheral” is a near-omniscient intelligence agent. She has surveillance powers that the NSA could only dream of. Should I be surprised to see you portray that kind of character so positively?

Well, I don’t know. She’s complicated, because she is this kind of terrifying secret police person in the service of a ruthless global kleptocracy. At the same time, she seems to be slightly insane and rather nice. It’s not that I don’t have my serious purposes with her, but at the same time she’s something of a comic turn.

Her official role is supposed to be completely terrifying, but at the same time her role is not a surprise. It’s not like, “Wow, I never even knew that that existed.”



Most of the characters in “The Peripheral” assume that they’re being monitored at all times. That assumption is usually correct. As a reader, I was disconcerted by how natural this state of constant surveillance felt to me.

I don’t know if it would have been possible 30 years ago to convey that sense to the reader effectively, without the reader already having some sort of cultural module in place that can respond to that. If we had somehow been able to read this text 30 years ago, I don’t know how we would even register that. It would be a big thing for a reader to get their head around without a lot of explaining. It’s a scary thing, the extent to which I don’t have to explain why [the characters] take that surveillance for granted. Everybody just gets it.

You’re considered a founder of the cyberpunk genre, which tends to feature digital cowboys — independent operators working on the frontiers of technology. Is the counterculture ethos of cyberpunk still relevant in an era when the best hackers seem to be working for the Chinese and U.S. governments, and our most famous digital outlaw, Edward Snowden, is under the protection of Vladimir Putin?

It’s seemed to me for quite a while now that the most viable use for the term “cyberpunk” is in describing artifacts of popular culture. You can say, “Did you see this movie? No? Well, it’s really cyberpunk.” Or, “Did you see the cyberpunk pants she was wearing last night?”

People know what you’re talking about, but it doesn’t work so well describing human roles in the world today. We’re more complicated. I think one of the things I did in my early fiction, more or less for effect, was to depict worlds where there didn’t really seem to be much government. In “Neuromancer,” for example, there’s no government really on the case of these rogue AI experiments that are being done by billionaires in orbit. If I had been depicting a world in which there were governments and law enforcement, I would have depicted hackers on both sides of the fence.

In “Neuromancer,” I don’t think there’s any evidence of anybody who has any parents. It’s kind of a very adolescent book that way.

In “The Peripheral,” governments are involved on both sides of the book’s central conflict. Is that a sign that you’ve matured as a writer? Or are you reflecting changes in how governments operate?

I hope it’s both. This book probably has, for whatever reason, more of my own, I guess I could now call it adult, understanding of how things work. Which, I suspect, is as it should be. People in this book live under governments, for better or worse, and have parents, for better or worse.

In 1993, you wrote an influential article about Singapore for Wired magazine, in which you wondered whether the arrival of new information technology would make the country more free, or whether Singapore would prove that “it is possible to flourish through the active repression of free expression.” With two decades of perspective, do you feel like this question has been answered?

Well, I don’t know, actually. The question was, when I asked it, naive. I may have posed innocently a false dichotomy, because some days when you’re looking out at the Internet both things are possible simultaneously, in the same place.

So what do you think is a better way to phrase that question today? Or what would have been a better way to phrase it in 1993?

I think you would end with something like “or is this just the new normal?”

Is there anything about “the new normal” in particular that surprises you? What about the Internet today would you have been least likely to foresee?

It’s incredible, the ubiquity. I definitely didn’t foresee the extent to which we would all be connected almost all of the time without needing to be plugged in.

That makes me think of “Neuromancer,” in which the characters are always having to track down a physical jack, which they then use to plug themselves into this hyper-futuristic Internet.

Yes. It’s funny, when the book was first published, when it was just out — and it was not a big deal the first little while it was out, it was just another paperback original — I went to a science fiction convention. There were guys there who were, by the standards of 1984, far more computer-literate than I was. And they very cheerfully told me that I got it completely wrong, and I knew nothing. They kept saying over and over, “There’s never going to be enough bandwidth, you don’t understand. This could never happen.”

So, you know, here I am, this many years later with this little tiny flat thing in my hand that’s got more bandwidth than those guys thought was possible for a personal device to ever have, and the book is still resonant for at least some new readers, even though it’s increasingly hung with the inevitable obsolescence of having been first published in 1984. Now it’s not really in the pale, but in the broader outline.

You wrote “Neuromancer” on a 1927 Hermes typewriter. In an essay of yours from the mid-1990s, you specifically mention choosing not to use email. Does being a bit removed from digital culture help you critique it better? Or do you feel that you’re immersed in that culture, now?

I no longer have the luxury of being as removed from it as I was then. I was waiting for it to come to me. When I wrote [about staying off email], there was a learning curve involved in using email, a few years prior to the Web.

As soon as the Web arrived, I was there, because there was no learning curve. The interface had been civilized, and I’ve basically been there ever since. But I think I actually have a funny kind of advantage, in that I’m not generationally of [the Web]. Just being able to remember the world before it, some of the perspectives are quite interesting.

Drones and 3-D printing play major roles in “The Peripheral,” but social networks, for the most part, are obsolete in the book’s fictional future. How do you choose which technological trends to amplify in your writing, and which to ignore?

It’s mostly a matter of which ones I find most interesting at the time of writing. And the absence of social media in both those futures probably has more to do with my own lack of interest in that. It would mean a relatively enormous amount of work to incorporate social media into both those worlds, because it would all have to be invented and extrapolated.

Your three most recent novels, before “The Peripheral,” take place in some version of the present. You’re now returning to the future, which is where you started out as a writer in the 1980s. Futuristic sci-fi often feels more like cultural criticism of the present than an exercise in prediction. What is it about the future that helps us reflect on the contemporary world?

When I began to write science fiction, I already assumed that science fiction about the future is only ostensibly written about the future, that it’s really made of the present. Science fiction has wound up with a really good cultural toolkit — an unexpectedly good cultural toolkit — for taking apart the present and theorizing on how it works, in the guise of presenting an imagined future.

The three previous books were basically written to find out whether or not I could use the toolkit that I’d acquired writing fictions about imaginary futures on the present, but use it for more overtly naturalistic purposes. I have no idea at this point whether my next book will be set in an imaginary future or the contemporary present or the past.

Do you feel as if sci-fi has actually helped dictate the future? I was speaking with a friend earlier about this, and he phrased the question well: Did a book like “Neuromancer” predict the future, or did it establish a dress code for it? In other words, did it describe a future that people then tried to live out?

I think that the two halves of that are in some kind of symbiotic relationship with one another. Science fiction ostensibly tries to predict the future. And the people who wind up making the future sometimes did what they did because they read a piece of science fiction. “Dress code” is an interesting way to put it. It’s more like … it’s more like attitude, really. What will our attitude be toward the future when the future is the present? And that’s actually much more difficult to correctly predict than what sort of personal devices people will be carrying.

How do you think that attitude has changed since you started writing? Could you describe the attitude of our current moment?

The day the Apple Watch was launched, late in the day someone on Twitter announced that it was already over. They cited some subject, they linked to something, indicating that our moment of giddy future shock was now over. There’s just some sort of endless now, now.

Could you go into that a little bit more, what you mean by an “endless now”?

Fifty years ago, I think now was longer. I think that the cultural and individual concept of the present moment was a year, or two, or six months. It wasn’t measured in clicks. Concepts of the world and of the self couldn’t change as instantly or in some cases as constantly. And I think that has resulted in there being a now that’s so short that in a sense it’s as though it’s eternal. We’re just always in the moment.

And it takes something really horrible, like some terrible, gripping disaster, to lift us out of that, or some kind of extra-strong sense of outrage, which we know that we share with millions of other people. Unfortunately, those are the things that really perk us up. This is where we get perked up, perked up for longer than for over a new iPhone, say.

The worlds that you imagine are enchanting, but they also tend to be pretty grim. Is it possible to write good sci-fi that doesn’t have some sort of dystopian edge?

I don’t know. It wouldn’t occur to me to try. The world today, considered in its totality, has a considerable dystopian edge. Perhaps that’s always been true.

I often work in a form of literature that is inherently fantastic. But at the same time that I’m doing that, I’ve always shared concerns with more naturalistic forms of writing. I generally try to make my characters emotionally realistic. I do now, at least; I can’t say I always have done that. And I want the imaginary world they live in and the imaginary problems that they have to reflect the real world, and to some extent real problems that real people are having.

It’s difficult for me to imagine a character in a work of contemporary fiction who wouldn’t have any concerns with the more dystopian elements of contemporary reality. I can imagine one, but she’d be a weird … she’d be a strange character. Maybe some kind of monster. Totally narcissistic.

What makes this character monstrous? The narcissism?

Well, yeah, someone sufficiently self-involved. It doesn’t require anything like the more clinical forms of narcissism. But someone who’s sufficiently self-involved as to just not be bothered with the big bad things that are happening in the world, or the bad things — regular-size bad things — that are happening to one’s neighbors. There certainly are people like that out there. The Internet is full of them. I see them every day.

You were raised in the South, and you live in Vancouver, but, like Philip K. Dick, you’ve set some of your most famous work in San Francisco. What is the appeal of the city for technological dreamers? And how does the Silicon Valley of today fit into that Bay Area ethos?

I’m very curious to go back to San Francisco while on tour for this book, because it’s been a few years since I’ve been there, and it was quite a few years before that when I wrote about San Francisco in my second series of books.

I think one of the reasons I chose it was that it was a place that I would get to fairly frequently, so it would stay fresh in memory, but it also seemed kind of out of the loop. It was kind of an easy canvas for me, an easier canvas to set a future in than Los Angeles. It seemed to have fewer moving parts. And that’s obviously no longer the case, but I really know contemporary San Francisco now more by word of mouth than I do from first-person experience. I really think it sounds like a genuinely new iteration of San Francisco.

Do you think that Google and Facebook and this Silicon Valley culture are the heirs to the Internet that you so presciently imagined in the 1980s? Or do they feel like they’ve taken the Web in different directions than what you expected?

Generally it went it directions that didn’t occur to me. It seems to me now that if I had been a very different kind of novelist, I would have been more likely to foresee something like Facebook. But you know, if you try to imagine that somebody in 1982 writes this novel that totally and accurately predicted what it would be like to be on Facebook, and then tried to get it published? I don’t know if you would be able to get it published. Because how exciting is that, or what kind of crime story could you set there?

Without even knowing it, I was limited by the kind of fiction of the imaginary future that I was trying to write. I could use detective gangster stories, and there is a real world of the Internet that’s like that, you know? Very much like that. Although the crimes are so different. The ace Russian hacker mobs are not necessarily crashing into the global corporations. They’re stealing your Home Depot information. If I’d put that as an exploit in “Neuromancer,” nobody would have gotten it. Although it would have made me seem very, very prescient.

You’ve written often and eloquently about cults of celebrity and the surrealness of fame. By this point you’re pretty famous yourself. Has writing about fame changed the way you experience it? Does experiencing fame change the way you write about it?

Writers in our society, even today, have a fairly homeopathic level of celebrity compared to actors and really popular musicians, or Kardashians. I think in [my 1993 novel] “Virtual Light,” I sort of predicted Kardashian. Or there’s an implied celebrity industry in that book that’s very much like that. You become famous just for being famous. And you can keep it rolling.

But writers, not so much. Writers get just a little bit of it on a day-to-day basis. Writers are in an interesting place in our society to observe how that works, because we can be sort of famous, but not really famous. Partly I’d written about fame because I’d seen little bits of it, but the bigger reason is the extent to which it seems that celebrity is the essential postmodern product, and the essential post-industrial product. The so-called developed world pioneered it. So it’s sort of inherently in my ballpark. It would be weird if it wasn’t there.

You have this reputation of being something of a Cassandra. I don’t want to put you on the spot and ask for predictions. But I’m curious: For people who are trying to understand technological trends, and social trends, where do you recommend they look? What should they be observing?

I think the best advice I’ve ever heard on that was from Samuel R. Delany, the great American writer. He said, “If you want to know how something works, look at one that’s broken.” I encountered that remark of his before I began writing, and it’s one of my fridge magnets for writing.

Anything I make, and anything I’m describing in terms of its workings — even if I were a non-literary futuristic writer of some kind — I think that statement would be very resonant for me. Looking at the broken ones will tell you more about what the thing actually does than looking at one that’s perfectly functioning, because then you’re only seeing the surface, and you’re only seeing what its makers want you to see. If you want to understand social media, look at troubled social media. Or maybe failed social media, things like that.

Do you think that’s partly why so much science fiction is crime fiction, too?

Yeah, it might be. Crime fiction gives the author the excuse to have a protagonist who gets her nose into everything and goes where she’s not supposed to go and asks questions that will generate answers that the author wants the reader to see. It’s a handy combination. Detective fiction is in large part related to literary naturalism, and literary naturalism was a quite a radical concept that posed that you could use the novel to explore existing elements of society which had previously been forbidden, like the distribution of capital and class, and what sex really was. Those were all naturalistic concerns. They also yielded to detective fiction. Detective fiction and science fiction are an ideal cocktail, in my opinion.

 

http://www.salon.com/2014/11/09/william_gibson_i_never_imagined_facebook/?source=newsletter

Why “Psychological Androgyny” Is Essential for Creativity

by

“Creative individuals are more likely to have not only the strengths of their own gender but those of the other one, too.”

Despite the immense canon of research on creativity — including its four stages, the cognitive science of the ideal creative routine, the role of memory, and the relationship between creativity and mental illness — very little has focused on one of life’s few givens that equally few of us can escape: gender and the genderedness of the mind.

In Creativity: The Psychology of Discovery and Invention (public library) — one of the most important, insightful, and influential books on creativity ever written — pioneering psychologist Mihaly Csikszentmihalyi examines a curious, under-appreciated yet crucial aspect of the creative mindset: a predisposition to psychological androgyny.

In all cultures, men are brought up to be “masculine” and to disregard and repress those aspects of their temperament that the culture regards as “feminine,” whereas women are expected to do the opposite. Creative individuals to a certain extent escape this rigid gender role stereotyping. When tests of masculinity/femininity are given to young people, over and over one finds that creative and talented girls are more dominant and tough than other girls, and creative boys are more sensitive and less aggressive than their male peers.

Illustration by Yang Liu from ‘Man Meets Woman,’ a pictogram critique of gender stereotypes. Click image for details.

Csikszentmihalyi points out that this psychological tendency toward androgyny shouldn’t be confused with homosexuality — it deals not with sexual constitution but with a set of psychoemotional capacities:

Psychological androgyny is a much wider concept, referring to a person’s ability to be at the same time aggressive and nurturant, sensitive and rigid, dominant and submissive, regardless of gender. A psychologically androgynous person in effect doubles his or her repertoire of responses and can interact with the world in terms of a much richer and varied spectrum of opportunities. It is not surprising that creative individuals are more likely to have not only the strengths of their own gender but those of the other one, too.

Citing his team’s extensive interviews with 91 individuals who scored high on creativity in various fields — including pioneering astronomer Vera Rubin, legendary sociobiologist E.O. Wilson, philosopher and marginalia champion Mortimer Adler, universe-disturber Madeleine L’Engle, social science titan John Gardner, poet extraordinaire Denise Levertov, and MacArthur genius Stephen Jay Gould — Csikszentmihalyi writes:

It was obvious that the women artists and scientists tended to be much more assertive, self-confident, and openly aggressive than women are generally brought up to be in our society. Perhaps the most noticeable evidence for the “femininity” of the men in the sample was their great preoccupation with their family and their sensitivity to subtle aspects of the environment that other men are inclined to dismiss as unimportant. But despite having these traits that are not usual to their gender, they retained the usual gender-specific traits as well.

Illustration from the 1970 satirical book ‘I’m Glad I’m a Boy! I’m Glad I’m a Girl!’ Click image for more.

Creativity: The Psychology of Discovery and Invention is a revelatory read in its entirety, featuring insights on the ideal conditions for the creative process, the key characteristics of the innovative mindset, how aging influences creativity, and invaluable advice to the young from Csikszentmihalyi’s roster of 91 creative luminaries. Complement this particular excerpt with Ursula K. Le Guin on being a man — arguably the most brilliant meditation on gender ever written, by one of the most exuberantly creative minds of our time.

http://www.brainpickings.org/2014/11/07/psychological-androginy-creativity-csikszentmihalyi/