Dante Illuminating Florence with his Poemk, by Domenico di Michelino

Midway upon the journey of our life
I found myself within a forest dark,
For the straightforward pathway had been lost.

Ah me! how hard a thing it is to say
What was this forest savage, rough, and stern,
Which in the very thought renews the fear.

Dante: The Divine Comedy
Inferno: Canto I

While plumbing the depths of my Kindle seeking World War 2 research materials I discovered Dan Brown’s “Inferno” (2013) amongst the German books. I honestly don’t know how it got there.  But I began reading “Inferno” as a break from all the war material and Brown brought me back to the magical time I spent in Florence, Italy and my days as a scholar of Medieval and Renaissance literature and iconography. I went to Italy to research artistic symbols and images as reflected in literature and Florence was, of course, a major source of material. I even painstakingly translated parts of Dante’s “Divine Comedy” from the Medieval Italian. I miss the person I was back then. My plan was to stay in Florence but life intruded.

The meme-ification of Ayn Rand

How the grumpy author became an Internet superstar

“Feminist” T-shirts are her latest viral sensation. Why the objectivist’s writings lend themselves to the Web

, The Daily Dot

The meme-ification of Ayn Rand: How the grumpy author became an Internet superstar
Ayn Rand (Credit: WIkimedia)
This article originally appeared on The Daily Dot.

The Daily Dot Ayn Rand is not a feminist icon, but it speaks volumes about the Internet that some are implicitly characterizing her that way, so much so that she’s even become a ubiquitous force on the meme circuit.

Last week, Maureen O’Connor of The Cut wrote a piece about a popular shirt called the Unstoppable Muscle Tee, which features the quote: “The question isn’t who is going to let me, it’s who is going to stop me.”

As The Quote Investigator determined, this was actually a distortion of a well-known passage from one of Rand’s better-known novels, The Fountainhead:

“Do you mean to tell me that you’re thinking seriously of building that way, when and if you are an architect?”


“My dear fellow, who will let you?”

“That’s not the point. The point is, who will stop me?”

Ironically, Rand not only isn’t responsible for this trendy girl power mantra, but was actually an avowed enemy of feminism. As The Atlas Society explains in their article about feminism in the philosophy of Objectivism (Rand’s main ideological legacy), Randians may have supported certain political and social freedoms for women—the right to have an abortion, the ability to rise to the head of business based on individual merit—but they subscribed fiercely to cultural gender biases. Referring to herself as a “male chauvinist,” Rand argued that sexually healthy women should feel a sense of “hero worship” for the men in their life, expressed disgust at the idea that any woman would want to be president, and deplored progressive identity-based activist movements as inherently collectivist in nature.

How did Rand get so big on the Internet, which has become a popular place for progressive memory? A Pew Research study from 2005 discovered that: “the percentage of both men and women who go online increases with the amount of household income,” and while both genders are equally likely to engage in heavy Internet use, white men statistically outnumber white women. This is important because Rand, despite iconoclastic eschewing ideological labels herself, is especially popular among libertarians, who are attracted to her pro-business, anti-government, and avowedly individualistic ideology. Self-identified libertarians and libertarian-minded conservatives, in turn, were found by a Pew Research study from 2011 to be disproportionately white, male, and affluent. Indeed, the sub-sect of the conservative movement that Pew determined was most likely to identify with the libertarian label were so-called “Business Conservatives,” who are “the only group in which a majority (67 percent) believes the economic system is fair to most Americans rather than unfairly tilted in favor of the powerful.” They are also very favorably inclined toward the potential presidential candidacy of Rep. Paul Ryan (79 percent), who is well-known within the Beltway as an admirer of Rand’s work (once telling The Weekly Standard that “I give out Atlas Shrugged [by Ayn Rand] as Christmas presents, and I make all my interns read it.”).

Rand’s fans, in other words, are one of the most visible forces on the Internet, and ideally situated to distribute her ideology. Rand’s online popularity is the result of this fortuitous intersection of power and interests among frequent Internet users. If one date can be established as the turning point for the flourishing of Internet libertarianism, it would most likely be May 16, 2007, when footage of former Rep. Ron Paul’s sharp non-interventionist rebuttal to Rudy Giuliani in that night’s Republican presidential debate became a viral hit. Ron Paul’s place in the ideological/cultural milieu that encompasses Randism is undeniable, as evidenced by exposes on their joint influence on college campuses and Paul’s upcoming cameo in the movie Atlas Shrugged: Part 3. During his 2008 and 2012 presidential campaigns, Paul attracted considerable attention for his remarkable ability to raise money through the Internet, and to this day he continues to root his cause in cyberspace through a titular online political opinion channel—while his son, Sen. Rand Paul, has made no secret of his hope to tap into his father’s base for his own likely presidential campaign in 2016. Even though the Pauls don’t share Rand’s views on many issues, the self-identified libertarians that infused energy and cash into their national campaigns are part of the same Internet phenomenon as the growth of Randism.

As the Unstoppable Muscle Tee hiccup makes clear, however, Rand’s Internet fashionability isn’t always tied to libertarianism or Objectivism (the name she gave her own ideology). It also has a great deal to do with the psychology of meme culture. In the words of Annalee Newitz, a writer who frequently comments on the cultural effects of science and technology:

To share a story is in part to take ownership of it, especially because you are often able to comment on a story that you are sharing on social media. If you can share a piece of information that’s an absolute truth—whether that’s how to uninstall apps on your phone, or what the NSA is really doing—you too become a truth teller. And that feels good. Just as good as it does to be the person who has the cutest cat picture on the Internet.

If there is one quality in Rand’s writing that was evident even to her early critics, it was the tone of absolute certainty that dripped from her prose, which manifests itself in the quotes appearing in memes such as “I swear by my life and my love of it that I will never live for the sake of another man, nor ask another man to live for mine,” or  “A creative man is motivated by the desire to achieve, not by the desire to beat others” and “The ladder of success is best climbed by stepping on the rungs of opportunity.” Another Rand meme revolves around the popular quote: “Individual rights are not subject to a public vote; a majority has no right to vote away the rights of a minority; the political function of rights is precisely to protect minorities from oppression by majorities (and the smallest minority on Earth is the individual).”

What’s particularly noteworthy about these observations, aside from their definitiveness, is the fact that virtually no one adhering to a mainstream Western political ideology would disagree with them. Could you conceive of anyone on the left, right, or middle arguing that they’d accept being forced to live for another’s sake or want another to live solely for their own? Or that their ambitions are not driven by a desire to beat others? Or that they don’t think success comes from seizing on opportunities? Or that they think majorities should be able to vote away the rights of minorities?

These statements are platitudes, compellingly worded rhetorical catch-alls with inspiring messages that are unlikely to be contested when taken solely at face value. Like the erroneously attributed “The question isn’t who is going to let me, it’s who is going to stop me,” they can mean whatever the user wishes for them to mean. Conservatives can and will be found who claim that only they adhere to those values while liberals do not, many liberals will say the same thing about conservatives, and, of course, Rand wrote each of these statements with her own distinctly Objectivist contexts in mind. Because each one contains a generally accepted “absolute truth” (at least insofar as the strict text itself is concerned), they are perfect fodder for those who spread memes through pictures, GIFs, and online merchandise—people who wish to be “truth tellers.”

Future historians may marvel at the perfect storm of cultural conditions that allowed this Rand boom to take place. After all, there is nothing about the rise of Internet libertarianism that automatically guarantees the rise of meming as a trend, or vice versa. In retrospect, however, the fact that both libertarianism and meming are distinct products of the Internet age—one for demographic reasons, the other for psychological ones—made the explosion of Randisms virtually inevitable. Even if they’re destined to be used by movements with which she’d want no part, Ayn Rand isn’t going to fade away from cyberspace anytime soon.

William Gibson: I never imagined Facebook

The brilliant science-fiction novelist who imagined the Web tells Salon how writers missed social media’s rise

William Gibson: I never imagined Facebook
William Gibson (Credit: Putnam/Michael O’Shea)

Even if you’ve never heard of William Gibson, you’re probably familiar with his work. Arguably the most important sci-fi writer of his generation, Gibson’s cyber-noir imagination has shaped everything from the Matrix aesthetic to geek culture to the way we conceptualize virtual reality. In a 1982 short story, Gibson coined the term “cyberspace.” Two years later, his first and most famous novel, “Neuromancer,” helped launch the cyberpunk genre. By the 1990s, Gibson was writing about big data, imagining Silk Road-esque Internet enclaves, and putting his characters on reality TV shows — a full four years before the first episode of “Big Brother.”

Prescience is flashy, but Gibson is less an oracle than a kind of speculative sociologist. A very contemporary flavor of dislocation seems to be his specialty. Gibson’s heroes shuttle between wildly discordant worlds: virtual paradises and physical squalor; digital landscapes and crumbling cities; extravagant wealth and poverty.

In his latest novel, “The Peripheral,” which came out on Tuesday, Gibson takes this dislocation to new extremes. Set in mid-21st century Appalachia and far-in-the-future London, “The Peripheral” is partly a murder mystery, and partly a time-travel mind-bender. Gibson’s characters aren’t just dislocated in space, now. They’ve become unhinged from history.

Born in South Carolina, Gibson has lived in Vancouver since the 1960s. Over the phone, we spoke about surveillance, celebrity and the concept of the eternal now.

You’re famous for writing about hackers, outlaws and marginal communities. But one of the heroes of “The Peripheral” is a near-omniscient intelligence agent. She has surveillance powers that the NSA could only dream of. Should I be surprised to see you portray that kind of character so positively?

Well, I don’t know. She’s complicated, because she is this kind of terrifying secret police person in the service of a ruthless global kleptocracy. At the same time, she seems to be slightly insane and rather nice. It’s not that I don’t have my serious purposes with her, but at the same time she’s something of a comic turn.

Her official role is supposed to be completely terrifying, but at the same time her role is not a surprise. It’s not like, “Wow, I never even knew that that existed.”

Most of the characters in “The Peripheral” assume that they’re being monitored at all times. That assumption is usually correct. As a reader, I was disconcerted by how natural this state of constant surveillance felt to me.

I don’t know if it would have been possible 30 years ago to convey that sense to the reader effectively, without the reader already having some sort of cultural module in place that can respond to that. If we had somehow been able to read this text 30 years ago, I don’t know how we would even register that. It would be a big thing for a reader to get their head around without a lot of explaining. It’s a scary thing, the extent to which I don’t have to explain why [the characters] take that surveillance for granted. Everybody just gets it.

You’re considered a founder of the cyberpunk genre, which tends to feature digital cowboys — independent operators working on the frontiers of technology. Is the counterculture ethos of cyberpunk still relevant in an era when the best hackers seem to be working for the Chinese and U.S. governments, and our most famous digital outlaw, Edward Snowden, is under the protection of Vladimir Putin?

It’s seemed to me for quite a while now that the most viable use for the term “cyberpunk” is in describing artifacts of popular culture. You can say, “Did you see this movie? No? Well, it’s really cyberpunk.” Or, “Did you see the cyberpunk pants she was wearing last night?”

People know what you’re talking about, but it doesn’t work so well describing human roles in the world today. We’re more complicated. I think one of the things I did in my early fiction, more or less for effect, was to depict worlds where there didn’t really seem to be much government. In “Neuromancer,” for example, there’s no government really on the case of these rogue AI experiments that are being done by billionaires in orbit. If I had been depicting a world in which there were governments and law enforcement, I would have depicted hackers on both sides of the fence.

In “Neuromancer,” I don’t think there’s any evidence of anybody who has any parents. It’s kind of a very adolescent book that way.

In “The Peripheral,” governments are involved on both sides of the book’s central conflict. Is that a sign that you’ve matured as a writer? Or are you reflecting changes in how governments operate?

I hope it’s both. This book probably has, for whatever reason, more of my own, I guess I could now call it adult, understanding of how things work. Which, I suspect, is as it should be. People in this book live under governments, for better or worse, and have parents, for better or worse.

In 1993, you wrote an influential article about Singapore for Wired magazine, in which you wondered whether the arrival of new information technology would make the country more free, or whether Singapore would prove that “it is possible to flourish through the active repression of free expression.” With two decades of perspective, do you feel like this question has been answered?

Well, I don’t know, actually. The question was, when I asked it, naive. I may have posed innocently a false dichotomy, because some days when you’re looking out at the Internet both things are possible simultaneously, in the same place.

So what do you think is a better way to phrase that question today? Or what would have been a better way to phrase it in 1993?

I think you would end with something like “or is this just the new normal?”

Is there anything about “the new normal” in particular that surprises you? What about the Internet today would you have been least likely to foresee?

It’s incredible, the ubiquity. I definitely didn’t foresee the extent to which we would all be connected almost all of the time without needing to be plugged in.

That makes me think of “Neuromancer,” in which the characters are always having to track down a physical jack, which they then use to plug themselves into this hyper-futuristic Internet.

Yes. It’s funny, when the book was first published, when it was just out — and it was not a big deal the first little while it was out, it was just another paperback original — I went to a science fiction convention. There were guys there who were, by the standards of 1984, far more computer-literate than I was. And they very cheerfully told me that I got it completely wrong, and I knew nothing. They kept saying over and over, “There’s never going to be enough bandwidth, you don’t understand. This could never happen.”

So, you know, here I am, this many years later with this little tiny flat thing in my hand that’s got more bandwidth than those guys thought was possible for a personal device to ever have, and the book is still resonant for at least some new readers, even though it’s increasingly hung with the inevitable obsolescence of having been first published in 1984. Now it’s not really in the pale, but in the broader outline.

You wrote “Neuromancer” on a 1927 Hermes typewriter. In an essay of yours from the mid-1990s, you specifically mention choosing not to use email. Does being a bit removed from digital culture help you critique it better? Or do you feel that you’re immersed in that culture, now?

I no longer have the luxury of being as removed from it as I was then. I was waiting for it to come to me. When I wrote [about staying off email], there was a learning curve involved in using email, a few years prior to the Web.

As soon as the Web arrived, I was there, because there was no learning curve. The interface had been civilized, and I’ve basically been there ever since. But I think I actually have a funny kind of advantage, in that I’m not generationally of [the Web]. Just being able to remember the world before it, some of the perspectives are quite interesting.

Drones and 3-D printing play major roles in “The Peripheral,” but social networks, for the most part, are obsolete in the book’s fictional future. How do you choose which technological trends to amplify in your writing, and which to ignore?

It’s mostly a matter of which ones I find most interesting at the time of writing. And the absence of social media in both those futures probably has more to do with my own lack of interest in that. It would mean a relatively enormous amount of work to incorporate social media into both those worlds, because it would all have to be invented and extrapolated.

Your three most recent novels, before “The Peripheral,” take place in some version of the present. You’re now returning to the future, which is where you started out as a writer in the 1980s. Futuristic sci-fi often feels more like cultural criticism of the present than an exercise in prediction. What is it about the future that helps us reflect on the contemporary world?

When I began to write science fiction, I already assumed that science fiction about the future is only ostensibly written about the future, that it’s really made of the present. Science fiction has wound up with a really good cultural toolkit — an unexpectedly good cultural toolkit — for taking apart the present and theorizing on how it works, in the guise of presenting an imagined future.

The three previous books were basically written to find out whether or not I could use the toolkit that I’d acquired writing fictions about imaginary futures on the present, but use it for more overtly naturalistic purposes. I have no idea at this point whether my next book will be set in an imaginary future or the contemporary present or the past.

Do you feel as if sci-fi has actually helped dictate the future? I was speaking with a friend earlier about this, and he phrased the question well: Did a book like “Neuromancer” predict the future, or did it establish a dress code for it? In other words, did it describe a future that people then tried to live out?

I think that the two halves of that are in some kind of symbiotic relationship with one another. Science fiction ostensibly tries to predict the future. And the people who wind up making the future sometimes did what they did because they read a piece of science fiction. “Dress code” is an interesting way to put it. It’s more like … it’s more like attitude, really. What will our attitude be toward the future when the future is the present? And that’s actually much more difficult to correctly predict than what sort of personal devices people will be carrying.

How do you think that attitude has changed since you started writing? Could you describe the attitude of our current moment?

The day the Apple Watch was launched, late in the day someone on Twitter announced that it was already over. They cited some subject, they linked to something, indicating that our moment of giddy future shock was now over. There’s just some sort of endless now, now.

Could you go into that a little bit more, what you mean by an “endless now”?

Fifty years ago, I think now was longer. I think that the cultural and individual concept of the present moment was a year, or two, or six months. It wasn’t measured in clicks. Concepts of the world and of the self couldn’t change as instantly or in some cases as constantly. And I think that has resulted in there being a now that’s so short that in a sense it’s as though it’s eternal. We’re just always in the moment.

And it takes something really horrible, like some terrible, gripping disaster, to lift us out of that, or some kind of extra-strong sense of outrage, which we know that we share with millions of other people. Unfortunately, those are the things that really perk us up. This is where we get perked up, perked up for longer than for over a new iPhone, say.

The worlds that you imagine are enchanting, but they also tend to be pretty grim. Is it possible to write good sci-fi that doesn’t have some sort of dystopian edge?

I don’t know. It wouldn’t occur to me to try. The world today, considered in its totality, has a considerable dystopian edge. Perhaps that’s always been true.

I often work in a form of literature that is inherently fantastic. But at the same time that I’m doing that, I’ve always shared concerns with more naturalistic forms of writing. I generally try to make my characters emotionally realistic. I do now, at least; I can’t say I always have done that. And I want the imaginary world they live in and the imaginary problems that they have to reflect the real world, and to some extent real problems that real people are having.

It’s difficult for me to imagine a character in a work of contemporary fiction who wouldn’t have any concerns with the more dystopian elements of contemporary reality. I can imagine one, but she’d be a weird … she’d be a strange character. Maybe some kind of monster. Totally narcissistic.

What makes this character monstrous? The narcissism?

Well, yeah, someone sufficiently self-involved. It doesn’t require anything like the more clinical forms of narcissism. But someone who’s sufficiently self-involved as to just not be bothered with the big bad things that are happening in the world, or the bad things — regular-size bad things — that are happening to one’s neighbors. There certainly are people like that out there. The Internet is full of them. I see them every day.

You were raised in the South, and you live in Vancouver, but, like Philip K. Dick, you’ve set some of your most famous work in San Francisco. What is the appeal of the city for technological dreamers? And how does the Silicon Valley of today fit into that Bay Area ethos?

I’m very curious to go back to San Francisco while on tour for this book, because it’s been a few years since I’ve been there, and it was quite a few years before that when I wrote about San Francisco in my second series of books.

I think one of the reasons I chose it was that it was a place that I would get to fairly frequently, so it would stay fresh in memory, but it also seemed kind of out of the loop. It was kind of an easy canvas for me, an easier canvas to set a future in than Los Angeles. It seemed to have fewer moving parts. And that’s obviously no longer the case, but I really know contemporary San Francisco now more by word of mouth than I do from first-person experience. I really think it sounds like a genuinely new iteration of San Francisco.

Do you think that Google and Facebook and this Silicon Valley culture are the heirs to the Internet that you so presciently imagined in the 1980s? Or do they feel like they’ve taken the Web in different directions than what you expected?

Generally it went it directions that didn’t occur to me. It seems to me now that if I had been a very different kind of novelist, I would have been more likely to foresee something like Facebook. But you know, if you try to imagine that somebody in 1982 writes this novel that totally and accurately predicted what it would be like to be on Facebook, and then tried to get it published? I don’t know if you would be able to get it published. Because how exciting is that, or what kind of crime story could you set there?

Without even knowing it, I was limited by the kind of fiction of the imaginary future that I was trying to write. I could use detective gangster stories, and there is a real world of the Internet that’s like that, you know? Very much like that. Although the crimes are so different. The ace Russian hacker mobs are not necessarily crashing into the global corporations. They’re stealing your Home Depot information. If I’d put that as an exploit in “Neuromancer,” nobody would have gotten it. Although it would have made me seem very, very prescient.

You’ve written often and eloquently about cults of celebrity and the surrealness of fame. By this point you’re pretty famous yourself. Has writing about fame changed the way you experience it? Does experiencing fame change the way you write about it?

Writers in our society, even today, have a fairly homeopathic level of celebrity compared to actors and really popular musicians, or Kardashians. I think in [my 1993 novel] “Virtual Light,” I sort of predicted Kardashian. Or there’s an implied celebrity industry in that book that’s very much like that. You become famous just for being famous. And you can keep it rolling.

But writers, not so much. Writers get just a little bit of it on a day-to-day basis. Writers are in an interesting place in our society to observe how that works, because we can be sort of famous, but not really famous. Partly I’d written about fame because I’d seen little bits of it, but the bigger reason is the extent to which it seems that celebrity is the essential postmodern product, and the essential post-industrial product. The so-called developed world pioneered it. So it’s sort of inherently in my ballpark. It would be weird if it wasn’t there.

You have this reputation of being something of a Cassandra. I don’t want to put you on the spot and ask for predictions. But I’m curious: For people who are trying to understand technological trends, and social trends, where do you recommend they look? What should they be observing?

I think the best advice I’ve ever heard on that was from Samuel R. Delany, the great American writer. He said, “If you want to know how something works, look at one that’s broken.” I encountered that remark of his before I began writing, and it’s one of my fridge magnets for writing.

Anything I make, and anything I’m describing in terms of its workings — even if I were a non-literary futuristic writer of some kind — I think that statement would be very resonant for me. Looking at the broken ones will tell you more about what the thing actually does than looking at one that’s perfectly functioning, because then you’re only seeing the surface, and you’re only seeing what its makers want you to see. If you want to understand social media, look at troubled social media. Or maybe failed social media, things like that.

Do you think that’s partly why so much science fiction is crime fiction, too?

Yeah, it might be. Crime fiction gives the author the excuse to have a protagonist who gets her nose into everything and goes where she’s not supposed to go and asks questions that will generate answers that the author wants the reader to see. It’s a handy combination. Detective fiction is in large part related to literary naturalism, and literary naturalism was a quite a radical concept that posed that you could use the novel to explore existing elements of society which had previously been forbidden, like the distribution of capital and class, and what sex really was. Those were all naturalistic concerns. They also yielded to detective fiction. Detective fiction and science fiction are an ideal cocktail, in my opinion.

Why “Psychological Androgyny” Is Essential for Creativity


“Creative individuals are more likely to have not only the strengths of their own gender but those of the other one, too.”

Despite the immense canon of research on creativity — including its four stages, the cognitive science of the ideal creative routine, the role of memory, and the relationship between creativity and mental illness — very little has focused on one of life’s few givens that equally few of us can escape: gender and the genderedness of the mind.

In Creativity: The Psychology of Discovery and Invention (public library) — one of the most important, insightful, and influential books on creativity ever written — pioneering psychologist Mihaly Csikszentmihalyi examines a curious, under-appreciated yet crucial aspect of the creative mindset: a predisposition to psychological androgyny.

In all cultures, men are brought up to be “masculine” and to disregard and repress those aspects of their temperament that the culture regards as “feminine,” whereas women are expected to do the opposite. Creative individuals to a certain extent escape this rigid gender role stereotyping. When tests of masculinity/femininity are given to young people, over and over one finds that creative and talented girls are more dominant and tough than other girls, and creative boys are more sensitive and less aggressive than their male peers.

Illustration by Yang Liu from ‘Man Meets Woman,’ a pictogram critique of gender stereotypes. Click image for details.

Csikszentmihalyi points out that this psychological tendency toward androgyny shouldn’t be confused with homosexuality — it deals not with sexual constitution but with a set of psychoemotional capacities:

Psychological androgyny is a much wider concept, referring to a person’s ability to be at the same time aggressive and nurturant, sensitive and rigid, dominant and submissive, regardless of gender. A psychologically androgynous person in effect doubles his or her repertoire of responses and can interact with the world in terms of a much richer and varied spectrum of opportunities. It is not surprising that creative individuals are more likely to have not only the strengths of their own gender but those of the other one, too.

Citing his team’s extensive interviews with 91 individuals who scored high on creativity in various fields — including pioneering astronomer Vera Rubin, legendary sociobiologist E.O. Wilson, philosopher and marginalia champion Mortimer Adler, universe-disturber Madeleine L’Engle, social science titan John Gardner, poet extraordinaire Denise Levertov, and MacArthur genius Stephen Jay Gould — Csikszentmihalyi writes:

It was obvious that the women artists and scientists tended to be much more assertive, self-confident, and openly aggressive than women are generally brought up to be in our society. Perhaps the most noticeable evidence for the “femininity” of the men in the sample was their great preoccupation with their family and their sensitivity to subtle aspects of the environment that other men are inclined to dismiss as unimportant. But despite having these traits that are not usual to their gender, they retained the usual gender-specific traits as well.

Illustration from the 1970 satirical book ‘I’m Glad I’m a Boy! I’m Glad I’m a Girl!’ Click image for more.

Creativity: The Psychology of Discovery and Invention is a revelatory read in its entirety, featuring insights on the ideal conditions for the creative process, the key characteristics of the innovative mindset, how aging influences creativity, and invaluable advice to the young from Csikszentmihalyi’s roster of 91 creative luminaries. Complement this particular excerpt with Ursula K. Le Guin on being a man — arguably the most brilliant meditation on gender ever written, by one of the most exuberantly creative minds of our time.

Aldous Huxley on Drugs, Democracy, and Religion


“Generalized intelligence and mental alertness are the most powerful enemies of dictatorship and at the same time the basic conditions of effective democracy.”

In 1958, five years after his transcendent experience induced by taking four-tenths of a gram of mescalin, Aldous Huxley — legendary author of Brave New World, lesser-known but no less compelling writer of children’s books, modern prophet — penned an essay titled “Drugs That Shape Men’s Minds.” It was originally published in the Saturday Evening Post and eventually included in Moksha: Aldous Huxley’s Classic Writings on Psychedelics and the Visionary Experience (public library) — a selection of Huxley’s fiction, essays, and letters titled after the Sanskrit word for “liberation.” In the essay, Huxley considers the gifts and limitations of our wakeful consciousness, our universal quest for transcendence, and the interplay of drugs and democracy.

Huxley begins by considering why religion is nothing more nor less than an attempt to codify through symbolism our longing for what Jack Kerouac called “the golden eternity” and what Alan Lightman described in his encounter with the ospreys — a sense of intimate connection with the universe, with something larger than ourselves:

Every fully developed religion exists simultaneously on several different levels. It exists as a set of abstract concepts about the world and its governance. It exists as a set of rites and sacraments, as a traditional method for manipulating the symbols, by means of which beliefs about the cosmic order are expressed. It exists as the feelings of love, fear and devotion evoked by this manipulation of symbols.

And finally it exists as a special kind of feeling or intuition — a sense of the oneness of all things in their divine principle, a realization (to use the language of Hindu theology) that “thou art That,” a mystical experience of what seems self-evidently to be union with God.

The ordinary waking consciousness is a very useful and, on most occasions, an indispensable state of mind; but it is by no means the only form of consciousness, nor in all circumstances the best. Insofar as he transcends his ordinary self and his ordinary mode of awareness, the mystic is able to enlarge his vision, to look more deeply into the unfathomable miracle of existence.

The mystical experience is doubly valuable; it is valuable because it gives the experiencer a better understanding of himself and the world and because it may help him to lead a less self-centered and more creative life.

He echoes Mark Twain’s lament about religion and human egotism, Huxley examines the self-consciousness at the heart of worship:

We love ourselves to the point of idolatry; but we also intensely dislike ourselves — we find ourselves unutterably boring. Correlated with this distaste for the idolatrously worshipped self, there is in all of us a desire, sometimes latent, sometimes conscious and passionately expressed, to escape from the prison of our individuality, an urge to self-transcendence. It is to this urge that we owe mystical theology, spiritual exercises and yoga — to this, too, that we owe alcoholism and drug addiction.

Huxley then turns to how drugs have attempted to address this human urge and the interplay of those attempts with religion:

Modern pharmacology has given us a host of new synthetics, but in the field of the naturally occurring mind changers it has made no radical discoveries. All the botanical sedatives, stimulants, vision revealers, happiness promoters and cosmic-consciousness arousers were found out thousands of years ago, before the dawn of history.

In many societies at many levels of civilization attempts have been made to fuse drug intoxication with God-intoxication. In ancient Greece, for example, ethyl alcohol had its place in the established religion. Dionysus, or Bacchus, as he was often called, was a true divinity. His worshipers addressed him as Lusios, “Liberator,” or as Theoinos, “Godwinc.” The latter name telescopes fermented grape juice and the supernatural into a single pentecostal experience. . . . Unfortunately they also receive harm. The blissful experience of self -transcendence which alcohol makes possible has to be paid for, and the price is exorbitantly high.

Huxley argues that while the intuitive solution seems to be to enforce complete prohibition of mind-altering substances, this tends to backfire and “create more evils than it cures,” while also admonishing to the diametric opposite of this black-and-white approach, the “complete toleration and unrestricted availability” of drugs. Peering into the future of biochemistry and pharmacology, he foresees the development of “powerful but nearly harmless drugs,” but also notes that even if these were invented, they’d raise important questions about use and abuse, about whether their availability would make human beings ultimately happier or more miserable. He finds reason for concern in medicine’s history of overprescription of new drugs and writes:

The history of medical fashions, it may be remarked, is at least as grotesque as the history of fashions in women’s hats — at least as grotesque and, since human lives are at stake, considerably more tragic. In the present case, millions of patients who had no real need of the tranquilizers have been given the pills by their doctors and have learned to resort to them in every predicament, however triflingly uncomfortable. This is very bad medicine and, from the pill taker’s point of
view, dubious morality and poor sense.

He then turns to how these psychopharmacological tendencies might be exploited in a political context:

The dictatorships of tomorrow will deprive men of their freedom, but will give them in exchange a happiness none the less real, as a subjective experience, for being chemically induced. The pursuit of happiness is one of the traditional rights of man; unfortunately, the achievement of happiness may turn out to be incompatible with another of man’s rights — namely, liberty.

Wondering whether it would even be possible to “produce superior individuals by biochemical means,” Huxley points to an experiment Soviet researchers embarked upon in 1956, a five-year plan to develop “pharmacological substances that normalize higher nervous activity and heighten human capacity for work” — in other words, psychic energizers. Rather ironically given the context of subsequent geopolitical history of despots, from Putin to Yanukovych, Huxley considers the fruits of these experiments an assurance against despotism:

Let us all fervently wish the Russians every success in their current pharmacological venture. The discovery of a drug capable of increasing the average individual’s psychic energy, and its wide distribution throughout the U.S.S.R., would probably mean the end of Russia’s present form of government. Generalized intelligence and mental alertness are the most powerful enemies of dictatorship and at the same time the basic conditions of effective democracy. Even in the democratic West we could do with a bit of psychic energizing. Between them, education and pharmacology may do something to offset the effects of that deterioration of our biological material to which geneticists have frequently called attention.

Huxley ties this back to religion and the parallel artificiality of the transcendent experience:

Those who are offended by the idea that the swallowing of a pill may contribute to a genuinely religious experience should remember that all the standard mortifications — fasting, voluntary sleeplessness and self-torture — inflicted upon themselves by the ascetics of every religion for the purpose of acquiring merit, are also, like the mind-changing drugs, powerful devices for altering the chemistry of the body in general and the nervous system in particular. Or consider the procedures generally known as spiritual exercises. The breathing techniques taught by the yogi of India result in prolonged suspensions of respiration. These in turn result in an increased concentration of carbon dioxide in the blood; and the psychological consequence of this is a change in the quality of consciousness. Again, meditations involving long, intense concentration upon a single idea or image may also result — for neurological reasons which I do not profess to understand — in a slowing down of respiration and even in prolonged suspensions of breathing.

(Coincidentally, scientists have just begun to shed light on why meditators hallucinate — Huxley was, once more, ahead of his time.)

He concludes by reminding us of the deeper spiritual and psychoemotional roots of both drug-induced and religious experiences:

That men and women can, by physical and chemical means, transcend themselves in a genuinely spiritual way is something which, to the squeamish idealist, seems rather shocking. But, after all, the drug or the physical exercise is not the cause of the spiritual experience; it is only its occasion.

He once again peers into the future:

For most people, religion has always been a matter of traditional symbols and of their own emotional, intellectual and ethical response to those symbols. To men and women who have had direct experience of self-transcendence into the mind’s Other World of vision and union with the nature of things, a religion of mere symbols is not likely to be very satisfying. The perusal of a page from even the most beautifully written cookbook is no substitute for the eating of dinner. We are exhorted to “taste and see that the Lord is good.”


My own belief is that, though they may start by being something of an embarrassment, these new mind changers will tend in the long run to deepen the spiritual life of the communities in which they are available. . . . From being an activity mainly concerned with symbols, religion will be transformed into an activity concerned mainly with experience and intuition — an everyday mysticism underlying and giving significance to everyday rationality, everyday tasks and duties, everyday human relationships.

Whether one considers Huxley a madman or a prophet-genius, Moksha is a fascinating read and an unusual, dimensional lens on the human longing for transcendence. For a wholly different side of Huxley, see his only children’s book.

Cosmigraphics: Picturing Space Through Time in 4,000 Years of Cosmigraphics


A visual catalog of our quintessential quest to understand the cosmos and our place in it.

Long before Galileo pioneered the telescope, antagonizing the church and unleashing a “hummingbird effect” of innovation, humanity had been busy cataloging the heavens through millennia of imaginative speculative maps of the cosmos. We have always sought to make visible the invisible forces we long to understand, the mercy and miracle of existence, and nothing beckons to us with more intense allure than the majesty and mystery of the universe.

Four millennia of that mesmerism-made-visible is what journalist, photographer, and astrovisualization scholar Michael Benson explores with great dedication and discernment in Cosmigraphics: Picturing Space Through Time (public library) — a pictorial catalog of our quest to order the cosmos and grasp our place in it, a sensemaking process defined by what Benson aptly calls our “gradually dawning, forever incomplete situational awareness.” From glorious paintings of the creation myth predating William Blake’s work by centuries to the pioneering galaxy drawing that inspired Van Gogh’s Starry Night to NASA’s maps of the Apollo 11 landing site, the images remind us that the cosmos — like Whitman, like ourselves — is vast and contains multitudes. This masterwork of scholarship also attests, ever so gently, ever so powerfully, to the value of the “ungoogleable” — a considerable portion of Benson’s bewitching images comes from the vaults of the world’s great science libraries and archives, bringing to light a wealth of previously unseen treasures.

Illustration from Henry Russell’s 1892 treatise ‘Observations of the Transit of Venus.’Courtesy of The Royal Society

The book’s title is an allusion to Italo Calvino’s beloved Cosmicomics, a passage from which Benson deploys as the epigraph:

In the universe now there was no longer a container and a thing contained, but only a general thickness of signs, superimposed and coagulated, occupying the whole volume of space; it was constantly being dotted, minutely, a network of lines and scratches and reliefs and engravings; the universe was scrawled over on all sides, along all its dimensions. There was no longer any way to establish a point of reference; the Galaxy went on turning but I could no longer count the revolutions, any point could be the point of departure, any sign heaped up with the others could be mine, but discovering it would have served no purpose, because it was clear that, independent of signs, space didn’t exist and perhaps had never existed.

Long before the notion of vacuum existed in cosmology, English physician and cosmologist Robert Fludd captured the concept of non-space in his 1617 creation series, which depicts multiple chaotic fires subsiding until a central starlike structure becomes visible amid concentric rings of smoke and debris. Even though Fludd believed in a geocentric cosmology, this image comes strikingly close to current theories of solar system formation.Courtesy of U. of Oklahoma History of Science collections

Paintings of Saturn by German astronomer-artist Maria Clara Eimmart, a pioneering woman in science, from 1693–1698. Eimmart’s depictions are based on a 1659 engraving by Dutch astronomer Christiaan Huygens, the first to confirm that Saturn’s mysterious appendages, which had confounded astronomers since Galileo, were in fact ‘a thin flat ring, nowhere touching.’ What makes Eimmart’s painting unique is that it combines the observations of more than ten astronomers into a depiction of superior accuracy.Dipartimento di Fisica e Astronomia, Universita di Bologna

In 1845, Anglo-Irish astronomer William Parsons, the 3rd Earl of Rosse, equipped his castle with a giant six-ton telescope, soon nicknamed the ‘Leviathan,’ which remained the largest telescope in the world until 1918. Despite the cloudy Irish skies, Lord Rosse managed to glimpse and draw the spellbinding spiral structures of what were thought to be nebulae within the Milky Way. This print, based on Lord Rosse’s drawing of one such nebula — M51, known today as the Whirlpool Galaxy — became a sensation throughout Europe and inspired Van Gogh’s iconic ‘The Starry Night.’Courtesy of the Wolbach Library, Harvard

The project, which does for space what Cartographies of Time did for the invisible dimension, also celebrates the natural marriage of art and science. These early astronomers were often spectacular draughtsmen as well — take, for instance, Johannes Hevelius and his groundbreaking catalog of stars. As Benson points out, art and science were “essentially fused” until about the 17th century and many of the creators of the images in the book were also well-versed in optics, anatomy, and the natural sciences.

A 1573 painting by Portuguese artist, historian, and philosopher Francisco de Holanda, a student of Michelangelo’s, envisions the creation of the Ptolemaic universe by an omnipotent creator.Courtesy of Biblioteca Nacional de España

De Holanda was fascinated by the geometry of the cosmos, particularly the triangular form and its interplay with the circle.Courtesy of Biblioteca Nacional de España

This cryptic and unsettling ‘Fool’s Cap Map of the World’ (1580–1590), made by an unknown artist, appropriates French mathematician and cartographer Oronce Finé’s cordiform, or heart-shaped, projection of the Earth; the world in this iconic image is dressed in a jester’s belled cap, beneath which a Latin inscription from Ecclesiastes reads: ‘The number of fools is infinite.’Public domain via Wikimedia

The book is, above all, a kind of conceptual fossil record of how our understanding of the universe evolved, visualizing through breathtaking art the “fits and starts of ignorance” by which science progresses — many of the astronomers behind these enchanting images weren’t “scientists” in the modern sense but instead dabbled in alchemy, astrology, and various rites driven by religion and superstition. (For instance, Isaac Newton, often celebrated as the greatest scientist of all time, spent a considerable amount of his youth self-flagellating over his sins, and trying to discover “The Philosopher’s Stone,” a mythic substance believed to transmute ordinary metals into gold. And one of the gorgeous images in Benson’s catalog comes from a 1907 children’s astronomy book I happen to own, titled The Book of Stars for Young People, the final pages of which have always struck me with their counterblast: “Far out in space lies this island of a system, and beyond the gulfs of space are other suns, with other systems: some may be akin to ours and some quite different… The whole implies design, creation, and the working of a mighty intelligence; and yet there are small, weak creatures here on this little globe who refuse to believe in God.”)

A 1493 woodcut by German physician and cartographer Hartmann Schedel, depicting the seventh day, or Sabbath, when God rested.Courtesy of the Huntington Library

The Nebra Sky Disc (2000–1600 B.C.), excavated illegally in Germany in 1999, is considered to be both humanity’s first-known portable astronomical instrument and the oldest-known visual depiction of celestial objects.Public domain via Wikimedia

One of the phases of the moon from Selenographia, world’s first lunar atlas completed by German-Polish astronomer Johannes Hevelius in 1647 after years of obsessive observations. Hevelius also created history’s first true moon map.Courtesy of the Wolbach Library, Harvard

Beginning in 1870, French-born artist and astronomer Étienne Trouvelot spent a decade producing a series of spectacular illustrations of celestial bodies and cosmic phenomena. In 1872, he joined the Harvard College Observatory and began using its powerful telescopes in perfecting his drawings. His pastel illustrations, including this chromolithograph of Mare Humorum, a vast impact basin on the southwest side of the Earth-facing hemisphere of the moon, were among the first serious attempts to enlist art in popularizing the results of observations using technology developed for scientific research.Courtesy of the U. of Michigan Library

Étienne Trouvelot’s 1873 engravings of solar phenomena, produced during his first year at the Harvard College Observatory for the institution’s journal. The legend at the bottom reveals that the distance between the two prominences in the lower part of the engraving is one hundred thousand miles, more than 12 times the diameter of Earth. Despite the journal’s modest circulation, such engravings were soon co-opted by more mainstream publications and became trailblazing tools of science communication that greatly influenced public understanding of the universe’s scale.Courtesy of the Wolbach Library, Harvard

What makes Benson’s project especially enchanting is the strange duality it straddles: On the one hand, the longing to make tangible and visible the complex forces that rule our existence is a deeply human one; on the other, the notion of simplifying such expansive complexities into static images seems paradoxical to a dangerous degree — something best captured by pioneering astronomer Maria Mitchell when she marveled: “The world of learning is so broad, and the human soul is so limited in power! We reach forth and strain every nerve, but we seize only a bit of the curtain that hides the infinite from us.”

Unable to seize the infinite, are we fooling ourselves by trying to reduce it into a seizable visual representation? At what point do we, like Calvino’s protagonist, begin to mistake the presence or absence of “signs” for the presence or absence of space itself? It calls to mind Susan Sontag’s concern about how photography’s “aesthetic consumerism” endangers the real experience of life, which the great physicist Werner Heisenberg channeled decades earlier in a remark that exposes the dark side of visualizing the universe:

Contemporary thought is endangered by the picture of nature drawn by science. This danger lies in the fact that the picture is now regarded as an exhaustive account of nature itself so that science forgets that in its study of nature it is studying its own picture.

Plate from Thomas Wright’s 1750 treatise ‘An Original Theory,’ depicting Wright’s trailblazing notion that the universe is composed of multiple galaxies.Courtesy of the Wolbach Library, Harvard

And yet awe, the only appropriate response to the cosmos, is a visceral feeling by nature and thus has no choice but to engage our “aesthetic consumerism” — which is why the yearning at the heart of Benson’s project is a profoundly human one. He turns to the words of the pioneering English astronomer and mathematician Thomas Wright, whose 1750 book An Original Theory or New Hypothesis of the Universe Benson considers “one of the best-case studies of scientific reasoning through image.” Wright marvels:

What inconceivable vastness and magnificence of power does such a frame unfold! Suns crowding upon Suns, to our weak sense, indefinitely distant from each other; and myriads of myriads of mansions, like our own, peopling infinity, all subject to the same Creator’s will; a universe of worlds, all decked with mountains, lakes, and seas, herbs, animals, and rivers, rocks, caves, and trees… Now, thanks to the sciences, the scene begins to open to us on all sides, and truths scarce to have been dreamt of before persons of observation had proved them possible, invade our senses with a subject too deep for the human understanding, and where our very reason is lost in infinite wonders.

Illuminated solar eclipse prediction tables by German miniaturist Joachinus de Gigantibus, from the 1478 scientific treatise ‘Astronomia’ by Tuscan-Neopolitan humanist Christianus Prolianus.Courtesy of Rylands Medieval Collection, U. of Manchester

NASA’s 1979 geological map of the south polar region of the moon, part of the U.S. Geological Survey.Courtesy of USGS/NASA

Illustration from G. E. Mitton’s ‘The Book of Stars for Young People,’ 1907Courtesy of AAVSO

Artist-astronomer Étienne Trouvelot’s drawing of the total solar eclipse of July 29, 1878, in Wyoming.Courtesy of the Public Library of Cincinnati and Hamilton County

Cosmigraphics is a treasure trove in its entirety. Complement it with a tour of parallel facets of humanity’s visual imagination, Umberto Eco’s atlas of legendary lands and Manuel Lima’s visual history of tree-like diagrams, then revisit the little-known story of how Galileo influenced Shakespeare and this lovely children’s book about space exploration.

Mark Twain’s democratic ideal: How truth, laughter defeat “sweet-smelling, sugar-coated lies”

“A man at play with freedoms of his mind, believing allegiance to the truth and not the flag rescues democracy”

Mark Twain's democratic ideal: How truth, laughter defeat "sweet-smelling, sugar-coated lies"
Mark Twain

Mark Twain undertook the project of an autobiography in 1870 at the age of thirty-five, still a young man but already established as the famous author of “Innocents Abroad” and confident that he could navigate the current of his life by drawing upon the lessons learned thirteen years earlier as a steamboat pilot on the Mississippi River. As he noted in “Life on the Mississippi,”

There is one faculty which a pilot must incessantly cultivate until he has brought it to absolute perfection. . . . That faculty is memory. He cannot stop with merely thinking a thing is so-and-so, he must know it. . . . One cannot easily realize what a tremendous thing it is to know every trivial detail of twelve hundred miles of river and know it with absolute exactness.

Twain expected the going to be as easy as a straight stretch of deep water under the light of a noonday sun. It wasn’t. His memory was too close to absolute perfection, and he soon ran across snags and shoals unlike the ones to which he was accustomed south of Memphis and north of Vicksburg, an embarrassment he admitted in 1899 to a reporter from the London Times: “You cannot lay bare your private soul and look at it. You are too much ashamed of yourself. It is too disgusting.”

But Twain doesn’t abandon his attempt at autobiography, because the longer he stays the course—for thirty-four years and through as many drafts of misbegotten manuscript while also writing nine other books, among them Huckleberry Finn—the more clearly he comes to see that what he intends is not the examination of an inner child or the confessions of a cloistered id. His topic is of a match with that of the volume here in hand—America and the Americans making their nineteenth-century passage from an agrarian democracy to an industrial oligarchy, to Twain’s mind a great and tragic tale, and one that no other writer of his generation was better positioned to tell because none had seen the country at so many of its compass points or become as acquainted with so many of its oddly assorted inhabitants.

Born in Missouri in 1835 on the frontier of what was still Indian territory, Twain as a boy of ten had seen the flogging and lynching of Negro slaves, had been present in his twenties not only at the wheel of the steamboats Pennsylvania and Alonzo Child but also at the pithead of the Comstock Lode when in 1861 he joined the going westward to the Nevada silver mines and the California goldfields, there to keep company with underage murderers and overage whores. In San Francisco he writes newspaper sketches and satires, becomes known as “The Wild Humorist of the Pacific Slope” who tells funny stories to the dancing girls and gamblers in the city’s waterfront saloons.

Back east in the 1870s, Twain settles in Hartford, Connecticut, an eminent man of letters and property, and for the next thirty years, oracle for all occasions and star attraction on the national and international lecture stage, his wit and wisdom everywhere a wonder to behold—at banquet tables with presidents Ulysses S. Grant and Theodore Roosevelt, in New York City’s Tammany Hall with swindling politicians and thieving financiers, on the program at the Boston Lyceum with Oliver Wendell Holmes, Horace Greeley, Petroleum V. Nasby, and Ralph Waldo Emerson. He also traveled forty-nine times across the Atlantic, and once across the Indian Ocean, as a dutiful tourist surveying the sights in Rome, Paris, and the Holy Land; as an itinerant sage entertaining crowds in Australia and Ceylon.

Laughter was Twain’s stock-in-trade, and he produced it in sufficient quantity to make bearable the acquaintance with grief he knew to be generously distributed among all present in a Newport drawing room or a Nevada brothel. Whether the audience was drunk or sober, swaddled in fur or armed with pistols, Twain recognized it as likely in need of comic relief. “The hard and sordid things in life,” he once said, “are too hard and too sordid and too cruel for us to know and touch them year after year without some mitigating influence.” He bottled the influence under whatever label drummed up a crowd—burlesque, satire, parody, sarcasm, ridicule—any or all of it guaranteed to fortify the blood and restore the spirit.

Twain coined the phrase the “Gilded Age” as a pejorative, to mark the calamity that was the collision of the democratic ideal with the democratic reality, the promise of a free, forbearing, and tolerant society run aground on the reef of destruction formed by the accruals of vanity and greed that Twain understood to be not a society at all but a state of war. The ostrich feathers and the mirrored glass, he associated with the epithet citified, “suggesting the absence of all spirituality, and the presence of all kinds of paltry materialisms and mean ideals and mean vanities and silly cynicisms.” His struggling with his own paltry materialisms further delayed the composition of the autobiography. For thirty-four years he couldn’t get out of his own way, kept trying to find a language worthy of a monument, to dress up the many manuscripts in literary velvets and brocades.

Eventually faced with the approaching sandbar of his death, he puts aside his pen and ink and elects to dictate, not write, what he construes as his “bequest to posterity.” He begins the experiment in 1904 in Florence, where he has rented a handsome villa in which to care for his cherished, dying wife. To William Dean Howells, close friend and trusted editor, he writes to say, “I’ve struck it!” a method that removes all traces of a style that is “too prim, too nice,” too slow and artificial in its movement for the telling of a story.

“Narrative,” he had said at the outset of his labors,

should flow as flows the brook down through the hills and the leafy woodlands . . . a brook that never goes straight for a minute, but goes, and goes briskly, and sometimes ungrammatically, and sometimes fetching a horseshoe three-quarters of a mile around and at the end of the circuit flowing within the yard of the path it traversed an hour before; but always going, and always following at least one law, always loyal to that law, the law of narrative, which has no law. Nothing to do but make the trip; the how of it is not important so that the trip is made. 

Twain’s wife does not survive her season in the Italian sun, and at the age of seventy-one soon after his return to America, he casts himself adrift on the flood tide of his memory, dictating at discursive length to a series of stenographers while “propped up against great snowy white pillows” in a Fifth Avenue town house three blocks north of Washington Square. He delivers the deposition over a period of nearly four years, from the winter of 1906 until a few months before his death in the spring of 1910, here and there introducing into the record miscellaneous exhibits—previously published speeches, anecdotes and sketches, newspaper clippings, brief biographies, letters, philosophical digressions, and theatrical asides.

The autobiography he offers as an omnium-gatherum, its author reserving the right to digress at will, talk only about whatever interests him at the moment, “drop it at the moment its interest threatens to pale.” He leaves the reader free to adopt the same approach, to come across Twain at a meeting of the Hartford Monday Evening Club in 1884 (at which the subject of discussion is the price of cigars and the befriending of cats) and to skip over as many pages as necessary to find Twain in Honolulu in 1866 with the survivors of forty-three days at sea in an open boat, or discover him in Calcutta in 1896 in the company of Mary Wilson, “old and gray-haired, but . . . very handsome,” a woman whom he had much admired in her prior incarnation as a young woman in 1849 in Hannibal, Missouri:

We sat down and talked. We steeped our thirsty souls in the reviving wine of the past, the pathetic past, the beautiful past, the dear and lamented past; we uttered the names that had been silent upon our lips for fifty years, and it was as if they were made of music; with reverent hands we unburied our dead, the mates of our youth, and caressed them with our speech; we searched the dusty chambers of our memories and dragged forth incident after incident, episode after episode, folly after folly, and laughed such good laughs over them, with the tears running down. 

The turn of Twain’s mind is democratic. He holds his fellow citizens in generous and affectionate regard not because they are rich or beautiful or famous but because they are his fellow citizens. His dictations he employs as “a form and method whereby the past and the present are constantly brought face-to-face resulting in contrasts which newly fire up the interest all along like contact of flint with steel.” Something seen in Paris in 1894 reminds him of something else seen in Virginia City in 1864; an impression of the first time he saw Florence in 1892 sends him back to St. Louis in 1845.

The intelligence that is wide-wandering, intuitive, and sympathetic is also, in the parsing of it by Bernard De Voto, the historian and editor of Twain’s papers, “undeluded, merciless and final.” His comedy drifts toward the darker shore of tragedy as he grows older and loses much of his liking for what he comes to regard as “the damned human race,” his judgment rendered “in the absence of respect-worthy evidence that the human being has morals.”

Twain doesn’t absent himself from the company of the damned. He knows himself made, like all other men, as “a poor, cheap, wormy thing . . . a sarcasm, the Creator’s prime miscarriage in invention, the moral inferior of all the animals . . . the superior to them all in one gift only, and that one not up to his estimation of it—intellect.” The steamboat pilot’s delight in that one gift holds fast only to the end of his trick at the wheel of his life. Mankind as a species he writes off as a miscarriage in invention, but he makes exceptions—a very great many exceptions—for the men, women, and children (usually together with any and all of their uncles, nieces, grandmothers, and cousins) whom he has come to know and hold dear over the course of his travels. The autobiography is crowded with their portraits sketched in an always loving few sentences or a handsome turn of phrase. Humor is still “the great thing, the saving thing after all,” but as the gilded spirit of the age becomes everywhere more oppressive under the late-nineteenth-century chandeliers, Twain pits the force of his merciless and undeluded wit against “the peacock shams” of the world’s “colossal humbug.” He doesn’t traffic in the mockery of a cynic or the bitterness of the misanthrope. Nor does he expect his ridicule to correct the conduct of Boss Tweed, improve the morals of Commodore Vanderbilt, or stop the same-day deliveries of the politicians to the banks.

His purpose is therapeutic. A man at play with the freedoms of his mind, believing that it is allegiance to the truth and not the flag that rescues the citizens of a democracy from the prisons of their selfishness and greed, Twain aims to blow away with a blast of laughter the pestilent hospitality tents of a society making itself sick with its consumption of “sweet-smelling, sugar-coated lies.” He offers in their stead the reviving wine of the dear lamented past, and his autobiography stands, as does his presence in this book, as the story of an observant pilgrim heaving the weighted lead of his comprehensive and comprehending memory into the flow and stream of time.

Excerpted from “Mark Twain’s America” by Harry Katz and the Library of Congress. Published in October 2014 by Little, Brown and Company. Copyright © 2014 by Harry Katz and the Library of Congress. All rights reserved.


Lewis H. Lapham is editor of Lapham’s Quarterly. Formerly editor of Harper’s Magazine, he is the author of numerous books, including Money and Class in America, Theater of War, Gag Rule, and, most recently, Pretensions to Empire. The New York Times has likened him to H.L. Mencken; Vanity Fairhas suggested a strong resemblance to Mark Twain; and Tom Wolfe has compared him to Montaigne. This essay, shortened for TomDispatch, introduces “Magic Shows,” the Summer 2012 issue of Lapham’s Quarterly.