A life-altering adjective

It was your typical July afternoon in Kentucky, 2003 edition. My cohort at the Governor’s Scholars program – which at the time was held at three different colleges across the state for rising high school seniors – had met in the lobby of one of the dormitories at Centre College in Danville, and now we were off to the basement. I was barely an hour’s drive from home, but it felt like being in Princeton, New Jersey (where I had spent the previous summer) again, such was the strangeness of being alone with a bunch of other teenagers away from my family. This was a time when the program could prohibit students from bringing “computers” (read: desktop PCs) and have that be an effective way of isolating them from the outside. I had a cellphone that looked like a candy bar.

Anyway, there was an ice-breaking exercise – it was about alliteration (see what I did there?). As a way of telling everyone else our name, we had to pair it with an alliterative adjective (e.g., Sagely Susan, or Miraculous Mary). I seemed to settle out of nowhere on “avant-garde” – questioning it briefly since it was compound-hyphenated, but speaking it all the same through the basement air. I think someone smiled.  This choice of adjective has made an unfathomable difference in my life.

Later the same day or maybe the next (this was 12 years ago), I sat by myself at a table in the dining commons. Another student came over to me and introduced himself with “avant-garde?” It was memorable, I’ll grant – the longest and Frenchiest of all the adjectives. Maybe he was the one who had smiled. Anyway, the word was in this way a double ice-breaker, and we got on to fuller conversation, which for a pair of almost 17 year-olds at a summer camp involved where we intended to go to college.

I hadn’t thought of it much before that meeting. Maybe I would have just gone somewhere in state as the de facto option, had this meeting not happened. I told him I had spent a summer at Princeton, which I wrote off as a place that “didn’t know how to have fun” (how as a 16 year old he knew this I still don’t know). Then he began talking more positively about “Brown,” which I knew at the time mainly as one of the losingest college basketball teams ever as per an ESPN infographic. My interest was piqued.

A few nights after that, I was using that aforementioned cellphone to talk to my parents and I began to talk about some of these colleges this guy had been talking about. Maybe his perspective had been shaped by his experiences as a football player being recruited – he played for Owensboro Catholic – but the academic allure of the Ivy League schools, in a time before the Internet was really as pervasive as it is now, was having its own effect on me. I remember pushing so hard for it despite suggestions about just staying in state.

The fall of 2003 was accordingly a hectic one, with drawn-out application processes and interviews. I mentioned my friend from the dining commons during my Brown interview and the interviewer was surprised we knew each other and seemed to insinuate that the guy was maybe not a good fit for the school. He seemed to think otherwise about me, and I got my acceptance letter about 9 months after that first “avant-garde” utterance, around the time of the 2004 NCAA Final Four.

I sometimes think about what would have happened if I had picked a different word. It would have been a different world.

These sorts of almost accidents – or maybe they’re just actions that come from some place we don’t understand, if the universe really is deterministic after all – have scary power. Something similar, though less consequential, happened to me in 2006 when I was searching for music on a website called emusic. One of my friends from college was into some indie band – I think they were called The Delays – and I searched for them, and due to some mislabeling or weird search error on the site, the only result I got was a progressive trance compilation from the record label Renaissance UK, mixed by the DJ David Seaman.

This 2xCD collection ended up being by far the most influential album on my own tastes. It introduced me to acts like Luke Chable and more importantly Gabriel & Dresden. From there, I discovered more of Chable’s work and eventually found Gabriel & Dresden’s massive “Bloom” mix album, which introduced me to Above & Beyond and the entire Anjunabeats/Anjunadeep universe. “Bloom” was fittingly released on the first ever day of college for me, a day that also featured my first meeting with a professor who went on to become a co-worker and one of my best friends even to this day.

As for Above & Beyond, I have written about them several times here, saw them play Madison Square Garden, and listen to their podcast all the time. My favorite memory, though, is from early 2008 when I came back to my dorm after a weekend at a friend’s place. I think it was in March or April. The sun was just rising and it was foggy and I was looking out onto Bowen Street in Providence, Rhode Island as his car pulled away. In the background I was playing the first disc of some other compilation I had, which kicked off with a remix of the peerless Above & Beyond track “Good For Me.” It felt like I was in a trance (hah) as I reviewed some Latin grammar, too, of all things. Avant-garde studying, indeed.

Not Being Afraid Of Writing

I didn’t always like to write. When I was a 6th grader, I remember typing nervously on an old Windows 95 PC after school one day, trying to finish a intro-body-conclusion essay about a topic so important it was probably on a dreaded standardized test. My first ever “short story” was a heavily plagiarized handwritten knockoff of the plot of the computer game “Laura Bow 2: The Dagger of Amon Ra” (man, I wish I had that around – there’s something about handwriting in particular that I think invites so many possibilities). The Rubicon I ended up crossing was reading the book “The Haunted Mask,” part of R.L. Stine’s massively popular (and iconically 90s) “Goosebumps” series.

Writing is unique among the creative arts, I think, because the inputs that go into being great at it are so predictable: The best writer are almost invariably the best readers. Moreover, there really aren’t writing prodigies in the same way that there are music or visual art prodigies. Many of the world’s greatest authors – Shakespeare, Sophocles, Shaw, to name but three playwright with S-surnames – were late starters and/or late bloomers.

Shakespeare didn’t publish any play till he was in his late 20s and arguably didn’t hit his stride till he was in this mid 30s. Consider that Hamlet was likely completed when the Bard was 36 or 37, and all of his great tragicomedies (‘All’s Well That Ends Well,” “Coriolanus,” “The Two Noble Kinsmen,” etc.) meaning that he hit was still climbing to his artistic peak at the same age at which Mozart was, more than a century later, deceased (the Austrian composer died obscure and poor a month before his 36th birthday). Sophocles finished “Oedipus at Colonus” when he was almost 85.

The explanation is straightforward enough: Age brings opportunities to not only read more, but to read differently, to add new histories, correspondences, novels, poems, blog posts, newspaper columns, etc. to the brain’s vast, subconsciously indexed repertoire. The base is never forgotten, will never crumble, even as new columns and ornaments are added to it. I remember coming across certain turns of phrase and vocabulary words for the first time, but these discoveries fuel relatively minor bouts of growth. The most lasting learning comes from soaking up writers who are unafraid of using language, because language is for them almost like a surgical tool, the only one they have, for relieving that frenzied, mildly anxious condition known as inspiration.

“Inspiration” may be too mystical a word for it, invoking images of Muses speaking sentences directly to some grizzled Hemingway hunched over a typewriter. For me it’s more like, some sentence that hits the brain like rain would hit a fully spread-out umbrella, formed from the vapors of different overheard sentences or signs read on the subway. Sometimes, there is a phrase that just has to be turned into its own piece, forming the body and then requiring a title as a final ribbon on things, and at other times the title comes first and the body follows.

It’s sort of like cooking: The motions and the measurements vary each time, but the recipe – the things you’ve read, looked at, though about – stays the same and provides most of the final character. I probably would have never thought about all the different ways to compose – starting in media res, throwing paragraphs around the page, writing the intro last, lifting seemingly unrelated anecdotes to provide segues – had I not taken my current job two years ago and been forced to write at such tremendous volume for such a sustained period of time.

Having hard quotas is a way of dispelling concern about perfection, sure, but it is also a spigot for creativity. I don’t have forever is sort of my mentality with writing, rather than the less moving it doesn’t have to be perfect. I have a limited time to get my thoughts on the page and I don’t know who if anyone is going to read them – why be afraid? If nothing else, writing anything,  writing it quickly, and then reading it back more slowly later (I always cringe at reading my own stuff in the moment) has a way of fueling the reading-wrying cycle that allows for growth.

Voyages to the Moon

Sometime back in the early 1990s, I saw the moon through a telescope my uncle had set up in a shed, out in an open expanse of land on our family’s Kentucky farm. In a place so isolated and far away from large cities, I was able to see the same body that had inspired H.G. Wells and George Méliès around the turn of the 20th century. Yellowy and cheese-like through the lens, the close-up view of the moon’s surface made me feel like I was not only looking ahead at the final frontier of space but also staring at of the most reliable muses of past.

The First Men in the Moon
Late last year, I read Wells’ “The First Men in the Moon,” published in 1901. Two men reach the moon after creating a new compound that can counteract the effects of gravity. Eventually, they are separated, but not before they discover some of the oddities of the lunar world: abundant gold (“I resumed my destruction of the fungi. Then suddenly I saw something that struck me even then. “Cavor,” I said, “these chains are of gold!”), mooncalves (“They seemed monsters of mere fatness, clumsy and overwhelmed to a degree that would make a Smithfield ox seem a model of agility”), and a much better view of the stars (“The stars we see on earth are the mere scattered survivors that penetrate our misty atmosphere. But now at last I could realise the meaning of the hosts of heaven!”)

Wells’ story is by turns funny and horrifying, and always page-turningly irresistible. What struck me was how the moon seemed in a way like the opposite of the exploitative, imperialistic capitalist societies, from the British Empire to the United States, that were coming to a head during Wells’ time. The Selenites (moon people) are amazed at the warlike tendencies of earthlings and seemingly blessed by an abundance of natural resources that are not the subject of a hyper-competitive land grab, a la what was occurring in the 19th century on earth with the rush for Africa. There are signs, though, of creeping earth-ism, if you will, on the moon, such as in the industrial mooncalf processing plant beneath moon’s surface, echoing the coal mines and factories of the age.

IMG_1739

The Moon’s otherness ultimately leads Bedford (one of the main characters) to question his entire perspective, in a sublime chapter called “Mr. Bedford in Infinite Space.” Whereas treatment of aliens and their cultures – especially in sci-fi – are often just reimaginings of human affairs (the beings are vaguely humanoid and given whatever qualities the artists think are quintessentially human), Wells’ character meditation synthesizes all that’s different about life on the moon – the gravity, the atmosphere, the lack of hoopla around material commodities – and wonders if perhaps it is his world, his baselines for ‘normal,’ that are up for grabs and subject to artistic moulding:

“At last I felt my moonward start was sufficient. I shut out the sight of the moon from my eyes, and in a state of mind that was, I now recall, incredibly free from anxiety or any distressful quality, I sat down to begin a vigil in that little speck of matter in infinite space that would last until I should strike the earth … But in that direction no light was forthcoming, though the strangest fancies came drifting into my brain, queer remote suspicions, like shadows seen from away. Do you know, I had a sort of idea that really I was something quite outside not only the world, but all worlds, and out of space and time, and that this poor Bedford was just a peephole through which I looked at life?”

Le Voyage dans la Lune
A year after Wells published “The First Men in the Moon,” French filmmaker Georges Méliès made his silent film, “Le Voyage dans la Lune” (“A Voyage to the Moon”). Though a 113 year-old silent film, it has both black-and-white and color versions and runs between 9 and 18 minutes depending on the restoration and the frame rate. The color version was only discovered in the early 1990s, around the time I was peering through that telescope, and was in terrible disrepair.

By the 2010s, though, it had been restored, with occasional reliance on re-colored portions of the black-and-white frames as needed. A new soundtrack was also added, by the French electronica band Air, whose first and best album was fittingly entitled “Moon Safari” and contained catchy tracks like “Sexy Boy.” The colorized version recently made its way to Netflix, at about a 16 minute running time, making it the polar opposite of the other silent film I once discussed in depth here, D.W. Griffith’s “Intolerance.”

The imagination in “A Voyage to the Moon” is tremendous and all the more impressive for its speed, economy, and maximization of what we now regard as limited technical resources. The astronomy club at the beginning is arranged almost like a choir or an orchestra, with their telescopes resembling instruments and their conversations – unheard – like mini-songs against the Air score. Then there’s the loading of the bullet-like spaceship and its iconic impact on the moon’s anthropomorphized face – one of the earliest and most memorable usages of special effects in film.

IMG_1740

While the ship and its landing make the earthlings seem as militant as they are described in Wells’ novel, the humans are much less aggressive on lunar surface that Cavor and Bedford were in “The First Men in the Moon.” They look up at the stars, which then segue into the planets in a nice piece of visual artistry. Eventually, they are brought before the militant Selenites (the film likely used a French translation of Wells as a major source of inspiration) but are able to escape because of their physical superiority – the Selenites explode when tossed.

By the end of the film, there’s a parade near a statue emblazoned with the Latin phrase for “works conquers all” and the word “science,” both nice correctives to the idea that people from long ago were ignorant individuals beholden only to religious faith or superstition. I like how both Méliès and Wells were fascinated not just with reaching the moon, but in returning to earth from it – which, if you remember, was a key part of John F. Kennedy’s proposal for sending a man to the moon and getting him safely back to earth. The roots of the Apollo missions, which required tremendous scientific investment as well as cultural capital built upon centuries of fascination with the moon, are here in these early 20th century pieces of art.

Highly recommended, both of them. “A Voyage to the Moon” can be watched in its entirety on a lunch break, and “The First Men in the Moon” is slim volume that you could probably make it through in a day or two.

Triple Crown

Before this June, the last time a thoroughbred (race horse) won a Triple Crown – i.e., swept the Kentucky Derby, the Preakness Stakes, and the Belmont Stakes – my father was 24 years old. Jimmy Carter was President of the United States. The World Wide Web was at least 11 years away and Bob Metcalfe had only invented Ethernet a few years prior. The original World Trade Center had been standing for less than 10 years. Manhattan was synonymous with crime and urban decay. Downtown Brooklyn was a community in which a family of teachers could afford a house. Michael Jordan was 15 years old. I wasn’t born.

37 years doesn’t seem a long time. Sports, though, has a way of stretching out the years. Part of the thrill of watching sports is anticipating a famous, long-standing record being broken, but when said record resists being broken, its casts a long shadow that feels like it will never go away. The first time I remember watching a horse trying to become the 12th Triple Crown winner and first since Affirmed in 1978 was in 1997, at which point the drought was already a respectable 19 years. Silver Charm couldn’t deliver, beaten by Touch Gold (you can’t make this stuff up) in the stretch while I watched from a TV in Florida.

The next year, Real Quiet suffered one of the worst defeats in sporting history (right up there with Ghana’s World Cup loss to Uruguay) on an unlucky head bob at the end against Victory Gallop, as my grandfather and I watched from his Kentucky living room. Charismatic broke down past the wire after fading against Lemon Drop Kid. War Emblem stumbled out of the gate. Smarty Jones was run down by Birdstone while we all watched while on vacation in Vermont. Big Brown had a shoe loosened and eased up while I sat in the June heat one summer in Rhode Island right after I had graduated college. I’ll Have Another scratched the day before the race as I checked ESPN from my Chicago apartment. California Chrome was stepped on by another horse and couldn’t pull out the victory.

The Triple Crown drought seemed so long perhaps because it followed me through adolescence, high school, college, and the beginning of my career. I watched horses fail from TV sets in 5 different states, from ages 10 to 27. When California Chrome owner Steve Coburn huffed that he would never see a Triple Crown winner in his life after Tonalist won the Belmont Stakes, I kinda believed him.

On June 6, 2015, I was prepared to make it a staggering 6 states from which I had viewed a Triple Crown chance get dashed. I had watched American Pharoah’s victory in the Kentucky Derby from Chicago, his triumph in the Preakness from a Manhattan bar, and now I was off with my brother and his friend to Belmont Park itself to see the horse attempt what so many had deemed impossible for the modern thoroughbred.

Going to Belmont Park on Belmont Stakes day, with a Triple Crown on the line, is surreal. I expected the worst – a breakdown, a devastating loss in the stretch, an unruly crowd – but the entire place seemed sun-soaked in optimism. Women were hearing hats and men were wearing blazers and smoking cigars. The general admission area was crowded but respectful, and perfect strangers stood up for the privileges of others. Tickets were not expensive. It made me feel great about “human nature,” whatever that is.

Prior to Race 11 (that’s the Belmont Stakes) I had the anxiety of someone having to deliver a speech on the first day of class – and I was only watching! It’s hard to explain how intense the feeling is. Triple Crown opportunities don’t happen every year and the race itself is so short after so much build-up – hell, the entire Triple Crown usually takes less than 6 minutes to run each year. When the race started, I was swimming in the sunshine and the heat and the barely audible sound of the track announcer shouting over the crowd of 90,000+.

I couldn’t even see them as they rounded into the backstretch. My brother was confident, as I had been so many times in the past watching Real Quiet and Smarty Jones jump out to their seemingly decisive leads. My hands were sweaty. Finally, the 8 horses came into the stretch, with American Pharoah well ahead and extending his lead. He blazed past us; I would have been worried about him being caught had I been watching on TV (weird angle on the replay) but in person there was no doubt. Here was a horse that was dominating without even seeming to try.

He coasted past the line. My relief was met with a tingling feeling in my hands that I couldn’t recall ever feeling, and, yes, a few tears. I jumped and hugged at least two people around me. I high-fived a guy who had been arguing with a Red Sox fan who had been smoking near his friend prior to the race. The noise felt like the woozy din from a video bar after you are two drinks too drunk – faraway yet immersive.

I kept waiting for something to go wrong, for there to be an inquiry or a post-race injury, something that would snap me out of it. Nothing happened. We were inside history in the weird bubble of Belmont Park, which on this day had the multicultural sweep of New York City compressed into rustic, tree-fringed throwback of a suburban race track. It’s fitting that the track literally straddles Queens and Nassau counties, bringing together the buzz of NYC proper with the quietude of the outside world.

IMG_1729

They put up a panel for American Pharoah next to the other 11 Triple Crown winners in the infield. It was done quickly, as if to punctuate the action with an exasperated “Finally!” Within the hour, we were on the Long Island Railroad back to the rest of Queens, outrunning the crowds just as the horse had outrun 7 competitors and innumerable ghosts of failure.

There is no such thing as “human nature”

Not long ago, I finished an astonishing book called “The Western Illusion of Human Nature” by Marshall Sahlins, an anthropologist and professor emeritus at the University of Chicago (where I spent a formative year from 2008 to 2009). Ever since 2004, when I took an introductory class on Shakespeare at Brown University, I have been immensely skeptical of the notion of “human nature,” mostly because, as Mark Twain once quipped, “generalizations aren’t worth a damn” (think about it). My dislike of the term has amplified over the years as I came to see “human nature” as not just a hasty generalization but also a profoundly negative and deterministic outlook on life.

Genetic determinism: The descendant of “original sin”
Why are we the way we are? A simple glance at nearly anything in any inhabited city should give you a preliminary answer: When you consider all the buildings, roads, vehicles, elaborately clothed tourists, mobile phones that can connect to wireless data, you take in – from every angle – so many things that humans “weren’t intended to do.” We weren’t “intended” (by whom? by what?) to extract crude oil from the ground so that it could be refined into gasoline that would power trucks that would deliver a Wi-Fi- and Bluetooth-enabled watch to someone’s doorstep. We weren’t “intended” to eat dairy, gluten, or whatever else the peddlers of dietary fads have deemed the mythical source of all our ills. We weren’t “intended” to turn California into an agricultural superpower capable of producing almonds, artichokes, strawberries, and so many other foodstuff at unprecedented scale.

What I’m saying: Our world exists because of culture, with all of its “unnatural” or “unintended” things being cultural evolutions peculiar to peoples and periods of time. Moreover, all of its perks and drawbacks are the results of cultural choices about what is acceptable and important. The fact that we don’t have universal health care in the U.S. but pay programmers hundreds of thousands of dollars a year? That’s a cultural attitude, not a predetermined genetic outcome.

Human cultures predate the homo sapiens sapiens species by thousands of years, and our evolution – with peculiarities like helpless infants – has been guided by our complete immersion in culture from cradle to grave. Consider this: Even throwback diets like paleo, which aim to escape contemporary culinary habits by going back to what “cave men” ate, are still completely beholden to the artificial and culturally determinedselection of the best-tasting and best-looking fruits and vegetables (there were no perfectly golden, creamy bananas “in nature” before human began cross-breeding cultivars) and the most-consistently bred livestock (raising even a single cow requires immense amounts of food, grown with “unintended” methodologies, and supported by “unnatural” antibiotics and drugs) available. Aesthetics and prejudices (e.g., the unwillingness to eat insects or dogs in the West) play a huge role in simply influencing what is even on the table . We are no more “designed” to eat only preagricultural foodstuffs than we are to speak only pre Indo-European languages.

The idea of a savage man in the wilderness, eager to kill someone just to get his own immensely healthy and faultless food and all the while in desperate need of an iron-fisted enforcer (i.e., Hobbes’ Leviathan) is a myth that owes more to the pessimism of the English Civil War and the legacy of the vile Abrahamic religions than to any actual evidence of how humans organize their lives and groups. It assumes the worst about us and becomes a self-fulfilling prophecy, marketed by a culture eager to exercise control mechanisms and tell us what is and isn’t “natural” and “intended.”

The eagerness to impart everything to genes, to a built-in “human nature” that is most usually associated with greed, sexual wantonness, and violence, is a remarkably flimsy idea that is nevertheless mentioned with such gravitas whenever someone has to condescendingly explain a “harsh truth” to someone else. I mean: Would you listen to someone babble on about the Christian notion of “original sin” to you in explaining why bad things happen? “Human nature” is “original sin” in another guise: St. Augustine hypothesized that original sin was transferred from one person to his children via semen, a ridiculous argument that nevertheless has been reincarnated in the notion that there’s a certain “human nature” genetically copied across the generations that is accountable for everything from men cheating on their spouses to office managers being ruthless political animals who must be corralled. 

We impart to our own species a level of savagery that we wouldn’t assign to the worst of our relatives in the animal kingdom, and that doesn’t even exist in species like wolves that are often associated with evil despite their comraderie. We paper-over the obvious cultural destruction wrought by specific religions by instead saying that all this evil was instead inevitable because it had everything to do with “human nature” and nothing to do with fundamentalism.

Enough! What if instead of fixating on the negative traits we think are passed from one human to the other, we stepped back and consider how immensely in debt we still are to a Western cultural tradition that, for thousands of years, has put the “holy books” of the Abrahamic religions and the thoroughly pessimistic secular texts of Thucydides and Hobbes on a pedestal? We seem to see newborns and children as creatures that must be inoculated against some kind of savage nature, and then as adults held in check by paternalistic bureaucracies that prevent any lapse into a mythical “state of nature” that, due to culture’s pervasiveness, could never even have existed in the first place.

Almost any behavior that one would try to explain by appeal to “human nature” could be explained another way. Jacques Fresco once remarked about how he visited a Pacific island where everyone was naked all the time, and yet there was no evidence of constant sexual leering or abuse. If nothing else, his remarks are a good jumping-off point for thinking about how, say, Western uptightness about sexuality and a long legacy of optional sexism, rather than some inevitable “human nature” imprinted on the genome, has enabled things like catcalling and micro aggressions.

Yes, yes, I know – we don’t know much about genetics yet, and eventually we will able to explain everything via super-intelligent machines that can easily sequence DNA and analyze decisions. Bear in mind that the prioritization of such creations, as well as the ways in which they measure things (who decides how and what to measure, and how to interpret the results?), are all cultural realities, too and could be reversed. 

My sense is that we greatly, greatly oversubscribe to the notion of “human nature” because of the historical circumstances we live under, in which there is a dominant single power (the U.S.) with global reach sufficient to create a de facto common language (English) that in turn makes everything seem homogeneous, at least on the surface. If and when this state of affairs changes, I think we will eventually see the world’s disparate cultures and natures finally move out of the shadow of “human nature.”

Follow

Get every new post delivered to your Inbox.

Join 249 other followers