It was your typical July afternoon in Kentucky, 2003 edition. My cohort at the Governor’s Scholars program – which at the time was held at three different colleges across the state for rising high school seniors – had met in the lobby of one of the dormitories at Centre College in Danville, and now we were off to the basement. I was barely an hour’s drive from home, but it felt like being in Princeton, New Jersey (where I had spent the previous summer) again, such was the strangeness of being alone with a bunch of other teenagers away from my family. This was a time when the program could prohibit students from bringing “computers” (read: desktop PCs) and have that be an effective way of isolating them from the outside. I had a cellphone that looked like a candy bar.
Anyway, there was an ice-breaking exercise – it was about alliteration (see what I did there?). As a way of telling everyone else our name, we had to pair it with an alliterative adjective (e.g., Sagely Susan, or Miraculous Mary). I seemed to settle out of nowhere on “avant-garde” – questioning it briefly since it was compound-hyphenated, but speaking it all the same through the basement air. I think someone smiled. This choice of adjective has made an unfathomable difference in my life.
Later the same day or maybe the next (this was 12 years ago), I sat by myself at a table in the dining commons. Another student came over to me and introduced himself with “avant-garde?” It was memorable, I’ll grant – the longest and Frenchiest of all the adjectives. Maybe he was the one who had smiled. Anyway, the word was in this way a double ice-breaker, and we got on to fuller conversation, which for a pair of almost 17 year-olds at a summer camp involved where we intended to go to college.
I hadn’t thought of it much before that meeting. Maybe I would have just gone somewhere in state as the de facto option, had this meeting not happened. I told him I had spent a summer at Princeton, which I wrote off as a place that “didn’t know how to have fun” (how as a 16 year old he knew this I still don’t know). Then he began talking more positively about “Brown,” which I knew at the time mainly as one of the losingest college basketball teams ever as per an ESPN infographic. My interest was piqued.
A few nights after that, I was using that aforementioned cellphone to talk to my parents and I began to talk about some of these colleges this guy had been talking about. Maybe his perspective had been shaped by his experiences as a football player being recruited – he played for Owensboro Catholic – but the academic allure of the Ivy League schools, in a time before the Internet was really as pervasive as it is now, was having its own effect on me. I remember pushing so hard for it despite suggestions about just staying in state.
The fall of 2003 was accordingly a hectic one, with drawn-out application processes and interviews. I mentioned my friend from the dining commons during my Brown interview and the interviewer was surprised we knew each other and seemed to insinuate that the guy was maybe not a good fit for the school. He seemed to think otherwise about me, and I got my acceptance letter about 9 months after that first “avant-garde” utterance, around the time of the 2004 NCAA Final Four.
I sometimes think about what would have happened if I had picked a different word. It would have been a different world.
These sorts of almost accidents – or maybe they’re just actions that come from some place we don’t understand, if the universe really is deterministic after all – have scary power. Something similar, though less consequential, happened to me in 2006 when I was searching for music on a website called emusic. One of my friends from college was into some indie band – I think they were called The Delays – and I searched for them, and due to some mislabeling or weird search error on the site, the only result I got was a progressive trance compilation from the record label Renaissance UK, mixed by the DJ David Seaman.
This 2xCD collection ended up being by far the most influential album on my own tastes. It introduced me to acts like Luke Chable and more importantly Gabriel & Dresden. From there, I discovered more of Chable’s work and eventually found Gabriel & Dresden’s massive “Bloom” mix album, which introduced me to Above & Beyond and the entire Anjunabeats/Anjunadeep universe. “Bloom” was fittingly released on the first ever day of college for me, a day that also featured my first meeting with a professor who went on to become a co-worker and one of my best friends even to this day.
As for Above & Beyond, I have written about them several times here, saw them play Madison Square Garden, and listen to their podcast all the time. My favorite memory, though, is from early 2008 when I came back to my dorm after a weekend at a friend’s place. I think it was in March or April. The sun was just rising and it was foggy and I was looking out onto Bowen Street in Providence, Rhode Island as his car pulled away. In the background I was playing the first disc of some other compilation I had, which kicked off with a remix of the peerless Above & Beyond track “Good For Me.” It felt like I was in a trance (hah) as I reviewed some Latin grammar, too, of all things. Avant-garde studying, indeed.
I didn’t always like to write. When I was a 6th grader, I remember typing nervously on an old Windows 95 PC after school one day, trying to finish a intro-body-conclusion essay about a topic so important it was probably on a dreaded standardized test. My first ever “short story” was a heavily plagiarized handwritten knockoff of the plot of the computer game “Laura Bow 2: The Dagger of Amon Ra” (man, I wish I had that around – there’s something about handwriting in particular that I think invites so many possibilities). The Rubicon I ended up crossing was reading the book “The Haunted Mask,” part of R.L. Stine’s massively popular (and iconically 90s) “Goosebumps” series.
Writing is unique among the creative arts, I think, because the inputs that go into being great at it are so predictable: The best writer are almost invariably the best readers. Moreover, there really aren’t writing prodigies in the same way that there are music or visual art prodigies. Many of the world’s greatest authors – Shakespeare, Sophocles, Shaw, to name but three playwright with S-surnames – were late starters and/or late bloomers.
Shakespeare didn’t publish any play till he was in his late 20s and arguably didn’t hit his stride till he was in this mid 30s. Consider that Hamlet was likely completed when the Bard was 36 or 37, and all of his great tragicomedies (‘All’s Well That Ends Well,” “Coriolanus,” “The Two Noble Kinsmen,” etc.) meaning that he hit was still climbing to his artistic peak at the same age at which Mozart was, more than a century later, deceased (the Austrian composer died obscure and poor a month before his 36th birthday). Sophocles finished “Oedipus at Colonus” when he was almost 85.
The explanation is straightforward enough: Age brings opportunities to not only read more, but to read differently, to add new histories, correspondences, novels, poems, blog posts, newspaper columns, etc. to the brain’s vast, subconsciously indexed repertoire. The base is never forgotten, will never crumble, even as new columns and ornaments are added to it. I remember coming across certain turns of phrase and vocabulary words for the first time, but these discoveries fuel relatively minor bouts of growth. The most lasting learning comes from soaking up writers who are unafraid of using language, because language is for them almost like a surgical tool, the only one they have, for relieving that frenzied, mildly anxious condition known as inspiration.
“Inspiration” may be too mystical a word for it, invoking images of Muses speaking sentences directly to some grizzled Hemingway hunched over a typewriter. For me it’s more like, some sentence that hits the brain like rain would hit a fully spread-out umbrella, formed from the vapors of different overheard sentences or signs read on the subway. Sometimes, there is a phrase that just has to be turned into its own piece, forming the body and then requiring a title as a final ribbon on things, and at other times the title comes first and the body follows.
It’s sort of like cooking: The motions and the measurements vary each time, but the recipe – the things you’ve read, looked at, though about – stays the same and provides most of the final character. I probably would have never thought about all the different ways to compose – starting in media res, throwing paragraphs around the page, writing the intro last, lifting seemingly unrelated anecdotes to provide segues – had I not taken my current job two years ago and been forced to write at such tremendous volume for such a sustained period of time.
Having hard quotas is a way of dispelling concern about perfection, sure, but it is also a spigot for creativity. I don’t have forever is sort of my mentality with writing, rather than the less moving it doesn’t have to be perfect. I have a limited time to get my thoughts on the page and I don’t know who if anyone is going to read them – why be afraid? If nothing else, writing anything, writing it quickly, and then reading it back more slowly later (I always cringe at reading my own stuff in the moment) has a way of fueling the reading-wrying cycle that allows for growth.
Facebook excels at making me occasionally hate people I have known for years. Maybe they liked some homophobic retailer, shared a widely debunked story unironically, or generally just kept posting skinny mirror selfies to show How Awesome their lives were. Whatever. But Facebook’s corrosive powers don’t stop there; it’s the absolute fucking best at stirring up contempt for complete strangers. It goes where Reddit and the comment section could never go, because it creates a link between life-destroying nonsense and someone’s face/real identity.
These missives often come in the form of comments on a friend’s post, from someone I don’t know. Anyway there were two that really got me recently, so I’ll dissect them, not so much because they made me mad out of nowhere but because they triggered some thoughts I have had about the subjects in question for some time.
First, this sage on dietary advice and social progress:
“I’ve read several nutrition books from low carb to full vegan with many contradictory findings. The only absolute between them all is the undeniable harm refined carbohydrates and added sugars have on the body and society. It is definitively linked to the number one killer of Americans, more than lung cancer, more than drunk driving: heart disease.
The greatest health mechanism of our century wouldn’t be a cure for cancer, but a tax on added sugar and refined flour.”
Let’s start with the “contradictory findings” he mentions in the “nutrition books” he read. Resorting to confirmation bias and especially arguing that humanity has strayed from some idyllic dietary past are not bugs in nutritional literature (mmm) but features of it. Consider the long held wisdom that saturated fat causes heart disease (I picked this ailment due to the content of the above Facebook post). The American Heart Association has been largely responsible for peddling this notion, yet a 2013 study in the Annals of Internal Medicine found that:
“Current evidence does not clearly support cardiovascular guidelines that encourage high consumption of polyunsaturated fatty acids and low consumption of total saturated fats.”
The reasons that so many diet books are filled with contradictions are: 1) the authors are trying to sell the reader something, rather than educate him; 2) the effects of many foods on the body are still not well understood and merit further.
The poster above of course won’t have any of this, as seen in his use of “undeniable” and “definitively,” despite the doubts that can be cast on his claims. His usage of lung cancer and drunk driving in passing are notable, since he is trying to point to an obvious cause of heart disease on par with cigarette smoking and lung cancer or excessive alcohol consumption and drunk driving. It doesn’t exist, though.
Demonization of sugar in particular has much more to do with moralistic ranting about how “if it tastes good, it must be bad for you,” confusion about the differences between “natural” sweeteners like honey and their “chemical” clones like high fructose syrup (the same fucking thing), and fears that kids were being “poisoned” by candy, than it does with any solid science (sugar may cause weigh gain, which is worth than death for much of the current elite; but even being fat has no clear effect on mortality). Ditto for carbohydrates, albeit with an even more sordid history of junk low-carb and gluten-free diets that arose from one doctor’s accidental success in treating a celiac with a banana and skim-milk diet.
The last bit is bad in a different way, since it displays such limited imagination in improving health – a tax (and not just any old tax, but a regressive one borne by the poor as they try to buy food)! Denmark actually tried this “mechanism,” as he calls it, before, except with saturated fat. Once it became clear that the tax motivated Danes to cross the border to get fatty foods and that the entire premise of the measure was some shaky science, the tax was rolled back.
How about instead of using neoliberalist bullshit from self-help and diet books to control public institutions (i.e., governments that can enact taxes) we instead focus on actually figuring out how foods affect the body? The whole post comes off as someone afraid of being “fat” or “out of shape” trying to lecture the entire world on what they should eat.
Moving on now, to this luminary on the subject of the minimum wage:
“I don’t understand why the US thinks minimum wage should be $15/hr. If you are worth more than $7.25/hr to a business then there is no conceivable reason for you to be stuck working a minimum wage job. It doesn’t take skill to operate a register, clean a bathroom, or serve a meal, it’s basic labor and it’s not physically demanding. If a job is any more than that and still paying minimum wage then you’re working for the wrong company and should move on.
Minimum wage isn’t meant to support a family, purchase a new car, home, or even pay student debt. Minimum wage is meant for introductory roles or part-time/basic labor. Is it abused? Obviously. Will raising it fix the problem? No. It will just cause a loss of jobs and harder work for those making $15/hr. It will also cause pay cuts for those above the $15 mark who have busted their ass to get somewhere in life.”
Blah blah blah, look at me I work in IT to ensure that people get vitamins and loofas deliver to their front doors. First off, it’s curious that we start with figuring out what a person is “worth,” which in this case is determined by a business rather than by the person himself. Businesses do not have anyone’s real best interests – in terms of remuneration, health, you name it – at heart and exist mostly as outmoded institutions that are preserved to prop up the neoliberal state.
“No conceivable reason,” eh? This statement assumes that the employment market is rational and not beset by randomness, injustice and events far beyond a jobseeker’s control, such as the world-gambling going on every day on Wall Street. The poster has decided that all reasons for someone being stuck in minimum wage while deserving more can be ruled out. We can probably even do away with the nominal “$7.25/hr” bit, since the writer seems to think that whatever a business deems a worker is worth is what he is actually worth! I guess that includes $0.
“Skill” is an infuriating word in the context of employment discussions. There’s persistent talk about the nonexistent “skill gaps,” which is mostly code for businesses trying to squeeze workers’ wages by not hiring them and creating the artificial scarcity of unemployment, which drives desperation and willingness to take anything. “Skill” also imparts a sort of fictional objectivity to a chaotic market, through its associations with culturally important icons like athletes (who have “skills” in narrow areas) or card/video game players.
“It’s basic labor” – hah! Try cleaning a bathroom every day of the week. Better yet, try being a caregiver working for near minimum wage for 60+ hours a week and see just how un-demanding such a job is. Again, we have the assumption that high pay correlates with “real” work and low pay with “basic” work, when of course there are so many counterexamples that I could fill up the rest of this entry with them. A caregiver puts in much more body- and mind-numbing work – work that can be a matter of life and death for the person involved – than any software developer working on some Web app for a consulting firm can ever aspire to.
Saying minimum wage “isn’t meant to support a family, purchase a new car, home, or even pay student debt” reimagines many of the transient ideals of our age – home ownership, car purchases, exorbitantly expensive college – as universals that can serve as bases for judging what someone should get out of their work. It is a great question these days to ask exactly why anyone works in the first place, when so many occupation are completely removed from social welfare and basic human survival and automation could play a bigger role. The poster has an idea of “why,” though, and they’re all goals from the postwar era when today’s suffocating, precarious work environment was still decades away and society didn’t fetishize every last word out of some CEO’s streamlining, cost-cutting, union-busting mouth.
“Who have busted their asses to get somewhere in life” – it’s statements like this one that make me really despair over the U.S. ever finding a way past its relentlessly classist and racist system. Instead of trying to imagine that we’re all in this together and deserve dignity as members of the same nation, this poster draws the line between the vast masses of those earning minimum wage and the truly deserving who had the fortune to enter a lucrative field within our deterministic universe. This attitude is responsible for so much social ill, from the ridiculous costs of American healthcare (cue remarks about how long it takes to become a doctor) to the gentrification of working class neighborhoods by workaholic “entrepreneurs” making digital baubles for the 1 percent.
I won’t do this type of post again for a while, most likely. Again, I had planned to write on these issues at some point, and Facebook simply provided me with the raw material I finally needed to get started.
Sometime back in the early 1990s, I saw the moon through a telescope my uncle had set up in a shed, out in an open expanse of land on our family’s Kentucky farm. In a place so isolated and far away from large cities, I was able to see the same body that had inspired H.G. Wells and George Méliès around the turn of the 20th century. Yellowy and cheese-like through the lens, the close-up view of the moon’s surface made me feel like I was not only looking ahead at the final frontier of space but also staring at of the most reliable muses of past.
The First Men in the Moon
Late last year, I read Wells’ “The First Men in the Moon,” published in 1901. Two men reach the moon after creating a new compound that can counteract the effects of gravity. Eventually, they are separated, but not before they discover some of the oddities of the lunar world: abundant gold (“I resumed my destruction of the fungi. Then suddenly I saw something that struck me even then. “Cavor,” I said, “these chains are of gold!”), mooncalves (“They seemed monsters of mere fatness, clumsy and overwhelmed to a degree that would make a Smithfield ox seem a model of agility”), and a much better view of the stars (“The stars we see on earth are the mere scattered survivors that penetrate our misty atmosphere. But now at last I could realise the meaning of the hosts of heaven!”)
Wells’ story is by turns funny and horrifying, and always page-turningly irresistible. What struck me was how the moon seemed in a way like the opposite of the imperialistic industrial societies, from the British Empire to the United States, that were coming to a head during Wells’ time. The Selenites (moon people) are amazed at the warlike tendencies of earthlings and seemingly blessed by an abundance of natural resources that are not the subject of a hyper-competitive land grab, a la what was occurring in the 19th century on earth with the rush for Africa. There are signs, though, of creeping earth-ism, if you will, on the moon, such as in the industrial mooncalf processing plant beneath moon’s surface, echoing the coal mines and factories of the age.
The Moon’s otherness ultimately leads Bedford (one of the main characters) to question his entire perspective, in a sublime chapter called “Mr. Bedford in Infinite Space.” Whereas treatment of aliens and their cultures – especially in sci-fi – are often just reimaginings of human affairs (the beings are vaguely humanoid and given whatever qualities the artists think are quintessentially human), Wells’ character meditation synthesizes all that’s different about life on the moon – the gravity, the atmosphere, the lack of hoopla around material commodities – and wonders if perhaps it is his world, his baselines for ‘normal,’ that are up for grabs and subject to artistic moulding:
“At last I felt my moonward start was sufficient. I shut out the sight of the moon from my eyes, and in a state of mind that was, I now recall, incredibly free from anxiety or any distressful quality, I sat down to begin a vigil in that little speck of matter in infinite space that would last until I should strike the earth … But in that direction no light was forthcoming, though the strangest fancies came drifting into my brain, queer remote suspicions, like shadows seen from away. Do you know, I had a sort of idea that really I was something quite outside not only the world, but all worlds, and out of space and time, and that this poor Bedford was just a peephole through which I looked at life?”
Le Voyage dans la Lune
A year after Wells published “The First Men in the Moon,” French filmmaker Georges Méliès made his silent film, “Le Voyage dans la Lune” (“A Voyage to the Moon”). Though a 113 year-old silent film, it has both black-and-white and color versions and runs between 9 and 18 minutes depending on the restoration and the frame rate. The color version was only discovered in the early 1990s, around the time I was peering through that telescope, and was in terrible disrepair.
By the 2010s, though, it had been restored, with occasional reliance on re-colored portions of the black-and-white frames as needed. A new soundtrack was also added, by the French electronica band Air, whose first and best album was fittingly entitled “Moon Safari” and contained catchy tracks like “Sexy Boy.” The colorized version recently made its way to Netflix, at about a 16 minute running time, making it the polar opposite of the other silent film I once discussed in depth here, D.W. Griffith’s “Intolerance.”
The imagination in “A Voyage to the Moon” is tremendous and all the more impressive for its speed, economy, and maximization of what we now regard as limited technical resources. The astronomy club at the beginning is arranged almost like a choir or an orchestra, with their telescopes resembling instruments and their conversations – unheard – like mini-songs against the Air score. Then there’s the loading of the bullet-like spaceship and its iconic impact on the moon’s anthropomorphized face – one of the earliest and most memorable usages of special effects in film.
While the ship and its landing make the earthlings seem as militant as they are described in Wells’ novel, the humans are much less aggressive on lunar surface that Cavor and Bedford were in “The First Men in the Moon.” They look up at the stars, which then segue into the planets in a nice piece of visual artistry. Eventually, they are brought before the militant Selenites (the film likely used a French translation of Wells as a major source of inspiration) but are able to escape because of their physical superiority – the Selenites explode when tossed.
By the end of the film, there’s a parade near a statue emblazoned with the Latin phrase for “works conquers all” and the word “science,” both nice correctives to the idea that people from long ago were ignorant individuals beholden only to religious faith or superstition. I like how both Méliès and Wells were fascinated not just with reaching the moon, but in returning to earth from it – which, if you remember, was a key part of John F. Kennedy’s proposal for sending a man to the moon and getting him safely back to earth. The roots of the Apollo missions, which required tremendous scientific investment as well as cultural capital built upon centuries of fascination with the moon, are here in these early 20th century pieces of art.
Highly recommended, both of them. “A Voyage to the Moon” can be watched in its entirety on a lunch break, and “The First Men in the Moon” is slim volume that you could probably make it through in a day or two.
Before this June, the last time a thoroughbred (race horse) won a Triple Crown – i.e., swept the Kentucky Derby, the Preakness Stakes, and the Belmont Stakes – my father was 24 years old. Jimmy Carter was President of the United States. The World Wide Web was at least 11 years away and Bob Metcalfe had only invented Ethernet a few years prior. The original World Trade Center had been standing for less than 10 years. Manhattan was synonymous with crime and urban decay. Downtown Brooklyn was a community in which a family of teachers could afford a house. Michael Jordan was 15 years old. I wasn’t born.
37 years doesn’t seem a long time. Sports, though, has a way of stretching out the years. Part of the thrill of watching sports is anticipating a famous, long-standing record being broken, but when said record resists being broken, its casts a long shadow that feels like it will never go away. The first time I remember watching a horse trying to become the 12th Triple Crown winner and first since Affirmed in 1978 was in 1997, at which point the drought was already a respectable 19 years. Silver Charm couldn’t deliver, beaten by Touch Gold (you can’t make this stuff up) in the stretch while I watched from a TV in Florida.
The next year, Real Quiet suffered one of the worst defeats in sporting history (right up there with Ghana’s World Cup loss to Uruguay) on an unlucky head bob at the end against Victory Gallop, as my grandfather and I watched from his Kentucky living room. Charismatic broke down past the wire after fading against Lemon Drop Kid. War Emblem stumbled out of the gate. Smarty Jones was run down by Birdstone while we all watched while on vacation in Vermont. Big Brown had a shoe loosened and eased up while I sat in the June heat one summer in Rhode Island right after I had graduated college. I’ll Have Another scratched the day before the race as I checked ESPN from my Chicago apartment. California Chrome was stepped on by another horse and couldn’t pull out the victory.
The Triple Crown drought seemed so long perhaps because it followed me through adolescence, high school, college, and the beginning of my career. I watched horses fail from TV sets in 5 different states, from ages 10 to 27. When California Chrome owner Steve Coburn huffed that he would never see a Triple Crown winner in his life after Tonalist won the Belmont Stakes, I kinda believed him.
On June 6, 2015, I was prepared to make it a staggering 6 states from which I had viewed a Triple Crown chance get dashed. I had watched American Pharoah’s victory in the Kentucky Derby from Chicago, his triumph in the Preakness from a Manhattan bar, and now I was off with my brother and his friend to Belmont Park itself to see the horse attempt what so many had deemed impossible for the modern thoroughbred.
Going to Belmont Park on Belmont Stakes day, with a Triple Crown on the line, is surreal. I expected the worst – a breakdown, a devastating loss in the stretch, an unruly crowd – but the entire place seemed sun-soaked in optimism. Women were hearing hats and men were wearing blazers and smoking cigars. The general admission area was crowded but respectful, and perfect strangers stood up for the privileges of others. Tickets were not expensive. It made me feel great about “human nature,” whatever that is.
Prior to Race 11 (that’s the Belmont Stakes) I had the anxiety of someone having to deliver a speech on the first day of class – and I was only watching! It’s hard to explain how intense the feeling is. Triple Crown opportunities don’t happen every year and the race itself is so short after so much build-up – hell, the entire Triple Crown usually takes less than 6 minutes to run each year. When the race started, I was swimming in the sunshine and the heat and the barely audible sound of the track announcer shouting over the crowd of 90,000+.
I couldn’t even see them as they rounded into the backstretch. My brother was confident, as I had been so many times in the past watching Real Quiet and Smarty Jones jump out to their seemingly decisive leads. My hands were sweaty. Finally, the 8 horses came into the stretch, with American Pharoah well ahead and extending his lead. He blazed past us; I would have been worried about him being caught had I been watching on TV (weird angle on the replay) but in person there was no doubt. Here was a horse that was dominating without even seeming to try.
He coasted past the line. My relief was met with a tingling feeling in my hands that I couldn’t recall ever feeling, and, yes, a few tears. I jumped and hugged at least two people around me. I high-fived a guy who had been arguing with a Red Sox fan who had been smoking near his friend prior to the race. The noise felt like the woozy din from a video bar after you are two drinks too drunk – faraway yet immersive.
I kept waiting for something to go wrong, for there to be an inquiry or a post-race injury, something that would snap me out of it. Nothing happened. We were inside history in the weird bubble of Belmont Park, which on this day had the multicultural sweep of New York City compressed into rustic, tree-fringed throwback of a suburban race track. It’s fitting that the track literally straddles Queens and Nassau counties, bringing together the buzz of NYC proper with the quietude of the outside world.
They put up a panel for American Pharoah next to the other 11 Triple Crown winners in the infield. It was done quickly, as if to punctuate the action with an exasperated “Finally!” Within the hour, we were on the Long Island Railroad back to the rest of Queens, outrunning the crowds just as the horse had outrun 7 competitors and innumerable ghosts of failure.