Advertisements

The Chicago Cubs and the Leviathan

Doppler radar-like, I could hear it coming and I could feel it passing just by. Two Massachusetts kids raced through the third floor corridor of our dorm, wordless but louder than silence. Stationary, I stared at the red and green mess on the TV. David Ortiz had just delivered his second walkoff hit in as many nights to force a Game 6 in the 2004 ALCS between the Boston Red Sox and the New York Yankees.

A week later: My then-Intro to Greek instructor (and now friend, for 12 years running) wondered to our class if everyone had gone crazy because of the lunar eclipse that October. Nope. The Red Sox had just sent all of Rhode Island into a frenzy by sweeping the St. Louis Cardinals to win the 2004 World Series, their first title since the year World War I ended. My English professor remarked that she knew then-President of Baseball Operations for the Red Sox Theo Epstein’s mother, who was an instructor at Boston University.

“An ancient one”
Baseball is an old sport. The Chicago Cubs began operation in 1876 as the Chicago White Stockings. Before 2016, they had last appeared in the World Series in 1945 – a year before the NBA was founded. Their 1908 title predates both the NHL and the NFL.

Professional baseball’s  19th century origins has meant that there have been some epic championship droughts. The Red Sox did not win between 1918 and 2004, the White Sox from 1917 to 2005, and the Cubs from 1908 to 2016. Even the longest current drought – belonging to the Cleveland Indians, this year’s runners-up – dates to the Harry Truman administration.

During my years in New England in the mid-2000s, and especially during the fall of 2004, the anxiety expended on the Red Sox was heavy enough to send the university campus into frenzies of relief after each victory. Sometimes I thought of this seemingly throwaway quote from Moby-Dick:

“Almost universally, a lone whale proves an ancient one.”

Why did we – even me, an 18 year-old from Kentucky who grew up rooting for the Indians – think so much about this baseball team? Because they were basically alone in their futility, and it was some truly ancient futility, dating to a time when my oldest grandparents hadn’t hit double-digits yet. The entire point of being a Red Sox fan was that you almost certainly had never seen a championship in your lifetime. Every game was life and (will we win before my) death. Generations of fans came and went, but that ancient whale – the Curse of the Bambino, traced back to the fateful 1918 day when Babe Ruth was traded from the Red Sox to the Yankees – was very much alive, however immaterial.

“This grey-headed, ungodly old man, chasing with curses a Job’s whale”
But curses are ultimately just stories. The Red Sox curse broken during my first year in college was “only” the third longest at the time. Why was it so much more prominent than the longer White So and Cubs curses? I mean, Boston is one-sixth the size of Chicago. Both the Cubs and White Sox fanbases are substantially larger than Boston’s.

The answer: The marketing around the Curse of the Bambino was flawless. It combined specific superstitions – the Ruth backstory, the epic collapse to the New York Mets in the 1986 World Series, clips of which had been shown endlessly on ESPN in the school cafeteria during that year’s Red Sox run – with Boston’s longstanding inferiority complex compared to New York City. Being cursed, doomed to root for this alwasy second-best team, was emblematic of being a New England sports fan.

I don’t know when the Red Sox drought in particular took on the momentum of a “curse,” but 1986 seems like a good candidate. Up 3 games to 2 on the Mets, the Red Sox were at one point just one strike away from a title. Instead, the Mets rallied for several runs to completely turn the series around. Infamously, with the game tied, a hit from Mookie Wilson slipped between the ankles of Boston first baseman Bill Buckner, reaching the outfield to send the game-winning run home. Buckner was for years the face of Boston’s baseball failures.

It wasn’t his fault, though. Let’s say he grabs that ball. The game doesn’t end. The Red Sox would have batted again, but they also would have defended the lead in the bottom of the next inning since the game was at Shea Stadium. And guess what: Even with the loss, the Red Sox still had Game 7!

I wasn’t old enough to remember the Red Sox-Mets incident, but I did witness the team’s loss to the Yankess in the 2003 ALCS on Aaron Boone’s walkoff homerun in extra innings. The Red Sox had led by 3 runs as late as the 8th inning when Boston manager Grady Little – in Melville’s terminology, that “grey-headed, ungodly old man, chasing with curses a Job’s whale” – inexplicably allowed an exhausted Pedro Martinez to keep pitching to the Yankees, allowing New York to rally. After the game ended, I thought that maybe Boston is just always going to be second-best to New York, home of the two baseball teams (Mets and Yankees) who had prolonged years of New England sports misery.

“Saturn’s grey chaos rolls over me”
The next year, Boston broke through and then won again in 2007 and 2013. I was finishing college in 2007 and I don’t remember much excitement about that title relative to 2004. The Red Sox were just another team now. The White Sox also won during my undergraduate years.

Still, the Cubs drought persisted, that unrivaled Leviathan of sports curses. No titles since 1908. No World Series appearances since 1945. I moved to Chicago in 2008 and the Cubs won the division that fall. They were swept in the first round and a championship seemed further away than ever, with the drought guaranteed to surpass 100 years.

Since I began my time in Chicago living on the South Side, I started as a White Sox fan and never had many feelings about the Cubs. There was little doubt to me though that the Cubs were the dominant baseball team in the city in terms of fandom. When I moved to Irving Park in 2009, I became accustomed to the train full of Cubs fans arriving at the nearby Metra station from the suburbs, to take the bus to Wrigley Field. The losing persisted.

Many times, I wondered why Cubs fans bothered, not having reached my realization yet that, like Red Sox and White Sox fans before them, the losing perversely made it fun, or at least unique, to be a Cubs fan. Like the Red Sox, the Billy Goat Curse was a triumph of marketing. Following that loss to the Detroit Tigers in the 1945 World Series, the Cubs were for decades the second fiddle to the much more popular White Sox teams of the 1950s an 1960s (the 1959 World Series between the White Sox and the Dodgers was the most well-attended World Series of all time). No one was particularly aware of the Cubs’ title drought even as it passed 70 years in 1978.

Everything started changing in the 1980s. WGN launched its superstation programming, bringing Cubs games into living rooms around the country. Longtime Cardinals and White Sox announcer Harry Caray became the face of the Cubs, bringing his tradition of singing “Take Me Out to the Ballgame” during the 7th inning stretch to Wrigley Field. Steve Goodman wrote “Go, Cubs, Go.”

In the 1990s, the legend was cemented by Caray’s famous “someday the Chicago Cubs will be in the World Series, and it might be a lot sooner than we think” remark in 1991, and the team’s dismal 0-14 start to the 1997 season, which would prove to be his final one. In 1998, the franchise was also at the heart of the race to break Roger Maris’s home run record, with Cubs outfielder Sammy Sosa hitting 66 home runs that year.

Even then, though, the Cubs’ drought, unlike the Red Sox’s, was not one well-known for near-misses and heartbreak. The team had appeared twice in the NLCS since LCSes were first instated in 1969. They blew a 2-0 series lead to the San Diego Padres in the 1984 NLCS, which was then a best-of-5 format – a surprising, but hardly unheard of, feat. They were easily dispatched by the San Francisco Giants in the 1989 NLCS.

It’s true that in 2003 they were snakebitten. With a 3 games to 1 lead over the Florida Marlins – a team that had at one point that season been 10 games below .500 and was managed by the eccentric 72-year old Jack McKeon – in the NLCS, they were shut out in Game 5, then blew a 3-0 lead in the 8th inning of Game 6 after a controversial incident with a fan trying to catch a foul ball. The Marlins won Game 7 and then their second World Series title by defeating the Yankees the next week.

Like the Buckner incident, the “Bartman game” (Game 6) has had many of its vital details airbrushed. The foul ball was probably not catchable. Cubs starting pitcher Mark Prior had thrown over 100 pitches by the 8th inning and unsurprisingly lost his control, walking that same batter on the next pitch on a passed ball. Shortstop Alex Gonzalez botched a surefire inning-ending double play. The Marlins scored an astonishing 8 runs in just that inning.

Like other “cursed” teams, the Cubs were ultimately victims of two contradictory trends, more so than these crazy one-off incidents:

  • Until 1969, only one team from each league made the playoffs (and until 1995, only two). This limited a team’s chances unless it had the best record in its division or league. Many Red Sox teams were in fact shut out of the playoffs in the 1970s despite winning close to 100 games, since the Yankees were often better.
  • But baseball was also expanding rapidly, with more teams making it harder to win a title in any given year. The Marlins only joined in 1993, for example. Expansion has meant that there are many teams (8, to be exact) that have never won a title and likely won’t for years. Already, the Rangers and Astros have existed for 50+ years with no World Series. The Mariners and the Nationals have never even won the pennant.
  • What’s the difference between a 50-year old pre-2016 Cubs fan and say a 50 year-old Milwaukee Brewers fan? Neither had seen a title in a lifetime (the Brewers have never won the World Series). The Cubs “curse,” compounded by lack of opportunity as well as expansion, lasted so long that it became impersonal. Only 100 people on earth alive as of Nov. 4, 2016 were confirmed to have been born on Jan. 13, 1906 or earlier, which is likely the minimm for having been sentient the last time the Cubs won in 1908. It was as if they had never won at all.

Melville has another good quote for this too, one that I think of even more so than the others I have cited here:

“When I stand among these mighty Leviathan skeletons, skulls, tusks, jaws, ribs, and vertebrae, all characterized by partial resemblances to the existing breeds of sea-monsters; but at the same time bearing on the other hand similar affinities to the annihilated antichronical Leviathans, their incalculable seniors; I am, by a flood, borne back to that wondrous period, ere time itself can be said to have begun; for time began with man. Here Saturn’s grey chaos rolls over me, and I obtain dim, shuddering glimpses into those Polar eternities; when wedged bastions of ice pressed hard upon what are now the Tropics; and in all the 25,000 miles of this world’s circumference, not an inhabitable hand’s breadth of land was visible. Then the whole world was the whale’s; and, king of creation, he left his wake along the present lines of the Andes and the Himmalehs.”

To be a Cubs fan was to stand constantly amid the “might Leviathan skeletons” of their two titles (1907 and 1908) from the Theodore Roosevelt administration, seeing the “partial resemblances” of the dead ball era game to today’s multimillion dollar MLB juggernaut, thinking about their “incalculable seniors,”  many of them long since perished waiting for a Cubs title, letting your thoughts bear you back to “that wonderous period” before Wrigley Field (the second oldest park in the majors, having been finished in 1914) was even built, indeed before time itself for anyone who is currently living, obtaining only “dim, shuddering glimpses” into what it must feel like to celebrate a Cubs title, and imagining an entire world that was yours for a day as you basked in your post-championship euphoria.

Those two kids running through the 3rd floor corridor were probably heading for the quadrangle. I didn’t follow them. But they were also running into the past, letting “Saturn grey chaos” roll them back to a reconstructed past they never lived through, a virtually ancient New England where the Red Sox were somehow the world champs. Would it feel that good this time, in 2004?

I had no rooting interest in the Cubs-Indians World Series this year. But once the game pushed into extra innings, I remembered 1997. That year, the Indians lost in Game 7 in extra innins to the Marlins – exactly the situation in 2016, except against the Cubs. I had been rooting for the Indians all postseason that year, watching the games with my grandfather at his house. When the Marlins got the Series-winning double, it felt like a gut punch; I’ve never really cared about any sports outcome as I did that one, when I was still an impressionable 11 year-old. This time came close since the circumstances were so similar, at least on the TV screen. I kind of miss getting so wrapped up in somewhat meaningless things like sports fandom now. I also missed him, and wondered what it woudl have been like for him to live to see all the curses – even the great Leviathan itself, the Cubs drought – finally end, with me 30 years old and sitting next to my dad on the couch in our North Side Chicago house.

 

 

 

Advertisements

DDoS

Sameness
Some writers write the same piece their entire lives. Sometimes, the repetition is fun for the audience. The abstract novelist Will Self has recycled the same characters and inimitable style across many of his books, yet the effect is never less than bizarre and original (admittedly, Self is obtuse and not everyone will be able to make it past a single page, but no accounting for taste, etc.)

But then there’s the work of people like Joel Kotkin, who has for years written about how the dense urban areas of the U.S. (i.e., New York, Chicago, D.C., San Francisco) are in decline because so many people are moving to the sunbelt cities, which are cheaper because the urban cities are so expensive because everyone wants to live there…you can probably see how this argument is self-refuting.

Similarly, many tech bloggers like Ben Thompson of Stratechery keep trotting out the same arguments about “the Internet” in what feels like an interminable series of posts stretching all the way back to the advent of the World Wide Web (in reality, he’s somehow only been blogging since 2012). His pet argument is about how the Internet has ruined “distribution” as a business model, citing the decline of the newspaper industry in particular, which could not keep up once its printing presses, local advertising networks, and distribution trucks became enormous liabilities compared to the instantaneous delivery of Google and Facebook.

This reasoning ignores how reliant even companies such as Amazon – which Thompson cites as one of the key companies that took advantage of the “free” distribution of the Internet – are on logistics (in the case of Amazon in particular) and on massive, expensive, and environmentally corrosive data centers. What if those buildings packed with servers some day become as obsolete as the newspaper infrastructure that he regards as passé?

Attack
On Oct. 21, 2016, many major websites, including Spotify, Reddit, and Twitter, were down for hours as a massive distributed denial-of-service (DDoS) cyberattack overwhelmed these very same infrastructures. DDoS is a dense concept that is beyond the scope of this blog, but to explain it as simply as I can: It involves machines (PCs, servers, anything with an Internet connection) sending tons of useless requests to websites. This flood of traffic makes it impossible for the targeted websites to process legitimate requests. For the layman, this means that you try to go to “twitter.com” and instead you get an error and the page never finishes loading.

It is hard to imagine how a similar attack would play out on “legacy” communication networks like the postal system or the plain old telephone grid. I mean, imagine if the post office got so much junk mail each day that it couldn’t even deliver any of your mail, or anyone else’s, and you’re close to grasping the insanity of a DDoS attack. The Internet is uniquely exposed to danger in this way.

A key enabler of the Oct. 21 attack was a botnet, which is simply an interconected set of machines that have been hijacked and programmed to do harm, typically in the form of flooding websites with bogus traffic. As more and more devices become connected – e.g., home appliances, vehicles, etc. – the potential pool of enslavable botnet machines grows, making ever-more devastating DDoS attacks possible.

I only veer into the DDoS case to emphasize that “the Internet” is A) not new and B) not necessarily permanent. Commentators such as Thompson still speak of the Internet in terms of “revolution,” with prose treating it as something new, when it has existed for decades. The World Wide Web predates NAFTA and the Super Nintendo Entertainment System. Wi-Fi was approved by the IEEE the same year Bill Clinton was sworn in for a second term. Ethernet was first commercialized while the Summer Olympics were being held in the Soviet Union. Someone who joined Facebook on its first day of availability would be at least 30 years old now. The Internet is old.

Permanence
As for permanence, I’m talking not so much about how websites can go down or be deleted forever, but instead about how the Internet itself as a global, homogenous systems with Americentric features may not be long for this world. Today’s DDoS attack was targeted as U.S. services, and with a vast, mature pool of devices now out there to enlist into botnets – again, the result of decades of Internet existence – more events like this one, resulting in entire days of major websites being unavailable, are almost inevitable. Combating them could be costly to the point of making routine website visits onerous. Enjoy the Internet, because like anything else it won’t last.

Olympic medal tables and “decline” narratives

In 1988, the Soviet Union and East Germany dominated the podium at the Summer Olympics, leaving the U.S. and every other country in their wake. They won a combined 234 medals, or 32 percent of all medals awarded in Seoul that summer. A little more than 3 years later, neither country existed.

Twenty years after that, China won an astonishing 51 gold medals in Beijing, 15 more than the second-place U.S., which had easily topped that category in 1996, 2000, and 2004. The surge was a perfect complement to books in vogue at that moment, including Martin’s Jacques’ hyperbolic “When China Rules the World.” Yet this year, China managed a mere 26 – not bad, and good enough for third, but behind the 27 for Great Britain and 46 for the U.S., two countries whose combined populations is only 1/3rd of China’s – and which together are often described as being in terminal geopolitical decline.

Medals and geopolitics

Olympic medal tables have indeed often been read as geopolitical commentary. For instance, the mid 20th century tables seemed to reflect the dominance of the U.S. and the U.S.S.R., as expected during the Cold War – the latter in fact remains 2nd all-time in golds in the Summer Olympics, despite not having competed since 1988. Nazi Germany was the clear winner of the 1936 haul right as it was becoming an expansionist power, and Britain – perhaps owing to its long imperial history and diverse sporting culture – is the only country to have won a gold at every Olympics.

In Rio de Janeiro, the medal table was topped by the U.S. and Great Britain (Northern Ireland competes with the Republic of Ireland in the Olympics, which precludes usage of the “U.K.” as a team identifier), two countries that, as I noted earlier, have been pegged as in “decline” for decades. In Britain’s case, decline has been recognized since at least the end of WWII, gthen iven a fine point with the handover of Hong Kong in 1997, and finally turned into humiliation with the isolationist Brexit referendum. For the U.S., decline has been a constant concern, whether in the context of burgeoning Soviet strength in the 1950s and 1960s or the economic “malaise” of the 1970s and early 1980s.

Does the recent sporting dominance of these two English-speaking countries say anything about their geopolitcal  staying power? The U.S. and GBR are the first and fifth largest economies by nominal GDP, respectively, so one might expect them to at least have ample economic resources to pour into their sporting programs. But elsewhere, neither country is particularly distinguished at soccer, the only game with an international event (the World Cup) that can rival the Olympics’ prominence. They can’t even keep up with the likes of Argentina and the Netherlands there, both much smaller countries and economies.

It’s possible that the U.S. and GBR in 2016 could be like the U.S.S.R. and East Germany in 1988, with their exploits on the medal table largely independent of their “declining” status as great powers. Alternatively, perhaps their success hints at underrated strengths.

Decline or not?

The long-term narrative of “globalization” is often cited to explain both the decline of the British Empire’s once massive reach and the short-by-comparision postwar geopolitical dominance of the U.S. But as the anthropologist Pierre Bourdieu has noted, globalization is not so much homogenization as it is proliferation of the power of a handful of already-powerful nations, especially in terms of their financial clout. In 2016, New York City and London remain as dominant as ever as financial centers, having been strengthened by decades of deregulation, policies favorable to capital mobility (but cruically not to the same for labor), and the spread of high-speed IP networking (e.g., the Internet).

Meanwhile, scholars such as Michael Beckley have made the contrarian argument that in areas such as military capabilities, the gap between the U.S. and everyone else is actually getting wider, not narrower, and that the perceived transfer of power to the developing world because of offshored manufacturing is mostly an illusion. That is, many of the goods produced in China and Southeast Asia are overseen by foreign firms, which specify the designs in question.

The issue with assessing any decline narrative, whether informed by Olympic medal table reading or not, is that it is has often been difficult to figure out just how far declined (or not) a country is. The Soviet collapse of 1991 was wholly unexpected, even by the CIA. Japan’s 60-year transformation from a WWI Ally to a WWII Axis to a “Westernized” industrial power could have been scarcely imagined in 1910.

Maybe the U.S. and GBR really are on the verge of late capitalist collapse, in a twist of the crumbling planned economies looming over the Eastern Bloc amid the glories of those Seoul Olympics. Or perhaps they’re like their same old selves from 1908, before any of the turmoil of the 20th century, when they finished 1-2 with a combined 193 medals at London.

 

Uber, Lyft, and “legacy” business models 

Years ago, the Tumblr of someone named Justin Singer expressed some of the most sophisticated criticism to date of ride-sharing in general and of Uber in particular. He deconstructed the short-lived Uber talking point about UberX drivers making $90,000 per years and contextualized the service’s rise as part of the growing commodification of the taxi industry:

“The story of the for-hire vehicle industry has been one long march toward commoditization, with drivers always getting the short end of an increasingly smaller stick.”

One question to ask is why the “stick” here is getting “shorter” to begin with, despite the enormous pool of money filling up in Silicon Valley. Uber is an incredibly well-capitalized firm, having raised an astonishing $15 billion in equity and debt since 2009. That money is not trickling down to drivers, though, and Uber itself, even with all of that cash, is essentially a middle man between ride-seekers and independent contractors. Many of its drivers may be making minimum wage or, worse, running at a loss. Uber is a confidence game in which drivers collectively overlook the costs that they must shoulder to partcipate.

Anyway, that $15 billion is even more astonishing when considering the recent relative size of tech funding as a whole. From 2012 to 2015, total private funding in tech was $138 billion. Meanwhile, Apple paid out over $160 billion in dividends and buyback over that same time. Uber is both a huge chunk of all tech-related funding and, like Apple, an extremely efficient re-distributor of wealth upward (i.e., for its investors) – a model of shareholderism.

So in the midst of so much jargon about”entrepreneurs” and “innovators,” vast sums of money are going toward 1) extracting money from the existing taxi and limousine infrastructure and 2) paying shareholders (explicitly in Apple’s case, preempitvely in Uber’s).

But the banality of the ridesharing economy is perhaps best seen in the fact that it is trying to reengineer public transit to be less efficient (tons of private cars instead of buses) and more expensive (as a public goodturned into a private rent, it will inevitably become this). “Innovation” is apparently mostly about privatizing BART, or as Anil Dash has put it, “converting publicly-planned metropolitan transportation networks into privately-controlled automated dispatch systems.”

The reason I often put these buzzwords in quotes is that they now seem emptied of any clear #content. John Pat Leary’s seminal series Keywords for the Age of Austerity has examined why, for instance, “enterpreneur” has become ubiquitious to the point of meaninglessness in business jargon. Similarly, the scraping-by wages of the gig economy actually represent “flexibility” and “autonomy,” while across the board, whether Uber or Theranos, aggressive privatization and neoliberalism is instead just “technology” simply working out inevitable change that, as it happens, exacerbates inequality along predictable lines (college education vs. none, coastal cities vs. “flyover” country,” etc.).

A major beneficiary of the Silicon Valley lingo, though, is the cottage industry of satirists that have taken it to heart. Good satire requires a predictable target, because A) the pattern of behavior provides a clear target for ridicule and B) such predictability means that future events are likely to only strengthen the long-term resonance of the satire. This is why ironic internet accounts such as Carl Diggler (a fictional character who writes columns at cafe.com and has his own podcast) and @ProfJeffJarviss work so well.

Diggler, for example, set out to make fun of the “centrist” Both Sides Do It “beltway insiders” who think the fundamental goals of American politics are to cut Social Security and demonize Russia. His brand of satire has succeeded as political pundits have driven themselves crazy looking for ties between the Trump campaign and Vladimir Putin; consider this old piece he wrote about being a captive in Russia with this Josh Barro tweet about the country.

Meanwhile, @ProfJeffJarviss has spent years lampooning Silicon Valley VCs and CEOs with a variety of impressive rhetorical frameworks and tools, ranging from “Remember [name of a tech service that probably just launched yesterday],” as if to signal his ennui at even brand new services that, to him the ultimate tech snob, have already become passé; “Naive [a quoted tweet from someone making a common sense point],” to play inside baseball against even rational arguments against “innovation”; and “Very innovative of [company name] to do [trivial thing that is framed as a game-changer],” to elevate the prosaic to the plateau of “tech.”

But his real genius unfolds itself in the normal, everyday actions of his targets, most notably “journalism professor” Jeff Jarvis, who is seemingly predestined to have Twitter meltdowns about how why Hillary Clinton is “smart” to avoid press conferences or to take a quote out of context and proclaim Sarah Silverman’s DNC speech as the “best political speech ever” (instead of “the best political speech ever given by a celebrity ,” which is how it was described – quite a difference, yeah?). He makes a fool out of himself even without even needing the @ProfJeffJarviss foil, and so the parody only reads better and better over time.

In any case, ProfJeffJarviss showed how satire is serious business recently when he tweeted the following about Uber and Lyft:

 

Screen Shot 2016-08-22 at 10.14.38 PM.png

There is a lot to unpack here. In the “legacy pricing” tweet, he uses that epithet to refer to Uber’s current model of pricing rides according to an algorithm is deft since it frames something so often touted as uber cutting-edge – opaque “algorithms” – as laughably outdated in the face of just giving away something for free, which is often what Lyft and Uber do anyway when they run aggressive promos and “first ride free” deals. It’s possible that the enormous price cuts that both services provide, as a result of their massive capitalizations, is more important to their success than any “algorithm” cooked up by a programmer-genius.

The “freemium” tweet is more complex. The “rudeness” of asking for money  that he alludes to is something central to the modern economy, in which it is considered impolite to frame your search for a job as about getting the money that you so obviously need in order to survive in a capitalist society. Instead, “passion” and “dedication” have to be at least feigned, if not converted to as a sort of secular religion of individualism. Tips are nice under this ideology, but what really matters is “#creating” “#value,” the hashtags both markers of the empty jargon of so much social media terminology that prioritizes vague concepts – “engagement,” “thought leadership” – instead of the concrete notions of money etc. that are supposed to be so central to the economy!

Given how little most Uber and Lyft drivers earn, @ProfJeffJarviss isn’t wrong to say that what they are really doing is just performing an elaborate routine to awkwardly signal their inclusion in the nebulous “tech” world. They’re not earning $90k a year, but they are #engaging passengers and challenging “legacy” industries such as taxis, apparently. Still, there is something extremely old-fashioned and “legacy” about even these ridesharing startups, which subsist mostly on the laissez-faire brand of capitalism and sheer force of investment capital that were so instrumental to the business monopolies of the early 20th century. “Legacy pricing” – that’s what we get with each $5 Uber ride, underwritten by the old school investment power of Google, Goldman Sachs, et al.

On the brightside…

One of the all-out unpredictable oddities of this U.S. presidential election cycle has been nostalgia for the Cold War – from the ostensible “left” of the American political spectrum, of all places. Usually, spinning fever dreams of a renewed rivalry between th U.S. and Russia is something voters associate with the “right,” e.g., Mitt Romney in 2012 when he called Russia “our number one geopolitical foe.” But this time around, it has been Democrats battering Republicans, and Donald Trump in particular, for their ties to Vladimir Putin. It’s a mix of Manchurian Candidate-style conspiracy theory and what I can only guess is the decades of anti-Russian indoctriniation drilled into the Baby Boomer and Gen X generations, who grew up when the U.S.S.R. still existed, coming back to life like some capitalistic vampire out for new blood.

Why would anyone not insane have fond memories of the Cold War? It brought civilization to the brink of destruction in 1962 (and likely at many other points that we don’t even know about). But it still seems to give 40-something bespectacled GOP pundits as well as PR firm shills hard-ons (and make no mistake, these are predominatly/almost entirely male subgroups we’re talking about here) thinking about Washington and Moscow rattling sabers in Ukraine or Venezuela. There will have to be reckoning in U.S. foreign policy at some point, which for so long has coasted on endless spending fighting imaginary foes, including the hollowed-out shell that is 21st century Russia.

So if even the Cold War, with all of its apocalyptic overtones, can be rehabilitated by our collective dark nostalgia, what can’t be given a favorable coat of paint? Anything? There are moments from my own past that I feel like I can recognize as “inferior” to the present, and whose lived experience I recall being just awful – the endless afternoons of 2010 and 2011 despairing in a studio apartment, the painful ends of relationships – but which, looking back upon, a certain fondness can be conjured up. In trying to come to terms with this, uh, non-sequitur, I often think back to this Margaret Atwood line from Cat’s Eye:

“I can no longer control these paintings, or tell them what to mean. Whatever energy they have came out of me. I’m what’s left over.”

I think the past is not just what is being remembered but also what is being re-experienced; your brain is not a computer, retrieving memories, but something that needs to be conditioned, like an athlete, to perform in certain ways under specific sets of circumstances. In this sense, the past is similar to an artwork, which regardless of its particular character can be approached from different pathways – a new area of visual focus in a drawing, or a dedication to listening more carefully to the bassline on a song that you never really listened to closely before – each time. But the visceral feeling toward it, the “meaning” it has in the mind as a series of associations with the minor and major details of the moment, is to some degree uncontrollable.

I mean, I was thinking about Daft Punk’s much-maligned third album Human After All the other day. It was released at a pivotal time in my own life, March 2005, when it felt like the stresses of adult life were first starting to register in my previously invincible teenage consciousness. I am certain that if I were to go back and replay March 14, 2005, I would probably be exhausted, confused about if I had made the right college choice, and on the verge of taking a nap at any given moment, lusting for the release of Ambien in the evening. I wouldn’t be reveling in the sounds of “Human After All” or “Technologic.”

Yet in my memory it seems sunny, a set of effervescent scenes through which my younger self strides to a soundtrack of “Robot Rock” (from that album) and my roommate’s obsession with The Killers song “Mr. Brightside,” which I remember only sort of liking then but which I adore now, if only because of its proximity to my current favorite moments from that era. Likewise, 2010 – when I started my first teaching job, for nearly no pay, and began to listen to the album in depth – was a similarly trying time that I would never want to relive. Still, in my memory it’s the formative time that I got into that album, began to rethink my previous faith in music critcism, and oh yeah finally dug myself out of my post-college pit (though subsequent things had to go right as well).

And just as these at-the-time terrible circumstnaces have been turned around years later, due to myriad details that I could not totally control, there are plenty of “perfect moments” of relaxation and sunshine that I passed through, though I would cherish forever, and now cannot even remember.

This is where I think the “I’m what’s left over” part of Atwood’s quote comes in. Ironically, maybe it’s because the problematic moments took such a pound of flesh out of me that I become inexplicably nostalgic for them. They made me, in some way, to a greater degree than the idyllic ones, and I can realize their effects by seeing my current self – with its ever-shifting and reconstructive attitude toward the past, which will remake even it in time, perhaps leading it to re-darken the same moments it had previously let the light in on, who knows – as what’s left over, a sculpture in conversation with all of its pieces that were knocked off along the way.

I can try to give what happened meaning, but the effort seems in vain: I’m as likely to think that the miserable snowy winter of 2005/6 was a golden age of youth as I am to forget almost all of the details of the then-glorious life-affirming college graduation and think it was just a gateway to adult misery. Nostalgia kinds works this way across the board, I feel: For example, even though I can get almost any song ever recorded streamed to my phone now, I sometimes pine for the days in my dorm when everyone’s iTunes libraries were shared over the LAN and I could peek to see if Bright Eyes or “Mr. Brightside,” even, were in there.

Look, I have no idea if this is the labyrinthine memory-struggle reckoning that has driven 40 and 50 something political commentators to wish that the world were every day on the cusp of nuclear war. Maybe it’s a way of retrospectively looking on the bright side – sort of “swimming through sick lullabies,” a lyric from “Mr. Brightside” that I always thought was a good description of revisiting quaint subject matter with a newly darkened outlook – to play self-defense against one’s own past…