The famous William Faulkner quote about “the past not even being past” has staying power not only because it contests the idea that time is a one-dimensional line that moves “forward,” but also because it reveals how ancient decisions shape our lives even to the current second. Most people alive today weren’t even born when Margaret Thatcher became Prime Minister of the United Kingdom in the 1979 general election. She died years ago. But her ideas are very much alive.
Thatcher ended decades of postwar consensus that had seen the rise of social welfare systems across Europe and North America. Her zeal for high defense spending, low taxes, and less regulation kept Labour out of power for a generation while providing a blueprint for the ascendance of Ronald Reagan – who would take power less than two years later – across the Atlantic.
Many of us have no recollection of a time when it wasn’t assumed that everything had to be run like a business, in a “competitive” environment in which everyone is on her own and “the government” is some dark entity that must be reduced, instead of the people and institutions that make life bearable. Sure, these ideas had long gestated among the economists of the morally bankrupt Chicago School (mainly Milton Friedman) but Thatcher turned their academic papers into reality, crushing the miner unions and setting off a prolonged run of privatization and deregulation.
Even the distinctive brand of military adventurism that has fascinated Western governments and cable news channels since the Gulf War is derived from Thatcher’s decision to fight with Argentina over the Falklands. Almost all military campaigns since then – from Grenada to the Iraq War – have followed the same lead of confronting a clearly outmatched foe, to achieve morally and/or strategically dubious aims.
Although both the U.K. and the U.S. have had small-‘l’ liberal governments post-1979 – Blair and Brown in Britain, Clinton and Obama in America – the truth is that the Thatcher consensus has gone largely unchallenged. The centrism of Blair and the rebranding of Clinton’s party as “New Democrats” were signals of how they operated as much within the Thatcher/Reagan mold as Eisenhower had within the constraints of the then-dominant New Deal regime. Blair’s affinity for military adventures in the Balkans and Iraq and Clinton’s willingness to pursue “welfare” “reform” were both ripped straight from the small-‘c’ conservative playbook. It’s no accident that Thatcher herself identified “New Labour, with its scrubbed mentions of national ownership of industry in its party constitution, as her greatest achievement.
The two countries have followed similar paths for the last 40 years. Both Thatcher and Reagan decisively won all their general elections and then handed the reins to their competent but less charismatic successors, John Major and George H.W. Bush, respectively. Those two continued in a similar but slightly more moderate vein, only to lose in landslides in the 1990s to candidates (Blair and Clinton, respectively) from revamped center-left parties (i.e., Labour and the Democrats), institutions that would have been unrecognizable to party rank-and-file in the 1970s.
While both Blair and Clinton were electoral juggernauts, they both had much less success than either Thatcher or Reagan in laying the groundwork for their successors. Blair resigned and helped the unpopular Gordon Brown become prime minister; he lasted not even 3 years, losing power to a Tory-LibDem coalition in 2010. Clinton’s efforts could not get his VP Al Gore or his wife Hillary Clinton over the finish line. Like Brown, they both lacked the political acumen or popularity to stop the reactionaries who narrowly defeated them (Cameron in the U.K., Bush 43 and Trump in the U.S.). The only point at which the two histories diverge is with Obama, but he largely governed within the standard Reagan model, with sprinkles of Clintonism, including many of Clinton’s own personnel.
At a glance, Thatcherism and its numerous derivatives seem to be in strong health. Both the Conservatives in the U.K. and the GOP in the U.S. control the government. Both continue to pursue the same right-wing policies of the 1980s, arguably with even more aggression than their predecessors – just look at Theresa May’s fixation on a “hard Brexit” (that is, with maximal breaks from EU immigration rules and economic integration) and Trump’s almost comically plutocratic commitment to taking away people’s health insurance to finance tax cuts for billionaires.
“Comical” – there is something weirdly humorous about what the right-wing parties of the West have become, though, isn’t there?
The Conservatives campaigned on a platform of “Strong and Stable” leadership, but their last two PMs – Cameron and May – have taken monumental gambles (the Brexit referendum and the 2017 snap elections) that spectacularly backfired. Having lost their parliamentary majority to a Labour surge led by one of the furthest left MPs in Britain – Jeremy Corbyn, whom they labeled a “terrorist sympathizer”- they must now form a coalition with the Democratic Unionist Party of Northern Ireland, a hard-right creationist party with deep ties to loyalist paramilitaries (a fancy term for white terrorists who kill Catholics).
Meanwhile, the GOP, the home of the heirs to Jerry Falwell’s Moral Majority and nominally the party of strong national defense, is led by a former reality tv host who once went bankrupt running a casino (“you mean people just come in and give you their money in exchange for nothing? Sorry, I’m going to need a better business model” – no one ever, except possibly Donald Trump) and has been caught on tape confessing to routine sexual assault. Plus, party membership from top to bottom is deeply enmeshed with Russian spies and businesses.
And both parties have lost control of the issue of “terrorism,” once easily controlled by right-wing leaders like George W. Bush, to the point that May is literally negotiating with loyalists militias and the GOP has cheered an ISIS attack against Iranian civilians.
Whether these flaws matter to core right-wing partisans is debatable, but it is clear that the ubiquity of conservative policies and their demonstrable failure – visible not only in May being forced to align with terrorists and Trump with Russian autocrats, but also in the collapse of the global banking system after decades of Thatcherist deregulation – has energized the left in a way that had nearly passed out of living memory.
Bernie Sanders is the most popular politician in America and ended up a few hundred delegates short of snagging the Democratic nomination and likely becoming president. Corbyn went even further and humiliated May, turning predictions of a massive Tory majority heading into Brexit negotiations into a hung parliament. Given the tenuous Tory-DUP coalition, it is probable and perhaps inevitable that Corbyn will eventually be PM.
Both of these men are senior citizens who for most of their careers were dismissed as “unserious” leftists who would never enter the mainstream. Instead, they have a golden opportunity in their twilight years to finally eradicate Thatcherism root and branch by unseating two truly awful politicians. May and Trump are almost like the store brand versions of Thatcher and Reagan, far less compelling despite their surface resemblance (May as another “Iron Lady,” Trump as another celebrity GOP president and “Great Communicator”; in reality, they are the Tin Lady and the Great Convfefer).
The levels of media anger and disbelief at both Corbyn and Sanders deserves its own entry, but it is ultimately indicative of how much of both British and American society remains captured by Thatcherism and “centrism,” neither of which has been seriously challenged until now. New Labour, New Democrats, it’s all eroding and exposing the decrepit foundations of Thatcherism and Reaganism. The fact that the “terrorist” attacks in Britain did not hurt Labour but instead exposed the incompetence of Tory security policy (as Home Secretary, May literally cleared one of the London Bridge stabbed to go fight in Libya!) was a turning point in how I viewed the staying power of Western conservatism. It seems weak, its unpopular ideas barely propped up by cynical appeals and a dying electoral coalition. After almost 40 years, we’re finally seeing what Winston Churchill – that old Conservstive – might be comfortable labeling “the beginning of the end” of Thatcherism.
“Predictions are hard, especially about the future.” – Yogi Berra, but possibly apochryphal
Imagine living in Europe circa 1900. Someone asks you to predict the state of the world in 1950. Are you going to be able to tell them confidently that the continent at that time will be divided into two spheres of influence: One dominated by the United States of America and the other by a successor state to Tsarist Russia modeled on a militarized version of Karl Marx’s philosophy, all of this having taken shape after the second of two catastrophic wars, the most recent one having ended with the U.S.A. dropping a pair of radioactive bombs on Japan that killed hundreds of thousands of civilians?
If your prediction was way off in 1900, you would have been in good company. Conventional wisdom at the time maintained that the economies of Europe were too integrated to ever lead to war, much less a conflict that would first be deemed The Great War and then renamed after its successor was even worse. But there was one realm in which the catastrophe of World War I was foreseen with startling clarity: literature. H.G. Wells’ serialized 1907 novel “The War in the Air” contemplated the immense resources being poured into then-unprecedented war machines (emphasis added; note the prophecy of a decaying Russia and a militant Germany at the end, and the hints of the eventual end of the British Empire throughout):
“It is impossible now to estimate how much of the intellectual and physical energy of the world was wasted in military preparation and equipment, but it was an enormous proportion. Great Britain spent upon army and navy money and capacity, that directed into the channels of physical culture and education would have made the British the aristocracy of the world. Her rulers could have kept the whole population learning and exercising up to the age of eighteen and made a broad-chested and intelligent man of every Bert Smallways in the islands, had they given the resources they spent in war material to the making of men. Instead of which they waggled flags at him until he was fourteen, incited him to cheer, and then turned him out of school to begin that career of private enterprise we have compactly recorded. France achieved similar imbecilities; Germany was, if possible worse; Russia under the waste and stresses of militarism festered towards bankruptcy and decay. All Europe was producing big guns and countless swarms of little Smallways.”
Why did Wells predict the carnage of World War I so accurately – and in a work of fiction, no less – while his peers were distracted by what they wrongly deemed a dawning golden era of global cooperation?
The question brings me back to an old saw of mine: Google’s obsession with science fiction, a genre Wells was instrumental in modernizing. The company’s ambitious “moonshots” division once required that new projects have some sort of basis in or resemblance to sci-fi. Efforts such as flying cars, robots, you name it: all of it was a computer science exercise in catching up to the fantasies of pulp writers from decades ago. Hell, the dummy-piloted taxi cab from “Blade Runner” (a movie released in 1990) is still far out ahead of the billions upon billions of dollars being spent on self-driving cars today by Google and its peers.
Google is not alone; the tech industry often comes off as highly certain of what the future will look like. Predictions about the dominance of automated vehicles, “the rise of the robots,” and so much more are collectively the fuel upon which a thousand “influencer” conferences run. Such events and the companies that participiate in them are at the same time highly dismissive of the value of humanistic education, instead prizing “technical” knowledege above all else. Yet the irony of them fervently chasing ideas from storybooks persists.
At some level, we all seem to trust in the power of fiction to tell us what the future is, whether we trust the explicitly “futurist” visions of sci-fi, or the eschatology of books such as the Bible and the Koran. In regard to tech in paticular, I was startled a few months ago to read Rana Dasgupta’s “Tokyo Cancelled,” a 2005 novel that sort of retells the Arabian Nights – as well as various fairy tales, such as Bluebeard – for the 21st century.
In one of its discrete stories, a man accepts a new job as an editor of people’s memories. He curates thoughts that they have (which have been captured via surveillance) and puts together a retrospective to present to them on individualized CDs. However, he has to be careful to edit out bad memories:
“We have short-listed around a hundred thousand memories that you can work from. They’ve been selected on the basis of a number of parameters – facial grimacing, high decibel levels, obscene language – that are likely to be correlated with traumatic memories….Apply the logic of common sense: would someone want to remember this? Think of yourself like a film censor; if the family can’t sit together and watch it, it’s out.”
Now here’s a Facebook employee, in 2015, announcing the introduction of filters into its On This Day service, which sends you a notification each day linking you to your photos and status updates from past years:
“We know that people share a range of meaningful moments on Facebook. As a result, everyone has various kinds of memories that can be surfaced — good, bad, and everything in between. So for the millions of people who use ‘On This Day,’ we’ve added these filters to give them more control over the memories they see.”
So while Dasgupta was essentially predicting an advanced Facebook service at a time when Facebook itself didn’t even exist yet (“Tokyo Cancelled” was written well before 2005, and Facebook itself was launched in 2004), what were the leading lights of tech predicting? Um…
-Steve Jobs in 2003: music streaming services are terrible and will never work
-Reality: in 2016, streaming drove an 8.1 percent increase in music industry revenue, and virtually everyone has heard of or used Spotify and Apple Music
The gulf between Dasgupta’s futurism and these now-laugable prediction brings me back to the vitality of the often-maligned cultural studies fields. I am reminded again and again of how we have to think about culture as a whole – not just scientific advances, which are undoubtedly important to human improvement, but also the flow of literatures, social mores, art, etc. – to sense where we are going and where we are going to. For example: Max Weber once positioned the Protestant work ethic – a totally incidental characteristic associated with adherence to a specific religion – as a central cog in the growing success of capitalism, which was reshaping Europe in his time. Yes, the Industrial Revolution and the creation of the steam engine, electricity, coal-fired ships, etc. were all vital to the creation of global capitalism, but would it have coalesced into a coherent social system without the cultural glue of Protestantism?
Just as Weber saw religion as an essential way to make sense of and corral new modes of industrial production, Dasgupta saw, by writing speculatively about it, the struggle to deal with information at vast scale (imagine all the CDs needed to contain the memories of the characters in “Tokyo Cancelled”) as a defining issue of the busy yet personally isolating environment of the modern international airport, in which the book takes place. When we give up on studying the humanities (and all “the channels of physical culture” whose underinvestment Wells bemoaned in the passage above), we create huge blind spots for ourselves and miss futures like these that should have been apparent to us all along, whether they sprouted from an Edwardian sci-fi novel or a 21st century fairy tale.
I haven’t published all year. That’s going to change: I have a few topics I’ll be looking at in the coming weeks to get back into things:
-How fiction is often the best predictor of the future, with a focus on Rana Dasgupta’s 2005 novel “Tokyo Cancelled”
-A new translation of the Aristophanes play “Wealth,” which I am producing with my former Greek language instructor. My focus on Aristophanes will be also be a good chance to revisit one of my older posts about his play “The Frogs” and its treatment of literary criticism.
It seems absurd to think about, doesn’t it? After all, John McCain lost in a landslide to Barack Obama in 2008, winning a mere 46% of the vote while losing the entire Midwest and Eastern Seaboard with the exceptions of South Carolina and Georgia, in which he held on by single digits. Obama even won electoral votes in three states – Virginia, Indiana, and Nebraska – in which Democratic presidential candidates had been shut out since LBJ wiped out Goldwater in 1964.
The Obama victory in 2008 had two important causes: 1) the incompetence of the Bush 43 administration, which culminated in the late 2000s financial crash and 2) the charisma and focus of Obama’s messaging. Obama knew how to work specific issues, such as opposition to Big Ag in Iowa and NAFTA in Ohio, better than any Democratic candidate since LBJ.
Wih these two drivers in mind, it’s actually not hard to imagine a situation in which McCain could have prevailed. I see three changes that could have enabled a McCain victory:
- The Democratic superdelegates, much like they did in 2016 with the Hillary Clinton and Bernie Sanders race, decide to heavily rig the primaries by surpressing media coverage of Obama’s insurgent candidacy, arranging odd debate schedules, and disproportionately pre-aligning themselves with one candidate. Clinton wins the primary, but fails to capture the “Hope and Change” spirit of 2008 and instead treads out something of similar dubious value to “America is Already Great.”
- Meanwhile, McCain stays on message and distances himself from the Bush administration, reminding everyone of his primary challenge to the president in 2000 and his disdain for conservative institutions such as the Christian right. He picks a relatively low-profile swing state GOPer like John Kasich as VP instead of Sarah Palin, who alienated millions. Aligned against both Bushism and Clintonism, he manages to become the “outsider” despite being a member of the incumbent presidential party.
- The collapse of Lehman Brothers, which really propelled Obama’s candidacy over the top, doesn’t happen until December 2008, by which time the election is already settled. This is the hardest of all the changes to imagine, but bear with me.
So McCain defeates Clinton and enters office in January 2009. What next?
Many policies such as the stimulus bill would have still gone through on his watch, with the help of a moderate Democratic majority in both houses of Congress. Healthcare refrom probably would not have happened, though.
The biggest mystery, though, is what would have become of the mortgage crisis he inherited from Bush. The wide-reaching economic despair that the financial meltdown wrought on the entire country would likely have continued for years as it did under Obama, assuming an even quasi-typical GOP response of tax cuts and bailouts for banks. It would have, in others words, become fertile ground for various dissent movements.
Indeed, this situation could have profoundly reshaped the 2010 midterms, which in reality turned out to be a landslide for the newly formed Tea Party. Would the Tea Party have even emerged without the monolithic target of the Obama administration and the Democratic Congress of 2009-2010, both of them overseeing the reeling economy? Would a Tea Party of the Left have sprouted up instead, perhaps spearheaded by Bernie Sanders (who toyed with the idea of running for president in 2012)? Would a 76 year-old McCain have been able to win re-election with a rickety economy and potentially gridlocked Congress in 2012?
Considering the political situation in the U.S. after 2016, it’s tempting to imagine that maybe the fallout from a McCain administration – with the GOP owning the tumultous early 2010s years – might actually have forestalled the party’s descent into madness and left the country on sounder institutional footing. But the price would have come at the expense of many people’s lives and rights, especially vulnerable populations such as the poor and the LGBTQ community, who might not have seen the great particular advances of the Obama administration.
I plan to map out a few of these counterfactual scenarios about politics in future posts. This one, about 2008, is the one nearest to my heart, though, since it’s the first time I was ever excited about a presidential race, and it all happened at a pivotal moment in my life, when I was moving to Chicago for the first time. I did an absentee vote for Obama in Kentucky. However, his first term coincided with the hardest years of my life, when I struggled to find work. I don’t think my life would have been easier under a McCain presidency but sometimes I wonder about the implications.
Doppler radar-like, I could hear it coming and I could feel it passing just by. Two Massachusetts kids raced through the third floor corridor of our dorm, wordless but louder than silence. Stationary, I stared at the red and green mess on the TV. David Ortiz had just delivered his second walkoff hit in as many nights to force a Game 6 in the 2004 ALCS between the Boston Red Sox and the New York Yankees.
A week later: My then-Intro to Greek instructor (and now friend, for 12 years running) wondered to our class if everyone had gone crazy because of the lunar eclipse that October. Nope. The Red Sox had just sent all of Rhode Island into a frenzy by sweeping the St. Louis Cardinals to win the 2004 World Series, their first title since the year World War I ended. My English professor remarked that she knew then-President of Baseball Operations for the Red Sox Theo Epstein’s mother, who was an instructor at Boston University.
“An ancient one”
Baseball is an old sport. The Chicago Cubs began operation in 1876 as the Chicago White Stockings. Before 2016, they had last appeared in the World Series in 1945 – a year before the NBA was founded. Their 1908 title predates both the NHL and the NFL.
Professional baseball’s 19th century origins has meant that there have been some epic championship droughts. The Red Sox did not win between 1918 and 2004, the White Sox from 1917 to 2005, and the Cubs from 1908 to 2016. Even the longest current drought – belonging to the Cleveland Indians, this year’s runners-up – dates to the Harry Truman administration.
During my years in New England in the mid-2000s, and especially during the fall of 2004, the anxiety expended on the Red Sox was heavy enough to send the university campus into frenzies of relief after each victory. Sometimes I thought of this seemingly throwaway quote from Moby-Dick:
“Almost universally, a lone whale proves an ancient one.”
Why did we – even me, an 18 year-old from Kentucky who grew up rooting for the Indians – think so much about this baseball team? Because they were basically alone in their futility, and it was some truly ancient futility, dating to a time when my oldest grandparents hadn’t hit double-digits yet. The entire point of being a Red Sox fan was that you almost certainly had never seen a championship in your lifetime. Every game was life and (will we win before my) death. Generations of fans came and went, but that ancient whale – the Curse of the Bambino, traced back to the fateful 1918 day when Babe Ruth was traded from the Red Sox to the Yankees – was very much alive, however immaterial.
“This grey-headed, ungodly old man, chasing with curses a Job’s whale”
But curses are ultimately just stories. The Red Sox curse broken during my first year in college was “only” the third longest at the time. Why was it so much more prominent than the longer White So and Cubs curses? I mean, Boston is one-sixth the size of Chicago. Both the Cubs and White Sox fanbases are substantially larger than Boston’s.
The answer: The marketing around the Curse of the Bambino was flawless. It combined specific superstitions – the Ruth backstory, the epic collapse to the New York Mets in the 1986 World Series, clips of which had been shown endlessly on ESPN in the school cafeteria during that year’s Red Sox run – with Boston’s longstanding inferiority complex compared to New York City. Being cursed, doomed to root for this alwasy second-best team, was emblematic of being a New England sports fan.
I don’t know when the Red Sox drought in particular took on the momentum of a “curse,” but 1986 seems like a good candidate. Up 3 games to 2 on the Mets, the Red Sox were at one point just one strike away from a title. Instead, the Mets rallied for several runs to completely turn the series around. Infamously, with the game tied, a hit from Mookie Wilson slipped between the ankles of Boston first baseman Bill Buckner, reaching the outfield to send the game-winning run home. Buckner was for years the face of Boston’s baseball failures.
It wasn’t his fault, though. Let’s say he grabs that ball. The game doesn’t end. The Red Sox would have batted again, but they also would have defended the lead in the bottom of the next inning since the game was at Shea Stadium. And guess what: Even with the loss, the Red Sox still had Game 7!
I wasn’t old enough to remember the Red Sox-Mets incident, but I did witness the team’s loss to the Yankess in the 2003 ALCS on Aaron Boone’s walkoff homerun in extra innings. The Red Sox had led by 3 runs as late as the 8th inning when Boston manager Grady Little – in Melville’s terminology, that “grey-headed, ungodly old man, chasing with curses a Job’s whale” – inexplicably allowed an exhausted Pedro Martinez to keep pitching to the Yankees, allowing New York to rally. After the game ended, I thought that maybe Boston is just always going to be second-best to New York, home of the two baseball teams (Mets and Yankees) who had prolonged years of New England sports misery.
“Saturn’s grey chaos rolls over me”
The next year, Boston broke through and then won again in 2007 and 2013. I was finishing college in 2007 and I don’t remember much excitement about that title relative to 2004. The Red Sox were just another team now. The White Sox also won during my undergraduate years.
Still, the Cubs drought persisted, that unrivaled Leviathan of sports curses. No titles since 1908. No World Series appearances since 1945. I moved to Chicago in 2008 and the Cubs won the division that fall. They were swept in the first round and a championship seemed further away than ever, with the drought guaranteed to surpass 100 years.
Since I began my time in Chicago living on the South Side, I started as a White Sox fan and never had many feelings about the Cubs. There was little doubt to me though that the Cubs were the dominant baseball team in the city in terms of fandom. When I moved to Irving Park in 2009, I became accustomed to the train full of Cubs fans arriving at the nearby Metra station from the suburbs, to take the bus to Wrigley Field. The losing persisted.
Many times, I wondered why Cubs fans bothered, not having reached my realization yet that, like Red Sox and White Sox fans before them, the losing perversely made it fun, or at least unique, to be a Cubs fan. Like the Red Sox, the Billy Goat Curse was a triumph of marketing. Following that loss to the Detroit Tigers in the 1945 World Series, the Cubs were for decades the second fiddle to the much more popular White Sox teams of the 1950s an 1960s (the 1959 World Series between the White Sox and the Dodgers was the most well-attended World Series of all time). No one was particularly aware of the Cubs’ title drought even as it passed 70 years in 1978.
Everything started changing in the 1980s. WGN launched its superstation programming, bringing Cubs games into living rooms around the country. Longtime Cardinals and White Sox announcer Harry Caray became the face of the Cubs, bringing his tradition of singing “Take Me Out to the Ballgame” during the 7th inning stretch to Wrigley Field. Steve Goodman wrote “Go, Cubs, Go.”
In the 1990s, the legend was cemented by Caray’s famous “someday the Chicago Cubs will be in the World Series, and it might be a lot sooner than we think” remark in 1991, and the team’s dismal 0-14 start to the 1997 season, which would prove to be his final one. In 1998, the franchise was also at the heart of the race to break Roger Maris’s home run record, with Cubs outfielder Sammy Sosa hitting 66 home runs that year.
Even then, though, the Cubs’ drought, unlike the Red Sox’s, was not one well-known for near-misses and heartbreak. The team had appeared twice in the NLCS since LCSes were first instated in 1969. They blew a 2-0 series lead to the San Diego Padres in the 1984 NLCS, which was then a best-of-5 format – a surprising, but hardly unheard of, feat. They were easily dispatched by the San Francisco Giants in the 1989 NLCS.
It’s true that in 2003 they were snakebitten. With a 3 games to 1 lead over the Florida Marlins – a team that had at one point that season been 10 games below .500 and was managed by the eccentric 72-year old Jack McKeon – in the NLCS, they were shut out in Game 5, then blew a 3-0 lead in the 8th inning of Game 6 after a controversial incident with a fan trying to catch a foul ball. The Marlins won Game 7 and then their second World Series title by defeating the Yankees the next week.
Like the Buckner incident, the “Bartman game” (Game 6) has had many of its vital details airbrushed. The foul ball was probably not catchable. Cubs starting pitcher Mark Prior had thrown over 100 pitches by the 8th inning and unsurprisingly lost his control, walking that same batter on the next pitch on a passed ball. Shortstop Alex Gonzalez botched a surefire inning-ending double play. The Marlins scored an astonishing 8 runs in just that inning.
Like other “cursed” teams, the Cubs were ultimately victims of two contradictory trends, more so than these crazy one-off incidents:
- Until 1969, only one team from each league made the playoffs (and until 1995, only two). This limited a team’s chances unless it had the best record in its division or league. Many Red Sox teams were in fact shut out of the playoffs in the 1970s despite winning close to 100 games, since the Yankees were often better.
- But baseball was also expanding rapidly, with more teams making it harder to win a title in any given year. The Marlins only joined in 1993, for example. Expansion has meant that there are many teams (8, to be exact) that have never won a title and likely won’t for years. Already, the Rangers and Astros have existed for 50+ years with no World Series. The Mariners and the Nationals have never even won the pennant.
- What’s the difference between a 50-year old pre-2016 Cubs fan and say a 50 year-old Milwaukee Brewers fan? Neither had seen a title in a lifetime (the Brewers have never won the World Series). The Cubs “curse,” compounded by lack of opportunity as well as expansion, lasted so long that it became impersonal. Only 100 people on earth alive as of Nov. 4, 2016 were confirmed to have been born on Jan. 13, 1906 or earlier, which is likely the minimm for having been sentient the last time the Cubs won in 1908. It was as if they had never won at all.
Melville has another good quote for this too, one that I think of even more so than the others I have cited here:
“When I stand among these mighty Leviathan skeletons, skulls, tusks, jaws, ribs, and vertebrae, all characterized by partial resemblances to the existing breeds of sea-monsters; but at the same time bearing on the other hand similar affinities to the annihilated antichronical Leviathans, their incalculable seniors; I am, by a flood, borne back to that wondrous period, ere time itself can be said to have begun; for time began with man. Here Saturn’s grey chaos rolls over me, and I obtain dim, shuddering glimpses into those Polar eternities; when wedged bastions of ice pressed hard upon what are now the Tropics; and in all the 25,000 miles of this world’s circumference, not an inhabitable hand’s breadth of land was visible. Then the whole world was the whale’s; and, king of creation, he left his wake along the present lines of the Andes and the Himmalehs.”
To be a Cubs fan was to stand constantly amid the “might Leviathan skeletons” of their two titles (1907 and 1908) from the Theodore Roosevelt administration, seeing the “partial resemblances” of the dead ball era game to today’s multimillion dollar MLB juggernaut, thinking about their “incalculable seniors,” many of them long since perished waiting for a Cubs title, letting your thoughts bear you back to “that wonderous period” before Wrigley Field (the second oldest park in the majors, having been finished in 1914) was even built, indeed before time itself for anyone who is currently living, obtaining only “dim, shuddering glimpses” into what it must feel like to celebrate a Cubs title, and imagining an entire world that was yours for a day as you basked in your post-championship euphoria.
Those two kids running through the 3rd floor corridor were probably heading for the quadrangle. I didn’t follow them. But they were also running into the past, letting “Saturn grey chaos” roll them back to a reconstructed past they never lived through, a virtually ancient New England where the Red Sox were somehow the world champs. Would it feel that good this time, in 2004?
I had no rooting interest in the Cubs-Indians World Series this year. But once the game pushed into extra innings, I remembered 1997. That year, the Indians lost in Game 7 in extra innins to the Marlins – exactly the situation in 2016, except against the Cubs. I had been rooting for the Indians all postseason that year, watching the games with my grandfather at his house. When the Marlins got the Series-winning double, it felt like a gut punch; I’ve never really cared about any sports outcome as I did that one, when I was still an impressionable 11 year-old. This time came close since the circumstances were so similar, at least on the TV screen. I kind of miss getting so wrapped up in somewhat meaningless things like sports fandom now. I also missed him, and wondered what it woudl have been like for him to live to see all the curses – even the great Leviathan itself, the Cubs drought – finally end, with me 30 years old and sitting next to my dad on the couch in our North Side Chicago house.