Olympic medal tables and “decline” narratives

In 1988, the Soviet Union and East Germany dominated the podium at the Summer Olympics, leaving the U.S. and every other country in their wake. They won a combined 234 medals, or 32 percent of all medals awarded in Seoul that summer. A little more than 3 years later, neither country existed.

Twenty years after that, China won an astonishing 51 gold medals in Beijing, 15 more than the second-place U.S., which had easily topped that category in 1996, 2000, and 2004. The surge was a perfect complement to books in vogue at that moment, including Martin’s Jacques’ hyperbolic “When China Rules the World.” Yet this year, China managed a mere 26 – not bad, and good enough for third, but behind the 27 for Great Britain and 46 for the U.S., two countries whose combined populations is only 1/3rd of China’s – and which together are often described as being in terminal geopolitical decline.

Medals and geopolitics

Olympic medal tables have indeed often been read as geopolitical commentary. For instance, the mid 20th century tables seemed to reflect the dominance of the U.S. and the U.S.S.R., as expected during the Cold War – the latter in fact remains 2nd all-time in golds in the Summer Olympics, despite not having competed since 1988. Nazi Germany was the clear winner of the 1936 haul right as it was becoming an expansionist power, and Britain – perhaps owing to its long imperial history and diverse sporting culture – is the only country to have won a gold at every Olympics.

In Rio de Janeiro, the medal table was topped by the U.S. and Great Britain (Northern Ireland competes with the Republic of Ireland in the Olympics, which precludes usage of the “U.K.” as a team identifier), two countries that, as I noted earlier, have been pegged as in “decline” for decades. In Britain’s case, decline has been recognized since at least the end of WWII, gthen iven a fine point with the handover of Hong Kong in 1997, and finally turned into humiliation with the isolationist Brexit referendum. For the U.S., decline has been a constant concern, whether in the context of burgeoning Soviet strength in the 1950s and 1960s or the economic “malaise” of the 1970s and early 1980s.

Does the recent sporting dominance of these two English-speaking countries say anything about their geopolitcal  staying power? The U.S. and GBR are the first and fifth largest economies by nominal GDP, respectively, so one might expect them to at least have ample economic resources to pour into their sporting programs. But elsewhere, neither country is particularly distinguished at soccer, the only game with an international event (the World Cup) that can rival the Olympics’ prominence. They can’t even keep up with the likes of Argentina and the Netherlands there, both much smaller countries and economies.

It’s possible that the U.S. and GBR in 2016 could be like the U.S.S.R. and East Germany in 1988, with their exploits on the medal table largely independent of their “declining” status as great powers. Alternatively, perhaps their success hints at underrated strengths.

Decline or not?

The long-term narrative of “globalization” is often cited to explain both the decline of the British Empire’s once massive reach and the short-by-comparision postwar geopolitical dominance of the U.S. But as the anthropologist Pierre Bourdieu has noted, globalization is not so much homogenization as it is proliferation of the power of a handful of already-powerful nations, especially in terms of their financial clout. In 2016, New York City and London remain as dominant as ever as financial centers, having been strengthened by decades of deregulation, policies favorable to capital mobility (but cruically not to the same for labor), and the spread of high-speed IP networking (e.g., the Internet).

Meanwhile, scholars such as Michael Beckley have made the contrarian argument that in areas such as military capabilities, the gap between the U.S. and everyone else is actually getting wider, not narrower, and that the perceived transfer of power to the developing world because of offshored manufacturing is mostly an illusion. That is, many of the goods produced in China and Southeast Asia are overseen by foreign firms, which specify the designs in question.

The issue with assessing any decline narrative, whether informed by Olympic medal table reading or not, is that it is has often been difficult to figure out just how far declined (or not) a country is. The Soviet collapse of 1991 was wholly unexpected, even by the CIA. Japan’s 60-year transformation from a WWI Ally to a WWII Axis to a “Westernized” industrial power could have been scarcely imagined in 1910.

Maybe the U.S. and GBR really are on the verge of late capitalist collapse, in a twist of the crumbling planned economies looming over the Eastern Bloc amid the glories of those Seoul Olympics. Or perhaps they’re like their same old selves from 1908, before any of the turmoil of the 20th century, when they finished 1-2 with a combined 193 medals at London.

 

Uber, Lyft, and “legacy” business models 

Years ago, the Tumblr of someone named Justin Singer expressed some of the most sophisticated criticism to date of ride-sharing in general and of Uber in particular. He deconstructed the short-lived Uber talking point about UberX drivers making $90,000 per years and contextualized the service’s rise as part of the growing commodification of the taxi industry:

“The story of the for-hire vehicle industry has been one long march toward commoditization, with drivers always getting the short end of an increasingly smaller stick.”

One question to ask is why the “stick” here is getting “shorter” to begin with, despite the enormous pool of money filling up in Silicon Valley. Uber is an incredibly well-capitalized firm, having raised an astonishing $15 billion in equity and debt since 2009. That money is not trickling down to drivers, though, and Uber itself, even with all of that cash, is essentially a middle man between ride-seekers and independent contractors. Many of its drivers may be making minimum wage or, worse, running at a loss. Uber is a confidence game in which drivers collectively overlook the costs that they must shoulder to partcipate.

Anyway, that $15 billion is even more astonishing when considering the recent relative size of tech funding as a whole. From 2012 to 2015, total private funding in tech was $138 billion. Meanwhile, Apple paid out over $160 billion in dividends and buyback over that same time. Uber is both a huge chunk of all tech-related funding and, like Apple, an extremely efficient re-distributor of wealth upward (i.e., for its investors) – a model of shareholderism.

So in the midst of so much jargon about”entrepreneurs” and “innovators,” vast sums of money are going toward 1) extracting money from the existing taxi and limousine infrastructure and 2) paying shareholders (explicitly in Apple’s case, preempitvely in Uber’s).

But the banality of the ridesharing economy is perhaps best seen in the fact that it is trying to reengineer public transit to be less efficient (tons of private cars instead of buses) and more expensive (as a public goodturned into a private rent, it will inevitably become this). “Innovation” is apparently mostly about privatizing BART, or as Anil Dash has put it, “converting publicly-planned metropolitan transportation networks into privately-controlled automated dispatch systems.”

The reason I often put these buzzwords in quotes is that they now seem emptied of any clear #content. John Pat Leary’s seminal series Keywords for the Age of Austerity has examined why, for instance, “enterpreneur” has become ubiquitious to the point of meaninglessness in business jargon. Similarly, the scraping-by wages of the gig economy actually represent “flexibility” and “autonomy,” while across the board, whether Uber or Theranos, aggressive privatization and neoliberalism is instead just “technology” simply working out inevitable change that, as it happens, exacerbates inequality along predictable lines (college education vs. none, coastal cities vs. “flyover” country,” etc.).

A major beneficiary of the Silicon Valley lingo, though, is the cottage industry of satirists that have taken it to heart. Good satire requires a predictable target, because A) the pattern of behavior provides a clear target for ridicule and B) such predictability means that future events are likely to only strengthen the long-term resonance of the satire. This is why ironic internet accounts such as Carl Diggler (a fictional character who writes columns at cafe.com and has his own podcast) and @ProfJeffJarviss work so well.

Diggler, for example, set out to make fun of the “centrist” Both Sides Do It “beltway insiders” who think the fundamental goals of American politics are to cut Social Security and demonize Russia. His brand of satire has succeeded as political pundits have driven themselves crazy looking for ties between the Trump campaign and Vladimir Putin; consider this old piece he wrote about being a captive in Russia with this Josh Barro tweet about the country.

Meanwhile, @ProfJeffJarviss has spent years lampooning Silicon Valley VCs and CEOs with a variety of impressive rhetorical frameworks and tools, ranging from “Remember [name of a tech service that probably just launched yesterday],” as if to signal his ennui at even brand new services that, to him the ultimate tech snob, have already become passé; “Naive [a quoted tweet from someone making a common sense point],” to play inside baseball against even rational arguments against “innovation”; and “Very innovative of [company name] to do [trivial thing that is framed as a game-changer],” to elevate the prosaic to the plateau of “tech.”

But his real genius unfolds itself in the normal, everyday actions of his targets, most notably “journalism professor” Jeff Jarvis, who is seemingly predestined to have Twitter meltdowns about how why Hillary Clinton is “smart” to avoid press conferences or to take a quote out of context and proclaim Sarah Silverman’s DNC speech as the “best political speech ever” (instead of “the best political speech ever given by a celebrity ,” which is how it was described – quite a difference, yeah?). He makes a fool out of himself even without even needing the @ProfJeffJarviss foil, and so the parody only reads better and better over time.

In any case, ProfJeffJarviss showed how satire is serious business recently when he tweeted the following about Uber and Lyft:

 

Screen Shot 2016-08-22 at 10.14.38 PM.png

There is a lot to unpack here. In the “legacy pricing” tweet, he uses that epithet to refer to Uber’s current model of pricing rides according to an algorithm is deft since it frames something so often touted as uber cutting-edge – opaque “algorithms” – as laughably outdated in the face of just giving away something for free, which is often what Lyft and Uber do anyway when they run aggressive promos and “first ride free” deals. It’s possible that the enormous price cuts that both services provide, as a result of their massive capitalizations, is more important to their success than any “algorithm” cooked up by a programmer-genius.

The “freemium” tweet is more complex. The “rudeness” of asking for money  that he alludes to is something central to the modern economy, in which it is considered impolite to frame your search for a job as about getting the money that you so obviously need in order to survive in a capitalist society. Instead, “passion” and “dedication” have to be at least feigned, if not converted to as a sort of secular religion of individualism. Tips are nice under this ideology, but what really matters is “#creating” “#value,” the hashtags both markers of the empty jargon of so much social media terminology that prioritizes vague concepts – “engagement,” “thought leadership” – instead of the concrete notions of money etc. that are supposed to be so central to the economy!

Given how little most Uber and Lyft drivers earn, @ProfJeffJarviss isn’t wrong to say that what they are really doing is just performing an elaborate routine to awkwardly signal their inclusion in the nebulous “tech” world. They’re not earning $90k a year, but they are #engaging passengers and challenging “legacy” industries such as taxis, apparently. Still, there is something extremely old-fashioned and “legacy” about even these ridesharing startups, which subsist mostly on the laissez-faire brand of capitalism and sheer force of investment capital that were so instrumental to the business monopolies of the early 20th century. “Legacy pricing” – that’s what we get with each $5 Uber ride, underwritten by the old school investment power of Google, Goldman Sachs, et al.

On the brightside…

One of the all-out unpredictable oddities of this U.S. presidential election cycle has been nostalgia for the Cold War – from the ostensible “left” of the American political spectrum, of all places. Usually, spinning fever dreams of a renewed rivalry between th U.S. and Russia is something voters associate with the “right,” e.g., Mitt Romney in 2012 when he called Russia “our number one geopolitical foe.” But this time around, it has been Democrats battering Republicans, and Donald Trump in particular, for their ties to Vladimir Putin. It’s a mix of Manchurian Candidate-style conspiracy theory and what I can only guess is the decades of anti-Russian indoctriniation drilled into the Baby Boomer and Gen X generations, who grew up when the U.S.S.R. still existed, coming back to life like some capitalistic vampire out for new blood.

Why would anyone not insane have fond memories of the Cold War? It brought civilization to the brink of destruction in 1962 (and likely at many other points that we don’t even know about). But it still seems to give 40-something bespectacled GOP pundits as well as PR firm shills hard-ons (and make no mistake, these are predominatly/almost entirely male subgroups we’re talking about here) thinking about Washington and Moscow rattling sabers in Ukraine or Venezuela. There will have to be reckoning in U.S. foreign policy at some point, which for so long has coasted on endless spending fighting imaginary foes, including the hollowed-out shell that is 21st century Russia.

So if even the Cold War, with all of its apocalyptic overtones, can be rehabilitated by our collective dark nostalgia, what can’t be given a favorable coat of paint? Anything? There are moments from my own past that I feel like I can recognize as “inferior” to the present, and whose lived experience I recall being just awful – the endless afternoons of 2010 and 2011 despairing in a studio apartment, the painful ends of relationships – but which, looking back upon, a certain fondness can be conjured up. In trying to come to terms with this, uh, non-sequitur, I often think back to this Margaret Atwood line from Cat’s Eye:

“I can no longer control these paintings, or tell them what to mean. Whatever energy they have came out of me. I’m what’s left over.”

I think the past is not just what is being remembered but also what is being re-experienced; your brain is not a computer, retrieving memories, but something that needs to be conditioned, like an athlete, to perform in certain ways under specific sets of circumstances. In this sense, the past is similar to an artwork, which regardless of its particular character can be approached from different pathways – a new area of visual focus in a drawing, or a dedication to listening more carefully to the bassline on a song that you never really listened to closely before – each time. But the visceral feeling toward it, the “meaning” it has in the mind as a series of associations with the minor and major details of the moment, is to some degree uncontrollable.

I mean, I was thinking about Daft Punk’s much-maligned third album Human After All the other day. It was released at a pivotal time in my own life, March 2005, when it felt like the stresses of adult life were first starting to register in my previously invincible teenage consciousness. I am certain that if I were to go back and replay March 14, 2005, I would probably be exhausted, confused about if I had made the right college choice, and on the verge of taking a nap at any given moment, lusting for the release of Ambien in the evening. I wouldn’t be reveling in the sounds of “Human After All” or “Technologic.”

Yet in my memory it seems sunny, a set of effervescent scenes through which my younger self strides to a soundtrack of “Robot Rock” (from that album) and my roommate’s obsession with The Killers song “Mr. Brightside,” which I remember only sort of liking then but which I adore now, if only because of its proximity to my current favorite moments from that era. Likewise, 2010 – when I started my first teaching job, for nearly no pay, and began to listen to the album in depth – was a similarly trying time that I would never want to relive. Still, in my memory it’s the formative time that I got into that album, began to rethink my previous faith in music critcism, and oh yeah finally dug myself out of my post-college pit (though subsequent things had to go right as well).

And just as these at-the-time terrible circumstnaces have been turned around years later, due to myriad details that I could not totally control, there are plenty of “perfect moments” of relaxation and sunshine that I passed through, though I would cherish forever, and now cannot even remember.

This is where I think the “I’m what’s left over” part of Atwood’s quote comes in. Ironically, maybe it’s because the problematic moments took such a pound of flesh out of me that I become inexplicably nostalgic for them. They made me, in some way, to a greater degree than the idyllic ones, and I can realize their effects by seeing my current self – with its ever-shifting and reconstructive attitude toward the past, which will remake even it in time, perhaps leading it to re-darken the same moments it had previously let the light in on, who knows – as what’s left over, a sculpture in conversation with all of its pieces that were knocked off along the way.

I can try to give what happened meaning, but the effort seems in vain: I’m as likely to think that the miserable snowy winter of 2005/6 was a golden age of youth as I am to forget almost all of the details of the then-glorious life-affirming college graduation and think it was just a gateway to adult misery. Nostalgia kinds works this way across the board, I feel: For example, even though I can get almost any song ever recorded streamed to my phone now, I sometimes pine for the days in my dorm when everyone’s iTunes libraries were shared over the LAN and I could peek to see if Bright Eyes or “Mr. Brightside,” even, were in there.

Look, I have no idea if this is the labyrinthine memory-struggle reckoning that has driven 40 and 50 something political commentators to wish that the world were every day on the cusp of nuclear war. Maybe it’s a way of retrospectively looking on the bright side – sort of “swimming through sick lullabies,” a lyric from “Mr. Brightside” that I always thought was a good description of revisiting quaint subject matter with a newly darkened outlook – to play self-defense against one’s own past…

Drugs and anti-aging

When I was a teenager, I had severe nodular acne. Using Clearasil didn’t help, washing my face for what felt like hours each morning and evening with a hot rag (to open the pores) and various cleansers didn’t get even dint the oil, dirt, and painful boils and whiteheads that were seemingly permanent fixtures of my face. It was a point of extreme anxiety during the endless expanse of middle school; 1997-2000 felt like a fucking decade, since on top of all the usual changes of adolescence there was my red and often swollen visage.

Sometime in 1999, we went to a dermatologist to assess treatment options. After cycling through a few medication including the antibiotic tetracycline, the doctor prescribed Accutane, a synthetic Vitamin A* derivative that first hit the market in 1988 and was still on patent by the pharmaceutical giant Hoffmann La Roche at the time. Ovoid little yellow pills in a branded blister pack, Accutane was serious business; it contained black box warnings about pregnancy risks, which in the coming years (along with the drug’s alleged propensity to cause irritable bowel syndrome) would prompt Roche to pull it from the U.S. market not long after the patent expired and lawsuits began mounting.

Accutane’s side effects are considerable. Some of them have a frequency of > 1 in every 10 patients! For comparison: the potent prostate drugs Proscar and Avodart (more on both of these medications later) have been the subjects of enormous online controversy for precipating erectile dysfunction in literally fewer than 1 in a 100 men.

I took my first dose sometime in early 2000 – I’m not exactly sure the date. It was quickly apparent that something had changed. My lips began to dry out, and would almost blacken in the following months. I got random nose bleeds. My acne got much worse for the first couple of weeks, the pimples painful to the touch. A picture of me in May 2000, at my grandfather’s 90th birthday celebration in Kentucky, showed my early-phase red face with my then-platinum blonde hair.

And then, nothing. Within months my acne was eradicated, I finished the prescribed pills, and that was that. You would never know without asking me that I had previously struggled so much with it. Without Accutane, I would likely have continued to battle acne into adulthood and been left with signifcant scarring. I have never taken any drug that was so effective.

My experience with Accutane changed how I perceive drugs as both medical and recreational substances. It made my expectations just absurdly high – unmeetable.

I had high hopes for Prozac when I started taking it in 2004, making a long walk from my dorm to a psychiatrist’s office about a Myopic Books store, bumping into someone with whom I still talk to today along the way, listening to Joy Division’s “Shadowplay” in my discman for probably the last time ever, and camping out at a Starbucks since I was too early for my appointment. It might have stabilized my mood at best, if it did anything other than make it hard to get hard (in both men and women, Prozac causes sexual dysfunction in a staggering 3/4ths of patients). Wellbutrin, the alternative I began taking in 2005, was better but hardly life-changing.

Recreationally, marijuana reminded me of alcohol, albeit without the worse aspects of the latter like dehydration and hangovers. But like anything else of that ilk, the high is temporary. Being without it makes you want it more, and yet each experience someone feels diminished from the first one. Accutane doesn’t make one “high” per se, but its benefits are permanent, even years after cessation. There’s no moment to recapture, since it’s always with you.

As a cosmetic and anti-aging drug, with the ability to prevent the nearly inevitable scaring that comes with nodular acne, it has few peers, especially considering its immediate efficacy and lasting effects. I mentioned Proscar and Avodart earlier. Both drugs are also somewhat effective as acne cures since they dramatically reduce the serum levels of DHT in the body, which has the indicated effects of reducing benign prostatic hyperplasia and halting male pattern baldness. A doctor I spoke to once confessed that every male in his office – everyone from the guy with a solid Norwood V hair pattern, to the full-looking yet folicularly thin physician himself  – was on Propecia (which is 1/5th a dose of Proscar, taken daily), which made me think if there were any reason not to take the drug if one were a healthy male with a predisposition to MPB. Its anti-aging effect in terms of reduction of prostate swelling and hormonal acne.

As an Avodart off-label user, I admit that my reasons are cosmetic and superficial, with health only a secondary concern. The need to constantly take the medicine, rather than finish a designated, shorter window of treatment a la Accutane, is admittedly a drawback, but also a reminder of the difficulty of halting some of the most obvious harbingers of aging, including MPB, BPH, and harder skin. It seems like there ought to be a more convenient form of treatment, and undoubtedly many new tries at anti-aging, from infusing the blood of younger people into older ones to the development of more sophisticated biologics, are coming. But the Accutane difference is what sticks with me.

That difference has remained with me whenever I think of what’s really satisfying, like creating something, or being moved by a certain visit or personal interaction. I mean, I still glow thinking of a life-sized self-portrait I scrawled out in a dormitory basement in 2005. The drive to keep “go back for more,” redo something in a similar way, doesn’t necessairly wash over me since I know the exact circumstances can’t be recreated and that I’ll always enjoy it in a particular way. Realizing that frees up mental space to pursue fresh thoughts and new adventures – and somehow, Accutane shares in the credit.

*: my father, whose acne was if anything far worse than mine, told me a story about once consuming cod liver oil as a treatment; cod liver oil is one of the richest natural sources of Vitamin A, indicating that speculation (this would have been in the 1960s) about a cure for acne was already on the right track.

“Hurt” and cover versions

A few days ago, my father and I were discussing cover versions. He was thinking of putting together audio playlists for bar trivia at a newly opened restaurant – a staple of NYC trivia that he was exporting to our small town in Kentucky – and he floated the idea of a list made entirely of songs that are more famous as covers than originals. We both immediately thought of the same song: “Hurt,” as rendered by Johnny Cash.

The history of “Hurt”
When Trent Reznor (aka Nine Inch Nails) recorded “Hurt,” he was in his late 20s. Over 6 minutes long, mostly quiet, but packed with dynamics changes, “Hurt” is atypical of its parent album, The Downward Spiral. Anyone who lived through the 1990s likely associates that LP with the creepy video for “Closer,” which features both an infamous “scene deleted” card (in a video full of undeleted distrubing images) and a bleep-out on the lyric “I want to fuck you like an animal/I want to feel you from the inside.”

After an hour of grunting and quasi-rocking out (NIN always had a certain shambling quality to them, with their drum machines and gloomy 80s synths always sapping a bit of their rocking vitality out of even their loudest songs), “Hurt” is the big Dylanesque finish – both a brief respite from the preceding violence and a bleak prelude to what happens when the music is over (to borrow a Doors lyric). It begins with a windy swirl and ends in a grind of listless noise that fades to black.

In between, Reznor spins a tale of impending suicide. “Everyone I know goes away in the end,” he says, with the pseudo-profunity of a 20-something who knows death exists, but doesn’t realize yet – because of lack of age – how close it is to you at any given moment. “You could have it all, my empire of dirt. I will let you down. I will make you hurt.” There’s grandiosity here, from someone who has already achieved a lot before turning 30 (“my empire”), but also  resignation (“you could have it all”), a subtle change in voice that shows the speaker finally acceeding to the power of what he once controlled. That is, the “you” seems to be heroin, which elsewhere in the song Reznor addressed memorably as “my sweetest friend;” the “I” seems to be the drug personified, talking about it simply making him hurt.

Cash’s version
The striking thing about “Hurt” overall is how the cover version illuminates the strengths of the original, not weakening it or showing it up in any way, while also introducing an entirely new reading of its lyrics’ meaning. Somehow, there is a perspective that only a 70-something could bring to these words; we just didn’t know it until Cash brought it into the open.

No cover version has been so thoroughly changed simply by dint of the covering artist’s age difference from the original performer. Cash was old enough to have been Reznor’s father; by the time “Hurt” was recorded, he was already in his 70s.

“What have I become?” is perhaps the most touching lyric from “Hurt,” and it has a markedly different meaning coming from the mouth of 71 year-old compared to that of a 28 year-old. For the latter, “What have I become?” seems like generalized Gen-X angst about not having changed the world by age 30, a subtle prelude to wishes of suicide driven not just by loneliness (Reznor’s version of “Hurt” is notable for how it seems to unfold in a vast room in which he is the only occupant; he was the sole performer on many of The Downward Spiral‘s songs, albeit not “Hurt,” on which he relied on an outside human drummer) but by Julius Caesar-grade inadequacy. Caesar apaprently wept at a bust of Alexander the Great, despairing at what the conqueror had achieved at an age the would-be Roman emperor had already passed.

But for someone in their 70s, and with the backstory of Cash, “What have I become?” is not a preemptive justification of suicide. It’s a confessional, and one with an unmistakable physicality: Cash’s voice, always gruff, was shredded by this stage of his life, with the natural smokiness and grit that everyone from Nick Cave to Death Grips have tried to achieve instead by affectation. In uttering “what have I become?” in that voice, he answers his own question.

For Cash reading “Hurt,” it is too late for suicide, but too early for death. He truly has seen “everyone I know” go away in the end, unlike Reznor, who in 1994 could only hypothesize about such in rationalizing his hypothetical drug-induced death. Cash, ravaged by years of his own drug use, has already let his demons “have it all,” the “empire of dirt” that indeed looks increasingly indistinct as his natural death approaches.

Cash shortened “Hurt” to barely more than 3 minutes, stripping out the lead-in and noisy outro and rearranging it with only guitars, his voice (he also replaced “crown of shit” with “crown of thorns,” which presents him as a Christ-like figure bleeding out from the years of needle sticks and painkiller highs), and a lone piano that thuds in and out like an insistent church bell. The original dynamics still shine through, though, especially as Cash gives a distored “If I could start again…” near the end that never fails to raise the hairs on my neck as I imagine what it must be like to be imagine starting all over again despite the shackles of advanced age.

There are plenty of startling cover versions out there, but no one so dramatically seized the opportunity the way Cash did in turning “Hurt” into his deathbed autobiography.

 

Follow

Get every new post delivered to your Inbox.

Join 269 other followers