Last week, I went to Austin Music Hall in Texas for a party hosted by the cybersecurity firm Trend Micro. The affair was “Star Wars”-themed, from the paper-based quiz of trivia from the films (somewhat dulled by everyone in the building not only having a smartphone, but being in some way connected to “IT”) to the cocktail selection, which included a “Tatooine Sunset” that was regrettably mostly pineapple juice in a glow-up glass.
With a cover band whirling through slightly dated pop hits from R. Kelly’s “Ignition (Remix)” to Metallica’s “Enter Sandman” (a large costumed Jabba the Hutt even danced to the rock-y rendition of “Wagon Wheel”), the whole scene confirmed what I had felt for years: That the previously “oppressed” geek minority with its fetishistic, once unusual culture – of sci-fi, comic books, computer programming, etc. – had become exceedingly normal, while still maintaining its bygone outsider status as a sort of protective moat around its particular tastes.
The endless sequel trains of Marvel movies and the reboots of seemingly every TV show, video game, and other pop culture franchises are the best evidence of this shift – how long till a “reboot” of “Howard the Duck” smashes the opening weekend box office record? The sequlitis coming to “Star Wars” itself is already depressing enough, especially if you consider that something as offbeat as the first run of “Star Wars” movies (at a time, the 1970s, when blockbuster sci-fi was truly a fiction, a pipe dream) would struggle to even emerge in today’s homogenous cultural stream. But you can get a more visceral sense of how deep-seated and defensive the new geek-driven pop culture has become by discussing music.
A culture dominated by geeks (or nerds or whatever now-empty term, no longer effective as an epithet, you want to throw around) is one that is not easily given to experimentation or the abstract. You see this dissonance between geekdom and the avant-garde in the vast gulf between, say, comic books and non-representational visual art, between the predictable effects of an algorithm and the never-settle questions of a text like William Gaddis’ “JR.” Moreover, the likes of “Game of Thrones,” “Iron Man,” and “World of Warcraft” – I mean, even the stereotypical startup, with it sights set on helping you share photos in yet another way or exploiting some precarious worker’s need for “independent contracting” – are resoundingly concrete, to the point of humorlessness and managerialism.
They are then defended with equally dour moralizing from pop culture critics, who more than anything seem to want to reassure their audiences that what they already like – regardless of whether it is ever challenged – is perfect, as Freddie deBoer has pointed out. But how does any of this affect music?
Since at least the rise of the Backstreet Boys in the late 1990s, pop music has stuck to a largely uniform sound that melds arena rock and R&B. A handful of middle-aged Scandinavian men have penned the bulk of all hits in this mold for the past decade plus, from Britney Spears’s first album to Taylor Swift’s “masterpiece,” “1989.” When the latter came up in one of our conversations in Austin, several people rushed to assure everyone that they didn’t like her – or previous pop scions like Blink-182 – “ironically,” but genuinely.
Why would anyone like music ironically? Is, or was that ever, a thing? As Betty White once said of Facebook, “it seems like a huge waste of time.” Still, I can sketch out the roots of this attitude in my head. There seems to be an assumption among people roughly my age (I was born in the mid 1980s) that there exists some sort of cultural ivory tower, filled with people who spend their days listening to opera, reading Tennyson, and brushing up on their Latin and Greek. These mostly make-believe people, who do not at the very least constitute a coherent cultural body, are then assumed to frown upon things such as pop music, geek culture, and unrestricted technology-enabled capitalism (which makes the former two items so pervasive in the first place).
For me, the avatar of this fiction was the website Pitchfork, which I began reading long enough ago to remember its days as pitchforkmedia.com. I found it one day in 2002 while searching for a review of a Queens of the Stone Age album, which the reviewer liked but only halfheartedly (meanwhile, it was getting raves from Rolling Stone and other print magazines). Pitchfork, despite a relatively modest readership, introduced many people of my ilk – teenagers and twenty something East Coast college kids – to “indie rock.” At the same time, its carefully constructed ignorance of most pop music of the early and mid 2000s – its halcyon days – was commentary enough on its low opinion of the “mainstream.”
Pitchfork and many other e-zines (whoa, outdated term!), some of them now defunct (RIP, Stylus Magazine and Dusted Magazine), created their own parallel canons of lo-fi rock, techno, and folk musics, which didn’t intersect much with the rock critic-driven Great Albums lists put out all the time by print media during those days. All the same, Pitchfork was hardly on the cutting-edge of pop taste or the charts. Its influence felt all-encompassing to a daily reader, but didn’t even exist for billions of other music fans. Yet somehow the myth of the Indie Snob as not only a real archetype for music fandom but the default one lived on.
Accordingly, even now, in 2015, when Pitchfork itself goes out of its way to review a Ryan Adams cover album of “1989,” people feel that they have to let the idea of “ironic” love of a song or artist float in the air for a second when discussing taste, just to put out feelers the mythical High Culture Warriors among us and cultivate momentary solidarity with them. Then, they pop the irony balloon and admit the true default position: liking pop music unironically. It’s the classic iron fist in a velvet glove maneuver, except much more arrogant.
I don’t love pop music, and yet even I will not begrudge someone their affinity for Rihanna, The Weeknd, or anyone else, in public or or private. The taste doesn’t bother me, but the idea of an all-powerful and nonexistent cultural elite (god?) frowning upon the actual elite does. It reminds me of the 2012 U.S. presidential election, when the likes of Jack Welch, Rupert Murdoch, and other GOP donors bemoaned the “Chicago-style politics” and omniscient “paymasters” behind the Obama campaign, when the candidate they supported – Mitt Romeny – was a centimillionaire with the backing of Wall Street and virtually every non-governmental establishment institution in the country!
A similiar thread of fake oppression also comes up in geek-dominated fields such as computer science and electrical engineering. The recent episode with Mohamed Ahmed, a 14 year old student who “built” a “clock” (it seems he took the circuit board out of a commercial one and then put it in a briefcase), is instructive. The left-wing – which I consider myself a part of, though at time its feels oddly unfamiliar on some issues – rushed to the kid’s defense after he was arrested for bringing his concoction to school, citing discrimination against him as a Muslim (perhaps) and as a STEM-oriented “maker” (ugh).
The idea that most Americans (or anyone else) actively dislikes science geeks because they “create” things or fetishize technology is hilarious. We live in a world in which Facebook has more than 1 billion monthly users (although I guess one could dispute that Facebook itself actually “makes” anything other than a container for other people’s unpaid labor; maybe some day it will – in the spirit of Margaret Thatcher’s quip about socialism – run out of other people’s time and effort), in which conversations about college are dominated by talk of (fake) STEM major/skills shortages, and in which tech firms dedicate enormous amounts of PR time and energy to a gender gap while simultaneously pushing the hyper-masculine idea of maker culture; to put “makers” on a pedestal is to demean the care-taking work that holds so much of the world together. Not even the NYC government could quash Uber’s environment-polluting, downward-wage-pressuring ways, lest it be seen as an enemy of “innovation” that is mysteriously in short supply everywhere but the geek-led tech community.
Why do dominant cultural forces feel the need to create a vast fake counterculture that is (impossibly) more elite than it is? I think it is to consciously reinforce their own tastes (and ensure the capitalistic reproduction of those tastes for years to come) and subconsciously to hide some degree of embarrassment. Just as Google’s rebranding as a subsidiary of Alpha may have been driven in large part by the search company’s bashfulness at being essentially a giant ad firm (ads = so 20th century), love of pop music may be felt at some level by a fan as guilt. This necessitates the identification of something obscure and/or vaguely hip, liked by an élite person who in theory would sneer at pop music (I find indifference is the much more common reaction) or go even further and pretend to enjoy it ironically (if the sneer is too much effort, then the full-on ironic facade is a true Herculean labor).
Never mind that the richest, most “successful” members of the music industry aren’t Japanese doom metal peddlers, Finnish techno connoisseurs, or Swahili-speaking folk singers. I mean, just look around: One might have to actually make a conscious effort to actually avoid hearing Taylor Swift or Rihanna, but could live her entire life without ever having to lift a finger to avoid hearing a Pan Sonic tune. Unironic love for this music is by far the norm. Meanwhile, irony itself is often opposed to the capitalism that has made pop music possible. Irony lends itself to satire, humor, and various other attitudes that clash with the managerial, data- and formula-driven ethos of pop (and of neoliberal economics).
So when someone says she like a ubiquitous artist unironically, realize that she is in some way fighting a battle against an overwhelmingly dominant cultural force. She sees the possibility of liking something ironically – with all the “unproductive” hours spent cultivating that attitude at the office or at home that that would entail – and flirts for a moment with that impractical (under capitalism) position, and then moves on to the idea of there being some vast, sneering high cultural elite that loves free jazz and Goa trance. Finding that group unlikeable, she uses the term “unironically” to reinforce her tastes while shutting off alternatives to the mainstream. It’s too bad since while the High Culture elite is a fiction, there is life beyond the Top 40, in other musics that, yes, are sometimes challenging or not a good fit for everyone but which are not defended with anywhere near the nose-turning so characteristic of mainstream culture whenever its sacred cows are targeted for slaughter.
Years ago, I walked past a dumpster in one of Chicago’s distinctive Northwest Side alleys and saw a painting in it. I had previously found a good coffee table in this same receptacle, so I was accustomed to seeing and looking for salvageable stuff in there. It was a giant canvas panel with the album cover of the Rolling Stones’ 1994 album “Voodoo Lounge.” Various smart-alec remarks ran through my head, most memorably the “Garbage – indeed” critique of the techno band Garbage’s debut album in some publication I can’t remember. “Voodoo Lounge” is not one of the Stones’ most loved albums, but it has always been special to me.
Into the Voodoo Lounge…
For starters, the whimsical cover, with a leopard-like figure on the front and the tongue logo on the back, is probably the 2nd best in their catalog after the Warhol-designed print for 1971’s classic “Sticky Fingers” (the soundtrack of my 2003 summer at the Governor’s Scholar Program in Danville, KY, and the first recording to use the iconic tongue – in this case, as the image below a functional zipper on the original LP sleeve). Second, my introduction to it was through the song “Thru and Thru,” which I must have heard 1,000 times while listening to the soundtrack of “The Sopranos” while in the car with my mom during our drives to Elizabethtown, KY in the early 2000s.
“Thru and Thru” is the rare Keith Richards lead vocal, with cheeky lyrics (“you know that we do take away/we deliver too”) as well as a tantalizingly slow pace punctuated by Charlie Watt’s occasional fills. On a soundtrack that featured an extremely Stones-y song by the Lost Boys, the inclusion of this obscure, uncharacteristic number – the penultimate track on a mostly forgotten studio album – seemed almost like an in-joke, but I loved it. A few years later, one of my college roommates and I sat around listening to “Love is Strong,” the opening song from “Voodoo Lounge,” and marveled at how such a memorable, forceful performance was delivered by a band that at the time of production already had 30 years of recording and touring under its belt.
Still, I never listened to the entire double LP (it’s one of the band’s longest works, at 63 minutes) until I got an Apple Music subscription a few months back. I played all 15 tracks over my Jawbone Jambox while sweeping the floor one day, and it was fun. Whether you are new the to the Stones or a longtime fan who never got around to their twilight output, “Voodoo Lounge” is shocking. It has some of their some of their most explicit lyrics – these 50somethings (in the 90s!) were clearly on the prowl all the time – and rocks with a youthful energy on songs like “You Got Me Rockin'” and “Suck on the Jugular.”
I would have just let the album pass as another daily listen (I’ve been trying to listen to a new album from start to finish every day) if not for the fact that I finished Shakespeare’s much-maligned “Titus Andronicus” later the same day. Like Shakespeare, the Stones are a British cultural institution with a vast catalogue that by turns is considered “classic” and utter dreck. I have always struggled to reconcile my own views with this critique.
…and onto Titus revisionism
My listening experience with my friend in 2005, i.e., our time soaking in “Love is Strong,” was a much-needed corrective to the critical vacuum I had been occupying for years, in which the Stones allegedly hadn’t produced anything good since 1972. Similarly, my Introduction to Shakespeare in 2004 was an eye-opener because it focused so heavily on less-regarded plays such as “The Two Gentlemen of Verona” and “Richard II.” The former is memorably quoted in “Shakespeare in Love,” and the way it plays with gender and disguise may as well be a roadmap for how the Bard tackled these concepts throughout his 25-year career. We didn’t read “King Lear” or “Hamlet,” and yet it didn’t feel like a loss.
Shakespeare’s generally agreed-upon chronology is the opposite of the Stones’ discography: Whereas the latter is considered to tail off as its artists get older, the former is thought to improve, at least to a point, with a peak sometime in the early 1600s with Lear etc. and a well-respected turn into romance and “problem plays” in old age (which was the late 40s at that time). While many of Shakespeare’s early works are written off as strange collaborations (i.e., “1 Henry VI”) or immature verse (“Two Gentlemen…”), none receives the scorn foisted upon “Titus Andronicus,” Shakespeare’s almost Marlovian revenge tragedy about a Roman general who suffers tremendously from passing up a chance to become emperor.
Like “Voodoo Lounge,” “Titus” is shocking for its anti-Victorian sentiments (it feels weird to describe it this way, in terms of an era that was centuries later, but I feel that 19th century sensibilities have so deeply affected readings of this tragedy). That Stones record talks about anal sex, the smell of vaginas, and “fucking all night” (made funny through its use in a call-response song structure) while Shakespeare’s play – written when he was not even 30 – ups the ante with gang rape, bodily mutilation, and cannibalism.
Looking back at the year – 2010 – I spent teaching at a Chicago community college just off the CTA Red Line, I almost regret not teaching this play because it feels so modern, in the way that it is all spectacle and so racial in the way that it frames its violence. Consider this passage, delivered by a Goth upon discovering the biracial lovechild of Aaron the Moor and the (white) Empress Tamora (spouse of Saturninus, who became emeperor when Titus balked), as the assembled soldiers consider what to do with the baby:
I heard a child cry underneath a wall.
I made unto the noise, when soon I heard
The crying babe controlled with this discourse:
‘Peace, tawny slave, half me and half thy dam!
Did not thy hue bewray whose brat thou art,
Had nature lent thee but thy mother’s look,
Vilain, thou mightst have been an emperor.
But where the bull and the cow are both milk-white
They never do beget a coal-black calf.’
I thought of so many possible contemporary issues that could influence a reading of these remarkable words. Birtherism. The Trump campaign’s scorn for Mexicans. White privilege. Ferguson, Missouri. “Anchor baby” predicted in the usage of “villain” to address an infant. A (totally different) world in which Barack Obama looked more like his white mother than his black father (and yet he became an “emperor,” in a sense, despite taking after his father). Ad infinitum.
My Introduction to Shakespeare instructor affectionately called “Titus” a “real potboiler.” Years later, I see what she meant – it is never dull, frequently violent, and occasionally hilarious (someone with no arms at one point carries someone’s lopped-off hand in her mouth…). It is both “Kill Bill” and “Naked Gun” in a 5-act structure. It pre-empts parody.
Moreover, like “Voodoo Lounge,” it is so often filed away as a second-tier work, yet its viciousness and viscerality are instructive reminders that the artists are humans, too, and not just names on pages or busts in libraries. “Voodoo Lounge” has an earthy smokiness that makes it sound like it could have been recorded yesterday;”Titus,” a cultural despair that can easily be reconstructed as commentary on the stratified, violence-obsessed, racially defined America of the 21st century.
Opinions are hardly in short supply. But rarely is there a better opportunity to realize their low value than when it comes to reading the “lesser” works of profoundly influential and talented artists and realizing that they might in fact be a little more than little works. The grit of “Titus” and the hedonism of “Voodoo Lounge” make me feel closer to their respective creators than any of the “greater” works in their vast canons. Too bad I never picked that painting up out of the dumpster; maybe I’ll just do my own version someday.
It is an enviable feeling when you find a word that encapsulates a complex experience in a just a handful of characters. Examples for me include “schadenfreude” (German; taking pleasure in someone else’s misfortune) and sehnsucht (also German; way too complex to describe here). Even “nostalgia,’ although a relatively well-known English word, is much more evocative as a Greek form – its roots come from verbs that mean “to go home” and “to struggle,” meaning that nostalgia is literally a “struggle to go home,” which paints a brighter picture in the mind than simply pining for the past ever could (I love the notion of the past being “home”).
Then of course, there is the longest ever Greek word at the end of Aristophanes’ Ecclesiazusae. This amalgamation takes advantage of the unique characteristics of the language (I always thought of Greek as a language of addition, which is hard to explain – it’s like it’s a bunch of puzzle pieces waiting to be fitted together, especially its nouns) to invent a new term for stew that includes all of said stew’s ingredients (an English translation is impossible; the lone accent mark at the end, added because of the language’s rules, is hilarious in this context):
Sigh – nostalgia for when I first read that play in 2006. I have always felt like Greek was a superior language to English, since its freedom from relying exclusively on syntax for meaning gives its extra resources for creatively arranging its words. The gap between Aristophanes and Plato in the originals and in English is a testament to this.
Anyway, I came across a word today that gave me the rush I was talking about, although it is not an exotic word and is cobbled together from common components (I mean, even a delicious stew can be made from cheap ingredients, right?) Writing for Time, Siva Vaidhyanathan unleashed “technonarcissism,” a term that pops up here and there but is far from mainstream. He explained it this way:
“There’s a widespread and erroneous assumption that new technologies radically change how everyone lives. In reality, such change is slow, stunted, complex, and uneven. The wealthy and educated who tend to read and write about new technology obsessively also tend to exaggerate the cultural and economic influence of technological change because they embrace it.”
Indeed (this would be a Greek way to start a sentence, with a particle!); for years I was in my own Twitter echo chamber because I followed mostly venture capitalists and virtuoso technonarcissists like John Gruber of Daring Fireball and Ben Thompson of Stratechery (this was when I worked for a startup in Chicago). My world became one in which the release of the iPhone in 2007 was a momentous, earth-shaking event(well it definitely was for Apple’s shareholders), Twitter was supposedly a platform for the masses, and institutions from taxi drivers to makes of Adobe Flash makers were just purveyors of “legacy” crafts primed to be crushed under the wheel of “progress.” The nadir (peak?) of this thinking can be seen in empty pronouncements such as this one from Thompson (couldn’t get the embed to work, so I’m just quoting) mocking concernd about the current bubble in “tech” companies:
” “This time will be worse because the real world is affected.” Or this time is because tech is actually affecting the real world.”
What is “tech” and what is “the real word”? These are the broadest of descriptive strokes. “Technology” as a category is curious, as Leo Marx has argued in a great paper. It is essentially the rebranding of blue-collar activities – working with machines – into white-collar ones so as to achieve a degree of class separation in which the already well-off can be generously construed as agents of change (“leaders,” in the anti-democratic parlance of our times). From this shift – made possible mostly/only by appeal to a scientific-sounding Greek word; and yet we are constantly lectured about how non-STEM fields don’t matter! – we get a culture obsessed with “innovation,” an activity that is distinctly unavailable to the underclasses.
The vagueness of “tech” also makes us see mundane advertising firms like Google and Facebook as world-changing companies in their own category, as Peter Strempel has explained:
“Google is no more a technology company than auto manufacturers, pharmaceutical corporations, or food conglomerates. The latter all use and develop technology, too, but we name them according to their products and services, not the tools they use to develop and sell them.”
I mean, Bank of America makes more software than Microsoft does, but you would never see any self-regarding tech writer call BofA a “tech” company. Evgeny Morozov was onto something when he hypothesized that Google’s reorganization into Alphabet was driven in part by embarrassment – that despite all the bluster about solving “big problems” and all of its exclusive perks, the search giant realized that it was just an advertising firm, with a business model like that long-written off medium, free broadcast TV, that continues to be an important media stream for the less privileged (who, as a bonus, don’t have to put up with excessive data collection and tracking while in front of the boob tube).
The business about the “real world” in Thompson’s tweet is even more revealing about the technonarcissist outlook. How, exactly, can “tech” (whatever it is) ever not affect the “real world”? “Tech” here is configured as something that exists in its own plane – an extension of the digital dualist conception of reality – almost god-like and not subject to the same experiences as everything and everyone else. Religion may be declining in the West, but these sorts of myths – about the saving power of technology in particular and of progress in general – continue, as John Gray has explained in his excellent book “The Silence of Animals.”
Accordingly, we get countless “real world” people – taxi drivers, hoteliers, booksellers, et al, as described in a Nick Bilton article excerpted by Thompson in another tweet and presented only with “gotcha” commentary rather than any real compassion for these individuals – displaced by (“imaginary world”?) actors – smartphones, tablets, “the Internet,” startups – that are presented as vanguards of an unstoppable, historic, and not-of-this-real-world force (innovation?!). But the latter group owes its success to mundane things – carrier subsidies, ads, government research, rich VCs with a decades-long windfall from Reaganism – that bankroll its illusions of grandeur.
In other words, they are part of the real world (everything is) but the class separation afforded by terms like “technology” and access to vast amounts of capital, to the degree that “failure” and not making any money (Amazon is a great example) are not the catastrophic events that they would be for say a mom-and-pop business, makes them come off as special. And so we get narratives about how technology is “changing everything” and doing so “faster than ever” in large part because the members of the technonarcissist class spend all day moving from one gadget to the next, calling out confirmation bias even as they labor to disprove threatening narratives (such as Jill Lepore’s takedown of “disruption”), and ruminating on the meaning of Google’s logo change.
All of this involves the “real world,” so why the insistence that this time is different, regarding the aforementioned bubble? Some of it is probably the shame Morozov hinted at. As Paul Krugman has explained many times, for all the hype about “tech” writ large, its effects on productivity and wages have been meh-ish. It’s hard to know if Facebook, Microsoft Excel, or Uber have made the world a better place for most people. Indeed, the implication of tech in affecting the “real world” in the current wave of funding – as opposed to the apparently meaningless bubble of the late 1990s and early 2000s – is one that relies heavily on negative signaling, such as the protests of taxi drivers over Uber and the frustration of booksellers with Amazon. So while the technonarcissist tech press basks in the convenience of services built upon huge stores of exploited labor, many others suffer and let everyone know about their fresh wounds.
The realization that this has happened could be taken as evidence that “tech” is indeed affecting the “real world” (it could not be otherwise, after all), sure. Mostly, though, it is proof that, far from the “progress” narrative assigned to so much commentary from the non-reading tech commentariat, age-old forces of capitalism, from advertising to automation, are now more than ever succeeding at separating the haves from the have-nots, just as “technology” itself appropriated and consolidated the wares of “manufacturing” and “machining” into a new, all-conquering term.