Reclaiming the past

The first podcast I ever listened to was “The Talk Show” by John Gruber, the proprietor of Daring Fireball – a minimalist site, mostly about Apple, with roots in the pre-Facebook Internet. It was about the then-upcoming changes in iOS 7: More Photoshop than Xcode, I think Gruber said, noting that the much-hyped overhaul of Apple’s most popular platform would be more cosmetic than functional. He was right. The big underlying changes didn’t come until iOS 8 and 9.

It seems like smalltalk in retrospect, but I remember it well since: 1) it was my first podcast; 2) it highlights how superficial the art/science distinction is. The switch from puffy, “likable” icons in iOS to the “flatness” of iOS 7 is probably the single biggest change that most longtime users of iOS can remember. Is there similar sentimentality for the introduction of share sheets in iOS 8? The implementation of FaceTime in iOS 4?

The flat redesign of iOS made the OS normal. No longer did you need encouragement via textured icons and graphics based on items like bookshelves, felt-lined drawers, and reel-to-reel tape decks to let you know it was ok to touch the screen. The “normcore” design of iOS 7 freed up Apple to then be much more ambitious with its technical implementations: Features like inter-app sharing, widgets, and superior interactivity with other devices (like Apple Watch) would have been strangers in the strange land of the bolted-down plush toy display of iOS 6 and earlier.

The evolution of iOS is a testament to how art often drives science, rather than the other way around (the latter has been endlessly discussed in considering the effects of increasingly sophisticated drawing and photo editing tools, social medial appropriation/reblogging, etc. on “art”). For all the hype about the need for STEM skills and “technical” employees, demand for anything – from Instagram to superior accounting tools – is in some way a demand for a better life filled with art, whether said art is time to read Nietzsche, watch another “Star Wars” sequel, or go out for an enjoyable meal.

This realization didn’t crystalize for me until at least 2014. I wish it had been in my mind over the preceding decade, though, when I saw so many students with “useless” majors – myself included, as a Classics concentrator (my college didn’t use the “major” terminology – derided for not fitting into a world obsessed with “innovation” built largely on Excel spreadsheets and 8th-grade math. I remember 8th grade: Algebra 1, taught to me by a Mississippian on the verge of retirement. Maybe she should have invented Uber. She probably could have.

My new views on education and its individual subjects have liberated the past for me. I feel bad now that I felt bad during those years when I thought nothing was useful except the STEM subjects, which I cursed myself for many nights for not studying. More sunnily, I can now call back my walks through the Chicago South Side, on the UChicago campus, when I was desperate for work and wondering if anyone cared about my thoughts about Washington Irving’s “Salmagundi.” Now I know that what I learned about “stew-like” composition from that book was invaluable to my development of a distinctive “professional” and personal writing style that has helped provide for my family.

I heard William Faulkner’s “The past is not dead; it is not even past” quip in the 1990s, but didn’t give it any consideration until at least 2006 when I was writing about “Light in August” for an English course, late at night while listening to Explosions in the Sky. Everything I had consumed up to that point – from TV shows to conversations throwaways about “moving on” since “that’s in the past!” – made it feel like the past was a train stop receding into the distance as the locomotive pressed ever-forward. But a train stop is never disposable or one-way; other trains will pass it by, perhaps even the same train going back on the same tracks in another direction.

Railroads became my favorite metaphor for understanding Faulkner’s concept of time. Music also helped stretch out my feel for the past. During the last few months of college, I became a huge fan of the Anjunabeats record label. That sounds like bullshit: Who can even name a record label, much less like it? The Anjunabeats affinity was an accident. In late 2006, I had searched for “Delays” (some British pop group that my roommate was into it) on eMusic, finding nothing by them. There was a result, though: a compilation of tracks by another label, Renaissance UK.

Listening to the lengthy set immersed me in electronica (or dance or trance or EDM or whatever term was then-popular to distinguish non-guitar, non-rap popular music) and led to a CD buying binge so that I could fill up some of vast unused spaces on my iPod. By late 2007 I had branched out into other labels but the 2-disc mixed compilation was still my favorite medium: varied, yet consistent in its churn, like the day writer I would eventually become, using templates to write about VoIP and help desks in slightly different tones.

Anjunabeats was actually a group before it was a label. Once it expanded from a duo to a trio, it took the name Above & Beyond, still the most famous artist on the imprint. The first Anjunabeats track I heard was by Above & Beyond and it was called “Good for Me.”

The track was actually a remix. It faded it with a sudden pulse and descent into gentleness that felt just like waking up – which I was doing that spring day in 2008, while the fog went away outside and my friend who had just dropped me off from on the drive from Cranston – I had slept the whole way – sped away.

“Good for Me” and its parent album, “Tri-State,” had been released in 2006. When I listened to them in the months and years afterward I would call back to mind that white-hued morning in the dorm, but also my own once-ignorant (of Above & Beyond) experiences in 2006.

Something like the trip I took with my brother and cousins to a Yu-Gi-Oh! tournament in Louisville in the summer of ’06 – when I never listened to a single second of the Anjunabeats catalog – were now soundtracked in the blue waves of my memory with “Good for Me” or any of the other fantastic cuts from “Tri-State.” My recollections of life before the iPhone (release in 2007) would also become not just clichédly sepia-toned, but soundtracked by A&B, or Smith & Pledger, or any of the vast Anjunabeats stable that released singles throughout the mid-2000s.

This sort of reclamation of the past hurt my head at first – what should I call it? Eventually “retrofitting” felt right. It was as if the past were a manufactured thing – this chair, this table, this turntable, it’s all ‘the past’ since it bears some old design into the present, and then there’s the idea of the past which often cannot be shut-out of thought – with new things being added to it.

The past as railroad; the past as commodity; maybe the past as building – in multiple sense of the word – captured what I felt most closely. The old dormitory where I smoked from my red-and-black bong, or the wood-paneled bar room in which I conferenced happily with a TA in 2008 but now remember sadly because I saw some guy who dismissed my interest with business-like “best of luck in your search!” bullshitese using it as the backdrop for its Facebook profile – these are the retrofitted places of the brain.

The lyrics of OceanLab’s “Breaking Ties” were helpful: “Though I may return/To empty places on my own.” The locations – buildings, mostly – of the past are indeed emptied of the clutter and the ambience – like the “genuine 60s dust” that Lee Mavers of The La’s lamented was missing from the band’s recording gear and studios – but they’re still receptive. They can be filled, retrofitted with other songs, other places overlain, other times.

Liking pop music “ironically”

Last week, I went to Austin Music Hall in Texas for a party hosted by the cybersecurity firm Trend Micro. The affair was “Star Wars”-themed, from the paper-based quiz of trivia from the films (somewhat dulled by everyone in the building not only having a smartphone, but being in some way connected to “IT”) to the cocktail selection, which included a “Tatooine Sunset” that was regrettably mostly pineapple juice in a glow-up glass.

With a cover band whirling through slightly dated pop hits from R. Kelly’s “Ignition (Remix)” to Metallica’s “Enter Sandman” (a large costumed Jabba the Hutt even danced to the rock-y rendition of “Wagon Wheel”), the whole scene confirmed what I had felt for years: That the previously “oppressed” geek minority with its fetishistic, once unusual culture – of sci-fi, comic books, computer programming, etc. – had become exceedingly normal, while still maintaining its bygone outsider status as a sort of protective moat around its particular tastes.

The endless sequel trains of Marvel movies and the reboots of seemingly every TV show, video game, and other pop culture franchises are the best evidence of this shift – how long till a “reboot” of “Howard the Duck” smashes the opening weekend box office record? The sequlitis coming to “Star Wars” itself is already depressing enough, especially if you consider that something as offbeat as the first run of “Star Wars” movies (at a time, the 1970s, when blockbuster sci-fi was truly a fiction, a pipe dream) would struggle to even emerge in today’s homogenous cultural stream. But you can get a more visceral sense of how deep-seated and defensive the new geek-driven pop culture has become by discussing music.

A culture dominated by geeks (or nerds or whatever now-empty term, no longer effective as an epithet, you want to throw around) is one that is not easily given to experimentation or the abstract. You see this dissonance between geekdom and the avant-garde in the vast gulf between, say, comic books and non-representational visual art, between the predictable effects of an algorithm and the never-settle questions of a text like William Gaddis’ “JR.” Moreover, the likes of “Game of Thrones,” “Iron Man,” and “World of Warcraft” – I mean, even the stereotypical startup, with it sights set on helping you share photos in yet another way or exploiting some precarious worker’s need for “independent contracting” – are resoundingly concrete, to the point of humorlessness and managerialism.

They are then defended with equally dour moralizing from pop culture critics, who more than anything seem to want to reassure their audiences that what they already like – regardless of whether it is ever challenged – is perfect, as Freddie deBoer has pointed out. But how does any of this affect music?

Since at least the rise of the Backstreet Boys in the late 1990s, pop music has stuck to a largely uniform sound that melds arena rock and R&B. A handful of middle-aged Scandinavian men have penned the bulk of all hits in this mold for the past decade plus, from Britney Spears’s first album to Taylor Swift’s “masterpiece,” “1989.” When the latter came up in one of our conversations in Austin, several people rushed to assure everyone that they didn’t like her – or previous pop scions like Blink-182 – “ironically,” but genuinely.

Why would anyone like music ironically? Is, or was that ever, a thing? As Betty White once said of Facebook, “it seems like a huge waste of time.” Still, I can sketch out the roots of this attitude in my head. There seems to be an assumption among people roughly my age (I was born in the mid 1980s) that there exists some sort of cultural ivory tower, filled with people who spend their days listening to opera, reading Tennyson, and brushing up on their Latin and Greek. These mostly make-believe people, who do not at the very least constitute a coherent cultural body, are then assumed to frown upon things such as pop music, geek culture, and unrestricted technology-enabled capitalism (which makes the former two items so pervasive in the first place).

For me, the avatar of this fiction was the website Pitchfork, which I began reading long enough ago to remember its days as I found it one day in 2002 while searching for a review of a Queens of the Stone Age album, which the reviewer liked but only halfheartedly (meanwhile, it was getting raves from Rolling Stone and other print magazines). Pitchfork, despite a relatively modest readership, introduced many people of my ilk – teenagers and twenty something East Coast college kids – to “indie rock.” At the same time, its carefully constructed ignorance of most pop music of the early and mid 2000s – its halcyon days – was commentary enough on its low opinion of the “mainstream.”

Pitchfork and many other e-zines (whoa, outdated term!), some of them now defunct (RIP, Stylus Magazine and Dusted Magazine), created their own parallel canons of lo-fi rock, techno, and folk musics, which didn’t intersect much with the rock critic-driven Great Albums lists put out all the time by print media during those days. All the same, Pitchfork was hardly on the cutting-edge of pop taste or the charts. Its influence felt all-encompassing to a daily reader, but didn’t even exist for billions of other music fans. Yet somehow the myth of the Indie Snob as not only a real archetype for music fandom but the default one lived on.

Accordingly, even now, in 2015, when Pitchfork itself goes out of its way to review a Ryan Adams cover album of “1989,” people feel that they have to let the idea of “ironic” love of a song or artist float in the air for a second when discussing taste, just to put out feelers the mythical High Culture Warriors among us and cultivate momentary solidarity with them. Then, they pop the irony balloon and admit the true default position: liking pop music unironically. It’s the classic iron fist in a velvet glove maneuver, except much more arrogant.

I don’t love pop music, and yet even I will not begrudge someone their affinity for Rihanna, The Weeknd, or anyone else, in public or or private. The taste doesn’t bother me, but the idea of an all-powerful and nonexistent cultural elite (god?) frowning upon the actual elite does. It reminds me of the 2012 U.S. presidential election, when the likes of Jack Welch, Rupert Murdoch, and other GOP donors bemoaned the “Chicago-style politics” and omniscient “paymasters” behind the Obama campaign, when the candidate they supported – Mitt Romeny – was a centimillionaire with the backing of Wall Street and virtually every non-governmental establishment institution in the country!

A similiar thread of fake oppression also comes up in geek-dominated fields such as computer science and electrical engineering. The recent episode with Mohamed Ahmed, a 14 year old student who “built” a “clock” (it seems he took the circuit board out of a commercial one and then put it in a briefcase), is instructive. The left-wing – which I consider myself a part of, though at time its feels oddly unfamiliar on some issues – rushed to the kid’s defense after he was arrested for bringing his concoction to school, citing discrimination against him as a Muslim (perhaps) and as a STEM-oriented “maker” (ugh).

The idea that most Americans (or anyone else) actively dislikes science geeks because they “create” things or fetishize technology is hilarious. We live in a world in which Facebook has more than 1 billion monthly users (although I guess one could dispute that Facebook itself actually “makes” anything other than a container for other people’s unpaid labor; maybe some day it will – in the spirit of Margaret Thatcher’s quip about socialism – run out of other people’s time and effort), in which conversations about college are dominated by talk of (fake) STEM major/skills shortages, and in which tech firms dedicate enormous amounts of PR time and energy to a gender gap while simultaneously pushing the hyper-masculine idea of maker culture; to put “makers” on a pedestal is to demean the care-taking work that holds so much of the world together. Not even the NYC government could quash Uber’s environment-polluting, downward-wage-pressuring ways, lest it be seen as an enemy of “innovation” that is mysteriously in short supply everywhere but the geek-led tech community.

Why do dominant cultural forces feel the need to create a vast fake counterculture that is (impossibly) more elite than it is? I think it is to consciously reinforce their own tastes (and ensure the capitalistic reproduction of those tastes for years to come) and subconsciously to hide some degree of embarrassment. Just as Google’s rebranding as a subsidiary of Alpha may have been driven in large part by the search company’s bashfulness at being essentially a giant ad firm (ads = so 20th century), love of pop music may be felt at some level by a fan as guilt. This necessitates the identification of something obscure and/or vaguely hip, liked by an élite person who in theory would sneer at pop music (I find indifference is the much more common reaction) or go even further and pretend to enjoy it ironically (if the sneer is too much effort, then the full-on ironic facade is a true Herculean labor).

Never mind that the richest, most “successful” members of the music industry aren’t Japanese doom metal peddlers, Finnish techno connoisseurs, or Swahili-speaking folk singers. I mean, just look around: One might have to actually make a conscious effort to actually avoid hearing Taylor Swift or Rihanna, but could live her entire life without ever having to lift a finger to avoid hearing a Pan Sonic tune. Unironic love for this music is by far the norm. Meanwhile, irony itself is often opposed to the capitalism that has made pop music possible. Irony lends itself to satire, humor, and various other attitudes that clash with the managerial, data- and formula-driven ethos of pop (and of neoliberal economics).

So when someone says she like a ubiquitous artist unironically, realize that she is in some way fighting a battle against an overwhelmingly dominant cultural force. She sees the possibility of liking something ironically – with all the “unproductive” hours spent cultivating that attitude at the office or at home that that would entail – and flirts for a moment with that impractical (under capitalism) position, and then moves on to the idea of there being some vast, sneering high cultural elite that loves free jazz and Goa trance. Finding that group unlikeable, she uses the term “unironically” to reinforce her tastes while shutting off alternatives to the mainstream. It’s too bad since while the High Culture elite is a fiction, there is life beyond the Top 40, in other musics that, yes, are sometimes challenging or not a good fit for everyone but which are not defended with anywhere near the nose-turning so characteristic of mainstream culture whenever its sacred cows are targeted for slaughter.

Titus Andronicus in the Voodoo Lounge

Years ago, I walked past a dumpster in one of Chicago’s distinctive Northwest Side alleys and saw a painting in it. I had previously found a good coffee table in this same receptacle, so I was accustomed to seeing and looking for salvageable stuff in there. It was a giant canvas panel with the album cover of the Rolling Stones’ 1994 album “Voodoo Lounge.” Various smart-alec remarks ran through my head, most memorably the “Garbage – indeed” critique of the techno band Garbage’s debut album in some publication I can’t remember. “Voodoo Lounge” is not one of the Stones’ most loved albums, but it has always been special to me.

Into the Voodoo Lounge…
For starters, the whimsical cover, with a leopard-like figure on the front and the tongue logo on the back, is probably the 2nd best in their catalog after the Warhol-designed print for 1971’s classic “Sticky Fingers” (the soundtrack of my 2003 summer at the Governor’s Scholar Program in Danville, KY, and the first recording to use the iconic tongue – in this case, as the image below a functional zipper on the original LP sleeve). Second, my introduction to it was through the song “Thru and Thru,” which I must have heard 1,000 times while listening to the soundtrack of “The Sopranos” while in the car with my mom during our drives to Elizabethtown, KY in the early 2000s.

“Thru and Thru” is the rare Keith Richards lead vocal, with cheeky lyrics (“you know that we do take away/we deliver too”) as well as a tantalizingly slow pace punctuated by Charlie Watt’s occasional fills. On a soundtrack that featured an extremely Stones-y song by the Lost Boys, the inclusion of this obscure, uncharacteristic number – the penultimate track on a mostly forgotten studio album – seemed almost like an in-joke, but I loved it. A few years later, one of my college roommates and I sat around listening to “Love is Strong,” the opening song from “Voodoo Lounge,” and marveled at how such a memorable, forceful performance was delivered by a band that at the time of production already had 30 years of recording and touring under its belt.

Still, I never listened to the entire double LP (it’s one of the band’s longest works, at 63 minutes) until I got an Apple Music subscription a few months back. I played all 15 tracks over my Jawbone Jambox while sweeping the floor one day, and it was fun. Whether you are new the to the Stones or a longtime fan who never got around to their twilight output, “Voodoo Lounge” is shocking. It has some of their some of their most explicit lyrics – these 50somethings (in the 90s!) were clearly on the prowl all the time – and rocks with a youthful energy on songs like “You Got Me Rockin'” and “Suck on the Jugular.”

I would have just let the album pass as another daily listen (I’ve been trying to listen to a new album from start to finish every day) if not for the fact that I finished Shakespeare’s much-maligned “Titus Andronicus” later the same day. Like Shakespeare, the Stones are a British cultural institution with a vast catalogue that by turns is considered “classic” and utter dreck. I have always struggled to reconcile my own views with this critique.

…and onto Titus revisionism
My listening experience with my friend in 2005, i.e., our time soaking in “Love is Strong,” was a much-needed corrective to the critical vacuum I had been occupying for years, in which the Stones allegedly hadn’t produced anything good since 1972. Similarly, my Introduction to Shakespeare in 2004 was an eye-opener because it focused so heavily on less-regarded plays such as “The Two Gentlemen of Verona” and “Richard II.” The former is memorably quoted in “Shakespeare in Love,” and the way it plays with gender and disguise may as well be a roadmap for how the Bard tackled these concepts throughout his 25-year career. We didn’t read “King Lear” or “Hamlet,” and yet it didn’t feel like a loss.

Shakespeare’s generally agreed-upon chronology is the opposite of the Stones’ discography: Whereas the latter is considered to tail off as its artists get older, the former is thought to improve, at least to a point, with a peak sometime in the early 1600s with Lear etc. and a well-respected turn into romance and “problem plays” in old age (which was the late 40s at that time). While many of Shakespeare’s early works are written off as strange collaborations (i.e., “1 Henry VI”) or immature verse (“Two Gentlemen…”), none receives the scorn foisted upon “Titus Andronicus,” Shakespeare’s almost Marlovian revenge tragedy about a Roman general who suffers tremendously from passing up a chance to become emperor.

Like “Voodoo Lounge,” “Titus” is shocking for its anti-Victorian sentiments (it feels weird to describe it this way, in terms of an era that was centuries later, but I feel that 19th century sensibilities have so deeply affected readings of this tragedy). That Stones record talks about anal sex, the smell of vaginas, and “fucking all night” (made funny through its use in a call-response song structure) while Shakespeare’s play – written when he was not even 30 – ups the ante with gang rape, bodily mutilation, and cannibalism. 

Looking back at the year – 2010 – I spent teaching at a Chicago community college just off the CTA Red Line, I almost regret not teaching this play because it feels so modern, in the way that it is all spectacle and so racial in the way that it frames its violence. Consider this passage, delivered by a Goth upon discovering the biracial lovechild of Aaron the Moor and the (white) Empress Tamora (spouse of Saturninus, who became emeperor when Titus balked), as the assembled soldiers consider what to do with the baby:

I heard a child cry underneath a wall.
I made unto the noise, when soon I heard
The crying babe controlled with this discourse:
‘Peace, tawny slave, half me and half thy dam!
Did not thy hue bewray whose brat thou art,
Had nature lent thee but thy mother’s look,
Vilain, thou mightst have been an emperor.
But where the bull and the cow are both milk-white
They never do beget a coal-black calf.’

I thought of so many possible contemporary issues that could influence a reading of these remarkable words. Birtherism. The Trump campaign’s scorn for Mexicans. White privilege. Ferguson, Missouri. “Anchor baby” predicted in the usage of “villain” to address an infant. A (totally different) world in which Barack Obama looked more like his white mother than his black father (and yet he became an “emperor,” in a sense, despite taking after his father). Ad infinitum.

My Introduction to Shakespeare instructor affectionately called “Titus” a “real potboiler.” Years later, I see what she meant – it is never dull, frequently violent, and occasionally hilarious (someone with no arms at one point carries someone’s lopped-off hand in her mouth…). It is both “Kill Bill” and “Naked Gun” in a 5-act structure. It pre-empts parody.

Moreover, like “Voodoo Lounge,” it is so often filed away as a second-tier work, yet its viciousness and viscerality are instructive reminders that the artists are humans, too, and not just names on pages or busts in libraries. “Voodoo Lounge” has an earthy smokiness that makes it sound like it could have been recorded yesterday;”Titus,” a cultural despair that can easily be reconstructed as commentary on the stratified, violence-obsessed, racially defined America of the 21st century.

Opinions are hardly in short supply. But rarely is there a better opportunity to realize their low value than when it comes to reading the “lesser” works of profoundly influential and talented artists and realizing that they might in fact be a little more than little works. The grit of “Titus” and the hedonism of “Voodoo Lounge” make me feel closer to their respective creators than any of the “greater” works in their vast canons. Too bad I never picked that painting up out of the dumpster; maybe I’ll just do my own version someday.


It is an enviable feeling when you find a word that encapsulates a complex experience in a just a handful of characters. Examples for me include “schadenfreude” (German; taking pleasure in someone else’s misfortune) and sehnsucht (also German; way too complex to describe here). Even “nostalgia,’ although a relatively well-known English word, is much more evocative as a Greek form – its roots come from verbs that mean “to go home” and “to struggle,” meaning that nostalgia is literally a “struggle to go home,” which paints a brighter picture in the mind than simply pining for the past ever could (I love the notion of the past being “home”).

Then of course, there is the longest ever Greek word at the end of Aristophanes’ Ecclesiazusae. This amalgamation takes advantage of the unique characteristics of the language (I always thought of Greek as a language of addition, which is hard to explain – it’s like it’s a bunch of puzzle pieces waiting to be fitted together, especially its nouns) to invent a new term for stew that includes all of said stew’s ingredients (an English translation is impossible; the lone accent mark at the end, added because of the language’s rules, is hilarious in this context):


Sigh – nostalgia for when I first read that play in 2006. I have always felt like Greek was a superior language to English, since its freedom from relying exclusively on syntax for meaning gives its extra resources for creatively arranging its words. The gap between Aristophanes and Plato in the originals and in English is a testament to this. 

Anyway, I came across a word today that gave me the rush I was talking about, although it is not an exotic word and is cobbled together from common components (I mean, even a delicious stew can be made from cheap ingredients, right?) Writing for Time, Siva Vaidhyanathan unleashed “technonarcissism,” a term that pops up here and there but is far from mainstream. He explained it this way:

“There’s a widespread and erroneous assumption that new technologies radically change how everyone lives. In reality, such change is slow, stunted, complex, and uneven. The wealthy and educated who tend to read and write about new technology obsessively also tend to exaggerate the cultural and economic influence of technological change because they embrace it.”

Indeed (this would be a Greek way to start a sentence, with a particle!); for years I was in my own Twitter echo chamber because I followed mostly venture capitalists and virtuoso technonarcissists like John Gruber of Daring Fireball and Ben Thompson of Stratechery (this was when I worked for a startup in Chicago). My world became one in which the release of the iPhone in 2007 was a momentous, earth-shaking event(well it definitely was for Apple’s shareholders), Twitter was supposedly a platform for the masses, and institutions from taxi drivers to makes of Adobe Flash makers were just purveyors of “legacy” crafts primed to be crushed under the wheel of “progress.” The nadir (peak?) of this thinking can be seen in empty pronouncements such as this one from Thompson (couldn’t get the embed to work, so I’m just quoting) mocking concernd about the current bubble in “tech” companies:

” “This time will be worse because the real world is affected.” Or this time is because tech is actually affecting the real world.”

What is “tech” and what is “the real word”? These are the broadest of descriptive strokes. “Technology” as a category is curious, as Leo Marx has argued in a great paper. It is essentially the rebranding of blue-collar activities – working with machines – into white-collar ones so as to achieve a degree of class separation in which the already well-off can be generously construed as agents of change (“leaders,” in the anti-democratic parlance of our times). From this shift – made possible mostly/only by appeal to a scientific-sounding Greek word; and yet we are constantly lectured about how non-STEM fields don’t matter! – we get a culture obsessed with “innovation,” an activity that is distinctly unavailable to the underclasses.

The vagueness of “tech” also makes us see mundane advertising firms like Google and Facebook as world-changing companies in their own category, as Peter Strempel has explained:

Google is no more a technology company than auto manufacturers, pharmaceutical corporations, or food conglomerates. The latter all use and develop technology, too, but we name them according to their products and services, not the tools they use to develop and sell them.”

I mean, Bank of America makes more software than Microsoft does, but you would never see any self-regarding tech writer call BofA a “tech” company. Evgeny Morozov was onto something when he hypothesized that Google’s reorganization into Alphabet was driven in part by embarrassment – that despite all the bluster about solving “big problems” and all of its exclusive perks, the search giant realized that it was just an advertising firm, with a business model like that long-written off medium, free broadcast TV, that continues to be an important media stream for the less privileged (who, as a bonus, don’t have to put up with excessive data collection and tracking while in front of the boob tube).

The business about the “real world” in Thompson’s tweet is even more revealing about the technonarcissist outlook. How, exactly, can “tech” (whatever it is) ever not affect the “real world”? “Tech” here is configured as something that exists in its own plane – an extension of the digital dualist conception of reality – almost god-like and not subject to the same experiences as everything and everyone else. Religion may be declining in the West, but these sorts of myths – about the saving power of technology in particular and of progress in general – continue, as John Gray has explained in his excellent book “The Silence of Animals.”

Accordingly, we get countless “real world” people – taxi drivers, hoteliers, booksellers, et al, as described in a Nick Bilton article excerpted by Thompson in another tweet and presented only with “gotcha” commentary rather than any real compassion for these individuals – displaced by (“imaginary world”?) actors – smartphones, tablets, “the Internet,” startups – that are presented as vanguards of an unstoppable, historic, and not-of-this-real-world force (innovation?!). But the latter group owes its success to mundane things – carrier subsidies, ads, government research, rich VCs with a decades-long windfall from Reaganism – that bankroll its illusions of grandeur.

In other words, they are part of the real world (everything is) but the class separation afforded by terms like “technology” and access to vast amounts of capital, to the degree that “failure” and not making any money (Amazon is a great example) are not the catastrophic events that they would be for say a mom-and-pop business, makes them come off as special. And so we get narratives about how technology is “changing everything” and doing so “faster than ever” in large part because the members of the technonarcissist class spend all day moving from one gadget to the next, calling out confirmation bias even as they labor to disprove threatening narratives (such as Jill Lepore’s takedown of “disruption”), and ruminating on the meaning of Google’s logo change.

All of this involves the “real world,” so why the insistence that this time is different, regarding the aforementioned bubble? Some of it is probably the shame Morozov hinted at. As Paul Krugman has explained many times, for all the hype about “tech” writ large, its effects on productivity and wages have been meh-ish. It’s hard to know if Facebook, Microsoft Excel, or Uber have made the world a better place for most people. Indeed, the implication of tech in affecting the “real world” in the current wave of funding – as opposed to the apparently meaningless bubble of the late 1990s and early 2000s – is one that relies heavily on negative signaling, such as the protests of taxi drivers over Uber and the frustration of booksellers with Amazon. So while the technonarcissist tech press basks in the convenience of services built upon huge stores of exploited labor, many others suffer and let everyone know about their fresh wounds.

The realization that this has happened could be taken as evidence that “tech” is indeed affecting the “real world” (it could not be otherwise, after all), sure. Mostly, though, it is proof that, far from the “progress” narrative assigned to so much commentary from the non-reading tech commentariat, age-old forces of capitalism, from advertising to automation, are now more than ever succeeding at separating the haves from the have-nots, just as “technology” itself appropriated and consolidated the wares of “manufacturing” and “machining” into a new, all-conquering term.

Amazon shock

The New York Times has published a deep look at Amazon’s brutal white collar workplace norms, which is somehow both unsurprising and shocking. Unsurprising in that this type of 80-hour-a-week bullshit (despite the evidence of diminishing returns and the huge incentive to just pretend that idle hours were spent “getting things done”) is everywhere in the high-tech economy beloved by the upper classes, and shocking in how completely dehumanizing Amazon’s entire system has become. It has turned its relentlessly efficient supply chain management on itself. So a company most famous for shipping books, diapers, and anything else to you at lower cost and in less time than anyone else (it seems kinda trivial to put it like that) has made these traditional retail stakes – I’m sorry, “innovations” – the ambitious ends justifying a bleak set of means.

I am not going to dive into everything that is deplorable about Amazon’s approach; the problem here is rooted in the history of American capitalism, techno-utopianism, and sexism (Amazon has no female top-tier executives) and the scope of this entry is much smaller. Instead, I am more interested in how Amazon has created white-collar equivalents of many of the indecencies of increasingly precarious blue-collar work. Whereas the latter has obviously suffered for decades under the dissolution of unions, the offshoring of labor, and the overall eclipse of labor by capital, the (really) upper middle class white-collar world has all the while maintained a facade of control and direction in the current economy, captured in the obsessive use of terms like “flexibility” and “leadership.”

But it’s just that: a front. Being “flexible” in the context of the workplace often means having to yield – in one’s time or in how one uses one body (this is why ‘flexible’ is such a telling term) – to undetermined forces, whether they be writ small -“efficiencies” – or large – “the market -” to disguise the fact that they’re just policies approved by upper class real human beings, subject to passing ideologies like neoliberal economics. In Amazon’s case, the perennial capitalist crisis of low profits, which is currently roiling the Chinese monetary system, has been the norm since day 1, which helps explains why the retailer – which despite having made virtually no money in its entire existence is valued at a hundred of billions of dollars – is perhaps even more notably inhumane than its sea of peers from Uber to Apple. It’s had plenty of time to bide its time and sharpen its claws.

While Amazon’s highly compensated mid-to-upper management employees are likely not what most people think of as ‘the middle class,’ I think that their subjection to crying fits and ridiculously petty harassment over email is the inevitable upward migration of manipulative techniques first tried-out on the underclasses (say, Amazon’s warehouse workers and the billions of their ilk all over the world). This is not the orthodox position, though. On Twitter, where fascination with Amazon has shaped the low-stakes non-ideological commentary of writers like Farhad Manjoo and Ben Thompson, the NYT’s revelations have instead spurred a sort of foggy “choice” narrative that is typical when trying to discuss neoliberal economic institutions without getting into politics.

In since-deleted tweets, for example, Thompson talked about how companies like Amazon “don’t happen by magic” and that many people still choose to work there. I think this viewpoint oversells how much any individual “chooses” to subject herself to the workplace wringer that Amazon et al are continually refining (and not in a good way; iteration is not necessarily positive). When the paths to the upper class are so narrowed by inequality that transversing them requires extraordinary measures such as the will to take on student loans and later be subjected to 24/7 interference (and all of this as an introduction), the romanticized idea of “choice” loses its luster. Today’s optionless (in all senses) 1099 warehouse stocker is tomorrow’s product manager with a caveat-laden contract and no down time.

Moreover, the narrative of “leadership” and “impact” are more of the anti-democratic, hyper-competitive worldview of the business elite, one that doesn’t really square with how the global economy operates (as Paul Krugman has explained) but nevertheless retains regrettable sway over individuals eager to “rally the troops” and do whatever it takes – bust unions, subject even the lowest-paid workers to humiliating screenings each night – to stay ahead in some fictional race. The idolized “leaders” of the new business world, including Amazon CEO Jeff Bezos and secularly sainted Apple founder Steve Jobs, are seen as following straight lines from fanatical hard work to riches, which is ridiculous: they are also successful because they occupied certain historical moments with particular opportunities, and because capital accrued to them from sources other than their labor. Yet the hardworking, deserving CEO myth lives on.

Fortunately, I think there has finally been some push back to the Great Companies coverage that has been de rigeur across the Web ever since Google IPO’d and made it seemingly cool to not be “evil,” even if an organization was just another profit-seeking outlet like G.E. or Standard Oil before it. Thompson deleted many of his Amazon tweets following frustration over others who ‘assume malice‘ in his remarks about how “choice” factored into the Amazon wringer, how these practices were necessary to make a company “like Amazon” possible, and how the NYT was somehow the real villain in all of this because of its style of coverage (?).

I believe him that he was not defending Amazon consciously in the deleted tweets. Still, the style of commentary he exemplifies – i.e., one that argues that a given product/service is “the future” despite its unclear/possibly negative social value, that money from VCs and Wall Street is somehow mostly (and miraculously, considering the rent-seekers we are talking about here) directed at tech projects that improve the lives of people, and that the Interent and all of its derivatives such as the click-based economy are just immutable facts of business that we cannot change –  will almost inevitably be read that way given the deteriorating economic environment for so many in countries like the U.S., Amazon’s home. The true orice of Amazon-style convenience is finally becoming clear to many of us, and that’s a good thing.


Get every new post delivered to your Inbox.

Join 262 other followers