The Internet Anti-Revolution

“The Internet” is often lionized for its effects on what are, to the well-off people who can even use them, trivialities. Parsing the praise for services from Uber to Airbnb, the depoliticized reader can just imagine the sheer horrors of the bygone dystopia in she had to dial taxi services (on a phone, no less!) or put up with the indignities of hotel reservations. Few have popped this Internet hype balloon with more aplomb than Ha-Joon Chang, who in a sublime chapter in his book “23 Things They Don’t Tell You About Capitalism,” convincingly argued that the washing machine was a more important and socially progressive invention than “the Interne,” since the latter has mostly benefited our leisure lives. Has “the Internet” really sparked a “revolution” because of its ability to ease the discovery of nearby tapas joints?

I have put “the Internet” as well as “revolution” in quotes for a reason:

  • First, although it is by default discussed as a non-political, reified force, “the Internet” is a social relation, built and managed by humans in accordance with the politics and class systems of their societies. “The Internet” is not a thing; it does not exist in nature and there is nothing inevitable about its character. As such, discussions of its abilities to influence human relations (this phrasing alone shows just how reified it has become) cannot simply trace its history of technological updates – e.g., the creation of Ethernet, the introduction of TCP/IP, the advent of Wi-Fi, etc. – but must also include the circumstances that attended these changes.
  • Second, “revolution” is an odd choice by the technology commentariat for a descriptor of “the Internet”‘s impact, considering that revolutions are political affairs and often  – as in the case of the ones that occurred in Russia and China last century – anticapitalist. Nevertheless, bloggers such as Ben Thompson of Stratechery conceive of an “Internet Revolution” that will rival the Industrial Revolution in scope and lasting power. I wonder if he and others realize that the Industrial Revolution only brought about the polished bourgeoise world of “tech” by centuries of class domination. As Lenin once said, “Advances in in the spheres of technology and science in capitalist society are but advances in the extortion of sweat.”

It is empirically the case that the world’s Ubers and Airbnbs, its Googles and Facebooks, and its iPhones and Fitbits, are only possible via a vast often unseen (or ignored) store of labor – Marx’s “hidden abode of production” – that cannot possibly compete with the “noisy sphere of consumption.” Whether 1099 contractors (your Uber driver), unpaid “content” contributors (everyone on every social network ever), or literal slaves (the children and poor who extract the metals that go into many consumer electronics), it’s safe to say that the actual grunt-work of the so-called “Internet Revolution” is not being put in by programmers logging “80” hours a week (see Peter Fleming’s breezy takedown of the overwork culture from a few months ago) but instead by the massive global underclasses.

Revolution and “normal people”
I just finished Astra Taylor’s excellent book “The People’s Platform,” which I discovered from a review on Fredrik deBoer’s website (which I essentially binge-read this past week). I could not recommend it more highly – it is a sober, well-researched, impeccably written corrective to the idea that “the Internet” will inevitably enable an egalitarian makeover of society because of how its users now have “open” access to so much information, each one with a smartphone in her pocket to become her own filmmaker or reporter.

John Pat Leary expressed similar sentiments in a brief piece for Salon recently, going after the buzzword of all buzzwords: “innovation”:

“[I]nnovation transforms processes and leaves structures intact. Thus, instead of reinventing housing or transit, “innovators” mostly develop new processes to monetize the dysfunctional housing and transit we already have, via companies like Airbnb and Uber.  It’s one thing, therefore, to celebrate novelty indiscriminately — as if meth labs and credit-default swaps are not innovative — but what if the new isn’t even very new at all?”

Uber introduced a smartphone-accessible CRM on top of the existing taxi and limo infrastructure. Airbnb has done something similar with tons of rental units. But against these technically trivial changes to the topmost layers of huge social systems of labor (i.e., transportation and housing), Thompson et al still hold out hope for a techno-utopia:

“At the risk of painting too broad a stroke, it seems to me that much of the opposition to changes wrought by the Internet undervalue the positive impact said changes have on normal people. For example, people despair over newspapers closing without appreciating the explosion in quality content freely available to anyone anywhere in the world, the net result of which means those who choose to be can be far more informed about far more things than just a few years ago. Others gripe about Facebook’s frivolity or it and Google’s collection of data without acknowledging that both have fundamentally changed how we relate to both those we know as well as anything we wish to know.”

I’m not sure who these “normal people” are – perhaps they are inserted as a semantic complement to the “revolution” terminology often thrown around in these contexts, to exploit the notion of a pleased proletariat yet one that is diminished by being reduced to being a set of passive consumers (of services like Uber) rather than active citizens (who would have the political legs to bargain against more powerful interests). I think Thompson overestimates the demographic variety of “Internet” users, many of whom are well-off males using services designed by others like them. Anyway, the rest of the paragraph is chock-full of the types of arguments that Taylor spend her entire book debunking:

  • Many newspapers have closed, but the likes of BuzzFeed (a Thompson favorite) and HuffPo (to name but two) have hardly replaced the investigative journalism and noncommercial writing (i.e., research and reporting that knows nothing of a world of “sponsored content” or other euphemisms for payola) that they performed. There has been a change in incentives, to say the least.
  • “Freely available” is a misnomer. It’s free to the person who navigates to the site, in that she doesn’t have to pay for access. But she pays a huge price in attention-drain (ads), privacy (tracking from ads), and social safety nets and programs (from the downward pressure on wages of writers, created by giving away so much stuff for “free”).
  • It is naive to think that “the explosion in quality content” (there’s that word again) means that everyone is going to be an adept hunter-gatherer of the precise items needed to be informed in a democratic society. Instead, there’s the echo chamber facilitated by Google and Facebook, which show us mostly what we have already seen (what a depressing lack of imagination), and the lingering effects of churnalism – tons of articles cranked out by writers toiling for fractions of a cent per word – meant to play to the profit-driven “platforms” of the major Web companies rather than the public interest.
  • If this seems abstract, imagine if say the police department in your community were dissolved tomorrow and replaced by an “explosion” of “law enforcement content.” So instead of a publicly-funded, equal opportunity service offered to the community for the price of their taxes, the replacement would be an abundance of private sector for-profit law enforcement agencies that would have no motive other than to make as much money as possible. It would soon become evident that this incentive does not align with basic principles of health, safety, or community.
  • Similarly, “the normal people” who ostensibly benefit from the convenience of a service like Uber end up paying a price in the privatization of basic services such as transportation. Capitalist organizations, unlike public agencies, are incentivized to accumulate capital but not necessarily to act in the public interest or even be fair to all customers.
  • To continue with Leary’s observation, “innovative” services indeed carry over and magnify the flaws of the systems that they purport to replace, to the extent that for every convenience they unlock they seemingly burden someone else with a new injustice (more extortion of sweat indeed). That could take the form of increasingly long, low-paying hours or new wrinkles such as disregard for disability laws in the case of Uber, despite the latter’s much-ballyhooed elimination of the old-school discrimination of cabbies passing up fares whose appearance they didn’t like.
  • Anecdotes decrying the workings of Uber, Facebook, et al are often dismissed out of hand by the technology commentariat – Thompson himself talks of “countless anecdotal stories about how a company valued at tens of billions of dollars is taking advantage of drivers earning tens of dollars per hour at best” and sets it aside by dismissively asking “what drivers ought to do otherwise.” This sort of preemptive exasperation is common in tech-Twitter and on like-minded blogs – this idea that unions, labor bargaining, and more equal distribution for workers cannot be part of any Serious Conversation about the issues (I’m using “Serious” in the delightfully derisive Paul Krugman sense here).
  • And yet, these writers’ same anecdotes – Thompson’s piece has one about his own stay in Airbnb and how using a hotel would have been personally prohibitive – are often used to show how great the Internet-enabled services in questions actually are!

What I am ultimately getting at is there is a clear structure to all towering talk of how “the Internet” is “revolutionizing” every field from medicine to education to journalism. That is, there is a surface layer consisting of examples such as Uber, Airbnb, Facebook, etc. that have tackled recreational concerns – these are the trivialities I discussed earlier. Beneath that, there is the layer of the much more substantial changes of uprooted labor, unpaid toil, and erosion of the power of public institutions, all of which are being effectively obscured by the bourgeoise “problems” that the most prominent Internet services solve.

We extol Uber’s “disruption” (I hate this word – only a hyper-capitalist would be fascinated by how pushing down labor costs can make products more competitive!) of a telephone- and thumb-enabled good with an Internet-enabled one, speaking of a minor change in ordering/billing infrastructure as if it were world-altering. We do this when the big change is really the degradation of the cabbie profession, the privatization of transportation options, and the continued dominance of capital over labor. We are paying a huge price for what amounts to trivial advances for the upper classes…

Asking hard questions of Google’s neural nets

What is the point of tech journalism? Actually, what is the point of journalism writ large in 2015?

Let’s start with the latter – as Matt Bruenig helpfully noted on his blog a while back, many journalists insist that they “work for the public” by covering hard stories like the rationale behind ISIS or, to cite a classic example, the secrecy of the Pentagon Papers. Journalists seem to imagine a wall between themselves and the ‘lesser’ folks who work in fields such as content marketing and especially public relations. Those people might write merely to please other businesses or spin new products, but we journalists – the journalists seem to say – we are fulfilling a higher calling.

Whatever. Jennifer Pan penned a great article for Jacobin about the PR/journalism row, looking at the extraordinary, thankless emotional labor expended by non-“journalists,” mostly women. This work is every bit as “real” as Wesley Lowery going to Ferguson so that he could then endlessly brag about it on Twitter, and it is increasingly the norm for white collar workers everywhere who, overwhelmed with task, are under the gun to channel their emotions into deadline-driven writing. 

The plight is shared. Trying to draw these particular lines is just another form of capitalism’s shameful culture of work-prestige, in which we humiliate workers doing thankless jobs to survive, and especially the unemployed who, even if they don’t want or need to work, face unbelievable cultural scorn simply because their efforts are not remunerated! Some day this attitude will be seen for the antisocial embarrassment that it is.

In this context, “journalist” becomes a moral and normative, rather than a descriptive, term.”Journalist” is to the language-intensive fields what “entrepreneur” is to the tech sector: a loaded word for dressing up ultimately mundane labor, which is everywhere. The entrepreneur runs businesses, and the journalist writes words that target a specific demographic favorable to advertisers. Doesn’t sound so extraordinary, does it?

Are Vox, BuzzFeed, and even The New York Times now anything more than vehicles for precisely paired ads and text? Look, here’s a mansplainer about coal power plants, alongside a tasteful banner ad for Hulu. Journalists imagine the traditional wall between their work and the business itself, but even this separation has its limits. I doubt The Verge is going to do a deep-dive series on the merits of Leninism or on the unremarkable hypercapitalism of Google, Facebook et al.

Which brings me back to the original question – what do tech journalists do? 

The likes of The New York Times’ Farhad Manjoo or even one-man operations like Stratechery’s Ben Thompson and others from the Web’s most trafficked sites often create work that is indistinguishable from press releases. I mean, Manjoo once talked about how Spotify encourages “a musical culture that more closely resembles a cocktail buffet than a sit-down dinner,” without asking if this is a good thing, why it happened, or what is means for the people producing the music. Thompson’s posts, while typically more nuanced, rarely challenge what firms from Uber to Apple do, instead viewing these companies as essentially apolitical entities that move with all the impersonal intractability of something like the weather or time itself. A typical paragraph reads:

“When you think about it that way — that Netflix isn’t so much a network as they are a type of marketplace in which consumers can give their attention to creators — it becomes apparent that Netflix isn’t that far off from Uber or Airbnb or any of the other market-makers that are transforming industry-after-industry.”

Netflix isn’t just some company that put videos on its servers, no – it’s one of the new “market-makers,” an enviable godlike position in a world in which “the market” is still treated as a fact of nature rather than, as John Pat Leary has so elegantly noted, “a political instrument and a historical notion.” Companies are really just people – laborers, beholden to capitalists – but tech journalism makes them seem like persons – i.e., autonomous, cohesive entities with their own histories and personalities. The U.S. Supreme Court would be proud.

To get a real sense of just how completely tech journalists and reporters have capitulated to the propaganda, though, you have to have read its recent hysteria about Google’s neural nets. Basically, Google fed random images to its custom cloud infrastructure and got some “weird” results that the tech press thought were cool. Here’s Business Insider’s hot take:

“Google’s artificial neural networks are supposed to be used for image recognition of specific objects like cars or dogs. Recently, Google’s engineers turned the networks upside down and fed them random images and static in a process they called “inceptionism.”

In return, they discovered their algorithms can turn almost anything into trippy images of knights with dog heads and pig-snails.”

Bleh. Even the grouchy Gizmodo couldn’t muster much more than some linked tweets and images and an invitation to try it out for yourself. So Freddie de Boer was right when he noted:

“I still have not seen a single piece on this “Google’s dreams” thing that asks a single hard question or brings an ounce of skepticism.”

So in that spirit, I’m going to ask some hard questions of my own (I figure my skepticism is more than implied throughout this entire piece):

  • What does this mean for facial recognition technology? Is Google preparing its neural nets to catalog all human faces with identities? What if someone doesn’t want this?
  • Google is an advertising company. Is it planning to run psychology experiments on its users by showing them subtly “modified” images that are just click-traps?
  • We hear a lot about how Google, Microsoft, and others face a “skills gap” that prevents them from hiring U.S. workers. This is bullshit, but how much do these companies actually benefit from using this vulnerable labor from India and other places to fuel the development of questionable, antisocial projects, like these neural nets?
  • We’re often told that superhuman AI is not only inevitable, but necessary. If these ‘dreams’ are the basis for what we’re creating – super fast image recognition – is there really so much promise in this technology, though? Better ads, quicker load times – that’s it?

I could keep going, but all of this is tedious I realize. Just as journalists are increasingly at the mercy of advertisers – to the point that even great outfits like iMore are inseparable from the annoying ad engines that keep their lights on – tech journalists are more subtly ruled over by the motives of reified firms that seemingly none of them dare question.

A life-altering adjective

It was your typical July afternoon in Kentucky, 2003 edition. My cohort at the Governor’s Scholars program – which at the time was held at three different colleges across the state for rising high school seniors – had met in the lobby of one of the dormitories at Centre College in Danville, and now we were off to the basement. I was barely an hour’s drive from home, but it felt like being in Princeton, New Jersey (where I had spent the previous summer) again, such was the strangeness of being alone with a bunch of other teenagers away from my family. This was a time when the program could prohibit students from bringing “computers” (read: desktop PCs) and have that be an effective way of isolating them from the outside. I had a cellphone that looked like a candy bar.

Anyway, there was an ice-breaking exercise – it was about alliteration (see what I did there?). As a way of telling everyone else our name, we had to pair it with an alliterative adjective (e.g., Sagely Susan, or Miraculous Mary). I seemed to settle out of nowhere on “avant-garde” – questioning it briefly since it was compound-hyphenated, but speaking it all the same through the basement air. I think someone smiled.  This choice of adjective has made an unfathomable difference in my life.

Later the same day or maybe the next (this was 12 years ago), I sat by myself at a table in the dining commons. Another student came over to me and introduced himself with “avant-garde?” It was memorable, I’ll grant – the longest and Frenchiest of all the adjectives. Maybe he was the one who had smiled. Anyway, the word was in this way a double ice-breaker, and we got on to fuller conversation, which for a pair of almost 17 year-olds at a summer camp involved where we intended to go to college.

I hadn’t thought of it much before that meeting. Maybe I would have just gone somewhere in state as the de facto option, had this meeting not happened. I told him I had spent a summer at Princeton, which I wrote off as a place that “didn’t know how to have fun” (how as a 16 year old he knew this I still don’t know). Then he began talking more positively about “Brown,” which I knew at the time mainly as one of the losingest college basketball teams ever as per an ESPN infographic. My interest was piqued.

A few nights after that, I was using that aforementioned cellphone to talk to my parents and I began to talk about some of these colleges this guy had been talking about. Maybe his perspective had been shaped by his experiences as a football player being recruited – he played for Owensboro Catholic – but the academic allure of the Ivy League schools, in a time before the Internet was really as pervasive as it is now, was having its own effect on me. I remember pushing so hard for it despite suggestions about just staying in state.

The fall of 2003 was accordingly a hectic one, with drawn-out application processes and interviews. I mentioned my friend from the dining commons during my Brown interview and the interviewer was surprised we knew each other and seemed to insinuate that the guy was maybe not a good fit for the school. He seemed to think otherwise about me, and I got my acceptance letter about 9 months after that first “avant-garde” utterance, around the time of the 2004 NCAA Final Four.

I sometimes think about what would have happened if I had picked a different word. It would have been a different world.

These sorts of almost accidents – or maybe they’re just actions that come from some place we don’t understand, if the universe really is deterministic after all – have scary power. Something similar, though less consequential, happened to me in 2006 when I was searching for music on a website called emusic. One of my friends from college was into some indie band – I think they were called The Delays – and I searched for them, and due to some mislabeling or weird search error on the site, the only result I got was a progressive trance compilation from the record label Renaissance UK, mixed by the DJ David Seaman.

This 2xCD collection ended up being by far the most influential album on my own tastes. It introduced me to acts like Luke Chable and more importantly Gabriel & Dresden. From there, I discovered more of Chable’s work and eventually found Gabriel & Dresden’s massive “Bloom” mix album, which introduced me to Above & Beyond and the entire Anjunabeats/Anjunadeep universe. “Bloom” was fittingly released on the first ever day of college for me, a day that also featured my first meeting with a professor who went on to become a co-worker and one of my best friends even to this day.

As for Above & Beyond, I have written about them several times here, saw them play Madison Square Garden, and listen to their podcast all the time. My favorite memory, though, is from early 2008 when I came back to my dorm after a weekend at a friend’s place. I think it was in March or April. The sun was just rising and it was foggy and I was looking out onto Bowen Street in Providence, Rhode Island as his car pulled away. In the background I was playing the first disc of some other compilation I had, which kicked off with a remix of the peerless Above & Beyond track “Good For Me.” It felt like I was in a trance (hah) as I reviewed some Latin grammar, too, of all things. Avant-garde studying, indeed.

Not Being Afraid Of Writing

I didn’t always like to write. When I was a 6th grader, I remember typing nervously on an old Windows 95 PC after school one day, trying to finish a intro-body-conclusion essay about a topic so important it was probably on a dreaded standardized test. My first ever “short story” was a heavily plagiarized handwritten knockoff of the plot of the computer game “Laura Bow 2: The Dagger of Amon Ra” (man, I wish I had that around – there’s something about handwriting in particular that I think invites so many possibilities). The Rubicon I ended up crossing was reading the book “The Haunted Mask,” part of R.L. Stine’s massively popular (and iconically 90s) “Goosebumps” series.

Writing is unique among the creative arts, I think, because the inputs that go into being great at it are so predictable: The best writer are almost invariably the best readers. Moreover, there really aren’t writing prodigies in the same way that there are music or visual art prodigies. Many of the world’s greatest authors – Shakespeare, Sophocles, Shaw, to name but three playwright with S-surnames – were late starters and/or late bloomers.

Shakespeare didn’t publish any play till he was in his late 20s and arguably didn’t hit his stride till he was in this mid 30s. Consider that Hamlet was likely completed when the Bard was 36 or 37, and all of his great tragicomedies (‘All’s Well That Ends Well,” “Coriolanus,” “The Two Noble Kinsmen,” etc.) meaning that he hit was still climbing to his artistic peak at the same age at which Mozart was, more than a century later, deceased (the Austrian composer died obscure and poor a month before his 36th birthday). Sophocles finished “Oedipus at Colonus” when he was almost 85.

The explanation is straightforward enough: Age brings opportunities to not only read more, but to read differently, to add new histories, correspondences, novels, poems, blog posts, newspaper columns, etc. to the brain’s vast, subconsciously indexed repertoire. The base is never forgotten, will never crumble, even as new columns and ornaments are added to it. I remember coming across certain turns of phrase and vocabulary words for the first time, but these discoveries fuel relatively minor bouts of growth. The most lasting learning comes from soaking up writers who are unafraid of using language, because language is for them almost like a surgical tool, the only one they have, for relieving that frenzied, mildly anxious condition known as inspiration.

“Inspiration” may be too mystical a word for it, invoking images of Muses speaking sentences directly to some grizzled Hemingway hunched over a typewriter. For me it’s more like, some sentence that hits the brain like rain would hit a fully spread-out umbrella, formed from the vapors of different overheard sentences or signs read on the subway. Sometimes, there is a phrase that just has to be turned into its own piece, forming the body and then requiring a title as a final ribbon on things, and at other times the title comes first and the body follows.

It’s sort of like cooking: The motions and the measurements vary each time, but the recipe – the things you’ve read, looked at, though about – stays the same and provides most of the final character. I probably would have never thought about all the different ways to compose – starting in media res, throwing paragraphs around the page, writing the intro last, lifting seemingly unrelated anecdotes to provide segues – had I not taken my current job two years ago and been forced to write at such tremendous volume for such a sustained period of time.

Having hard quotas is a way of dispelling concern about perfection, sure, but it is also a spigot for creativity. I don’t have forever is sort of my mentality with writing, rather than the less moving it doesn’t have to be perfect. I have a limited time to get my thoughts on the page and I don’t know who if anyone is going to read them – why be afraid? If nothing else, writing anything,  writing it quickly, and then reading it back more slowly later (I always cringe at reading my own stuff in the moment) has a way of fueling the reading-wrying cycle that allows for growth.

Junk from Facebook

Facebook excels at making me occasionally hate people I have known for years. Maybe they liked some homophobic retailer, shared a widely debunked story unironically, or generally just kept posting skinny mirror selfies to show How Awesome their lives were. Whatever. But Facebook’s corrosive powers don’t stop there; it’s the absolute fucking best at stirring up contempt for complete strangers. It goes where Reddit and the comment section could never go, because it creates a link between life-destroying nonsense and someone’s face/real identity.

These missives often come in the form of comments on a friend’s post, from someone I don’t know. Anyway there were two that really got me recently, so I’ll dissect them, not so much because they made me mad out of nowhere but because they triggered some thoughts I have had about the subjects in question for some time.

First, this sage on dietary advice and social progress:

“I’ve read several nutrition books from low carb to full vegan with many contradictory findings. The only absolute between them all is the undeniable harm refined carbohydrates and added sugars have on the body and society. It is definitively linked to the number one killer of Americans, more than lung cancer, more than drunk driving: heart disease.

The greatest health mechanism of our century wouldn’t be a cure for cancer, but a tax on added sugar and refined flour.”

LOL.

Let’s start with the “contradictory findings” he mentions in the “nutrition books” he read. Resorting to confirmation bias and especially arguing that humanity has strayed from some idyllic dietary past are not bugs in nutritional literature (mmm) but features of it. Consider the long held wisdom that saturated fat causes heart disease (I picked this ailment due to the content of the above Facebook post). The American Heart Association has been largely responsible for peddling this notion, yet a 2013 study in the Annals of Internal Medicine found that:

“Current evidence does not clearly support cardiovascular guidelines that encourage high consumption of polyunsaturated fatty acids and low consumption of total saturated fats.”

The reasons that so many diet books are filled with contradictions are: 1) the authors are trying to sell the reader something, rather than educate him; 2) the effects of many foods on the body are still not well understood and merit further.

The poster above of course won’t have any of this, as seen in his use of “undeniable” and “definitively,” despite the doubts that can be cast on his claims. His usage of lung cancer and drunk driving in passing are notable, since he is trying to point to an obvious cause of heart disease on par with cigarette smoking and lung cancer or excessive alcohol consumption and drunk driving. It doesn’t exist, though.

Demonization of sugar in particular has much more to do with moralistic ranting about how “if it tastes good, it must be bad for you,” confusion about the differences between “natural” sweeteners like honey and their “chemical” clones like high fructose syrup (the same fucking thing), and fears that kids were being “poisoned” by candy, than it does with any solid science (sugar may cause weigh gain, which is worth than death for much of the current elite; but even being fat has no clear effect on mortality). Ditto for carbohydrates, albeit with an even more sordid history of junk low-carb and gluten-free diets that arose from one doctor’s accidental success in treating a celiac with a banana and skim-milk diet.

The last bit is bad in a different way, since it displays such limited imagination in improving health – a tax (and not just any old tax, but a regressive one borne by the poor as they try to buy food)! Denmark actually tried this “mechanism,” as he calls it, before, except with saturated fat. Once it became clear that the tax motivated Danes to cross the border to get fatty foods and that the entire premise of the measure was some shaky science, the tax was rolled back.

How about instead of using neoliberalist bullshit from self-help and diet books to control public institutions (i.e., governments that can enact taxes) we instead focus on actually figuring out how foods affect the body? The whole post comes off as someone afraid of being “fat” or “out of shape” trying to lecture the entire world on what they should eat.

Moving on now, to this luminary on the subject of the minimum wage:

“I don’t understand why the US thinks minimum wage should be $15/hr. If you are worth more than $7.25/hr to a business then there is no conceivable reason for you to be stuck working a minimum wage job. It doesn’t take skill to operate a register, clean a bathroom, or serve a meal, it’s basic labor and it’s not physically demanding. If a job is any more than that and still paying minimum wage then you’re working for the wrong company and should move on.

Minimum wage isn’t meant to support a family, purchase a new car, home, or even pay student debt. Minimum wage is meant for introductory roles or part-time/basic labor. Is it abused? Obviously. Will raising it fix the problem? No. It will just cause a loss of jobs and harder work for those making $15/hr. It will also cause pay cuts for those above the $15 mark who have busted their ass to get somewhere in life.”

Blah blah blah, look at me I work in IT to ensure that people get vitamins and loofas deliver to their front doors. First off, it’s curious that we start with figuring out what a person is “worth,” which in this case is determined by a business rather than by the person himself.  Businesses do not have anyone’s real best interests – in terms of remuneration, health, you name it – at heart and exist mostly as outmoded institutions that are preserved to prop up the neoliberal state.

“No conceivable reason,” eh? This statement assumes that the employment market is rational and not beset by randomness, injustice and events far beyond a jobseeker’s control, such as the world-gambling going on every day on Wall Street. The poster has decided that all reasons for someone being stuck in minimum wage while deserving more can be ruled out. We can probably even do away with the nominal “$7.25/hr” bit, since the writer seems to think that whatever a business deems a worker is worth is what he is actually worth! I guess that includes $0.

“Skill” is an infuriating word in the context of employment discussions. There’s persistent talk about the nonexistent “skill gaps,” which is mostly code for businesses trying to squeeze workers’ wages by not hiring them and creating the artificial scarcity of unemployment, which drives desperation and willingness to take anything. “Skill” also imparts a sort of fictional objectivity to a chaotic market, through its associations with culturally important icons like athletes (who have “skills” in narrow areas) or card/video game players.

“It’s basic labor” – hah! Try cleaning a bathroom every day of the week. Better yet, try being a caregiver working for near minimum wage for 60+ hours a week and see just how un-demanding such a job is. Again, we have the assumption that high pay correlates with “real” work and low pay with “basic” work, when of course there are so many counterexamples that I could fill up the rest of this entry with them. A caregiver puts in much more body- and mind-numbing work – work that can be a matter of life and death for the person involved – than any software developer working on some Web app for a consulting firm can ever aspire to.

Saying minimum wage “isn’t meant to support a family, purchase a new car, home, or even pay student debt” reimagines many of the transient ideals of our age – home ownership, car purchases, exorbitantly expensive college – as universals that can serve as bases for judging what someone should get out of their work. It is a great question these days to ask exactly why anyone works in the first place, when so many occupation are completely removed from social welfare and basic human survival and automation could play a bigger role. The poster has an idea of “why,” though, and they’re all goals from the postwar era when today’s suffocating, precarious work environment was still decades away and society didn’t fetishize every last word out of some CEO’s streamlining, cost-cutting, union-busting mouth.

“Who have busted their asses to get somewhere in life” – it’s statements like this one that make me really despair over the U.S. ever finding a way past its relentlessly classist and racist system. Instead of trying to imagine that we’re all in this together and deserve dignity as members of the same nation, this poster draws the line between the vast masses of those earning minimum wage and the truly deserving who had the fortune to enter a lucrative field within our deterministic universe. This attitude is responsible for so much social ill, from the ridiculous costs of American healthcare (cue remarks about how long it takes to become a doctor) to the gentrification of working class neighborhoods by workaholic “entrepreneurs” making digital baubles for the 1 percent.

I won’t do this type of post again for a while, most likely. Again, I had planned to write on these issues at some point, and Facebook simply provided me with the raw material I finally needed to get started.

Follow

Get every new post delivered to your Inbox.

Join 252 other followers