It is an enviable feeling when you find a word that encapsulates a complex experience in a just a handful of characters. Examples for me include “schadenfreude” (German; taking pleasure in someone else’s misfortune) and sehnsucht (also German; way too complex to describe here). Even “nostalgia,’ although a relatively well-known English word, is much more evocative as a Greek form – its roots come from verbs that mean “to go home” and “to struggle,” meaning that nostalgia is literally a “struggle to go home,” which paints a brighter picture in the mind than simply pining for the past ever could (I love the notion of the past being “home”).
Then of course, there is the longest ever Greek word at the end of Aristophanes’ Ecclesiazusae. This amalgamation takes advantage of the unique characteristics of the language (I always thought of Greek as a language of addition, which is hard to explain – it’s like it’s a bunch of puzzle pieces waiting to be fitted together, especially its nouns) to invent a new term for stew that includes all of said stew’s ingredients (an English translation is impossible; the lone accent mark at the end, added because of the language’s rules, is hilarious in this context):
Sigh – nostalgia for when I first read that play in 2006. I have always felt like Greek was a superior language to English, since its freedom from relying exclusively on syntax for meaning gives its extra resources for creatively arranging its words. The gap between Aristophanes and Plato in the originals and in English is a testament to this.
Anyway, I came across a word today that gave me the rush I was talking about, although it is not an exotic word and is cobbled together from common components (I mean, even a delicious stew can be made from cheap ingredients, right?) Writing for Time, Siva Vaidhyanathan unleashed “technonarcissism,” a term that pops up here and there but is far from mainstream. He explained it this way:
“There’s a widespread and erroneous assumption that new technologies radically change how everyone lives. In reality, such change is slow, stunted, complex, and uneven. The wealthy and educated who tend to read and write about new technology obsessively also tend to exaggerate the cultural and economic influence of technological change because they embrace it.”
Indeed (this would be a Greek way to start a sentence, with a particle!); for years I was in my own Twitter echo chamber because I followed mostly venture capitalists and virtuoso technonarcissists like John Gruber of Daring Fireball and Ben Thompson of Stratechery (this was when I worked for a startup in Chicago). My world became one in which the release of the iPhone in 2007 was a momentous, earth-shaking event(well it definitely was for Apple’s shareholders), Twitter was supposedly a platform for the masses, and institutions from taxi drivers to makes of Adobe Flash makers were just purveyors of “legacy” crafts primed to be crushed under the wheel of “progress.” The nadir (peak?) of this thinking can be seen in empty pronouncements such as this one from Thompson (couldn’t get the embed to work, so I’m just quoting) mocking concernd about the current bubble in “tech” companies:
” “This time will be worse because the real world is affected.” Or this time is because tech is actually affecting the real world.”
What is “tech” and what is “the real word”? These are the broadest of descriptive strokes. “Technology” as a category is curious, as Leo Marx has argued in a great paper. It is essentially the rebranding of blue-collar activities – working with machines – into white-collar ones so as to achieve a degree of class separation in which the already well-off can be generously construed as agents of change (“leaders,” in the anti-democratic parlance of our times). From this shift – made possible mostly/only by appeal to a scientific-sounding Greek word; and yet we are constantly lectured about how non-STEM fields don’t matter! – we get a culture obsessed with “innovation,” an activity that is distinctly unavailable to the underclasses.
The vagueness of “tech” also makes us see mundane advertising firms like Google and Facebook as world-changing companies in their own category, as Peter Strempel has explained:
“Google is no more a technology company than auto manufacturers, pharmaceutical corporations, or food conglomerates. The latter all use and develop technology, too, but we name them according to their products and services, not the tools they use to develop and sell them.”
I mean, Bank of America makes more software than Microsoft does, but you would never see any self-regarding tech writer call BofA a “tech” company. Evgeny Morozov was onto something when he hypothesized that Google’s reorganization into Alphabet was driven in part by embarrassment – that despite all the bluster about solving “big problems” and all of its exclusive perks, the search giant realized that it was just an advertising firm, with a business model like that long-written off medium, free broadcast TV, that continues to be an important media stream for the less privileged (who, as a bonus, don’t have to put up with excessive data collection and tracking while in front of the boob tube).
The business about the “real world” in Thompson’s tweet is even more revealing about the technonarcissist outlook. How, exactly, can “tech” (whatever it is) ever not affect the “real world”? “Tech” here is configured as something that exists in its own plane – an extension of the digital dualist conception of reality – almost god-like and not subject to the same experiences as everything and everyone else. Religion may be declining in the West, but these sorts of myths – about the saving power of technology in particular and of progress in general – continue, as John Gray has explained in his excellent book “The Silence of Animals.”
Accordingly, we get countless “real world” people – taxi drivers, hoteliers, booksellers, et al, as described in a Nick Bilton article excerpted by Thompson in another tweet and presented only with “gotcha” commentary rather than any real compassion for these individuals – displaced by (“imaginary world”?) actors – smartphones, tablets, “the Internet,” startups – that are presented as vanguards of an unstoppable, historic, and not-of-this-real-world force (innovation?!). But the latter group owes its success to mundane things – carrier subsidies, ads, government research, rich VCs with a decades-long windfall from Reaganism – that bankroll its illusions of grandeur.
In other words, they are part of the real world (everything is) but the class separation afforded by terms like “technology” and access to vast amounts of capital, to the degree that “failure” and not making any money (Amazon is a great example) are not the catastrophic events that they would be for say a mom-and-pop business, makes them come off as special. And so we get narratives about how technology is “changing everything” and doing so “faster than ever” in large part because the members of the technonarcissist class spend all day moving from one gadget to the next, calling out confirmation bias even as they labor to disprove threatening narratives (such as Jill Lepore’s takedown of “disruption”), and ruminating on the meaning of Google’s logo change.
All of this involves the “real world,” so why the insistence that this time is different, regarding the aforementioned bubble? Some of it is probably the shame Morozov hinted at. As Paul Krugman has explained many times, for all the hype about “tech” writ large, its effects on productivity and wages have been meh-ish. It’s hard to know if Facebook, Microsoft Excel, or Uber have made the world a better place for most people. Indeed, the implication of tech in affecting the “real world” in the current wave of funding – as opposed to the apparently meaningless bubble of the late 1990s and early 2000s – is one that relies heavily on negative signaling, such as the protests of taxi drivers over Uber and the frustration of booksellers with Amazon. So while the technonarcissist tech press basks in the convenience of services built upon huge stores of exploited labor, many others suffer and let everyone know about their fresh wounds.
The realization that this has happened could be taken as evidence that “tech” is indeed affecting the “real world” (it could not be otherwise, after all), sure. Mostly, though, it is proof that, far from the “progress” narrative assigned to so much commentary from the non-reading tech commentariat, age-old forces of capitalism, from advertising to automation, are now more than ever succeeding at separating the haves from the have-nots, just as “technology” itself appropriated and consolidated the wares of “manufacturing” and “machining” into a new, all-conquering term.
The New York Times has published a deep look at Amazon’s brutal white collar workplace norms, which is somehow both unsurprising and shocking. Unsurprising in that this type of 80-hour-a-week bullshit (despite the evidence of diminishing returns and the huge incentive to just pretend that idle hours were spent “getting things done”) is everywhere in the high-tech economy beloved by the upper classes, and shocking in how completely dehumanizing Amazon’s entire system has become. It has turned its relentlessly efficient supply chain management on itself. So a company most famous for shipping books, diapers, and anything else to you at lower cost and in less time than anyone else (it seems kinda trivial to put it like that) has made these traditional retail stakes – I’m sorry, “innovations” – the ambitious ends justifying a bleak set of means.
I am not going to dive into everything that is deplorable about Amazon’s approach; the problem here is rooted in the history of American capitalism, techno-utopianism, and sexism (Amazon has no female top-tier executives) and the scope of this entry is much smaller. Instead, I am more interested in how Amazon has created white-collar equivalents of many of the indecencies of increasingly precarious blue-collar work. Whereas the latter has obviously suffered for decades under the dissolution of unions, the offshoring of labor, and the overall eclipse of labor by capital, the (really) upper middle class white-collar world has all the while maintained a facade of control and direction in the current economy, captured in the obsessive use of terms like “flexibility” and “leadership.”
But it’s just that: a front. Being “flexible” in the context of the workplace often means having to yield – in one’s time or in how one uses one body (this is why ‘flexible’ is such a telling term) – to undetermined forces, whether they be writ small -“efficiencies” – or large – “the market -” to disguise the fact that they’re just policies approved by upper class real human beings, subject to passing ideologies like neoliberal economics. In Amazon’s case, the perennial capitalist crisis of low profits, which is currently roiling the Chinese monetary system, has been the norm since day 1, which helps explains why the retailer – which despite having made virtually no money in its entire existence is valued at a hundred of billions of dollars – is perhaps even more notably inhumane than its sea of peers from Uber to Apple. It’s had plenty of time to bide its time and sharpen its claws.
While Amazon’s highly compensated mid-to-upper management employees are likely not what most people think of as ‘the middle class,’ I think that their subjection to crying fits and ridiculously petty harassment over email is the inevitable upward migration of manipulative techniques first tried-out on the underclasses (say, Amazon’s warehouse workers and the billions of their ilk all over the world). This is not the orthodox position, though. On Twitter, where fascination with Amazon has shaped the low-stakes non-ideological commentary of writers like Farhad Manjoo and Ben Thompson, the NYT’s revelations have instead spurred a sort of foggy “choice” narrative that is typical when trying to discuss neoliberal economic institutions without getting into politics.
In since-deleted tweets, for example, Thompson talked about how companies like Amazon “don’t happen by magic” and that many people still choose to work there. I think this viewpoint oversells how much any individual “chooses” to subject herself to the workplace wringer that Amazon et al are continually refining (and not in a good way; iteration is not necessarily positive). When the paths to the upper class are so narrowed by inequality that transversing them requires extraordinary measures such as the will to take on student loans and later be subjected to 24/7 interference (and all of this as an introduction), the romanticized idea of “choice” loses its luster. Today’s optionless (in all senses) 1099 warehouse stocker is tomorrow’s product manager with a caveat-laden contract and no down time.
Moreover, the narrative of “leadership” and “impact” are more of the anti-democratic, hyper-competitive worldview of the business elite, one that doesn’t really square with how the global economy operates (as Paul Krugman has explained) but nevertheless retains regrettable sway over individuals eager to “rally the troops” and do whatever it takes – bust unions, subject even the lowest-paid workers to humiliating screenings each night – to stay ahead in some fictional race. The idolized “leaders” of the new business world, including Amazon CEO Jeff Bezos and secularly sainted Apple founder Steve Jobs, are seen as following straight lines from fanatical hard work to riches, which is ridiculous: they are also successful because they occupied certain historical moments with particular opportunities, and because capital accrued to them from sources other than their labor. Yet the hardworking, deserving CEO myth lives on.
Fortunately, I think there has finally been some push back to the Great Companies coverage that has been de rigeur across the Web ever since Google IPO’d and made it seemingly cool to not be “evil,” even if an organization was just another profit-seeking outlet like G.E. or Standard Oil before it. Thompson deleted many of his Amazon tweets following frustration over others who ‘assume malice‘ in his remarks about how “choice” factored into the Amazon wringer, how these practices were necessary to make a company “like Amazon” possible, and how the NYT was somehow the real villain in all of this because of its style of coverage (?).
I believe him that he was not defending Amazon consciously in the deleted tweets. Still, the style of commentary he exemplifies – i.e., one that argues that a given product/service is “the future” despite its unclear/possibly negative social value, that money from VCs and Wall Street is somehow mostly (and miraculously, considering the rent-seekers we are talking about here) directed at tech projects that improve the lives of people, and that the Interent and all of its derivatives such as the click-based economy are just immutable facts of business that we cannot change – will almost inevitably be read that way given the deteriorating economic environment for so many in countries like the U.S., Amazon’s home. The true orice of Amazon-style convenience is finally becoming clear to many of us, and that’s a good thing.
Google is now a subsidiary. A holding company called Alphabet, with former Google CEO Larry Page as its CEO, was formed today, with a diminished Google (now led by Sundar Pichai) as one part of the conglomerate. Parsing the press release, I could not really understand the move, though I thought the ever-reliable iOS indie developer Marco Arment (you should really read his blog) cut through the jargon perfectly:
“Google is not a conventional company. We do not intend to become one,” said the world’s largest advertising corporation.
— Marco Arment (@marcoarment) August 10, 2015
This sentiment gets at the heart of what seems so lacking in all of Silicon Valley’s self-gratulatory talk about “innovation” in general and Google’s self-promotion as an “unconventional” firm in particular. Despite all the bluster about solving big problems and “disrupting” this or that, the essential business model of the ex-Google, Facebook, Snapchat, etc. is advertising, something with which a living room viewer with her rabbit-ears TV set in the 1950s would be readily familiar.
Moreover, how long can advertising last? It seems like an absurd question, but consider this argument from Astra Taylor’s excellent “The People’s Platform”:
“Advertising is, in essence, a private tax. Because promotional budgets are factored into the price we pay for goods, customers end up footing the bill. That means that, all together, we spend more than $700 billion a year on advertising, a tremendous waste of money on something that has virtually no social value and that most of us despise.”
This hatred of advertising has not reached critical mass yet. But it seems to be bubbling up in the increasing usage of ad blockers and services such as Disconnect and Ghostery. It also may be contributing to the operational and product changes pursued in recent months by ex-Google and Facebook, which split up and introduced a tightly controlled non-Web platform (Facebook Instant Stories) for publishing stories, respectively. Serving ads on the Web is a miserable business.
Early last year, I wonder what it would take for Google to decline, at a time when Google was seen as “catching up” to its chief rival, Apple (the monetary gap between the two has only widened since then). I compared Google to a church, with few or no direct payments from its “users” and heavy subsidies from everyone from big box retailers shelling their new deals to Apple itself, keeping Google as the default search engine in iOS and OS X:
“Google is like a church or a cathedral. That is, it is frequently visited, assumed to be a mainstay of the cultural fabric regardless of external economic conditions and – most importantly – it collects little to no money from any of the end-users who interact with it. Sure, parishioners may make a slight donation to the local church, but the real funding comes from other sources; likewise, Joe Web Surfer doesn’t directly pay Google for anything, with the possible exception of a buck or two for extra Google Drive space or Google Play Music All Access. Hence, the actual business of Google is abstracted from consumers, who end up spending little or no time contemplating how or why it could go belly up – it’s not like they can point to reduced foot traffic or ridiculous clearance sales as harbingers of decline.”
Google was a trailblazer not just in search engines but also in the $0-per-use business model that is now pervasive across the Web. A writer like me can churn out reams of “content” for the insatiable Web yet receive a fraction of the compensation of someone who builds the technical infrastructure that supports ad bidding and placement. Readers by and large pay nothing, existing only as “clicks” and “pageviews” to be monetized by irritating ads served by Google. And yet without us or the shrinking number of sites that still try to do original reporting instead of running on the homogenous Game of Thrones/Silicon Valley/Hillary Clinton hamster wheel created by Google, Facebook et al, what would the Web (or anything else that relies on “content”) be?
Alphabet is being spun as an exciting new phase in ex-Google’s history – no surprise there – but it feels like decline. It comes off as an inevitable running-aground, at a time when ads are resisted with rising fervor (blocking is coming in iOS 9) and all the low-hanging technological fruit – i.e., satellites, microprocessors, IP networks – spawned from the 20th’s century public sector investments has been picked, leaving only relatively mild “innovations” like Material Design or streaming music services to suck up the media oxygen and leave the sycophantic tech press breathless. The “big problems” that ex-Google is now structurally committed to solving are more of the same, fantasies spawned from direct obsessions with sci-fi and now-dominant geek culture. Google is normal, and Alphabet only confirms it.
The first time I ever used an iPhone was in 2008, not long after I had moved into my first studio apartment in Chicago, equipped with a circa 2006 Motorola Razr and no home Internet connection for the entire first month. Someone in the required core class of my M.A. program let me look at her iPhone 3G, and I was impressed at how much more capable it was than my flip phone. Easily scrollable contacts, maps, games, a real Web browser: 2008-me was bowled over. That feeling of novelty continued for a while, through at least the (leaked) introduction of the iPhone 4, after which most of the major smartphones coalesced around hi-res displays, fast cellular networking and Wi-Fi, and a comprehensive selection of apps.
That same feeling of “this is so obviously the future” did not accompany my unboxing and initial try-on of the Apple Watch Sport I ordered in late April and received in early May 2015. That’s not a slight at the Watch, though. Almost no device in the history of consumer electronics was so well-positioned for dominance as the first few iPhones, which were right at the intersection of a device class that many consumers were already invested in (cell phones), functionality that they already used on other discrete devices (e.g., MP3 playing on iPods; Web browsing on PCs and Macs), and a global set of business and technology standards (carriers and the GSM specifications) that could be tapped for growth.
The Watch, in contrast, is entering into the loose, mostly unproven “smartwatch” category. It isn’t replacing networked devices the way the iPhone succeeded the Razrs and Samsung Blackjacks of the world. Its predecessors for many consumers seem to be either inexpensive wristwatches – with or without digital displays – or nothing at all. For my part, I never regularly wore a watch prior to trying out the Apple Watch Sport. In this sense, the Watch is a lot more like the original iPod than the iPhone: a new approach to activities that were largely un-networked or mostly “offline” in the past (music listening in the case of the iPod; health tracking, timing, etc. with the Watch).
I tried on both the Apple Watch and Apple Watch Sport at the 5th Avenue Apple Store prior to ordering. The Sport was significantly lighter, a big advantage for me in particular since I was planning to use it mostly as a companion during exercise, not as a piece of jewelry. My review will focus only on the Sport, 42mm model.
The Sport comes with short and long bands, both made of fluroelastomer. Putting on the Watch took me a few tries to get right: The notch is lined up with a series of holes – like fastening a belt – and then tucked into an opening to complete the loop. The bands connect to the Watch body via two slots that can be controlled with the small buttons on the back of the Watch. You may need a pencil or long fingernail to press them precisely enough to release the band from the Watch itself and swap it out for one of another size and/or color.
All versions of the Sport have a light gray exterior, except for the black band version, which comes with a space gray casing. The screen is OLED Retina. Its lower resolution vis-a-vis the iPhone 6 Plus (with which I paired mine) is apparent when viewing screenshots on the iOS/OS X Photos apps. On the Watch itself, however, almost everything looks great, especially Apple’s own apps. Craig Hockenberry has broken down the relevant differences between LCD and OLED display technology.
There are 4 total exterior hardware buttons on the Watch. The Digital Crown is the main one. It serves as the toggle between the watch faces and the app “honeycomb” (the equivalent of the home screen grid on iOS), a back button (if you tap it once while in an app, it goes back to the honeycomb; if you tap it twice from the watch face, it goes back to the most recently used app), as a scroll control, as a Siri activator (long press), and as the way to perform “regular” watch functionality such as setting a timer. The other button on the side does much less. It brings up your contacts and serves as a power button if held in.
The underside of the Watch houses the lights that are used to measure your heart rate, as well as two more buttons which are used to control the band clasps. The lights enable a common medical technique in which green lights are shined onto your skin to be absorbed by the blood flow just beneath the surface. Since blood is red, it absorbs green light, and the Watch sensors can track the amount of absorption to approximate your heart rate. The Watch sensors also include a haptic feedback engine that gives you a tap on the wrist during certain notifications.
Apple introduced another OS with the Watch, stylishly known as watchOS. I don’t think it’s possible to write a Siracusa-style review of watchOS since it seems designed to be as minimalistic as possible. It basically has two faces (no pun intended).
The first is what I would call the watch side, which consists of the face you have chosen as well as the glances below it and notifications above it. The face displays the time as well as any “complications” – weather, physical activity, Calendar events – that are clickable and can take you to the respective Watch applications.
As on iOS, the notification center slides down from the top and shows all notifications from both Watch applications and from your iPhone. For example, you may get pings from Activity on your Watch (an app that doesn’t notify you on your iPhone), as well from iPhone-only apps such as iTunes Store.
Glances are pulled up from the bottom, much like Control Center on iOS and they are navigable from left to right. Glances, more so than fullscreen app immersion, seem to be the de facto way of interacting with most software on the Watch, especially since they can convey a lot of information before the Watch screen automatically powers itself off after 7 seconds if nothing is done (to save battery). You can pull up a glance to see where you stand on your Activity ring, what the current weather is, your most recent heart rate reading, etc.
You can even any audio playback (from any app) from your iPhone using the Music glance – this has been an immensely useful feature, allowing me to avoid fishing out my phone while on the subway or walking. There’s also a Watch equivalent of Control Center, with familiar options such as Do Not Disturb alongside new features such as pinging your iPhone if you can’t find it (surprisingly common!).
The second is the app “honeycomb,” buried behind a click of the Digital Crown. Using it is straightforward if you have ever used iOS. Tap to enter an app. You can touch and drag to rearrange the apps. This screen is the only way to get to useful utilities like the stopwatch, workout tracker, and certain third-party apps.
Many of these – Mail, Music, Maps – I never enter. The glances are usually enough. The standout (“killer” is so cliche) app is Activity, which gives you a look at your moving calories, minutes exercised, and hours during which you stood and moved around for at least one minute. It has a useful glance, and by default it proactively notifies you when to stand (if you haven’t done so during the first 50 minutes of the hour) and gives you updates on your progress.
So what non-Apple apps are good? I agree with John Moltz that Yummly helps enhance the Watch’s utility in the kitchen. It’s great to be able to dismiss notifications or respond to texts via voice or canned replies while you are in the middle of cooking and don’t want to get your phone even more dirty than it already is. Uber provides one-tap car requests. Day One makes it super-easy to make a diary update from your wrist.
Beyond those three, my favorite so far is Acorns, the micro-investment app. You can view your portfolio value from the glance, and use Force Touch (i.e., a harder-than-normal tap on the screen) to invest more money. It really takes advantage of the Watch’s software and hardware.
For now, Watch apps are not downloaded on their own. Instead, they are installed when you download the “parent” iPhone app to your iPhone. So if you install Fly Delta on your iPhone, its Watch app is sent to your Watch. Deleting the iPhone app will delete the Watch app.
The Watch has its own Settings app, which controls functionality like Siri and Do Not Disturb. The more nuanced controls, though, are found in the Apple Watch app now installed on all iPhones running the most recent version of iOS. From there you can do everything from rearrange the honeycomb to toggle which apps do and do not appear on the Watch. System updates to watchOS are also handled from this iPhone app.
Third-party app performance is spotty. The apps are essentially running on your iPhone and then transmitting data to your Watch over Bluetooth and Wi-Fi (both must be turned on all the time to keep the Watch paired).
Swapping-in additional bands is easier than I thought. Apple sells them for around $50 each for the Sport. You’ll need to click the two mini-buttons on the back of the Watch to release the clasps so you can insert the new band.
Don’t expect another iPhone – and that’s a good thing! The Watch is meant for now as a companion to your iPhone, not a rival to it. It shines most during workouts – especially the Sport, with its lightweight aluminum and durable bands -, in the kitchen, and as an interactive terminal for dealing with notifications. Both its software (mostly glance-oriented) and its hardware (no cellular connectivity) speak to its limitations. It takes the pressure off of feeling compelled to look at your phone all the time – if you get a call or message, it’s show right there on the Watch, so you won’t miss it.
It feels and looks good on the wrist and is nicely unobtrusive. I worried beforehand about it being a mini-iPhone attention black hole, sending a deluge of notifications, but those can be toggled as needed, and I don’t get that many to begin with. The mythical “steep learning curve” is just that – I didn’t find the Watch hard to get used to at all, and I expect I am hardly alone judging by the user satisfaction numbers so far. Definitely worth a try if you already have an iPhone 5 or newer.
“The Internet” is often lionized for its effects on what are, to the well-off people who can even use them, trivialities. Parsing the praise for services from Uber to Airbnb, the depoliticized reader can just imagine the sheer horrors of the bygone dystopia in she had to dial taxi services (on a phone, no less!) or put up with the indignities of hotel reservations. Few have popped this Internet hype balloon with more aplomb than Ha-Joon Chang, who in a sublime chapter in his book “23 Things They Don’t Tell You About Capitalism,” convincingly argued that the washing machine was a more important and socially progressive invention than “the Interne,” since the latter has mostly benefited our leisure lives. Has “the Internet” really sparked a “revolution” because of its ability to ease the discovery of nearby tapas joints?
I have put “the Internet” as well as “revolution” in quotes for a reason:
- First, although it is by default discussed as a non-political, reified force, “the Internet” is a social relation, built and managed by humans in accordance with the politics and class systems of their societies. “The Internet” is not a thing; it does not exist in nature and there is nothing inevitable about its character. As such, discussions of its abilities to influence human relations (this phrasing alone shows just how reified it has become) cannot simply trace its history of technological updates – e.g., the creation of Ethernet, the introduction of TCP/IP, the advent of Wi-Fi, etc. – but must also include the circumstances that attended these changes.
- Second, “revolution” is an odd choice by the technology commentariat for a descriptor of “the Internet”‘s impact, considering that revolutions are political affairs and often – as in the case of the ones that occurred in Russia and China last century – anticapitalist. Nevertheless, bloggers such as Ben Thompson of Stratechery conceive of an “Internet Revolution” that will rival the Industrial Revolution in scope and lasting power. I wonder if he and others realize that the Industrial Revolution only brought about the polished bourgeoise world of “tech” by centuries of class domination. As Lenin once said, “Advances in in the spheres of technology and science in capitalist society are but advances in the extortion of sweat.”
It is empirically the case that the world’s Ubers and Airbnbs, its Googles and Facebooks, and its iPhones and Fitbits, are only possible via a vast often unseen (or ignored) store of labor – Marx’s “hidden abode of production” – that cannot possibly compete with the “noisy sphere of consumption.” Whether 1099 contractors (your Uber driver), unpaid “content” contributors (everyone on every social network ever), or literal slaves (the children and poor who extract the metals that go into many consumer electronics), it’s safe to say that the actual grunt-work of the so-called “Internet Revolution” is not being put in by programmers logging “80” hours a week (see Peter Fleming’s breezy takedown of the overwork culture from a few months ago) but instead by the massive global underclasses.
Revolution and “normal people”
I just finished Astra Taylor’s excellent book “The People’s Platform,” which I discovered from a review on Fredrik deBoer’s website (which I essentially binge-read this past week). I could not recommend it more highly – it is a sober, well-researched, impeccably written corrective to the idea that “the Internet” will inevitably enable an egalitarian makeover of society because of how its users now have “open” access to so much information, each one with a smartphone in her pocket to become her own filmmaker or reporter.
John Pat Leary expressed similar sentiments in a brief piece for Salon recently, going after the buzzword of all buzzwords: “innovation”:
“[I]nnovation transforms processes and leaves structures intact. Thus, instead of reinventing housing or transit, “innovators” mostly develop new processes to monetize the dysfunctional housing and transit we already have, via companies like Airbnb and Uber. It’s one thing, therefore, to celebrate novelty indiscriminately — as if meth labs and credit-default swaps are not innovative — but what if the new isn’t even very new at all?”
Uber introduced a smartphone-accessible CRM on top of the existing taxi and limo infrastructure. Airbnb has done something similar with tons of rental units. But against these technically trivial changes to the topmost layers of huge social systems of labor (i.e., transportation and housing), Thompson et al still hold out hope for a techno-utopia:
“At the risk of painting too broad a stroke, it seems to me that much of the opposition to changes wrought by the Internet undervalue the positive impact said changes have on normal people. For example, people despair over newspapers closing without appreciating the explosion in quality content freely available to anyone anywhere in the world, the net result of which means those who choose to be can be far more informed about far more things than just a few years ago. Others gripe about Facebook’s frivolity or it and Google’s collection of data without acknowledging that both have fundamentally changed how we relate to both those we know as well as anything we wish to know.”
I’m not sure who these “normal people” are – perhaps they are inserted as a semantic complement to the “revolution” terminology often thrown around in these contexts, to exploit the notion of a pleased proletariat yet one that is diminished by being reduced to being a set of passive consumers (of services like Uber) rather than active citizens (who would have the political legs to bargain against more powerful interests). I think Thompson overestimates the demographic variety of “Internet” users, many of whom are well-off males using services designed by others like them. Anyway, the rest of the paragraph is chock-full of the types of arguments that Taylor spend her entire book debunking:
- Many newspapers have closed, but the likes of BuzzFeed (a Thompson favorite) and HuffPo (to name but two) have hardly replaced the investigative journalism and noncommercial writing (i.e., research and reporting that knows nothing of a world of “sponsored content” or other euphemisms for payola) that they performed. There has been a change in incentives, to say the least.
- “Freely available” is a misnomer. It’s free to the person who navigates to the site, in that she doesn’t have to pay for access. But she pays a huge price in attention-drain (ads), privacy (tracking from ads), and social safety nets and programs (from the downward pressure on wages of writers, created by giving away so much stuff for “free”).
- It is naive to think that “the explosion in quality content” (there’s that word again) means that everyone is going to be an adept hunter-gatherer of the precise items needed to be informed in a democratic society. Instead, there’s the echo chamber facilitated by Google and Facebook, which show us mostly what we have already seen (what a depressing lack of imagination), and the lingering effects of churnalism – tons of articles cranked out by writers toiling for fractions of a cent per word – meant to play to the profit-driven “platforms” of the major Web companies rather than the public interest.
- If this seems abstract, imagine if say the police department in your community were dissolved tomorrow and replaced by an “explosion” of “law enforcement content.” So instead of a publicly-funded, equal opportunity service offered to the community for the price of their taxes, the replacement would be an abundance of private sector for-profit law enforcement agencies that would have no motive other than to make as much money as possible. It would soon become evident that this incentive does not align with basic principles of health, safety, or community.
- Similarly, “the normal people” who ostensibly benefit from the convenience of a service like Uber end up paying a price in the privatization of basic services such as transportation. Capitalist organizations, unlike public agencies, are incentivized to accumulate capital but not necessarily to act in the public interest or even be fair to all customers.
- To continue with Leary’s observation, “innovative” services indeed carry over and magnify the flaws of the systems that they purport to replace, to the extent that for every convenience they unlock they seemingly burden someone else with a new injustice (more extortion of sweat indeed). That could take the form of increasingly long, low-paying hours or new wrinkles such as disregard for disability laws in the case of Uber, despite the latter’s much-ballyhooed elimination of the old-school discrimination of cabbies passing up fares whose appearance they didn’t like.
- Anecdotes decrying the workings of Uber, Facebook, et al are often dismissed out of hand by the technology commentariat – Thompson himself talks of “countless anecdotal stories about how a company valued at tens of billions of dollars is taking advantage of drivers earning tens of dollars per hour at best” and sets it aside by dismissively asking “what drivers ought to do otherwise.” This sort of preemptive exasperation is common in tech-Twitter and on like-minded blogs – this idea that unions, labor bargaining, and more equal distribution for workers cannot be part of any Serious Conversation about the issues (I’m using “Serious” in the delightfully derisive Paul Krugman sense here).
- And yet, these writers’ same anecdotes – Thompson’s piece has one about his own stay in Airbnb and how using a hotel would have been personally prohibitive – are often used to show how great the Internet-enabled services in questions actually are!
What I am ultimately getting at is there is a clear structure to all towering talk of how “the Internet” is “revolutionizing” every field from medicine to education to journalism. That is, there is a surface layer consisting of examples such as Uber, Airbnb, Facebook, etc. that have tackled recreational concerns – these are the trivialities I discussed earlier. Beneath that, there is the layer of the much more substantial changes of uprooted labor, unpaid toil, and erosion of the power of public institutions, all of which are being effectively obscured by the bourgeoise “problems” that the most prominent Internet services solve.
We extol Uber’s “disruption” (I hate this word – only a hyper-capitalist would be fascinated by how pushing down labor costs can make products more competitive!) of a telephone- and thumb-enabled good with an Internet-enabled one, speaking of a minor change in ordering/billing infrastructure as if it were world-altering. We do this when the big change is really the degradation of the cabbie profession, the privatization of transportation options, and the continued dominance of capital over labor. We are paying a huge price for what amounts to trivial advances for the upper classes…