One of the fascinating habits of the current far left (note: I consider myself a “liberal”) is an enormous propensity to protest small offenses while staying silent on large ones. Hence we get seemingly daily unrest on college campus over “trigger warnings” for Ovid, while these same easily affected individuals as well as the masses of “moderate Muslims” that supposedly exist around the world barely raise a finger against ISIS. Professor Peter Boghossian has brought this disparity in priorities up many times on his Twitter feed.
Anyway, let me take a slight tangent here and say that I think the presidency of George W. Bush was one of the worst things that has ever happened to the U.S., but not for the reasons that you might think. His policies were awful and ensured massive economic inequality, environmental degradation, and discrimination, but alone they weren’t enough to make him any more historically terrible than the likes of James Buchanan (did nothing in the face of the Confederacy) or Warren Harding (enormously corrupt). Instead, what really put Bush over the top is that he ruined an entire generation of liberal thinkers.
Events such as the Iraq War and Bush’s own outspoken Christianity filled the liberal consciousness and even now continue to ferment there, making it impossible for the left to talk seriously about pressing issues. Here’s what I mean: Seemingly no one in the liberal spectrum in the U.S. (or even in other parts of “the West”) can, say, call out ISIS for being barbarians because there first instinct is instead to babble on about how the Iraq War was a mistake, how if we weren’t so greedy for oil none of this (this meaning ISIS and the implosion of the Middle East status quo) would have happened, and how we aren’t any better because after all much of our country is religious too.
Just look at this recent Jezebel thread, which leads off with a revealing story about how ISIS is terrorizing women and institutionalizing exactly the kind of misogynistic social system that one would expect to see in a culture that used the Koran as its moral guide. It takes less than 10 comments (as of the time of this writing) for someone to start talking about how the best way to help would be to time travel back to 2003 and not invade Iraq:
What a waste of intellectual energy (and a time machine, if we had one! so many other things I would change instead of that). Then we get a photo of Saddam Hussein, one of the worst people who ever lived, seemingly with the subtext that leaving this madman in charge was OK and somehow a solution to another group (ISIS) simply taking the Koran at face value.
Then we get plenty of tediousness about racism and “Islamophobia,” a made-up condition originally peddled by conservative Muslim clerics:
In case you weren’t aware, “Muslim” is not a race, and many of the people in the “Middle East” are not even the same race, nor do they technically have even the same religion (indeed, these schisms are at the heart of the long-running tension between Iran and the Gulf States). The world’s most populous Muslim countries are as follows (top 7):
That is enormous ethnic diversity, spread out over the whole of Asia and Africa. If criticizing Islam is indeed “racism,” then what race, exactly, is the target of the barbs? Maybe Ben Affleck and “sowhatiswhat2″ can pick one that they think represents the “true” Islam, which brings me to another point, about the first post in the above screen grab: Who even decides what does or does not have anything to do with a given religion?
We always here that ISIS has “nothing to do with Islam,” despite the group’s incredible knowledge of the Koran and its adherence to the book’s literal meaning. In fact, based on textual interpretation alone, it makes more sense to call ISIS “fundamentalists” rather than “extremists” – the latter word more accurately describes deviation from the basic meaning rather than sticking to it. I suspect that the commenter is less familiar than ISIS itself is with the Koran’s passages about women and the pains of hell for infidels (and the fact that it says that most of the people in hell are women!).
There is no body or rigorous, scientific test for determining if something is “truly” Islamic or Christian or not; the idea is laughable. 400 years ago, it was very Christian to burn people at the stake for being witches, but now it isn’t. Nothing has changed in the actual texts or rituals of Christianity; there was simply a cultural shift. Who knows, maybe someday beheading infidels, crashing planes into buildings to be transported to a magical sky-bordello, and enslaving women will be culturally unacceptable as “Islamic” actions, but for now there are clearly millions who think that they are quite “Islamic.”
The thread only goes further off the rails from there, bringing up irrelevant matters like the Holocaust
The comment about Muhammad being judged by “modern standards” is curiously weak and contradictory. The “standards” implied would seem to be ones that say murder and pedophilia are bad, and yet they would seem to originate with the same modern “fetid culture” that allowed the Holocaust to happen and can’t judge itself?
Moreover, try to imagine ISIS without the historical example of Mohammad and without the edicts of the Koran to fall back on. It’s impossible. Sure, violence would exist without religion, but it’s not given that this particular type of violence – with its peculiar dimensions of misogyny, sharia, and the awaiting on the return of Isa to earth (that’s Jesus, by the way) – would. And don’t even think about appealing to “human nature” as the cause of this discord, since that concept is itself heavily indebted to centuries of religious influence and concepts like original sin and the roots of humankind in dust, blood, and other “dirty” materials.
Of course, I made the mistake of reading the comments. Guess I learned my lesson.
Twinks. Bears. Leathermen. The gay universe has enough more labels that Dr. Tim Whatley’s label maker could churn out labels for. I guess you could throw “bi” in there, too while we’re at it. Bi the way, back in 2005, I remember describing myself as “bi” to someone in college and he told me right away that there was no such thing as “bi,” while taking the chance to talk about how “coming out” as gay made as little sense as “coming out” as black (obvious visual differences aside, this wonky argument has stuck with me as a potential line of investigation how race is constructed – like, why is Barack Obama “black” despite having at least 50 percent “white” DNA?).
Anyway, imagine that you have taken the stereotypical coming-of-age route of being gay in a country like the U.S. or China. Let’s say you’re “closeted,” that you cannot indulge an affinity for the same sex (to put it pretentiously) without real effort and a great deal of contrarianism. You grow up against the grain, whether said grain is religious bullshit (like in many parts of the U.S.) or anxiety about losing an only child to non-reproductiity (like in China). Then once you mature, maybe you escape to some place like San Francisco or Chicago or New York where being “gay” derives about as much attention as a single blade of grass in Central Park.
Good job, you’ve seemingly won a battle against The System and gained the freedom to break out on your own. Except not. Subcultures beckon, and moreover, Gay Culture itself, with all of its norms and stereotypes, looms large.
Let’s start at the physical layer. Much of the gay world is obsessed with going to the gym to obtain some ideal Greco-Roman physique, all the while fueling the timeless chant “no fems fats or asians” when looking for potential hookups/partners on the Internet. In just this one habit, we can see the long shadow of the privileged men of antiquity, who had no concept of what “gay” would even mean but definitely prized bodily beauty, were indifferent to women, and had sex with other men.
So think about it. After a life spent in the closet, the reaction is to rush into some of the most rigid conformity possible. “Straight ” men face nowhere near the pressure to be skinny, nor are they seemingly subject to stereotypes about musical tastes, voice intonations, blah blah blah. Maybe it’s a protective measure – to band together to survive as a “minority” (even though I have expressed here before that I think orientation is enormously malleable and non-deterministic) – but it is, in all of Pride Parade/Boystown/Rainbow Flag splendor, conservative (I mean this without political connotation of any sort) at the core, much in the way that wearing black is a conservative way to seem “cool.”
Even subcultures opposed to the macro Gay Culture enamored of skinniness are no better when it comes to being different. They are exclusionary, too, just on their own sets of principles, such as hairiness and bulk in the case of bears. Plus, the sheer ease of finding man-to-man encounters (it’s on a whole different level than woman-to-man ones) means that enforcing one own’s prejudices – not “fit” enough, too much of a “jock” (though this one might be a little hopeful in its contrarianism) – is easy because raw supply is so vast (pun not intended).
Perhaps the lesson here is that gay and straight people are barely even different. The paradox is that in its attempt to break so much with the dominant heteronormativity of certain countries, the gay world has shown that it, too, is just as prone to cliquedom, bullshit rules, and groupthink as everyone else. It just seems depressing that the reaction to The Closet of heteronormative culture is basically to find another one that’s even more exacting in its dimensions.
A while ago I wrote a post about Talking Points Memo (TPM), the left-leaning media property that publishes news as well as trenchant and often witty editorial commentary on those items. I noted in my older post how I thought the site had gone off the rails on issues such as the mythical concept of “Islamophobia” and given up on protecting liberalism from the destruction wished upon by the hard-right, especially the religious (and even more specifically, the Islamic) segment of it. It felt like a dark day when even one of the Web’s best liberal sites could muster little more in the face of the Charlie Hebdo murders in France than to point out that “atheists” could be violent and that we should be careful not to be racist toward Muslims (this is a weird sentiment since, well, “Muslim” is not a race).
I soon forgot about TPM’s middling response to Charlie Hebdo, especially as I became immersed in the reactions of Sam Harris and Peter Boghossian, to name but a few. While the far left tip-toed around the fundamental issue of a violent, 7th century moral philosophy holding the civilized world hostage, these thinkers – suddenly and wrongly pitted as “far-right” by the leftists who would deny that Islam as a religion had anything to do with the murders of people who had drawn cartoons of Muhammad – these thinkers offered incredible clarity:
“When will the far left will realize that holding particular political views doesn’t make you a good human? Treating other humans well does.” – Boghossian
“People have been killed over cartoons. End of moral analysis.” – Harris
Ok, so these are simple sentiments. All the same, they’re ones that too often get overlooked in the drive toward some dystopian new extreme variant of 1990s political correctness. Now I consider myself a liberal, and could never subscribe to the vile conservative ideologies currently en vogue in the U.S., where disdain for the poor, minorities, and women is rampant. But the left’s refusal to defend liberalism (and freedom of speech in particular) makes me feel like I almost don’t have a political home anymore.
Let me backtrack for a moment, all the way to 9/11. I remember the narratives that came up not long after, from the likes of the almost parodic Noam Chomsky (a favorite author of Osama Bin Laden’s) and the seemingly self-hating Glenn Greenwald (a gay Jew who will defend Islamic extremists to no end) and the notorious Ward Churchill (who compared the occupants of the original World Trade Center to “little Eichmanns”), about how 9/11 and other like events were a predictable reaction to American evildoing abroad and a legacy of empire, much in the vein of the 19th and early 20th century European colonial powers. In a way, this is a satisfying outlook because it seems to apply the crystal-clear axioms of Newtonian physics (“For every action there is an equal and opposite reaction”) to the muddy waters of geopolitics and culture.
There are too many ethical and anthropological objection to be made to the far-left’s critique of 9/11, though, as well as its response to ISIS (“created by George W. Bush,” according to one naive college student) and the recent attempted shootings in Garland, Texas, at a “Draw Muhammad” cartoon contest. Which brings me back to TPM. In the wake of this shocking event – two heavily armed men trying to kill someone over drawing a cartoon – the site’s editors thought that the best response was to look into the history of the event’s organizers. The headline read:
“What You Need to Know About the Anti-Muslim Extremists Attacked in Texas”
Who are the “extremists” here? Not the attackers – described plainly as “two gunmen,” but the organizers, who had courted disaster by merely inviting others to draw a 7th century warlord. It’s almost too absurd to be sad. Are we really a society in which it needs to be spelled out that threatening murder over ink drawings and paintings is barbaric?
Thinking about this sort of victim-blaming from 9/11 to Garland, I am struck but how racist and colonial it is. No, you didn’t read that wrong. Think about it:
1. Our actions, whether they involve saving the Muslim population of Kosovo in 1999 from certain genocide or making an ill-advsied incursion into Iraq’s Sunni/Shiite divide in 2003, are imperialist and wrong – basically, we should know better, the logic runs, since we’re an advanced society.
2. On the other hand, the people who blow themselves up thinking they’ll be transported to a magical brothel in the sky, they’re fine because, well, we can’t expect anything better from them – just violence. It’s the noble savage myth – i.e., that rustic peoples had a better grasp of life and health – with a terrible twist, that twist being that in order to preserve the nobility of a “savage” culture, extreme violence is needed. Moreover, when these people clearly state that they did something to avenge Muhammad or Allah, we don’t take them at their word – instead, we probe for some “deeper” (read: non-religious; yet religion is at the same time defended from proper critique, which makes no sense) cause that somehow involves the U.S. making arms deals with the Saudis or saving the two holy mosques from Saddam Hussein 25 years ago.
This is all so much condescension toward other cultures, believing that they have no way to even be brought into liberalism because well, religion (in this case Islam) is all they can ever have. It extends so far as to exempt religiously motivated attackers from blame at all, while the likes of Pam Geller – however “extreme” in their free expression – are indicted for imaginary crimes like “hate speech” and “blasphemy.” Entire concepts like “Islamophobia” are also invented to try and put true extremists on the same level as the world’s Jews who have suffered from millennia of anti-Semitism.
So much of what is enjoyable and good in the liberal world – from being able to walk down the street without worrying about receiving a death sentence for wearing mixed fabrics a la Leviticus, to being able to eat bacon and shrimp – is only possible, sadly, because generations of Christians and Jews have given up on defending their “sacred” texts to the (literal) death. I understand the left’s desire to defend the underdog here, but I also despise its low expectations for the Muslim world, plus I despair over its inability to see that terrorism and violent reaction to cartoons are not some deeply engrained part of “human nature” but in fact particular to a specific religious and cultural tradition. We need to stop neurotically looking for what’s wrong in ourselves and realize that we are increasingly at the mercy of people who believe some of the most absurd things about the universe.
Not long ago, I finished an astonishing book called “The Western Illusion of Human Nature” by Marshall Sahlins, an anthropologist and professor emeritus at the University of Chicago (where I spent a formative year from 2008 to 2009). Ever since 2004, when I took an introductory class on Shakespeare at Brown University, I have been immensely skeptical of the notion of “human nature,” mostly because, as Mark Twain once quipped, “generalizations aren’t worth a damn” (think about it). My dislike of the term has amplified over the years as I came to see “human nature” as not just a hasty generalization but also a profoundly negative and deterministic outlook on life.
Genetic determinism: The descendant of “original sin”
Why are we the way we are? A simple glance at nearly anything in any inhabited city should give you a preliminary answer: When you consider all the buildings, roads, vehicles, elaborately clothed tourists, mobile phones that can connect to wireless data, you take in – from every angle – so many things that humans “weren’t intended to do.” We weren’t “intended” (by whom? by what?) to extract crude oil from the ground so that it could be refined into gasoline that would power trucks that would deliver a Wi-Fi- and Bluetooth-enabled watch to someone’s doorstep. We weren’t “intended” to eat dairy, gluten, or whatever else the peddlers of dietary fads have deemed the mythical source of all our ills. We weren’t “intended” to turn California into an agricultural superpower capable of producing almonds, artichokes, strawberries, and so many other foodstuff at unprecedented scale.
What I’m saying: Our world exists because of culture, with all of its “unnatural” or “unintended” things being cultural evolutions peculiar to peoples and periods of time. Moreover, all of its perks and drawbacks are the results of cultural choices about what is acceptable and important. The fact that we don’t have universal health care in the U.S. but pay programmers hundreds of thousands of dollars a year? That’s a cultural attitude, not a predetermined genetic outcome.
Human cultures predate the homo sapiens sapiens species by thousands of years, and our evolution – with peculiarities like helpless infants – has been guided by our complete immersion in culture from cradle to grave. Consider this: Even throwback diets like paleo, which aim to escape contemporary culinary habits by going back to what “cave men” ate, are still completely beholden to the artificial and culturally determinedselection of the best-tasting and best-looking fruits and vegetables (there were no perfectly golden, creamy bananas “in nature” before human began cross-breeding cultivars) and the most-consistently bred livestock (raising even a single cow requires immense amounts of food, grown with “unintended” methodologies, and supported by “unnatural” antibiotics and drugs) available. Aesthetics and prejudices (e.g., the unwillingness to eat insects or dogs in the West) play a huge role in simply influencing what is even on the table . We are no more “designed” to eat only preagricultural foodstuffs than we are to speak only pre Indo-European languages.
The idea of a savage man in the wilderness, eager to kill someone just to get his own immensely healthy and faultless food and all the while in desperate need of an iron-fisted enforcer (i.e., Hobbes’ Leviathan) is a myth that owes more to the pessimism of the English Civil War and the legacy of the vile Abrahamic religions than to any actual evidence of how humans organize their lives and groups. It assumes the worst about us and becomes a self-fulfilling prophecy, marketed by a culture eager to exercise control mechanisms and tell us what is and isn’t “natural” and “intended.”
The eagerness to impart everything to genes, to a built-in “human nature” that is most usually associated with greed, sexual wantonness, and violence, is a remarkably flimsy idea that is nevertheless mentioned with such gravitas whenever someone has to condescendingly explain a “harsh truth” to someone else. I mean: Would you listen to someone babble on about the Christian notion of “original sin” to you in explaining why bad things happen? “Human nature” is “original sin” in another guise: St. Augustine hypothesized that original sin was transferred from one person to his children via semen, a ridiculous argument that nevertheless has been reincarnated in the notion that there’s a certain “human nature” genetically copied across the generations that is accountable for everything from men cheating on their spouses to office managers being ruthless political animals who must be corralled.
We impart to our own species a level of savagery that we wouldn’t assign to the worst of our relatives in the animal kingdom, and that doesn’t even exist in species like wolves that are often associated with evil despite their comraderie. We paper-over the obvious cultural destruction wrought by specific religions by instead saying that all this evil was instead inevitable because it had everything to do with “human nature” and nothing to do with fundamentalism.
Enough! What if instead of fixating on the negative traits we think are passed from one human to the other, we stepped back and consider how immensely in debt we still are to a Western cultural tradition that, for thousands of years, has put the “holy books” of the Abrahamic religions and the thoroughly pessimistic secular texts of Thucydides and Hobbes on a pedestal? We seem to see newborns and children as creatures that must be inoculated against some kind of savage nature, and then as adults held in check by paternalistic bureaucracies that prevent any lapse into a mythical “state of nature” that, due to culture’s pervasiveness, could never even have existed in the first place.
Almost any behavior that one would try to explain by appeal to “human nature” could be explained another way. Jacques Fresco once remarked about how he visited a Pacific island where everyone was naked all the time, and yet there was no evidence of constant sexual leering or abuse. If nothing else, his remarks are a good jumping-off point for thinking about how, say, Western uptightness about sexuality and a long legacy of optional sexism, rather than some inevitable “human nature” imprinted on the genome, has enabled things like catcalling and micro aggressions.
Yes, yes, I know – we don’t know much about genetics yet, and eventually we will able to explain everything via super-intelligent machines that can easily sequence DNA and analyze decisions. Bear in mind that the prioritization of such creations, as well as the ways in which they measure things (who decides how and what to measure, and how to interpret the results?), are all cultural realities, too and could be reversed.
My sense is that we greatly, greatly oversubscribe to the notion of “human nature” because of the historical circumstances we live under, in which there is a dominant single power (the U.S.) with global reach sufficient to create a de facto common language (English) that in turn makes everything seem homogeneous, at least on the surface. If and when this state of affairs changes, I think we will eventually see the world’s disparate cultures and natures finally move out of the shadow of “human nature.”
Tim Chevalier recently penned an excellent essay on leaving “tech culture” behind, talking about how he come to see his work as a programmer as both exhausting and pointless:
“I have no love left for my job or career, although I do have it for many of my friends and colleagues in software. And that’s because I don’t see how my work helps people I care about or even people on whom I don’t wish any specific harm. Moreover, what I have to put up with in order to do my work is in danger of keeping me in a state of emotional and moral stagnation forever.”
I found this post via Twitter and then I read an optimistic response – i.e, said culture can be fixed if only the “right people” were in charge – from functional programmer and ex-Googler Michael O. Church on his WordPress site. As companies like Google and Facebook have become central to the neoliberal surveillance state, as Silicon Valley has become this weird locus for both entertainment (today’s smartphone addicts are just yesterday’s nuclear family sitting rapt in front of the rabbit-ears TV) and austerity lingo (“innovation,” “entrepreneur,” and “failure” are particularly odious), there has been mounting attention on the cultural problems in tech. These issues usually include:
- Rampant sexism, with the recent Ellen Pao suit against venture capital firm KPCB bringing it into play in a high-stakes trial (Pao lost). Software and hardware remain overwhelmingly male fields, and not by accident either (more on this in a bit). Facebook and Google, as well as much older firms like Apple, wield enormous powers comparable to those at various times in history have been vested in states (e.g.., surveillance in the case of the Web giants), while being driven by a laughably unrepresentative slice of humanity (the upper middle-class to the superrich, with strong biases in areas of race and gender).
- Workalcoholism, with 80-100 hour weeks described with violent euphemisms like “killing it” or “crushing it.” Just as Uma Thurman had to trek those steps in “Kill Bill,” it seems that everyone in tech has to live out the nonstop bullshit about being, or at least serving, a
businessmanentrepreneur who works around the clock to build something that has questionable social utility as well as dark political potential capable of quelling outrage over inequality, since criticizing workaholics is surprisingly difficult in the U.S.
- Individualism, which may seem surprising in workplace cultures increasingly dominated by jargon about “teams” and “families,” but the strain is readily apparent in the archetypes of the cowboy coder, the “visionary” CEO, and the notion that a singular force like “tech” is the solution to political, anthropological, and social issues as well as technical ones. Just as so much anthropology has dissolved into discussions of “power” at the expense of all other motivations, popular social and political science have taken an acid bath in “disruption,” “innovation,” and “technical skills” and the priorities of a narrow group of individuals.
All of these trends are awful, and together they have created a “tech culture” that is enormously unfriendly even to the many of the upper middle-class people whose equal love of social stratification and devices (for instance, Uber is not just an app, but a way around egalitarian public transport) would seem to make them perfect cogs in its machine. It’s gotten to the point of self-defeat. But why is “tech culture” this way? What is it about “tech” that makes it so much worse for women, specific minorities (minorities in the U.S., I mean; they could be and are “majorities” elsewhere) and no-nonsense introverts than, say, “trance music culture” or “content marketing culture”?
I don’t think it is attributable solely or even largely to there being so much money in the industry right now. The facile argument would be to just say that large amounts of money and competition drive people crazy, bringing out the dreaded base “human nature” for antisocial/Hobbesian violence and self-interst (the same “human nature” used to explain contrary concepts like sociability) and running roughshod over cultural penchants for equality.
But such outlooks are necessarily pessimistic about mankind, with roots in the idea of a dismal state of nature (an idea with a very specific, hardly universal, lineage from Thucydides to Hobbes, the latter being the former’s English translator). Plus, to make this argument while fretting over how to fix “tech’s broken culture” is to fuel the same fire that is in need of snuffing: Assumptions about the worst in people coming out in lieu of a paternalistic security-surveillance state are probably major causes of said “brokenness.”
So here goes, three reasons why “tech culture” as conceived can’t be fixed. My intent is writing this is not to paint “tech”‘s problems as intractable so much as it is to highlight how the very terms of the conversation – problematic words like “technology” and “technical,” for instance – necessarily constrain any worthwhile corrective action, which would almost certainly require access to political and cultural channels that “tech culture” has willfully isolated itself from.
1) The term “technology” is a problem in and of itself
Like many words with Latin or Greek roots, “technology” gets off the air of sophistication to English speakers whose routine conversations are dominated by shorter words of Anglo-Saxon origin. Leo Marx has written extensively on how “technology” is a “hazardous category,” one that emerged as a way of appropriating the work and culture of blue-collar individuals – machinists, artisans, et al – for their white-collar counterparts, or at least the most visible artifacts thereof, particularly industrious men in lab coats (more indicative of wider “technical culture” than “tech culture,” but the gap between “technology” and “technical” is narrow and it’s sometimes not even clear what “tech” is short for anymore), serious politicians paying lip-service to “innovation,” and MBAs and executives of all stripes who have seen “tech” as the next road to riches.
“Technology” as a word is in this respect a lot like “innovation”: Its usage automatically draws a line between classes. Just as “innovation” is off-limits to the activities of individuals like bus drivers or shoe repairman, “technology” is off-limits to the social underclasses and the demographically disadvantaged. Yet, the issues of “inequality in technology” and the “technology gender gap” keep popping up and confusing otherwise sensible people. Remember that the entire technology category was made to rebrand once-workmanlike activities like repairing a machine or reviewing specifically coded systems into something acceptable for the upper, mostly male, classes.
Technology is no longer even about labor, or the “hard work”that so many entrepreneurs pay homage to in order to subtly justify inequality, but about the production of artifacts, as Debbie Chacra so memorably put it in a piece for The Atlantic entitled “Why I Am Not A Maker“. This shift that is neatly encapsulated in the Maker movement that makes it clear that tangible (commercial) items, not behind-the-scenes care-taking or educational work, are the best representations of the value of one’s effort and class. The obsession with creating objects and turning yesterday’s castoffs – factory machine labor, geek culture – into tomorrow’s winning business formulas brings me to…
2) Geek/nerd culture may be at fault too
Saying the geeks won the battle over their one-time high school bullies has become cliche. Silicon Valley, Makers, DIY lifestyles: They all nominally benefit geeks who were “unpopular” (I don’t even know what this means anymore, in an age when anyone can get thousands of likes for a Facebook status or reddit post) in high school and invested themselves in “uncool” (not anymore) activities like reading comic books or programming.
There is a large overlap here between the rise of the artifact-obsessed technology culture of Silicon Valley, the newfound top dog status of geeks, and the enormous monetization in the West of properties like DC and Marvel Comics, Game of Thrones, and Star Wars, to name but a few. Disney (owner of both Marvel and Lucasfilm somehow) now turns to comics rather than Broadway (the inspiration for many of its old-school animated musical) for its cash cows.
The point of this overlap, though, is that there is tremendous reinforcement from all over of the geek mindset, which of course prioritizes things – fictional characters, universes, continuities, costumes, etc. – over real people. Think of the enormous mental energy channeled into deciding between an Autobots and Decepticon costume for a convention, for instance. There are finite resources and time in any day, and geek culture expends it on fantasy worlds in which factors like race, class, and gender are secondary because real people aren’t involved and the universe might be on the verge of being destroyed or something.
There is a striking parallel between what tech culture is producing – software, mostly, for limited audiences and use cases – and what the increasingly commoditized geek culture is churning out in the form of mass market film, comics, and games. The cultural issues that besiege tech certainly have analogues elsewhere – sexism, racism, and crass individualism are in pretty much every office – but the industry faces the unique pressure of having to absorb tremendous amounts of money (from VCs and consumers alike) while still overseeing its long tradition of lionizing artifacts and abstractions at the expense of nearly everything else (that’s why we get so much speculation about how much a “great programmer” is worth when we should be asking what an average one is worth).
3) The separation of “technical” and “non-technical” persons is more needless stratification
This picks up the end of the last point. Go to any story about the mythical “skills gap” in the U.S. or the bios parts of a tech firm’s website and you’re likely to run head-on into the word “technical.” Here again we have something maybe once had tangible meaning – someone’s skill with a carpentry tool or machining process – being used as a preemptive moat against anyone with “soft” skills like communications, writing, PR, etc.
Of course it’s weird that politics – the “softest” skill of all since it is so hard to evaluate, with a “bad” grasp perhaps benefiting one as much as a “good” one would, depending on the context – is still what determines how organizations function. But if tech, and more specifically technical skills, is the answer to everything, politics goes out of sight (it doesn’t go away, but ends up sort of like the man behind the curtain in “The Wizard of Oz,” if you will) and the often random fortunes of a company and its culture are instead put under the illusion of a meritocracy. ”
Technical” is a handy word in this setting because it implies easily quantifiable results and, let’s admit it, a certain look and behavior that informs all of the stereotype and extreme cases, like the aforementioned cowboy coder. A technical person can produce artifacts, which are what tech is known for. The fact that artifact-making is traditionally a male habit and one with a long tradition of excluding all but a narrow cross-section of society becomes secondary, and the problems multiply.
This entry ended up being way longer than I imagined, and I have likely gone off onto confusing parenthetical tangents here. Ultimately It feels like the tech industry’s hand-wringing over its culture problems is like the coal industry fretting about environmental damage: Admirable, but intrinisically linked to how it does business and why it has succeeded in the first place. We’ll probably have to rethink the entire category of ‘technology’ to ever get onto better footing.