One of the weirdest pieces of advice anyone can dispense is that old chestnut, “follow your passion.” Its proponents have included billionaires Richard Branson and Steve Jobs, which gives a good sense of its social origins. Both Branson and Jobs were in positions to follow any passion, whether multinational business or recreational basket-weaving. Whatever value was assigned to these passions was secondary in light of overwhelming wealth.
But “follow your passion” has a certain allure, doesn’t it? What’s more, it has many sugarcoated variants, including “Do what you love,” which Miya Tokumitsu dissected for Slate:
“‘Do what you love'” disguises the fact that being able to choose a career primarily for personal reward is a privilege, a sign of socioeconomic class. Even if a self-employed graphic designer had parents who could pay for art school and co-sign a lease for a slick Brooklyn apartment, she can bestow DWYL as career advice upon those covetous of her success.”
Put it that way and being told to follow your passion is akin to being told to just sleep when suffering grievous physical injuries. Or perhaps it’s like being told to believe in God, basically, or to pray your way out of a situation. That is, whatever the benefits, they don’t materialize out of the ether and instead require plenty of legwork and some degree of fortune. There may be a goal in sight (a great job, being healed), but getting there is not a linear matter of just praying the prayer or setting the heart on a passion.
What’s more, the emphasis on passion (a word, by the way, which originally meant “suffering”), like almost any religious rhetoric about “god,” is a way to dehumanize the issue at hand. Finding fulfilling employment suddenly becomes a matter of pinpointing an immaterial “passion” rather than interacting with other humans and dealing with their strengths and caprices. Worse, fixating on “passion” devalues important work that could hardly be construed as anyone’s passion – caregiving, washing clothes, changing diapers – but is essential to the functioning of society. I mean, imagine if all garbage men went on strike – society would grind to a halt because individuals weren’t not following their passion for picking up refuse.
The passion/love rhetoric basically reinforces the class stratification that brought such terminology to the fore in the first place. It demonstrates the degree to which privileged classes dictate terms to everyone else, instructing them that, yes, there are choices that ou can make to improve your life if you’d just find something you’d like to do and appreciate what you’re already doing (the second half of “Do what you love…” has a conformist angle unmatched by the blander “Follow your passion.”).
Look, I’m no determinist – it’s way too depressing. Skills can be learned, and, yep, new passions can be kindled. I learned how to use a UNIX/OS X command line interface last year. But so many things had to go right for me to get into a position of even wanting to dedicate enough effort to learning it:
- My dad was an early PC adopter and taught me to use DOS before I was 6
- A friend recommended I take programming as an elective in high school
- At the last minute in 2004, I decided to take Greek instead of Italian in first semester of college, in what was basically a coin flip decision. This was unfathomably important as it introduced me to someone who later helped me get a job at a software company, which brought me full circle to programming after a detour into humanities.
Others would have been fighting a tougher battle. Maybe they could have ended up in a similar position, but just think of the literally billions of people who are in no shape to follow passions of any sort. They need help surviving. They need material aid and shelter and fairer economic policies, not rhetoric about passion that is little more than faith-based snake oil.
For everyone who is more fortunate, consider: what has following these passions produced? Tokumitsu identified the unpaid intern and adjunct professor as byproducts of the passion-obssesed world, and I agree. My passion for literature led me into adjuncting in 2010. I loved talking about literature, even in that capacity, but it was work all the same, and unsustainable work at that. I moved onto things that I had never – and sometimes, still don’t – consider ‘passions’, but I’m immensely happier. Writing tons of support emails, for example, is probably no one’s idea of a “passion,” but I did it for years after I left academia and it enabled me to live a more fulfilling life than I had passionately ranting about Waiting for Godot.
The issue is perhaps the idea that passion has to be folded into work, a notion that implies that there really isn’t time for much activity outside of work. Another way to say it is perhaps that your passion has to be your work, or else you have no opportunity for passion. That’s insulting and depressing, plus it’s bullshit. We really deserve a society in which identity is not synonymous with occupation. That would open more doors for culture, equality, and well, passion.
Data data data. Big data. Granular data. You won’t need gasoline, or a Tesla – you are data-driven.
It’s hard to get away from “data” (or its cousin, “information”). What does the data say? Does the information tell us anything?
I don’t know; I didn’t realize that these things were sentient beings that can “say” phrases and “tell” us about ideas. And their vagueness – what IS data? Does it have important physical characteristics or is it part of the world’s growing layers of abstraction, under which manual work and tangible items are obscured and devalued?
The issue with the world’s data obsession is not that it necessarily produces bad commentary, bad writing, or bad sociological analysis. Still, it does do that. The data told us that, without any Entente/American troops on German soil in 1918, Germany maybe shouldn’t have surrendered. Who knows what their odds of victory were?
It’s not all bad news. Being data-driven leads to big, nice-looking, slow-loading webpages such as Vox, The Verge, and FiveThirtyEight, and their newsrooms full of tone-deaf white guys.
But trusting data over all else is to shirk social responsibility. It’s to wring one’s hand in faux-seriousness to some intangible ideal – data, numeracy, whatever you want to label it – that really plays a role analogous to god (but that would be frowned upon by data-driven crowds, natch). It becomes the agent, something unstoppable and immutable, while the writer becomes less accountable – after all, she’s just letting the data talk, as if channeling the Burning bush.
What a naive viewpoint. Using data is a matter of interpretation, and many writers don’t have the chops for it. It’s no coincidence that two good pieces (one from The New Republic, another from Quartz) bemoaning the rise of data journalism were published on the same day (today). They point out this gap, plus they get at a bigger issue – that even data-driven writing is opinion, with research structured to favor a particular point. Problematic literature is sidebarred or ignored altogether – it basically has to be, whether the writer intends to do so or not, given the sheer volume of material out there.
It’s easy to see why data-driven writing has cachet on the Internet, with its somewhat technologist demographics. And maybe if FiveThirtyEight gets this year’s NCAA brackets right, it’ll have done a good deed – I’m not saying data journalists aren’t making valuable contributions. But their relentless drive toward “the future” of journalism or “journalism for the digital era” (how long have these trope been around? They’re fucking exhausting), like all progressivism, is often overzealous and blinkered.
I mean, look at education. What has decades of data-driven teaching, testing, and planning done? Billions in profits for private corporations, miserable students with their mouths taped shut, unions busted up, laughable Common Core standards…administrators have spent the past 30 years trying to close an “achievement gap” that doesn’t exist, ignoring poverty in their drive to throw data-driven strategies at things such as SAT testing that have little redeeming social value.
What does that show? Well, data itself isn’t important. It’s what people do with it, how they interpret it. How could one not see the broader U.S. trend toward privatization and inequality in the data-driven education craze, except with actions sanctioned by the authority of “the data says ___”? I worry that writing is going to become like this too, with terrible data-driven drivel designed for machines and condoned by the godlike vehicle of “technology.”
What’s the appeal of pixelated 8-bit graphics and linear gameplay? Well, maybe they’re an escape from Internet-only dystopian shooters (seriously, how many of these can the average gaming bro play through). A respite from “free-to-play.” A break from “Read Phone Status” permissions. They’re decisive proof that progress isn’t something that just moves forwards. It goes backward all the time (see also: the move away from albums and toward standalone singles and streaming music).
I mean, this says it all. And I would remiss to mention that I am so looking to Shovel Knight for Wii U/3DS at the end of this month.
Until then, I’ve been tiding myself over with Mutant Mudds Deluxe for Wii U. “8-bit” is a monomer here, though, as the game draws inspiration from the SNES’s color palette (plus the blonde haircut and glasses of protagonist Max is more than a little reminiscent of Jeff from Earthbound).
Mutant Mudds Deluxe sets out to do just a few things and it does all them all as well as Scrooge McDuck bounces on a cane. Max has a jetpack and a water cannon. His jump never feels quite high enough, weirdly – maybe it’s the sheer necessity of having to jetpack-blast your way up through all the CGA-Lands (cute IBM reference) that makes the normal jump seem unimportant to the game. In this way, the game resembles 8-bit classic Bionic Commando, with its deemphasis (well, downright obviation) of jumping in favor of claw grappling.
There’s unlimited ammo, as you would expect from a golden/silver age Nintendo platformer. Difficulty is sufficient – tricky moving/disappearing platforms, weirdly positioned enemies – but not back-breaking like Castlevania III or Defender (or as latter-day gamers call it, Flappy Bird).
You can tell that this game began on the 3DS (sans the “Deluxe” moniker). Its usage of depth-of-field effects is clever, but feels awkward on Wii U, where there’s tons of real estate that feels wasted by shrinking Max into the background. But the widescreen effect does bring some major improvements over the mobile version, which would often not show enough of the screen for you to avoid having to make a blind jump.
At only $10, Mutant Mudds Deluxe isn’t cheap compared to the F2P garbage out there. But like the astonishing Out There, it feels like a bargain for how much craftsmanship is crammed into it.
An honest Android game is hard to find. Most are “free,” except with in-app purchases. It’s like buying an apple “for free” at a supermarket and then paying $0.85 to eat it – what’s “free” about that? Free-to-play, free-to-eat, whatever – the mobile gaming world is full of cutthroat pirates obsessed with the word candy and unconcerned with your experience. Every now and then you get lucky with something like Plants vs. Zombies 2, only to see its makers experiment with pay-to-win lawn mowers.
What a weird feeling it is then when you find a game that doesn’t have any IAP – especially when it so easily could have implemented them to squeeze for you $50 here or there. The cross-platform Out There is at once a throwback to a different type of gaming business model and one hopes a foreshadowing of what’s possible for high-quality mobile games. It only costs $3.99, and despite its labeling in Google Play, there aren’t any IAP.
Out There is exquisitely made. The graphics resemble a comic book, with lushly colored sci-fi landscapes. The soundtrack is creepy and beautiful, or basically what you would expect for a deep-space survival adventure. It’s the 22nd century and your character has awoken from cryogenic slumber (having fared better than Ted Williams, apparently) and has to make his way from one galaxy to the next.
Right from the start, Out There has that feeling of there being a long quest ahead, which I don’t always get from mobile games that seem not to look beyond what you’re going to do 5 minutes from now when you run out of rubies/coins/donuts. There’s a dot way across the galaxy and you’ve got to get there, overcoming all sorts of hazards and misfortune along the way.
Your ship has several main resources – hull strength, fuel, and oxygen. Each one of these depletes as you travel from star system to star system. See what I mean about there being a golden opportunity for IAP here? But Out Here splendidly doesn’t take it. Instead, you can only acquire each element (H/He for fuel; O for oxygen; and Fe for hull and equipment repair) by harvesting them from stars and planets. How novel.
There’s a lot of risk/reward calculus in Out There. For example, you can drill into a planet’s surface to get iron and other metals, but doing so uses some fuel and carries the risk of breaking your drill, in which case you’ll have to use iron (what you were likely trying to acquire in the first place) to repair it. Your cargo hold is limited, with only a few slots and a cap of 20 units on each of the essential elements. It’s possible to dismantle equipment to make room and harvest elements, but doing so could leave you missing a module you’ll wish you had later on.
Traveling through the lonely cosmos of Out There is dangerous. In other words, prepare for a lot of game over’s. You might spin off course and take a bunch of hull damage, or your light speed warp between worlds may fail, leaving you short 20+ fuel and no further along in your quest. The game also has a choose-your-own-adventure element to it, in which you pick one branch on a path and never really know if a choice will net you a nice resource bonus or end your game prematurely.
Out There is exceedingly difficult and unpredictable, and you’ll need a lot of luck to get through it safely. But this isn’t Candy Crush luck – you won’t make it all the way to your destination without putting in some dedicated planning.
It reminds me of all the hours I logged as a kid playing Space Quest V: The Next Mutation, another tough trek (heh) that owed a lot to classic sci-fi and burnished its loopy puzzles with gorgeous artwork. Out There isn’t an adventure game per se, but its long-form, challenging characteristics make it feel like an adventurer gamer’s take on Faster Than Light or Mass Effect.
What’s in a job application, really?
Resumes are full of bullshit, stuffed with elaborate, painful explanations that make a chef gig sound like the Vice Presidency of Demand Solutions. They are easily gamed – just think of all the LinkedIn recruiters who barely read CVs and go after jobseekers anyway.
Resumes aren’t even meant to be read, a fact that trips up jobseekers with humanistic backgrounds – maybe skimmed, but not read. Hell, even cover letters are frequently ignored, with anything more than two paragraphs getting the TL;DR glaze-over – welcome to
To compose literary masterpieces for either of these requirements is to produce something that maybe 2 or 3 people will read, probably cursorily and maybe angrily. In other words, it’s like churning out academic writing (almost – there’s good writing in the academy, but there’s also a lot of gobbledygook).
The expectations around the jobseeking process and its actual characteristics couldn’t be more different. I know they were for me – I once naively thought that a simple resume submission would net at least a response, favorable or otherwise, and that writing mattered for getting a job. Maybe it does at select companies such as Basecamp (“When in doubt, always hire the better writer”) but by and large writing is undervalued.
It’s something that everyone can do, and as such everyone erroneously thinks he can do well (in contrast, not everyone writes computer code or flies an airplane). Sure, some writing does pass for good with certain audiences – pepper anything with enough instances of “operational,” “solutions,” and “efficiencies” (ugh, that word shouldn’t really be plural, ever) and it gives off the noxious air of sophistication, but it’s just more of the same me-too dreck.
Form letters alone are evidence that writing too often doesn’t matter when seeking jobs. Think of how the classic TBNT letter is both long, vague, AND meaningless – an impressive trifecta.
How did jobseekers, especially on the humanities side, get in the situation of fighting an uphill struggle against hiring committees? Part of it is of course “the economy,” a euphemistic term for humans and what they value – tomes could be, and have been, written on how the cult of shareholder value, for example, has destroyed interest in things such as public investment and progressive causes.
But another part is, I think, the ongoing degradation of respect for labor at large. Giving a job to someone is treated almost as a gift or conferring of privilege, rather than as reward for merit.
How else to explain “entry-level” jobs that demand multiple years of experience? This disconnect grinds my gears because it shows that organizations don’t value language. You can tell so much about the ethics and value of a company by how they use words, how they frame job searches, and how they treat candidates. Saying “entry-level” means “5 years of experience” is dishonest and likely a sign that someone is trying to be cheap – pigeonholing an experienced person as just another rung on the totem pole. It’s a lot like how we’re (I’m speaking primarily to a U.S. audience here – apologies to international readers) there’s a persistent “skills gap” in hiring, or an “achievement gap” in education, both of which, unsurprisingly, are as spuriously and manipulatively motivated as the “missile gap” from the Cold War.
Similar aversion is kindled by the use of the word “technical” to mean all sorts of random shit, from “jargony” (in relation to writing) to “math-like” (in regard to skills). Clear, expressive writing is a sign that you’re thinking and trying to solve a real problem or fulfill an urgent need, and most job postings fail miserably. They’re overloaded with cliches that deaden the language, they’re not about real things (yes, I said “things,” not “entities,” “stakeholders,” “solutions,” “interests,” or “competencies”) and are essentially giant walls meant to keep applicants out rather than invite them to take a shot.
There is of course the angle here that HR departments are overwhelmed with resumes and should be pitied, especially as they try to deal with the “skills gap” phantom. That would be a plausible argument if these same sorts of practice hadn’t contributed to the problem in the first place. Dishonest postings give applicants, especially inexperienced ones, false hope – not everyone is going to be able to parse through the bullshit and see that what lies before them is not a doorway but a giant wall. Many will try to scale it nonetheless because they have no choice, so excuse me if I pity the ones making that effort rather than their counterparts who laid the brickwork.
How can job postings and processes be improved? There’s no catch-all answer, especially since hiring is not a meritocratic process much of the time and is fundamentally political. Organizations can and should hire applicants that it likes; but if possible they shouldn’t give off the air of objectivity, which pollutes the entire process and comes off as the worst possible lip service to fairness (and yet the same people who came up with these unfair constructs would probably resort to the ‘life’s not fair’ cliche if pressed). Maybe they could start with some of these changes:
Write the job ad in a voice and style similar to what you expect back from the applicant
Don’t write a huge wall of text about how someone needs “excellent written and verbal communications skills” for a writing job. It’s a writing job; you should be asking for samples, and good writing is so hard to fake, especially if requested on-demand or short notice (it’s not a resume). No one says “excellent written and verbal communications skills” in real life – it’s robot-talk. It undermines the concept it’s trying to communicate.
And don’t do this.
Basically, know your audience. Many job ads are written in a style so obtuse that it’s like trying to describe agile infrastructure automation, when the topic at hand is, um, “entry level data entry specialist” or something similarly euphemistic. Get real.
Come prepared to the interview that you’re conducting
Applicants are always told to do their research and be ready for the interview, and many of them do – they have to. The problem is, sometimes even the best preparation gets foiled by the interviewer’s lackadaisical approach to proceedings.
Too many interviews begin with long, awkward pauses. Ok, you say, this is the cue for me to just start spewing about what I do, who I am, where I see myself in…
This is an awful way to start things off. Maybe it separates the good from the bad, but it also puts the applicant in a bad position, as if she’s playing a guessing game.
Sometimes this awkward start gets followed up by some throwaways about “would [x] be ok?” and “are you ok doing [y]”? No one is going to say ‘no’ to these questions and they’re a waste of everyone’s time. If the responsibilities aren’t clear by the interview stage, something is wrong.
Keep things short
I once interviewed with Uber. In the process, I learned an enormous amount about how the company really operated. No, I wasn’t made privy to any of its earnings. And I didn’t get wind of UberX before it rolled out. Rather, I saw how Uber was a company set up to cross all sorts of ethical boundaries and step on people’s toes, based on how they treated job applicants.
The entire process lasted something like 50 days. That’s insane. An initial generic application was supplemented by a scenario exercise that in retrospect seems like unpaid labor on behalf of the company’s idea-starved managers. The on-site interview was a carousel of “fit” questions about how comfortable I would be meeting their demand schedule, which afaik was just tweeting generic replies to customers. I was given credits to take a few rides and evaluate them. I attended a holiday party full of awkwardness. I had to write emails to them with my feedback on the party. There was interminable radio silence, followed by a form letter. The same job was posted again and again, and plenty of people had the same experience as me – just check out Glassdoor (sign-in may be required).
Yes, I was disappointed for some reason when I was rejected, but looking back it feels like I dodged a bullet. Any company willing to drag things out that long and play cat-and-mouse with applicants is prone to, oh I dunno, spam-call users in a fake grassroots effort, or interfere just a little bit with the “free market” principles it espouses.
Treat applicants with respect. They’re already respecting you by even sending an application.
Grow a thicker skin and have some guts
Maybe that’s a bit harsh, and I admit that it feels like something that applicants, rather than companies, should be told. But just think about how spineless something like a form letter is.
These insults to the English language exist so that bad news doesn’t have to expressed directly and humanely. The standard TBNT letter twists itself into such knots that its writers make fools of themselves – have you ever seen the phrase “while our response cannot be more positive?” What does that even mean? It seems like the writers are going for “we want to be more positive, but gosh darn it we can’t” (itself an oddly helpless, defeatist position to take), but they leave themselves open to an alternate reading that would interpret the sentiment as “we literally couldn’t be happier with what we saw.” Would you want to work for a company that greenlit that letter?
Of course, “want” is a luxury for many jobseekers, and the sad part is that they often have to put up with this as they struggle to find anything. Still, many of them have probably grown thicker skins as a result of all of their rejection, thicker than hiring committees that can be set off by the smallest possible thing or red flag. There’s so many listicles out there such as “Top 10 Things I See On An Application That Lets Me Know I Should Immediately Reject That Person” or “5 Things to Never Ever Do On A Job Application.” When did things become so hair-splittingly insane? What happened to taking a cover letter seriously and trying to have a real conversation with an applicant? If something such as the phraseology of a question, or the usage of a common term, is enough to set you off, then maybe you’re too sensitive. Of course, many applicants will never know what participle, mannerism, or gesture stopped them from getting a job, and some of them will probably stay up at night thinking about how disturbingly capricious and opaque the whole process is.
A job interview isn’t an ambush
On the opposite extreme from the unprepared interviewer is the the interviewer that comes armed to the teeth with gotcha questions. Blame Google. At least Google eventually revealed that asking someone how to get out of a blender after being shrunken down to the size of a nickel isn’t really a good indicator of how good a candidate is.
These tactics are sort of like “parlor tricks” that seem like good ideas in theory but are mostly a waste of everyone’s time. Like all the other affronts here, they’re a layer of abstraction on top of what should be a straightforward, humane process. The fact that many organizations can’t even have a civilized conversation with applicants is a pretty damning indictment of the culture at large. It’s the corporate equivalent of not making eye contact or ignoring someone’s text. Something needs to change.