Rahm Emanuel’s Runoff Victory is the English Language’s Loss

Last time, I wrote about a great series I came across, called Keywords for the Age of Austerity. Tired of hearing about “innovation”? Exhausted by being called a “stakeholder”? The U.S. is a country whose academic and business institutions are increasingly overrun by buzzwords that reflect the nation’s growing gap between haves and have nots. The city I lived in for seven years, Chicago, nicely encapsulates this intersection of neoliberal economics and fanciful language.

Despite having far fewer people than either Los Angeles or New York City, Chicago is the murder capital of America. Its North and South Sides are segregated almost perfectly along racial lines. It is run by a mayor, Rahm Emanuel, who grew up in the suburbs, made his career in D.C., then returned to the Midwest to take advantage of Chicago’s enormous cultural amenities. As Illinois’ largest city continues to struggle with unemployment, racial divides, and inequality, Emanuel has played the part of a latter-day Nero.

Instead of playing the Roman emperor’s violin, though, he has tapped away at “engagement” of the people who live there. He has, for example, sunk his energies into institutions such as the Department of Innovation and Technology, producing remarkably empty statements such as (emphasis mine):

“An open and transparent administration makes it easier for residents to hold their government accountable, but it also serves as a platform for innovative tools that improve the lives of all residents,” said Mayor Rahm Emanuel. “By making this data available to residents and developers, we are better able to promote civic engagement, while continuing to strengthen transportation options and pedestrian activity in Chicago’s neighborhoods.”

The bolded words are either ones that J. P. Leary has identified as part of his series, or ones that I think fit the mold:

Open and platform: “Open” and “platform,” as employed by today’s elite, both stem from the enormous influence of tech thinkfluencer (I use that term derisively) Tim O’Reilly and his promotion of concepts like “open source” software and “government-as-a-platform.” The usage of these words has similar effects to the current overuse of “stakeholders” (students and copywriters love that they now have a synonym for “people”) in that it creates a false sense of shared opportunity. Being able to check (some) government activities for malfeasance, like being able to vet software for bugs, is marketed as tantamount to having an active role in policy creation. “Platform” has the ring of empowerment (think Neil Kinnock’s speech that Joe Biden later plagiarized), but it also recalls operating systems like Windows or Linux distributions, which respectively represent quasi-authoritarian/private club-style design (what O’Reilly et al would call “closed source”) and “openness” to a limited number of mostly male individuals in positions to not only review but also add to the project (“open source”). It’s autocracy and oligarchy, respectively.

Innovative: “Innovation” is not an egalitarian concept. It usually applies to white collar professions and activities that involve “technology” (itself a keyword worthy of a future entry from Leary). Rahm’s Chicago is no exception to the rule. The above initiative with DOIT was about the release of “technology data sets,” naturally. Dumping information on the public is a good way to pay lip service to “openness,” but a poor one of actually cultivating democracy. The same inequality that “innovation” hints at in its preference for white collar activities over blue collar ones (“innovative” would never be applied to something like finding a way to effectively deal with Chicago’s potholes) makes it so that many citizens simply don’t have time to comb through massive data sets. I mean, would you read a 10,000 word email I sent to you out of nowhere? Data pollution, as I’ll start calling this technique, is also remarkably cynical: It assumes that the best way to approach a city’s core issues (like transportation) is by theoretically giving every man, woman and child the ability to be a DIY hero, finding discrepancies or points of interests in data as if they were reviewing the Linux kernel for bugs. Individualism – like the “Yankee ingenuity” myth that underpins “innovation” as a concept – is prioritized over civic involvement.

Accountable: Schools, teachers, non-executive workers, and sometimes soldiers (well, David Petraeus at least) are held “accountable” for what they do. Education really is the best example here, since for the past 30 years it has been run into the ground by profit-seeking test companies, non-expert consultants (i.e., many of them have never even been teachers), and neoliberal of all stripes anxious about losing out to Japan, China, whoever. “Teaching the test” (i.e., focusing largely or exclusively on subject matter that will be assessed on a standardized test) has ruined the freedom of teaching in many public schools in the U.S. However, “accountability” and its variants have very different meanings for the working class and the elite. While the former are under the never-ending pressure of being held up to some arbitrary standard, the latter are free to use “accountability” as a guard against regulation or real scrutiny. The implication is basically: You don’t have to regulate us or question our motives or vote us out in a recall election – we’re holding ourselves accountable, by doing things like releasing this pile of transportation data!

Engagement: “Engagement” is something of a paradox, since it aspires to the community-building of a town hall or community meeting while pushing the idea of highly individualized, stratified conversations that take place largely over channels like Twitter or through meaningless formalities like letters to mayor, in which one side holds all the power. Moreover, the word is now synonymous with simply informing the public what is going on or allowing them a level of token participation. It’s hard to know what real “engagement” would look like for Chicago: Maybe the citizenry banding together to get more cars off the road and push for a 24/7 CTA service, rather than being saddled with “transportation data sets” that are the equivalent of an over-ambitious homework assignment.

It’s remarkable how powerful language is. At a time when liberal arts programs are frowned upon for their lack of utility in a humorless job market, it seems that only English majors or people who are broadly read can actually cut through the fluffy words peddled by elites, who think of themselves as hardheaded realists steering us toward endless innovation.

Keywords for the Age of Austerity

Via cultural critic Evgeny Morozov, I recently came across a fantastic Tumblr by John Patrick Leary, a professor at Wayne State University. Leary has done a series on the “keywords for the age of austerity,” dissecting how terms like “innovation,” “entrepreneurship,” and “conversation” have been co-opted by businessmen to reinforce narrowideas about hierarchy, market-based everything, and virtual technologies (a creaky term, but one that basically encompasses anything from social networks in particular to the overall distract-a-thon of phones in general).

One of my favorite critiques in the series in this gem about how businessmen really have to stretch their use of language:

“One of the things that surprised me when I began this project was how imaginative, even fanciful, was the language of MBAs and economists, whose prestige derives from their disciplines’ pretensions to science and hard-headedness.  Consider the metaphor of ‘business confidence,’ in which abstractions like ‘business’ and ‘the market’ are personified with the fragile psyches of a lily-livered moper who must be either brow-beaten or deceived back into cheerfulness.”

Yep. And this same reification and shows up with many of the keywords Leary analyzes. If the “market” requires “confidence” at all time, then “innovation” needs a seemingly infinite supply of encouragement to offset the “discouragement” it is constantly receiving from courts that rule against startups like Aereo, fair usage laws, and “short-term returns.”

The latter comes from a NYT column by Cecilia Conrad of the MacArthur Foundation, entitled “Our Society Discourages Innovation,” wherein the author bemoans how educational assessment, the demise of academic tenure, etc., have chipped away at the ability to innovate. It is a strange argument to make, considering that acceptance and usage of terms like “innovation” has helped cause these changes.

All of these profound social upheavals have sprung from an increasingly financialized U.S. economy, in which, for instance, students are constantly monitored for “failure” and pitted pointlessly against their peers overseas, despite the meaninglessness of the tests purported to compare one country to another. Meanwhile, “innovation” is often used a stake to divide the constantly innovating, forward-thinking white collar class from the blue collar world – when’s the last time a central heat installer “innovated” something in the current popular corporate classist conception of the word?

There’s also something very self-destructive about the constant hand-wringing about educational achievement gaps, losing out to China, not having enough STEM graduates/visas, and falling behind in the global “marketplace.” First, business rhetoric is used to frame and even create problems that then require solutions conceived, developed, and marketed using the “austerity keywords” lexicon that Leary has chronicled throughout his blog. I mean, how many times has a huge project been marketed with the promise that it will save tons of money? Money for whom? You can see a specific, highly contextual corporate goal rebranded and repackaged as something that is good for society at large.

But then, since business is a small cross-section of the possible human experience, this approach often ends up entailing actions such as laying off teachers or cutting funding for “soft’ subjects like art history or, well, English. “Expensive” employees and areas of study that don’t directly involve considerations of money or something proven to reliably create it – stop for a moment to consider how un-“innovative” it is to look for “innovation” opportunities only or mostly in specific fields such as computer science or biology, in which there are, vitally, track records of financial success and none of the uncertainty that so upsets the aforementioned “confidence” – cannot be tolerated if all policy is made in deference to an impersonal but somehow godlike “market.”

In addition to the real damage done to livelihoods and social welfare, this approach also dumbs down the language. Instead of stopping to learn how to criticize or think about a cultural work or how to understand its history and usages of language, students and workers alike consume an endless torrent of buzzwords and saturated terms, peddled by parties with no credentials or distinctions in English.

Sadly, one of the most marketable (there’s that word again) arguments for making everyone take advanced English or even comparative literature classes would be to help beat back the tide of a language increasingly hijacked by a handful of overused words that non-expert (with respect to language) businessmen like and, let’s be honest, desperately depend upon for their own security.  Even the business and political classes so eager to cut humanities budget realize, on some level, that the words make the world, or else they wouldn’t be so set on framing every issue with almost the same set of terms.

A left field entry about diets, hair loss, and transformations

Almost all diets fail. The force of will of most human beings is no match for bodily urges and the peculiar design of the brain, which, because of the relative sizes of the adrenal glands and frontal lobe, is much better at giving into instincts than foreseeing consequences ahead of time. Diets also, by and large, aren’t even meant to make their would-be adherents thinner: They’re meant to separate them from money as well as from free time that could be devoted to so many other projects. The opportunity cost of a diet is immense, as Melissa Fabello once argued in her polemic against diets as tools of capitalism.

My own experience: Diets
Dieting never occurred to me until I was 25, at which point I had gained almost 40 pounds from the baseline weight I had maintained for almost a decade. My gain started sometime in 2009, when I was 22, not long after I had resumed taking Prozac and finished my degree.  I am not sure if any of these events were related but I do remember eating much more regularly than I had before, when I routinely slept until noon and didn’t eat anything until dinner.

2009 was also the first time I noticed a difference in my hair pattern, namely thinning near the temples – “noticed,” because it’s possible that these characteristics had emerged much earlier and I had simply not noticed due to the length of my hair between about 2003 and 2008. I am blonde and sometimes let me hair grow out a lot because it looks shorter than it really is. Looking back through Facebook albums, I can see that distinctive receding temples + prominent window’s peak pattern throughout 2009.

In mid-2011, at my sister’s college graduation, both my weight and my hair had further deviated from my the norms of my college and high school years. I neither noticed nor cared, though, which seems so strange in retrospect. I remember getting comments about my face being fuller one Christmas, but overall it felt like there wasn’t much changing except that I wore different (larger) pants and kept my hair shorter. Much of my time then was consumed with finding work.

A year later, I finally decided to change course because I was feeling sluggish a lot, perhaps from being sedentary at both my job at the time and spending too much time on the Internet at home. I started a VERY simple exercise at-home exercise routine that consisted of:

  1. Elevated pushups (with my feet in a large box full of clothes, so that my entire body was at an angle)
  2. Squats, being sure to go below parallel (agonizing at the time, but seems like second nature now)
  3. A sort of modified sit up, with one leg arched and the other flat, using the chest to lift the body partially up toward the ceiling

That was it for the first few months. Eventually I made some modifications:

  • I got a pair of pushup handles for $3 from a store in Philadelphia one Christmas.
  • I obtained a pull-up bar from Marshall’s and purchased some gloves from Nike so that I could avoid callouses
  • Finally, I picked up a 15-pound kettle weight, also from Marshall’s
  • With this new gear, I added pull-ups and one-legged bar pushups (replacing elevated pushups) to my workout.

I did all exercises at home and did a few walks. The entire regimen cost me maybe $40 between 2012 and today, about $1 for every pound I lost.

I changed what I ate too, although I did not do anything radical. I basically just:

  • Stopped eating cereal for breakfast every day
  • Replaced chips, pretzels and cookies with grapefruits and cucumbers as snacks
  • Switched to stevia from sugar in coffee, tea, etc.

For me(n), it seemed like curtailing sugar intake was the single most substantial dietary change. Everything else was secondary. I still ate lots of high fat food like hamburgers and fries but continued dropping weight.

I am not sure if my advice or case has relevance for the general population. I had always been thin and perhaps had changed body type during an anomalous and tumultuous period in my life (post-college, on antidepressants, etc.) that eventually subsided and allowed me to regress to my personal mean.

My own experience: Hair
Now back to hair. If there’s anything dieting and hair loss mitigation have in common it’s that they feel like – and often are – impossible battles. How many billions (trillions?) have been sunk into dieting and hair loss “solutions” over the years?

My hair seemed to thin out some from 2011-2013, perhaps from stress and MPB. There’s some of the latter on one side of my family but my case, if indeed I have it, seems mild so far. I noticed some temple thinness in 2009 and what seemed like a larger forehead sometime in 2013, when I was in Iowa at a casino on the night of my wedding.

The pattern for me is around the temples but doesn’t seem to affect the crown. The first particular action I took to see if I could re-thicken my hair was to use a Bosley regimen of shampoo + conditioner + thickening treatment. It seemed so-so: Most of its power, I think, came from the sheen it gave the hair after step 3 (the thickener).

In late 2014, I began taking saw palmetto supplements as well as biotin. The saw palmetto is a tree that grows in Georgia and Florida and produces a fatty fruit. It was apparently widely used in Native American medicine and has been compared to Propecia because of its apparent usefulness in treating prostate inflammation. Biotin/vitamin H is a B-vitamin that is supposed to help with protein structure for hair and nails.

Disclaimer: neither saw palmetto nor biotin is clinically proven to have any effect, positive or negative, on hair. Maybe it’s a placebo effect, but I do feel that 6 months of taking the saw palmetto has made my hair…fluffier? It’s hard to describe. I had a lot of hair in the front and center of my head, so maybe the active mechanisms had something to work with.

I also used a shampoo made with Dead Sea minerals, called Premier. Apparently it oxygenates the hair follicles by opening up the pores on the scalp. I cannot speak to its power yet since I haven’t used it for that long, but I like the texture it is creating so far. Overall I am happy with my hair in 2015. It looks a lot like it did in 2009, for what that’s worth.

Transformations
All of this is needlessly vain, I realize. The “advantages” of being thin and full-haired are often touted by industries staffed by people who are neither. Much of the allure of both traits is just that: A temptation to spend a lot of money on dubious “solutions” that cannot deliver on their promises.

I once read a great Quora post about how the past was the scarcest resource in the world. Nostalgia alone fuels the high prices of everything from New York real estate (“[famous person] lived here a long time ago!}) to hair transplants (“I can look like I did when I was 25 again!”). It’s so true. Dieting and hair restoration are both, more often than not, presented as tickets to a glorious past, a youth that was actually the product of nothing more than metabolism and genetics.

However, I do think it’s possible to go a journey by fighting the uphill battles against dieting and hair loss. I see no inherent virtue in being thin or having a full head of hair (these are such shallow, stupid criteria on which to judge anyone) but sometimes the experience of trying to reverse one’s current state – to simply snap out of whatever the daily norm and routine has become – can be a powerful learning experience and the fuel for personal transformation, not matter what the results.

The writer’s block myth

Writer’s block is a myth. The term itself is instructive: it could be construed as either a literal block encasing the writer, or a sort of solid mental state, not given to “fluid” thought. It’s a bit of both: I feel that “writer’s block” is typically too much thinking about thinking (hence the mental “block” aspect), and that physical action is the way to break it (insofar as it is like a physical object that can be broken).

The “right” state of mind will probably never come
At the beginning of this year, when I was blogging every single day, I was penning 2000-word essays in the evening after having written thousands of words for work during the day. Essentially, I was writing the equivalent of a short novel every 1.5 weeks. Moreover, since starting at my current job in the summer of 2013, I have written, I would estimate, almost 2x the volume of the King James Bible between job-related and personal projects.

Am I proud of all the material I’ve written? No. But my ability to “force” this amount of words through, perhaps in rough form early on but with plenty of refining along the way, has taught me that writing is ultimately a mechanical act; it’s not really about inspiration, at least not in the sense that is so often romanticized. It’s about mechanics.

The mechanics of writing – typing, scrawling, whatever – are an unending act that presents many possible plot twists along the way. Names, constructions, entire questions that I would have never come up with thinking idly spring into life when I have to actually commit to something, and I know I’m not alone. The “inspired” state is constructed and willed; it doesn’t descend from on high.

Act to think, don’t think to act
It sounds like bad advice, but hear me out. Sometimes I will compose lines or scenes in my head (I’m working on a collection that I will make into an ebook soon). But a-ha type moments are rare. Most of the exciting possibilities only unfold when I sit down to write – that’s part of what’s so exciting about the endeavor. The act is the catalyst for the thinking, not the other way around.

I am aiming to get my e-book done before my birthday in August. Then it’ll be on to the next collection or project. It feels like continually producing, with occasional stops along the way to restructure and let ideas simmer, is preferable to trying to plan out everything from the get-go.

There are some techniques I have tried on that front, though, such as using Excel/Numbers to plan a plot. It can be a good technique for thinking sequentially but I feel that it can lock one in before she even gets started with the actual narration and description. Right now I have been trying to approach each story as its own island, with a different style, so I am foregoing this method. “Writer’s block” hasn’t emerged yet, and I feel that it won’t as long as I just keep doing the writing “exercise,” no matter how painful it feels for the first couple of minutes when I’m running through ideas.

Silent Film and “the Internet”

In late 2013, we watched our first full-length silent film, “The Thief of Baghdad” from 1924, starring Douglas Fairbanks. Fast-moving with an endlessly engaging score (a loop of “Scheherazade” by Rimsky-Korsakov), it’s a good “break-in” film for anyone unfamiliar with the silent era. Fairbanks excelled at swashbuckling roles, and “The Thief of Baghdad” is one of the swashbucklingest movies ever made. He dances around with his scimitar and dives into the sea to fight off monsters, too.

Since that time, we have explored a few other silent era films, including the corpus of Kentucky director and Hollywood godfather D.W. Griffith. I recently finished his “Intolerance,” from 1916, the follow-up to 1915’s blockbuster “The Birth of a Nation.” The latter rewrote the rules for feature-length films by being essentially the first feature-length film, with a continuous narrative structure documenting the before, after and during of the American Civil War. “Intolerance,” though less famous, may be Griffith’s best work.

16129443253_da177e2000_k

A screen grab from “Intolerance.”

Split
I have always liked the idea of split stories and parallel action; “Intolerance” provides nothing but for epic 3+ hour duration. There are four stories, each documenting a moment in history when intolerance of other belief systems or moral codes was the preamble to violence: there’s an ancient Babylonian story about the city attack by Cyrus the Great, a Judean story about Jesus, a French story about the St. Bartholomew’s Day Massacre, and an American modern story about a mill strike and a group of, well, intolerable moralists.

The variety of “Intolerance” makes its epic running time go by swiftly. Griffith employs many different color prints, a melange of musical samples, and some strange interstitial techniques like a woman rocking a baby in a cradle (representing the passage of time between the film’s chosen eras) and a background shot that includes what looks like the script/screenplay for “Intolerance” itself – how meta. Textual snippets are also given period-specific cards, such a tablet for the Babylonian story.

Relevance
“Intolerance” is 99 years old this year, but perhaps because of its cosmopolitan subject matter it seems less dated than “The Birth of a Nation,” which represented and embraced the retrograde racial attitudes of its period. Another thing that makes “Intolerance” seem so modern is it ambition. The budget ran well into the millions of USD – in 1916! The sets, such as the Babylonian city that Cyrus besieges, are sprawling and look great almost a century on – behind the color-tinted shots and film crackles, they now seem as old as the times they tried to depict.

Some of the film’s imagery and topics, especially in the Babylonian scene, remain relevant for 21st century viewers. The issue of whose god is mightier – Bel-Marduk or Ishtar – and the shots of people falling to their deaths while large (siege) towers topple has uncomfortable symmetry with 9/11, for instance.

Part of what is so striking to me now, though, about “Intolerance” and silent films in general, is how “Internet”-like the entire experience is. There’s the variable pacing of moving from one card to the next and reading the text, just like one would do with a webpage (with the important and obvious difference of not being in control of the direction – although one could say that people addicted to Facebook or forum arguments are hardly free from inertia in this regard…). There is the card-by-card, shot-by-shot attention to design and layout (“Intolerance” even has footnotes for some of its textual snippets!) as well.

History
Earlier this year, I wrote about “the Internet” is a term applied retroactively to a bunch of actually separate histories – networking, software, hardware, etc. – with the added current connotation as a medium through which its users receive information. It used to be called by different names – “cyberspace” is perhaps the best example of this class of outmoded labels, as it conceives of connectivity as a space rather than a medium – and really if one wants to get technical, the vague principles of “the Internet” go all the way back to the telegraph, which was a much bigger break with what came before it than, say, TCP/IP was with its predecessors.

Before watching “Intolerance,” I hadn’t though of silent film as a part of “Internet history.” But the design tropes of silent film are if anything becoming more, not less, prevalent in media. Pushing cards or snippets of content – say, Snapchat Discover, Twitter’s “While You Were Away” feature, or the stream of matches on an app like Tinder – is an essential mechanism for many of today’s mobile services in particular. Integration of video with services like Meerkat (it lets one show live video to her Twitter followers) only makes the lineage from silent film to “the Internet” more apparent.

In a way, “the Internet” hasn’t even caught up to the immersive experience of silent films, which often not only pushed discrete cards and pieces of narration at viewers (ironically, to support a continuous narrative) but also featured live orchestras in grand settings. Videoconferencing (FaceTime, Skype) and the likes of Snapchat are Meerkat strive for that same immediacy that Griffith et al captured in the 1910s.

One more intersection: For someone used to talking movies, watching a silent film can feel really lonely, because no one is talking. For me, this exact sort of silence and proneness to becoming lost in thought – for better or worse – is endemic to using “the Internet.” It’s strange, really, that in an extroverted society like the U.S., in which silence is barely tolerated in meetings etc., that so much mental energy is channeled into the inaudible actions of responding to emails or skimming BuzzFeed. I would much rather wordlessly watch “Intolerance” again.

Follow

Get every new post delivered to your Inbox.

Join 243 other followers