Feeling jet lagged after a great trip to London for the weekend. While riding business class on the way back, I finally finished up the last of three essays of Nietzsche’s “On the Genealogy of Morals.” It started slow but the last half was an excellent argument about how Christian morality has so embedded itself in the West that even being an “atheist” is in some way just another stage in a long ascetic tradition – in this case, denying oneself the possibility of God’s existence.
There was also a passage about the strong versus the weak that resonated with me because of its arguments about herd mentality and meetings. I have always felt that meeting culture – “touching base,” having “a quick chat,” spewing 30- and 60-minute calendar blocks that probably merit only 5 minutes of time at most, etc. – was one of the most regrettable aspects of the workplace in the U.S. It’s like the corporate equivalent of church. So imagine my delight at this segment:
“[I]t should not be overlooked: the strong are as naturally inclined to strive to be apart as the weak are to strive to be together, when the former unite, this takes place only with a view to an aggressive collective action and collective satisfaction of their will to power, with much resistance from the individual consciences; the latter, on the contrary, gather together with pleasure at this very gathering, – their instinct is just as satisfied in doing this as the instinct of the born ‘masters’ is basically irritated and unsettled by organization.”
Meetings and gatherings of any kind, especially ones that involve, say, at least 3 people, are usually a waste of time for individuals who do their best work on their own. Being a cynic, I have often thought that the purpose of most corporate meetings is exhaustion – bringing people together in an ‘all-hands’ environment in which attention spans are tested and things are agreed to when no one is paying full attention. Opinions of people who don’t feel comfortable in the superficial environment of meetings – the ‘best’ argument doesn’t always win and is overcome by the best-sounding argument – are also crowded out.
I will write more about the last part of “On the Genealogy of Morals” later since it is really a treasure-trove of useful contrarian arguments against 21st century attitudes toward work. For now, though, I’ll note that Nietzsche talks about how the appeal of religion and the act of congregation – which these days has shifted in the U.S. from churches to workplaces – is the product of poor physical well-being (which needs some kind of relief) and wanting to be someone else. I can agree with that.
We had a good day in London this Valentine’s Day weekend. We saw Kensington Gardens, the Tower of London, and much of Westminster.
I may have some good material to start the fifth short story to cap off my initial collection. Maybe something really strange like a horror story inspired by the video game ZombiU (set in post apocalyptic Lobdon) and my own brief London travels.
I don’t have a long entry in me today since I spent most of my evening touching up a creative project that I ended up entitling “The Graduate.” I had originally posted a preview of it here as “The Gambler,” but adjusted the title so as to better reflect some of the themes of the story.
The story is about two individuals reflecting on a college graduation ceremony that they both attended. One has good memories of it, and more generally of her overall carefree attitude toward her college work, reflected in the fact that she never took any notes in her courses. The other person, a man, is more pensive about that same time, thinking about how so little had happened between his own graduation (further in the past) and the current graduation at hand. It’s sort of a fictional version of the non-fictional history I fleshed-out in an entry on here back in January.
I used some of the cut-and-paste and randomization that I have so far used in my other short stories. There are some snippets of poetry, plus an entire mid section that is a lecture that one person is listening to and not taking notes on. The mini sections near the photos, with the text “The notes:” are perhaps the notes that the more pensive of the two speakers (James) would have taken if he had been there (maybe he was there).
About the photos: I took them of my actual college notebooks from 2004 (you can see the date on one of them as 9/21/04 – my first month in college). I’ve always liked using handwriting in my creative projects, such as for this poem I once wrote in iA Writer on a Mac and then transcribed by hand and filtered through several apps:
I like the idea of having a dialogue between typed text and written/pictorial text. I had a lot of fun when I used photos of a printed version of the middle section of my first ever short story, “The Loop,” as the actual middle section of the piece (i.e., I didn’t even type it into the Tumblr entry). It lets me put visual art and writing side by side.
With four stories now written, I’m going to do a fifth and then work on self-publishing them as a collection. So far I have enjoyed the no-pressure atmosphere of Tumblr, but I am in love with the idea of collecting them all in a physical volume that I can distribute or even sell.
Stylistically, I feel that I’m still feeling out my limits and preferences. I like the concept of recycling old text, notes, lyrics, and other textual scraps into something that sort of moves like a narrative. Perhaps I’ll settle into more linear narratives eventually but my love of poetry seems like it’ll always pull me back toward making some sort of hybrid.
“Technology” is a strange word. Its Greek root, techné, means “art” or “excellence,” and its usage in English is scarce until at least the 20th century. Its rise in popular discourse during the second Industrial Revolution, the movement that produced inventions such as the phonograph, makes sense. However, what’s usually glossed over is that “technology,” as a word, is filler, distracting us from the the reshaping of society from above.
What does it even mean to say that “technology changed everything” or to assign so much agency to vague, well, technological concepts such as “big data” or “the Internet of Things?” The vast discourse on technology is the best possible example of what Georg Lukacs called “reification,” the act of instilling human activities with the characteristics of things, creating what Lukacs himself called “a ‘phantom-objectivity,’ an autonomy that seems so strictly rational and all-embracing as to conceal every trace of its fundamental nature: the relationship between people.”
When I see “technology” in a sentence, I move pretty quickly past it and don’t think much about it. If I do, though, it’s like I rounded a corner and saw a forked roads leading into three turnabouts – the generality is crushing. Are we talking strictly about the actions of hardware, software, and networks? Are these actions autonomous? What if we just assigned all of these machinations to the category of “machinery and artisanal crafts” and spoke of the great, world-changing, liberating power of “powerful industrial machinery”? It doesn’t have the same ring to it, does it?
Words and classes
The history of words to talk about all of the basic concepts that undergird “tech writing” – the category that would seemingly include everyone from TechCrunch to PC World to Daring Fireball to this blog – is the history of taking words that belonged to the blue-collar working classes and reassigning them to the white-collar management classes. Take “software,” for instance. It derives from “hardware,” which once referred primarily to small metal goods. As early as the 18th century, one could talk about a “hardware store” as a place to buy metals.
Something similar, on a much broader scale, has gone on with the term “Internet.” As I explained in my entry on “Space Quest 6: The Spinal Frontier,” the entire discourse about “the Internet” is a retroactive reorganization of many separate traditions, spanning hardware, software, and networking, that once went by disparate names. Even the act of using “the Internet” was once similarly variable: it could be called “going into cyberspace” or “using virtual reality” well through the 1990s. Grouping everything under the banner of the “Internet” has had the desired effect of making changes affecting fields as diverse as education (via online learning) and transportation (via services like Lyft and Uber) seem inevitable.
It is reification writ large, as tight origin story compiled after the fact to create that very “phantom-objectivity” that Lukacs talked about. Likewise, “technology” itself, as a word, is a mini history on how mundane physical activities – building computers, setting up assembly lines – were reimagined to be on par with the high arts of antiquity. Leo Marx wrote, in his paper “Technology: The Emergence of a Hazardous Concept”:
“Whereas the term mechanic (or industrial, or practical) arts calls to mind men with soiled hands tinkering at workbenches, technology conjures clean, well-educated, white male technicians in control booths watching dials, instrument panels, or computer monitors. Whereas the mechanic arts belong to the mundane world of work, physicality, and practicality – of humdrum handicrafts and artisanal skills – technology belongs on the higher social and intellectual plane of book learning, scientific research, and the university. This dispassionate word, with its synthetic patina, its lack of a physical or sensory referent, its aura of sanitized, bloodless – indeed, disembodied – cerebration and precision, has eased the induction of what had been the mechanic arts – now practiced by engineers – into the precincts of the finer arts and higher learning.”
Making it, writing it
I love this passage since it captures so much of how the the rise of technology firms has been about word games and the institution of engineers and venture capitalists as, crucially, creators, and heirs to the traditions of straight male-dominated industry. Debbie Chacra did a great job out outlining the real shape of the Maker movement in a piece for “The Atlantic,” arguing that “artifiacts” – anything physical that could be sold for gain or accrue some sort of monetary value, seemingly on its own – were more important than people in today’s economic systems, especially people who performed traditionally female tasks like educating or caregiving.
Tech writing, vague as it is, exists in this uncomfortable context in which anything not associated with coding or anything “technical” is deemed less important – to businesses, to shareholders, to whomever is important for now but may be forgotten tomorrow – that what is more easily viewed (I mean this literally) as work that came from a predictable process (software from coding is the best example). Writers in this field have to continually prop up a huge concept – technology – that carries the baggage of decades of trying to be elevated to the status of fine arts like….good writing.
Talking about the agency of concepts is common, and tech writers – or anyone dabbling in writing about technology – have to play so many ridiculous games to cater to readers who long ago became lost in the reification of “technology” as an unstoppable force. Take this sentence, which I recently found via Justin Singer’s Tumblr:
“Big Dating unbundles monogamy and sex. It offers to maximize episodes of intimacy while minimizing the risk of rejection or FOMO [fear of missing out].”
Bleh. This passage is easy to make fun of, but its structure is so indicative of tech writing at large. There’s the capitalized concept (“Big Dating”) that is acting, via a buzzwordy verb (“unbundling” – what was the “bundle” in the first place? but “disrupt” is still the all-time champion in this vein) on The World As A Whole. Then there’s the shareholder language (“maximize”/”minimize”/”risk”) that speaks to the neoliberal economic ideas – most of them questionable – that have been the intellectual lifeblood of the tech industry as well as the governments that feebly regulate it (the weakening of political will is one reason Marx saw technology as a “hazardous” concept).
Aristotle and wrap-up
When I dipped my toes into Aristotle’s “On Interpretation” earlier, I talked about how he defined nouns as “sounds.” I then wondered if so much bad writing was the result of trying to write things that would sound absurd in speech (i.e., as sounds).
Tech writing in particular has this sort of not-real quality to it that makes it sound so silly when read aloud. It’s always trying to reify and create vast, unstoppable forces that aren’t even physically perceptible. Writing about “the Internet of Things” or “Big Dating” is to basically dress up everyday and unnovel concepts like networked devices and dating services in dramatic language.
You may as well have someone try to describe an sandstorm or flood to you as it were the result of a phantom-objective, all-powerful godlike force. Wait, that’s, like, 99 percent of religion right there. Well, when writing about “technology,” you’re always writing someone else’s scriptures, with all the opacity and word-gaming that that entails – who wants to read most of that?
Nouns and Greek texts
Looking back at elementary school, the earliest thing I remember learning was what a noun was. “A person, place, or thing” – that seems to cover all the bases. It’s the type of knowledge that quickly becomes secondhand, only coming to mind in cases like interpreting a sentence that contains a gerund, which is an English nouns that seem like a verb (e.g., “the happening is up ahead”).
Sixteen years after I learned what a noun was, I started reading Aristotle in Greek. Although Aristotle exerts tremendous influence on all of Western civilization – in every field from biology (which he started with his examinations of specimens brought to him by Alexander the Great) to theater criticism – I have never loved his ideas or stylistic flourishes as much as those of his teacher, Plato.
Some of his Greek texts seemed rough to me, requiring a lot of insertion of English words in the translation, whereas Plato’s writing was full of plays on words and syntactical arrangements that made it enjoyable in ways that English couldn’t reproduce. When translating, I felt like sometimes English was an upgrade for Aristotle, while it never was for Plato.
Nouns and sounds: Nounds?
I began reading Aristotle’s “On Interpretation” today, my first real brush with his work since 2007, when I was working with the “Nichomachean Ethics.” It won’t take me too long to finish, which is exciting, having recently read almost nothing but long philosophical tracts and novels.
Early on, Aristotle, like an elementary school teacher, sets the grounds rules by defining what he means by a noun. He says:
“By a noun we mean a sound significant by convention, which has no reference to time, and of which no part is significant apart from the rest.”
I don’t have the Greek text with me (I’ll try to find an image of it later) but isn’t it strange that a noun is defined as a sound? Obviously, nouns are also written, soundlessly, on paper and word processors, but, as Aristotle notes, “written words are the symbols of spoken words.” It all comes back to speech.
Sounds and good and bad writing
This makes sense when you start to think about bad writing, more so than good writing. So much bad writing and so many bad ideas emerge because they have no predecessors in speech and would sound close to nonsense if spoken aloud. I’m thinking of all that business writing about “full-service solutions providers.” Jason Fried tore into it several years ago for Inc.:
“One of my favorite phrases in the business world is full-service solutions provider. A quick search on Google finds at least 47,000 companies using that one. That’s full-service generic. There’s more. Cost effective end-to-end solutions brings you about 95,000 results. Provider of value-added services nets you more than 600,000 matches. Exactly which services are sold as not adding value?.”
All of these phrases sound horrible in conversation – even the people who write them wouldn’t utter them aloud in relaxed company. It’s like there’s nothing there; encountering the word “solutions” in text makes me instantly skip like 2 or 3 lines ahead to see if things get better. There may as well be no nouns on the page.
Aristotle is helpful here, too, in a strange way:
“[N]othing is by nature a noun or name – it is only so when it becomes a symbol; inarticulate sounds, such as those which brutes produce, are significant, yet none of these constitutes a noun.”
It’s a weird image that comes to mind for me here, as I equate brutes raving inarticulately with business writers ranting about best-of-breed management structures in ghostwritten columns or ‘touching base’ in their emails. What counts as “inarticulate,” though? A liberal interpretation, I suspect, could capture so much that is bad and nebulous about writing, particularly writing about technology.
Some terms, like “the Internet,” have become so vast as to be meaningless without first trying to figure out what they’re not – what is the Internet not, when it comes to technology? As I noted a few posts ago, the term has come to bind together software, hardware, networks, and many other disparate technologies into a homogenous term.
If it’s not everything, then it’s trying to become so by incorporating every device possible, through the “Internet of Things.” Sensors, “analytics,” and, yep, valued-added services all pile into conversations about this term: All I know is that trying to write about “the Internet of Things” makes me sound like an inarticulate brute.