What is the difference between being good and being great? The question is a black hole. For starters, the word “good” is a powerful vehicle for imposing different kinds of moral philosophy, with a long history of being situated opposite of both “bad” and “evil” (not the same concept, and borne of two different cultures), as we looked at yesterday with Nietzsche’s The Genealogy of Morals.
“Great” isn’t as good as “good,” at least as words go. It has always seemed like a distraction; having established what “good” is – a vast undertaking beyond the strength of any individual, and one that, even accomplished, only yields an institution that can be challenged, like the Roman one that the Judaeo-Christian world fought against – a culture uses “great” as a layer on top of “good,” to draw attention away from the question of “what is good?” and instead transfix everyone with the notion of things that seem to be outside of the good-bad or good-evil dichotomy – i.e., things that are “great.”
But really, “great” does work on behalf of “good.” It is used to soften up our interpretative minds by directing us to things that, the bestower argues, are worthy of something akin to worship, which is a passive activity since the worshipper is not capable of constantly scrutinizing the worshipped and instead settles for assumptions – e.g., Jesus is the son of God; OK Computer is the best album of the 1990s – that invisibly take on the evaluative work once done by comparing “good” to its opposite du jour. It makes sense that a famous hymn is entitled “How Great Thou Art.”
Numbers, averages, and ease
Even if one believes that using “great” rather than “good” is just a matter of degree, there’s still the issue of drawing a line between the two. The difficulty of doing so – as opposed to the knee-jerk ease of calling something “bad” or “evil” and seeing that it is, in the mind, discrete from “good” – speaks to the odd place of “great” in the cultural lexicon. So naturally, numbers are often invoked to make it appear like there’s a clearer distinction.
A recent essay from Paul Graham attempted to show the difference in the context of computer programmers. He wrote:
“it’s easy to imagine cases where a great programmer might invent things worth 100x or even 1000x an average programmer’s salary.”
Now, for context, we’re talking about, possibly, someone who is “good” and merely keeps a codebase functional and someone who is “great” and invents something of uncertain cultural value like Snapchat. More generally, it seems to be a difference between workaday existence and the freedom conferred by turning code into a vast commercial enterprise run by a “great” programmer who, the mythology insists, had the wherewithal to create value on his own.
Accordingly, we can also see the bones of the old nobility-vs-everyone else morality grafted onto “great-vs-good,” or rather in this case “average,” a word with an interesting etymology. “Average” comes from the French “avarice,” meaning “damage to a ship.” To be average is literally to suffer financial loss, so “great” makes sense as a synonym, in this particular context only, for financial gain – a nice happy accident, if words are your game.
He also uses “ordinary,” which has roots meaning “orderly” or “customary.” This usage subtly, but only in passing, flips the script by making “great” non-conformist, despite it being invested with the most cliche value system possible – that the best idea is the one that makes the most money.
It’s notable that “bad” is missing in action here. The main foil is “average,” speaking to the growing economic inequality and current fixation on assessment, discussed in the book Average is Over: Powering America Beyond the Age of Great Stagnation, with a title implying that a “great” solution is needed to overcome a “great” problem. But the irony is that Graham’s claim of 1000x the value is random and unverified, and it seems hard to imagine a company hiring one “great” engineer over 1000 “average ones.”
Roger Ebert had his own ideas on “great,” in this case situated against “almost-great” rather than “good”:
“[T]rue geniuses rarely take their own work seriously, because it comes so easily for them. Great writers (Nabokov, Dickens, Wodehouse) make it look like play. Almost-great writers (Mann, Galsworthy, Wolfe) make it look like Herculean triumph. It is as true in every field; compare Shakespeare to Shaw, Jordan to Barkley, Picasso to Rothko, Kennedy to Nixon. Salieri could strain and moan and bring forth tinkling jingles; Mozart could compose so joyously that he seemed, Salieri complained, to be ‘taking dictation from God.'”
Again, the foil isn’t “good” per se, though Ebert’s “almost-good” is a lot different than Graham’s “average.” The line here isn’t financial value, but ease of creation, which is even harder to parse (unless, I guess, you’re truly “great” and know nothing but ease in your chosen endeavor). For me, painstaking work is sometimes terrible and sometimes good, while writing that comes easily also has wide variance of quality. I have noticed that some of my creative writing that came easy was better than that that came hard, but it also felt like the former owed a lot to the latter.
Next time, I’ll look at some passages to probe this dichotomy of “great” and “almost-great.” I doubt it will yield a 1000-to-1 difference as with Graham’s great-vs-average, or even anything concrete or generalizable, but it will give me a chance to talk about why anyone thinks public artistic and aesthetic critiques matter in the first place. Ebert’s point about taking work “seriously” may also have a role in this explanation.