Advertisements

Monthly Archives: June, 2017

Music vs book vs film criticism

In 2000, Brett DiCrescenzo of Pitchfork wrote one of the most infamous album reviews that still has a live URL on the internet. Assessing Radiohead’s “Kid A,” he straddled a line between the faux-literary (“The butterscotch lamps along the walls of the tight city square bled upward into the cobalt sky, which seemed as strikingly artificial and perfect as a wizard’s cap.”) and the musically incoherent (“Comparing this to other albums is like comparing an aquarium to blue construction paper.”), while tossing in some vague ethnic stereotypes (“The Italians surrounding me held their breath in communion (save for the drunken few shouting “Criep!”)”) and useless similes (“The primal, brooding guitar attack of “Optimistic” stomps like mating Tyrannosaurs.”), too. It’s a textbook example of the limits of popular music critcism.

Why is it so limited? For starters, it is heavily reliant on adjectives, Narrating what actually happens on any song – e.g., “The song opens in G Major and uses the following chords and key changes: [lists them all]” – would lose a lot of readers and likely stretch the review to a length that, along with the technical subject matter, would tank the page view stats of a site like Pitchfork (or Resident Advisor, or Tiny Mix Tapes, or any other music-centric website). So instead we get lots of adjectives; guitar solos are described as “fluid,” electronic instrumentation as “soundscapes,” and songs themselves as “airy” or “breathless” or “chugging.” These words usually make sense to me in context – like, I can see how a guitar solo might progress such that it seemed “fluid” – but they are somewhat removed from what’s actually going on. Modern music criticism would confound expectations if it began talking about what musicians were doing on their records – playing X, singing Y – and so it often resorts to elaborate descriptions, as well as the protracted narrative frames and cross-references to other pop culture that DiCrescenzo couldn’t avoid.

In contrast, book criticism cannot usually afford such ornate digressions. Any review of a book will naturally grapple with plot details and the author’s particular style, making it oddly both bread-and-butter and academic in comparison, without any of the criticism-as-art-itself that many reviews turn into (indeed, it’s hard now to read DiCrescenzo’s review outside the context of Pitchfork’s larger culture of “artsy” music reviews that were only minimally concerned with the records in question, and instead focused on building PItchfork’s distinctive brand during the early days of the web, when other music criticism sites were extremely barebones and newspaper-like).

Somewhere in between the extremes of music criticism and book criticism is the muddled middle of film criticism, which I’ll define as criticism of both movies and TV. Film critics inevitably must recite what happens on screen, similar to how a book critic can’t escape divulging some plot details; but they also frequently fall into the same rabbit hole that troubled DiCrescenzo, leaning on nebulous adjectives such as “languid” or “swoony” to describe a film’s appearance, or resorting to cliches about self-evident choices, such as the plot being “fast-paced.”

My theory is that the easier a medium is to consume, the more given it is to adjective-centric criticism:

  • Books cannot be multitasked and can take days, months, or even years to complete reading.
  • A movie can be watched in a single sitting, but will usually take at least an hour to finish; a TV show requires even less exertion, and is often a second screen to the viewer’s phone/laptop.
  • An album can be listened to in under an hour, plus it can be consumed “out of order” in a way that a book or film cannot; it is almost meant for multitasking, as the soundtrack for nearly anything.

Book critics have to focus on the plot because they cannot assume that anyone has read it. Meanwhile, music critics can be flashy since they are often speaking to people who have already listened to what they’re reviewing (and thus know the “plot,” as it were, of the album or song). The music critic’s task becomes not so much to provide guidance on whether the album or song should be consumed at all (as in the case with book criticism) but instead to tell the reader what cultural pigeonhole it fits into and if it is OK to like it all.

In this respect, music criticism is highly identitarian. DiCrescenzo’s review was a forerunner of the endless paeans to Beyonce that barely engage with the songs at hand but instead try to situate the subject as something beyond the possibility of different viewpoints: Liking it is Right, disliking it is Wrong. His “Kid A” tract labeled all other albums in music history as blue construction paper. Similarly, The Guardian’s review of “Lemonade” incoherently described the songs (“The songs, though, are not just prurient catnip, but actual dynamite”) and similarly railed against an illusory set of doubters or would-be competitors (“Cynics will cry foul, that Beyoncé remains an entitled superstar, raging at a paper tiger. Those cynics will be ignoring one of this year’s finest albums.”), recalling DiCrescenzo’s weird aquarium quip.

Meanwhile, film critics act like they are dealing with a medium as elitist and as private as the book, but in reality they are critiquing works that are more akin to music in terms of its publicity and ease of consumption. At the same time, they have to work within the significant accumulated institutional cruft – the Oscars, “prestige TV,” the “golden era of TV,” the Cannes Film Festival (and its many derivatives), HBO (and especially “Game of Thrones”), Netflix originals, the insane desire for critical validation of once-scorned superhero movies – that is really like some of the worst vestiges of the book critcism realm, for example the notion of a definable “Western canon” that must be defended by critics like Harold Bloom.

But film is not like print. Here’s what I mean: an obscure film is more approachable than a well-known book; for example anyone could see even a marginal piece of queer cinema with less effort than it would take to plow through either the widely known Infinite Jest or The Decline and Fall of the Roman Empire. To really feel the relative difficulty of consuming any book, consider the case of The Satanic Verses by Salman Rushdie.

It is likely the most controversial book of the last century, earning a death threat for its author from the leader of Iran, visibly straining relations between Iran and the United Kingdom, and resulting in the deaths of several of its translators. But how many people have ever actually discussed the content of the book? The fact that it is written in a dense, Joycean style that makes even the first pages hard to get through? How its controversial occurs in a dream sequence?

The original New York Times review of it is instructive, for both its clear descriptions of plot and its acknowledgement of the divide between the book’s vast reputation and its meager readership:

“The book moves with Gibreel and Chamcha from their past lives in Bombay to London, and back to Bombay again. For Gibreel, there is many an imaginary journey on the way – most notably to a city of sand called Jahilia (for ignorance), where a very decent, embattled businessman-turned-prophet by the name of Mahound is rising to prominence…

[M]uch of the outrage has been fueled by hearsay. Some of the noisiest objections have been raised by people who have never read the book and have no intention of ever reading it…

It is Mr. Rushdie’s wide-ranging power of assimilation and imaginative boldness that make his work so different from that of other well-known Indian novelists, such as R. K. Narayan, and the exuberance of his comic gift that distinguishes his writing from that of V. S. Naipaul.”

The Satanic Verses is virtually “hot take”-proof, since even the effort required to blow through it and write a quick blog about “Here’s What Salman Rushdie Doesn’t Get About Islam” or “Why Bernie Bros Have Been Praising “The Satanic Verses This Week” is too much for most writers. But if “The Satanic Verses” were a film, everyone would have seen it, given its reputation, and the takes would be endless.

To get a sense of how limited the scope of book criticism is within pop culture, consider the common Twitter joke of responding to anyone comparing anything to “Harry Potter” by simply saying “read another book.” There is no work of fiction that has such a tight hold on the imagination, but there are numerous films – “Star Wars,” “Jurassic Park,” “The Godfather,” etc. – that serve similar roles for understanding events.

It is precisely this ubiquity of major films that makes film criticism much more reactionary than either book or music criticism. A scathing book review is only mildly rewarding for its writer because the audience for any book is so relatively small, plus the intensely private experience of reading – setting your own pace, especially – means that each person’s opinion of a book is better insulated against contrarianism and reassessment than a similar opinion of a movie or TV show. Music reviews are after-the-fact and must contend with the strong identitarism of music taste (e.g., “am I still in good standing with [x] community if i like [y artist’s music]?).

But film is often consumed in public (at a theater) or socially (in a living room), and so there is more incentive to signal to others that they have the Right or Wrong opinions about it. The massive coverage of the Oscars (and the myriad issues about the backgrounds of who got nominated) and the enormous budgets of film studios and streaming services also mean that film critics have unique incentives to engage intensely with the conventional wisdom on any work. Inevitably, a lot of this engagement ends up reading like an angrier version of DiCrescenzo’s “Kid A” novella.

Take this Slate piece on “Lion,” which is only intermittently about the movie, but mixes in lots of personal backstory as well as a milder aquarium/blue construction paper constast. It goes from an odd concept of how gentrification actually happens (“If I had a nickel for every time someone asked me where my real parents were or if I intended to go back home, I could gentrify the Chinese province I was born in.”) to a riff on the “putting unrealistic words in a kid’s mouth” (“Even at that age [6], people would ask me if I knew my “real” family, and, if not, when I planned on meeting up with them at Starbucks.”), to undergraduate term paper-ese (“Our collective and shared understanding of identity continues to grow more and more complex, nuanced, and perhaps less grounded in traditional notions of what our “self” is. 2016 feels like one of the most crucial years for art in the context of artists from marginalized backgrounds asserting their voices—not asking to be “understood” per se but to be respected for the nuances of and intricacies of their identities.”).

The same author also posted an interaction with someone else (to whom I’ll refer to in the transcription below as “B” to his “A”) to his Twitter feed excorciating the same movie:

B: “I guess the last decent film I saw was…Lion? The one with Dev Patel. I thought it was good…could have been a little more emotionally powerful but they did a good job.”

A: “I hate that movie. I think it’s garbage and deeply reductive and offensive. A mauldin, little tale for white tears. Lion is like deeply terrible.”

B: “It was sort based off a true story though which was cool, but yeah they could’ve done a better job.”

A: “Something based on a true story doesn’t change the manipulative techniques the story uses. It perpetuates a really annoying, very white narrative that the families of adopted children don’t count as real, that the core identity of adoptess is based on a biological imperative. It is across the board garbage.”

B: “Right, stories are pure manipulation though. The dude made sure his mother knew that she was his mom, the one that took care of him all those years. Wouldn’t one want to understand where he or she biologically came from? Whether that be an adopted child or the child of immigrants that met in America. There’s so much underlying psychology that comes with your blood. It would be advantageous to know your nature as well as how you were nurtured.”

A: “That’s such a drearily lazy argument. Anyone can say that, anyone can make a straw man argument and deflect actual engagement with a cultural text. Of course art is manipulative, if your base understanding of manipulation (in art) is make the audience do anything. But art can engender and invoke feelings in an audience and exist in complexity. ‘Make you cry” is not necessarily emotional complexity, and while not all films may necessarily call for that, Lion specifically mores itself in low-key racist tropes and has a fundamental distinterest in the nuances of adopted identity. It reduces the identity of an adopted person, and what constitutes family, as a one-dimensional thing, without bothering to explore the political and personal implications of trans-racial/cross-cultural adoption. It offensively relies on adoptee and racialized identity that are superficial, that are without depth. Patel isn’t a character so much as he is a MacGuffin, moving the plot along from point A to point B, unconcerned with the ambiguities. Lion says that in order to be, as an adopted person, a person, you need to find your ‘real’ family, that only your biological family counts as who you are, completely ignoring the way that environment and upbringing and socialization within whitness has/has not shaped him as a person.”

B: Right, I get what you’re saying. But I’m saying that at is base, ‘Lion’ is about a boy who got lost, accidentally got adopted, and eventually tried to find his way back home. With epic cinematogrpahy along the way.”

To me, the telling part of this exchange is how “B” (the critic), after opening weakly with declarations about “Lion” being “garbage”, completely loses his footing after “A” says “There’s so much underlying psychology that comes with your blood,” capturing the nature/nurture divide in what amounts to a latter-day hippie-like aphorism. Everything in “A”‘s response from “That’s such a drearily lazy…” to “tropes” is word salad, although he regains his composure a bit with his critique of identity.

His scorn for Patel being a plot device (if you’re unfamiliar with the term, a MacGuffin is a goal or object that moves a plot along with or without any accompanying narrative exposition) and his mocking (in one of the actual tweets; not in the above transcript) of the “epic cinematography” remark are also revealing. The mechanics of the movie – how its plot works, what it looks like – are entirely subverted to riffing about its identity politics, which is somewhat incoherent, since the critic wants race to be non-determinative for adoptees, but not for “white” people (the word “white” is doing a vast amount of unexplained work in that exchange; it is not so much a word as a MacGuffin, moving the screed from point A to point B).

The entire rant reminded me of a seemingly endless stretch back in the mid 2000s when I was in college, when a friend would go on each Saturday morning to our brunch group about “Little Miss Sunshine,” bemoaning its prestige at the Oscars. There is really no equivalent to this behavior among book or music critics, since both fields are so atomized compared to film, which continues to have a much centralized academy of critics, producers, directors, etc. What book would an angry book reviewer would rail against in casual conversation (other than “Harry Potter,” which has almost exhausted the possibilities on this front, especially with the backlash to J.K. Rowling’s politics providing a delicious new reading of the series)? What album could attract such intense diatribes in a public forum?

Film critics, from Roger Ebert to Pauline Kael to our writer above, are reactionaries because the specter of the “wrong” type of art gaining prestige and adoration is so much more prominent than it is in the book or music spheres. A movie that  a critic dislikes getting feted at the Oscars, or receiving an ovation at the end of a screening (a la “Star Wars”), must engender a feeling similar to a Republican voter seeing “Hollywood celebs” on TV or thinking about a “liberal” enjoying same-sex relations or marijuana: derision, motivating a desire to correct the record. This tendency even seeps into the work of coherent writers like NYT film critic Wesley Morris, who used to complain about ill-defined “elitists” (a central term of conservative discourse) who didn’t appreciate popular film. The unbearableness of so much film criticism is why I agree with Noah Smith that cinema is a dying art with diminishing public relevance, in part because its critical institutions are such a mess.

Advertisements

The slow death of Thatcherism

The famous William Faulkner quote about “the past not even being past” has staying power not only because it contests the idea that time is a one-dimensional line that moves “forward,” but also because it reveals how ancient decisions shape our lives even to the current second. Most people alive today weren’t even born when Margaret Thatcher became Prime Minister of the United Kingdom in the 1979 general election. She died years ago. But her ideas are very much alive.

Thatcher ended decades of postwar consensus that had seen the rise of social welfare systems across Europe and North America. Her zeal for high defense spending, low taxes, and less regulation kept Labour out of power for a generation while providing a blueprint for the ascendance of Ronald Reagan – who would take power less than two years later – across the Atlantic. 

Many of us have no recollection of a time when it wasn’t assumed that everything had to be run like a business, in a “competitive” environment in which everyone is on her own and “the government” is some dark entity that must be reduced, instead of the people and institutions that make life bearable. Sure, these ideas had long gestated among the economists of the morally bankrupt Chicago School (mainly Milton Friedman) but Thatcher turned their academic papers into reality, crushing the miner unions and setting off a prolonged run of privatization and deregulation. 

Even the distinctive brand of military adventurism that has fascinated Western governments and cable news channels since the Gulf War is derived from Thatcher’s decision to fight with Argentina over the Falklands. Almost all military campaigns since then – from Grenada to the Iraq War – have followed the same lead of confronting a clearly outmatched foe, to achieve morally and/or strategically dubious aims.

Although both the U.K. and the U.S. have had small-‘l’ liberal governments post-1979 – Blair and Brown in Britain, Clinton and Obama in America – the truth is that the Thatcher consensus has gone largely unchallenged. The centrism of Blair and the rebranding of Clinton’s party as “New Democrats” were signals of how they operated as much within the Thatcher/Reagan mold as Eisenhower had within the constraints of the then-dominant New Deal regime. Blair’s affinity for military adventures in the Balkans and Iraq and Clinton’s willingness to pursue “welfare” “reform” were both ripped straight from the small-‘c’ conservative playbook. It’s no accident that Thatcher herself identified “New Labour, with its scrubbed mentions of national ownership of industry in its party constitution, as her greatest achievement.

The two countries have followed similar paths for the last 40 years. Both Thatcher and Reagan decisively won all their general elections and then handed the reins to their competent but less charismatic successors, John Major and George H.W. Bush, respectively. Those two continued in a similar but slightly more moderate vein, only to lose in landslides in the 1990s to candidates (Blair and Clinton, respectively) from revamped center-left parties (i.e., Labour and the Democrats), institutions that would have been unrecognizable to party rank-and-file in the 1970s. 

While both Blair and Clinton were electoral juggernauts, they both had much less success than either Thatcher or Reagan in laying the groundwork for their successors. Blair resigned and helped the unpopular Gordon Brown become prime minister; he lasted not even 3 years, losing power to a Tory-LibDem coalition in 2010. Clinton’s efforts could not get his VP Al Gore or his wife Hillary Clinton over the finish line. Like Brown, they both lacked the political acumen or popularity to stop the reactionaries who narrowly defeated them (Cameron in the U.K., Bush 43 and Trump in the U.S.). The only point at which the two histories diverge is with Obama, but he largely governed within the standard Reagan model, with sprinkles of Clintonism, including many of Clinton’s own personnel. 

At a glance, Thatcherism and its numerous derivatives seem to be in strong health. Both the Conservatives in the U.K. and the GOP in the U.S. control the government. Both continue to pursue the same right-wing policies of the 1980s, arguably with even more aggression than their predecessors – just look at Theresa May’s fixation on a “hard Brexit” (that is, with maximal breaks from EU immigration rules and economic integration) and Trump’s almost comically plutocratic commitment to taking away people’s health insurance to finance tax cuts for billionaires.

“Comical” – there is something weirdly humorous about what the right-wing parties of the West have become, though, isn’t there? 

The Conservatives campaigned on a platform of “Strong and Stable” leadership, but their last two PMs – Cameron and May – have taken monumental gambles (the Brexit referendum and the 2017 snap elections) that spectacularly backfired. Having lost their parliamentary majority to a Labour surge led by one of the furthest left MPs in Britain – Jeremy Corbyn, whom they labeled a “terrorist sympathizer”- they must now form a coalition with the Democratic Unionist Party of Northern Ireland, a hard-right creationist party with deep ties to loyalist paramilitaries (a fancy term for white terrorists who kill Catholics).

Meanwhile, the GOP, the home of the heirs to Jerry Falwell’s Moral Majority and nominally the party of strong national defense, is led by a former reality tv host who once went bankrupt running a casino (“you mean people just come in and give you their money in exchange for nothing? Sorry, I’m going to need a better business model” – no one ever, except possibly Donald Trump) and has been caught on tape confessing to routine sexual assault. Plus, party membership from top to bottom is deeply enmeshed with Russian spies and businesses.

And both parties have lost control of the issue of “terrorism,” once easily controlled by right-wing leaders like George W. Bush, to the point that May is literally negotiating with loyalists militias and the GOP has cheered an ISIS attack against Iranian civilians.

Whether these flaws matter to core right-wing partisans is debatable, but it is clear that the ubiquity of conservative policies and their demonstrable failure – visible not only in May being forced to align with terrorists and Trump with Russian autocrats, but also in the collapse of the global banking system after decades of Thatcherist deregulation – has energized the left in a way that had nearly passed out of living memory. 

Bernie Sanders is the most popular politician in America and ended up a few hundred delegates short of snagging the Democratic nomination and likely becoming president. Corbyn went even further and humiliated May, turning predictions of a massive Tory majority heading into Brexit negotiations into a hung parliament. Given the tenuous Tory-DUP coalition, it is probable and perhaps inevitable that Corbyn will eventually be PM. 

Both of these men are senior citizens who for most of their careers were dismissed as “unserious” leftists who would never enter the mainstream. Instead, they have a golden opportunity in their twilight years to finally eradicate Thatcherism root and branch by unseating two truly awful politicians. May and Trump are almost like the store brand versions of Thatcher and Reagan, far less compelling despite their surface resemblance (May as another “Iron Lady,” Trump as another celebrity GOP president and “Great Communicator”; in reality, they are the Tin Lady and the Great Convfefer).

The levels of media anger and disbelief at both Corbyn and Sanders deserves its own entry, but it is ultimately indicative of how much of both British and American society remains captured by Thatcherism and “centrism,” neither of which has been seriously challenged until now. New Labour, New Democrats, it’s all eroding and exposing the decrepit foundations of Thatcherism and Reaganism. The fact that the “terrorist” attacks in Britain did not hurt Labour but instead exposed the incompetence of Tory security policy (as Home Secretary, May literally cleared one of the London Bridge stabbed to go fight in Libya!) was a turning point in how I viewed the staying power of Western conservatism. It seems weak, its unpopular ideas barely propped up by cynical appeals and a dying electoral coalition. After almost 40 years, we’re finally seeing what Winston Churchill – that old Conservstive – might be comfortable labeling “the beginning of the end” of Thatcherism.