A left field entry about diets, hair loss, and transformations

Almost all diets fail. The force of will of most human beings is no match for bodily urges and the peculiar design of the brain, which, because of the relative sizes of the adrenal glands and frontal lobe, is much better at giving into instincts than foreseeing consequences ahead of time. Diets also, by and large, aren’t even meant to make their would-be adherents thinner: They’re meant to separate them from money as well as from free time that could be devoted to so many other projects. The opportunity cost of a diet is immense, as Melissa Fabello once argued in her polemic against diets as tools of capitalism.

My own experience: Diets
Dieting never occurred to me until I was 25, at which point I had gained almost 40 pounds from the baseline weight I had maintained for almost a decade. My gain started sometime in 2009, when I was 22, not long after I had resumed taking Prozac and finished my degree.  I am not sure if any of these events were related but I do remember eating much more regularly than I had before, when I routinely slept until noon and didn’t eat anything until dinner.

2009 was also the first time I noticed a difference in my hair pattern, namely thinning near the temples – “noticed,” because it’s possible that these characteristics had emerged much earlier and I had simply not noticed due to the length of my hair between about 2003 and 2008. I am blonde and sometimes let me hair grow out a lot because it looks shorter than it really is. Looking back through Facebook albums, I can see that distinctive receding temples + prominent window’s peak pattern throughout 2009.

In mid-2011, at my sister’s college graduation, both my weight and my hair had further deviated from my the norms of my college and high school years. I neither noticed nor cared, though, which seems so strange in retrospect. I remember getting comments about my face being fuller one Christmas, but overall it felt like there wasn’t much changing except that I wore different (larger) pants and kept my hair shorter. Much of my time then was consumed with finding work.

A year later, I finally decided to change course because I was feeling sluggish a lot, perhaps from being sedentary at both my job at the time and spending too much time on the Internet at home. I started a VERY simple exercise at-home exercise routine that consisted of:

  1. Elevated pushups (with my feet in a large box full of clothes, so that my entire body was at an angle)
  2. Squats, being sure to go below parallel (agonizing at the time, but seems like second nature now)
  3. A sort of modified sit up, with one leg arched and the other flat, using the chest to lift the body partially up toward the ceiling

That was it for the first few months. Eventually I made some modifications:

  • I got a pair of pushup handles for $3 from a store in Philadelphia one Christmas.
  • I obtained a pull-up bar from Marshall’s and purchased some gloves from Nike so that I could avoid callouses
  • Finally, I picked up a 15-pound kettle weight, also from Marshall’s
  • With this new gear, I added pull-ups and one-legged bar pushups (replacing elevated pushups) to my workout.

I did all exercises at home and did a few walks. The entire regimen cost me maybe $40 between 2012 and today, about $1 for every pound I lost.

I changed what I ate too, although I did not do anything radical. I basically just:

  • Stopped eating cereal for breakfast every day
  • Replaced chips, pretzels and cookies with grapefruits and cucumbers as snacks
  • Switched to stevia from sugar in coffee, tea, etc.

For me(n), it seemed like curtailing sugar intake was the single most substantial dietary change. Everything else was secondary. I still ate lots of high fat food like hamburgers and fries but continued dropping weight.

I am not sure if my advice or case has relevance for the general population. I had always been thin and perhaps had changed body type during an anomalous and tumultuous period in my life (post-college, on antidepressants, etc.) that eventually subsided and allowed me to regress to my personal mean.

My own experience: Hair
Now back to hair. If there’s anything dieting and hair loss mitigation have in common it’s that they feel like – and often are – impossible battles. How many billions (trillions?) have been sunk into dieting and hair loss “solutions” over the years?

My hair seemed to thin out some from 2011-2013, perhaps from stress and MPB. There’s some of the latter on one side of my family but my case, if indeed I have it, seems mild so far. I noticed some temple thinness in 2009 and what seemed like a larger forehead sometime in 2013, when I was in Iowa at a casino on the night of my wedding.

The pattern for me is around the temples but doesn’t seem to affect the crown. The first particular action I took to see if I could re-thicken my hair was to use a Bosley regimen of shampoo + conditioner + thickening treatment. It seemed so-so: Most of its power, I think, came from the sheen it gave the hair after step 3 (the thickener).

In late 2014, I began taking saw palmetto supplements as well as biotin. The saw palmetto is a tree that grows in Georgia and Florida and produces a fatty fruit. It was apparently widely used in Native American medicine and has been compared to Propecia because of its apparent usefulness in treating prostate inflammation. Biotin/vitamin H is a B-vitamin that is supposed to help with protein structure for hair and nails.

Disclaimer: neither saw palmetto nor biotin is clinically proven to have any effect, positive or negative, on hair. Maybe it’s a placebo effect, but I do feel that 6 months of taking the saw palmetto has made my hair…fluffier? It’s hard to describe. I had a lot of hair in the front and center of my head, so maybe the active mechanisms had something to work with.

I also used a shampoo made with Dead Sea minerals, called Premier. Apparently it oxygenates the hair follicles by opening up the pores on the scalp. I cannot speak to its power yet since I haven’t used it for that long, but I like the texture it is creating so far. Overall I am happy with my hair in 2015. It looks a lot like it did in 2009, for what that’s worth.

Transformations
All of this is needlessly vain, I realize. The “advantages” of being thin and full-haired are often touted by industries staffed by people who are neither. Much of the allure of both traits is just that: A temptation to spend a lot of money on dubious “solutions” that cannot deliver on their promises.

I once read a great Quora post about how the past was the scarcest resource in the world. Nostalgia alone fuels the high prices of everything from New York real estate (“[famous person] lived here a long time ago!}) to hair transplants (“I can look like I did when I was 25 again!”). It’s so true. Dieting and hair restoration are both, more often than not, presented as tickets to a glorious past, a youth that was actually the product of nothing more than metabolism and genetics.

However, I do think it’s possible to go a journey by fighting the uphill battles against dieting and hair loss. I see no inherent virtue in being thin or having a full head of hair (these are such shallow, stupid criteria on which to judge anyone) but sometimes the experience of trying to reverse one’s current state – to simply snap out of whatever the daily norm and routine has become – can be a powerful learning experience and the fuel for personal transformation, not matter what the results.

The writer’s block myth

Writer’s block is a myth. The term itself is instructive: it could be construed as either a literal block encasing the writer, or a sort of solid mental state, not given to “fluid” thought. It’s a bit of both: I feel that “writer’s block” is typically too much thinking about thinking (hence the mental “block” aspect), and that physical action is the way to break it (insofar as it is like a physical object that can be broken).

The “right” state of mind will probably never come
At the beginning of this year, when I was blogging every single day, I was penning 2000-word essays in the evening after having written thousands of words for work during the day. Essentially, I was writing the equivalent of a short novel every 1.5 weeks. Moreover, since starting at my current job in the summer of 2013, I have written, I would estimate, almost 2x the volume of the King James Bible between job-related and personal projects.

Am I proud of all the material I’ve written? No. But my ability to “force” this amount of words through, perhaps in rough form early on but with plenty of refining along the way, has taught me that writing is ultimately a mechanical act; it’s not really about inspiration, at least not in the sense that is so often romanticized. It’s about mechanics.

The mechanics of writing – typing, scrawling, whatever – are an unending act that presents many possible plot twists along the way. Names, constructions, entire questions that I would have never come up with thinking idly spring into life when I have to actually commit to something, and I know I’m not alone. The “inspired” state is constructed and willed; it doesn’t descend from on high.

Act to think, don’t think to act
It sounds like bad advice, but hear me out. Sometimes I will compose lines or scenes in my head (I’m working on a collection that I will make into an ebook soon). But a-ha type moments are rare. Most of the exciting possibilities only unfold when I sit down to write – that’s part of what’s so exciting about the endeavor. The act is the catalyst for the thinking, not the other way around.

I am aiming to get my e-book done before my birthday in August. Then it’ll be on to the next collection or project. It feels like continually producing, with occasional stops along the way to restructure and let ideas simmer, is preferable to trying to plan out everything from the get-go.

There are some techniques I have tried on that front, though, such as using Excel/Numbers to plan a plot. It can be a good technique for thinking sequentially but I feel that it can lock one in before she even gets started with the actual narration and description. Right now I have been trying to approach each story as its own island, with a different style, so I am foregoing this method. “Writer’s block” hasn’t emerged yet, and I feel that it won’t as long as I just keep doing the writing “exercise,” no matter how painful it feels for the first couple of minutes when I’m running through ideas.

Silent Film and “the Internet”

In late 2013, we watched our first full-length silent film, “The Thief of Baghdad” from 1924, starring Douglas Fairbanks. Fast-moving with an endlessly engaging score (a loop of “Scheherazade” by Rimsky-Korsakov), it’s a good “break-in” film for anyone unfamiliar with the silent era. Fairbanks excelled at swashbuckling roles, and “The Thief of Baghdad” is one of the swashbucklingest movies ever made. He dances around with his scimitar and dives into the sea to fight off monsters, too.

Since that time, we have explored a few other silent era films, including the corpus of Kentucky director and Hollywood godfather D.W. Griffith. I recently finished his “Intolerance,” from 1916, the follow-up to 1915’s blockbuster “The Birth of a Nation.” The latter rewrote the rules for feature-length films by being essentially the first feature-length film, with a continuous narrative structure documenting the before, after and during of the American Civil War. “Intolerance,” though less famous, may be Griffith’s best work.

16129443253_da177e2000_k

A screen grab from “Intolerance.”

Split
I have always liked the idea of split stories and parallel action; “Intolerance” provides nothing but for epic 3+ hour duration. There are four stories, each documenting a moment in history when intolerance of other belief systems or moral codes was the preamble to violence: there’s an ancient Babylonian story about the city attack by Cyrus the Great, a Judean story about Jesus, a French story about the St. Bartholomew’s Day Massacre, and an American modern story about a mill strike and a group of, well, intolerable moralists.

The variety of “Intolerance” makes its epic running time go by swiftly. Griffith employs many different color prints, a melange of musical samples, and some strange interstitial techniques like a woman rocking a baby in a cradle (representing the passage of time between the film’s chosen eras) and a background shot that includes what looks like the script/screenplay for “Intolerance” itself – how meta. Textual snippets are also given period-specific cards, such a tablet for the Babylonian story.

Relevance
“Intolerance” is 99 years old this year, but perhaps because of its cosmopolitan subject matter it seems less dated than “The Birth of a Nation,” which represented and embraced the retrograde racial attitudes of its period. Another thing that makes “Intolerance” seem so modern is it ambition. The budget ran well into the millions of USD – in 1916! The sets, such as the Babylonian city that Cyrus besieges, are sprawling and look great almost a century on – behind the color-tinted shots and film crackles, they now seem as old as the times they tried to depict.

Some of the film’s imagery and topics, especially in the Babylonian scene, remain relevant for 21st century viewers. The issue of whose god is mightier – Bel-Marduk or Ishtar – and the shots of people falling to their deaths while large (siege) towers topple has uncomfortable symmetry with 9/11, for instance.

Part of what is so striking to me now, though, about “Intolerance” and silent films in general, is how “Internet”-like the entire experience is. There’s the variable pacing of moving from one card to the next and reading the text, just like one would do with a webpage (with the important and obvious difference of not being in control of the direction – although one could say that people addicted to Facebook or forum arguments are hardly free from inertia in this regard…). There is the card-by-card, shot-by-shot attention to design and layout (“Intolerance” even has footnotes for some of its textual snippets!) as well.

History
Earlier this year, I wrote about “the Internet” is a term applied retroactively to a bunch of actually separate histories – networking, software, hardware, etc. – with the added current connotation as a medium through which its users receive information. It used to be called by different names – “cyberspace” is perhaps the best example of this class of outmoded labels, as it conceives of connectivity as a space rather than a medium – and really if one wants to get technical, the vague principles of “the Internet” go all the way back to the telegraph, which was a much bigger break with what came before it than, say, TCP/IP was with its predecessors.

Before watching “Intolerance,” I hadn’t though of silent film as a part of “Internet history.” But the design tropes of silent film are if anything becoming more, not less, prevalent in media. Pushing cards or snippets of content – say, Snapchat Discover, Twitter’s “While You Were Away” feature, or the stream of matches on an app like Tinder – is an essential mechanism for many of today’s mobile services in particular. Integration of video with services like Meerkat (it lets one show live video to her Twitter followers) only makes the lineage from silent film to “the Internet” more apparent.

In a way, “the Internet” hasn’t even caught up to the immersive experience of silent films, which often not only pushed discrete cards and pieces of narration at viewers (ironically, to support a continuous narrative) but also featured live orchestras in grand settings. Videoconferencing (FaceTime, Skype) and the likes of Snapchat are Meerkat strive for that same immediacy that Griffith et al captured in the 1910s.

One more intersection: For someone used to talking movies, watching a silent film can feel really lonely, because no one is talking. For me, this exact sort of silence and proneness to becoming lost in thought – for better or worse – is endemic to using “the Internet.” It’s strange, really, that in an extroverted society like the U.S., in which silence is barely tolerated in meetings etc., that so much mental energy is channeled into the inaudible actions of responding to emails or skimming BuzzFeed. I would much rather wordlessly watch “Intolerance” again.

The loss of freedom

“Freedom”
Coming of age in early-to-mid 2000s America, my generation had unprecedented opportunity to interpret the word “freedom.” The word was mercilessly flung around by the entire political spectrum, being most often positioned as something in dire need of preservation following 9/11.

Soldiers in Afghanistan and Iraq were said to fight for “freedom,” French fries were renamed “freedom fries” so as to, for some reason, slight the country that actually won the French American Revolution, and, with time, a word that might have once denoted independence from tyranny instead connoted everything from billionaires having the “right” to decisively influence election to the nation’s working poor gun-owners having the “freedom” to own firearms posing a much greater threat to them than any intruder. That really runs the gamut.

Even as a 16- and 17-year old, I didn’t feel that “freedom” was at stake for Americans in particular in regard to the Iraq War and other conflicts. Granted, it was very much at stake for the inhabitants of these lands, who good or evil had to abandon their previous lives in either fleeing or fighting back. There’s little freedom to be had when one is running for her life.

The dour American political climate of the 2000s was really just one symptom of the decay of “freedom” (and “free”) as a word free of partisan connotations. It has become doublespeak, following 70+ years of it serving as the go-to rallying cry against inflated threats like communism and terrorism. Its decline as a meaningful descriptor, I think, correlates perfectly with the growing absence of any existential threat to America.

“Freedom”: Not at stake
Not a single American alive today has fought or taken any political action that meaningfully preserved her nation’s “freedom” from foreign threats – the last person who did so was probably a Union solider who helped the U.S.A. defeat the C.S.A. in the Civil War and eradicate slavery. Had the Confederates won, then, yes, the prospect of freedom would have been lost for millions who would instead have lived under a white supremacist republic.

What about the World Wars? Let’s imagine the German Empire winning World War I – Niall Ferguson did so in his book “The Pity of War,” and the results aren’t that much different from what we see today with the European Union – a German-dominated trading bloc and currency union.  Germany had neither the international reach nor the materiel nor the will to take over the U.S., despite its encouragements to Mexico in the Zimmerman Note.

German victory in World War I likely would have prevented World War 2 from happening, but let’s imagine it did happen anyway in this alternate history. The “freedom” that Americans fought for in this conflict was the freedom of the people of other nations from the fascism (more on this word in a bit) of Nazi Germany and the Empire of Japan.

But the war’s decisive blow was struck by the U.S.S.R. at Stalingrad, and the nations that would have likely lived under the Third Reich instead lived behind the Iron Curtain of Stalin and his successors, who were no paragons of personal liberty. A victorious Axis, as imagined in books like “Fatherland,” would have likely resulted in a Cold War (between the Anglosphere and continental Europe) with slightly different contours.

While the postwar years were full of dangerous incidents, peaking with the Cuban Missile Crisis in 1962, the 70 years since the end of World War II have overwhelmingly been characterized by the steady decline of warfare and even vague hostilities between great powers, as well as the gradual redistribution of wealth from a handful of countries (the U.S. accounted for 50 percent of global GDP in 1945 but has declined to the low 20s since then) to everyone else. Fascist regimes eager to stamp out “freedom” are few and far between – even when a candidate like ISIS arises, its appeal is so limited and its power so slight within the capitalist world that it just just doesn’t make sense to argue that any American’s “freedom” is riding on the outcome of a war in the middle east or central Asia.

Interesting times
But still, “freedom” is often trotted out as something in need of costly preservation, perhaps in deference to Thomas Jefferson’s worrying assertion that “the tree of liberty must be refreshed from time to time with the blood of patriots and tyrants.” I don’t agree with this line of thinking, but it is bound to have its adherents thanks to the cult of personality that has emerged in America around the Founding Fathers.

The 2001 invasion of Afghanistan was initially called “Operation Enduring Freedom,” positioning the struggle against al-Qaeda as a monumental civilizational battle on par with the only real reference points that the developed world had: the Cold War and World War II. Perhaps the conception of the Afghanistan War as a battle in which “freedom” itself was at stake was inevitable since, in the wake of the dissolution of the U.S.S.R., existential threats to the notion of democratic government and free markets simply didn’t exist. One had to be invented.

Only in the last few years has terrorism finally and rightly been deflated as an existential threat when discussed in public by U.S. government officials. It’s not enough to stop the misuse of the word “freedom,” though, and the broader grandstanding about the severity of the threats that civilization faces. Benjamin Netanyahu’s speech about Iran to the U.S. Congress, in which he compared the Islamic Republic to Nazi Germany, is a great example of how the far right of the political spectrum continues to create bugbears for this notion of “freedom,” at a time when I’m not even sure they know what “freedom” means, the word having endured so much abuse over the last century or so.

Josh Marshall got at what I’m trying to get at in a post for Talking Points Memo last month, about how we can’t all”live in interesting times.” He went after Paul Berman’s “Terror and Liberalism” and the hysteria of the Iraq War with its trumped-up language about fascism, liberalism, and freedom:

“Berman’s book was something like the summa of that intellectual’s penchant for over-thinking things, that desire to have the times you live in match the headiest, most consequential and perhaps most idea-driven times of the past. If you’re a writer, an intellectual of a certain sort, who wouldn’t want to be Orwell in the late 30s and 40s or Hannah Arendt a bit later?…World War II ended 70 years ago. Outside of far-left pubs and Tea Party circulars, ‘fascism’ has not existed in any coherent form for a very long time. But some people can’t resist the hifalutin equivalent of dressing up as cowboys and Indians or cops and robbers. Play-acting. Fantasy. At least the Civil War and World War II re-enactors know they’re reenacting.”

Moreover, language about “freedom,” in the absence of a true foil, has been redirected at any number of absurd targets. Take Phil Gramm’s “alternative” to the Affordable Care Act, in the event that the U.S. Supreme Court invalidates that legislation’s premium tax credits. He thinks that another similar plan could be instituted as a replacement, albeit with the insurance markets deregulated. What did he call it? The “freedom plan” – apparently the freedom to prey on the poor and make exorbitant profits on essential care.

“Freedom” as a tool of the elites
The enlistment of the seemingly uncontroversial, universal concept of “freedom” into propaganda about unnecessary wars and draconian insurance plans, I think, indicates that “freedom” is doing verbal work that is the exact opposite of what it may have performed at other times for insurgencies or rebellions. “Freedom” has, basically, become a code word for the status quo, the interests of the elite in preserving an increasingly unequal economic system paired with a lack of politically accountable institutions.

Against big business? You’re against “freedom.” Ditto for unchecked political campaign contributions (protected under “freedom of speech,” stretched to an almost absurd meaning). The irony of the seemingly constant campaign to save “freedom” is that freedom (you know, the version that doesn’t need scare quotes) is being subtly lost, in that living a life in much of the developed world, outside some degree of surveillance and financial pressure, is becoming difficult. There is a loss of freedom, for sure, even while “freedom” still goes strong.

Turning a Facebook comment into a post

I wrote this on someone’s Facebook wall and decided I would clean it up and make it into a blog post. By the way, I know that I lapsed and missed a few days after my huge streak to start the year. There was a death in my family and I also had a hectic trip from NYC to Las Vegas. Anyway:

Professionalism” is about conformity to a *very* narrow idea of success. It assumes the worst about people – that what we look like is somehow indicative of our worth and that snap judgments (“baggy clothes, no good”) are merited. It’s ironic that so many “professional” organizations take extensive measures to ensure that they don’t discriminate (Internet job applications have been a godsend for HR departments in this regard, since they can facelessly turn someone away without having to worry about allegations that appearance was an issue) yet constantly discriminate on nonsense like whether you’re wearing a suit or not and, more subtly, if you even have enough money and status to really be a “professional.” If you want proof that the suit-dominated world is one that has its roots in patriarchy, then look at the suit’s history as something that men wore while hunting in centuries past.

I find that the word “professional” when used as a self-descriptor is filler – there are so many other terms that could be substituted that would tell me more about you. But, its usage makes sense: It’s a keyword for a certain type of hierarchy, a differentiator meant to cut off all those “non professionals” who have to wear company-supplied uniforms (think fast food or retail) or lope around in jeans and a hoody.

Speaking of which, this is one of the few areas in which I think Silicon Valley has actually been an improvement over older corporatism. Say what you want about Mark Zuckerberg, but him wearing a hoodie to his meeting with bankers before Facebook’s IPO was a strong symbol of the gap between freedom (to wear whatever one wants) and conformity (having to wear a suit all the time). Weird how only the rich and poor (especially in the service area), by and large, can escape the professionalism trap without any consequences.

Obsession with clothing in the workplace, enforced from above by management, is, I think, a symptom of what Paul Krugman has called in his economic articles “Very Serious People,” who present airs of seriousness – Solving Big Problems, Having No Time For Nonsense – that belie their actual non-serious positions, which can run the gamut from fretting about Medicare funding during the Great Recession (Krugman’s classic example) or, similarly, worrying about *attire* in organizations that have the material resources to – if they wanted to – drop the patriarchal politics and enact enormous change for the better. That was a long sentence

Follow

Get every new post delivered to your Inbox.

Join 243 other followers