“Predictions are hard, especially about the future.” – Yogi Berra, but possibly apochryphal
Imagine living in Europe circa 1900. Someone asks you to predict the state of the world in 1950. Are you going to be able to tell them confidently that the continent at that time will be divided into two spheres of influence: One dominated by the United States of America and the other by a successor state to Tsarist Russia modeled on a militarized version of Karl Marx’s philosophy, all of this having taken shape after the second of two catastrophic wars, the most recent one having ended with the U.S.A. dropping a pair of radioactive bombs on Japan that killed hundreds of thousands of civilians?
If your prediction was way off in 1900, you would have been in good company. Conventional wisdom at the time maintained that the economies of Europe were too integrated to ever lead to war, much less a conflict that would first be deemed The Great War and then renamed after its successor was even worse. But there was one realm in which the catastrophe of World War I was foreseen with startling clarity: literature. H.G. Wells’ serialized 1907 novel “The War in the Air” contemplated the immense resources being poured into then-unprecedented war machines (emphasis added; note the prophecy of a decaying Russia and a militant Germany at the end, and the hints of the eventual end of the British Empire throughout):
“It is impossible now to estimate how much of the intellectual and physical energy of the world was wasted in military preparation and equipment, but it was an enormous proportion. Great Britain spent upon army and navy money and capacity, that directed into the channels of physical culture and education would have made the British the aristocracy of the world. Her rulers could have kept the whole population learning and exercising up to the age of eighteen and made a broad-chested and intelligent man of every Bert Smallways in the islands, had they given the resources they spent in war material to the making of men. Instead of which they waggled flags at him until he was fourteen, incited him to cheer, and then turned him out of school to begin that career of private enterprise we have compactly recorded. France achieved similar imbecilities; Germany was, if possible worse; Russia under the waste and stresses of militarism festered towards bankruptcy and decay. All Europe was producing big guns and countless swarms of little Smallways.”
Why did Wells predict the carnage of World War I so accurately – and in a work of fiction, no less – while his peers were distracted by what they wrongly deemed a dawning golden era of global cooperation?
The question brings me back to an old saw of mine: Google’s obsession with science fiction, a genre Wells was instrumental in modernizing. The company’s ambitious “moonshots” division once required that new projects have some sort of basis in or resemblance to sci-fi. Efforts such as flying cars, robots, you name it: all of it was a computer science exercise in catching up to the fantasies of pulp writers from decades ago. Hell, the dummy-piloted taxi cab from “Blade Runner” (a movie released in 1990) is still far out ahead of the billions upon billions of dollars being spent on self-driving cars today by Google and its peers.
Google is not alone; the tech industry often comes off as highly certain of what the future will look like. Predictions about the dominance of automated vehicles, “the rise of the robots,” and so much more are collectively the fuel upon which a thousand “influencer” conferences run. Such events and the companies that participiate in them are at the same time highly dismissive of the value of humanistic education, instead prizing “technical” knowledege above all else. Yet the irony of them fervently chasing ideas from storybooks persists.
At some level, we all seem to trust in the power of fiction to tell us what the future is, whether we trust the explicitly “futurist” visions of sci-fi, or the eschatology of books such as the Bible and the Koran. In regard to tech in paticular, I was startled a few months ago to read Rana Dasgupta’s “Tokyo Cancelled,” a 2005 novel that sort of retells the Arabian Nights – as well as various fairy tales, such as Bluebeard – for the 21st century.
In one of its discrete stories, a man accepts a new job as an editor of people’s memories. He curates thoughts that they have (which have been captured via surveillance) and puts together a retrospective to present to them on individualized CDs. However, he has to be careful to edit out bad memories:
“We have short-listed around a hundred thousand memories that you can work from. They’ve been selected on the basis of a number of parameters – facial grimacing, high decibel levels, obscene language – that are likely to be correlated with traumatic memories….Apply the logic of common sense: would someone want to remember this? Think of yourself like a film censor; if the family can’t sit together and watch it, it’s out.”
Now here’s a Facebook employee, in 2015, announcing the introduction of filters into its On This Day service, which sends you a notification each day linking you to your photos and status updates from past years:
“We know that people share a range of meaningful moments on Facebook. As a result, everyone has various kinds of memories that can be surfaced — good, bad, and everything in between. So for the millions of people who use ‘On This Day,’ we’ve added these filters to give them more control over the memories they see.”
So while Dasgupta was essentially predicting an advanced Facebook service at a time when Facebook itself didn’t even exist yet (“Tokyo Cancelled” was written well before 2005, and Facebook itself was launched in 2004), what were the leading lights of tech predicting? Um…
-Steve Jobs in 2003: music streaming services are terrible and will never work
-Reality: in 2016, streaming drove an 8.1 percent increase in music industry revenue, and virtually everyone has heard of or used Spotify and Apple Music
The gulf between Dasgupta’s futurism and these now-laugable prediction brings me back to the vitality of the often-maligned cultural studies fields. I am reminded again and again of how we have to think about culture as a whole – not just scientific advances, which are undoubtedly important to human improvement, but also the flow of literatures, social mores, art, etc. – to sense where we are going and where we are going to. For example: Max Weber once positioned the Protestant work ethic – a totally incidental characteristic associated with adherence to a specific religion – as a central cog in the growing success of capitalism, which was reshaping Europe in his time. Yes, the Industrial Revolution and the creation of the steam engine, electricity, coal-fired ships, etc. were all vital to the creation of global capitalism, but would it have coalesced into a coherent social system without the cultural glue of Protestantism?
Just as Weber saw religion as an essential way to make sense of and corral new modes of industrial production, Dasgupta saw, by writing speculatively about it, the struggle to deal with information at vast scale (imagine all the CDs needed to contain the memories of the characters in “Tokyo Cancelled”) as a defining issue of the busy yet personally isolating environment of the modern international airport, in which the book takes place. When we give up on studying the humanities (and all “the channels of physical culture” whose underinvestment Wells bemoaned in the passage above), we create huge blind spots for ourselves and miss futures like these that should have been apparent to us all along, whether they sprouted from an Edwardian sci-fi novel or a 21st century fairy tale.
I haven’t published all year. That’s going to change: I have a few topics I’ll be looking at in the coming weeks to get back into things:
-How fiction is often the best predictor of the future, with a focus on Rana Dasgupta’s 2005 novel “Tokyo Cancelled”
-A new translation of the Aristophanes play “Wealth,” which I am producing with my former Greek language instructor. My focus on Aristophanes will be also be a good chance to revisit one of my older posts about his play “The Frogs” and its treatment of literary criticism.
It seems absurd to think about, doesn’t it? After all, John McCain lost in a landslide to Barack Obama in 2008, winning a mere 46% of the vote while losing the entire Midwest and Eastern Seaboard with the exceptions of South Carolina and Georgia, in which he held on by single digits. Obama even won electoral votes in three states – Virginia, Indiana, and Nebraska – in which Democratic presidential candidates had been shut out since LBJ wiped out Goldwater in 1964.
The Obama victory in 2008 had two important causes: 1) the incompetence of the Bush 43 administration, which culminated in the late 2000s financial crash and 2) the charisma and focus of Obama’s messaging. Obama knew how to work specific issues, such as opposition to Big Ag in Iowa and NAFTA in Ohio, better than any Democratic candidate since LBJ.
Wih these two drivers in mind, it’s actually not hard to imagine a situation in which McCain could have prevailed. I see three changes that could have enabled a McCain victory:
- The Democratic superdelegates, much like they did in 2016 with the Hillary Clinton and Bernie Sanders race, decide to heavily rig the primaries by surpressing media coverage of Obama’s insurgent candidacy, arranging odd debate schedules, and disproportionately pre-aligning themselves with one candidate. Clinton wins the primary, but fails to capture the “Hope and Change” spirit of 2008 and instead treads out something of similar dubious value to “America is Already Great.”
- Meanwhile, McCain stays on message and distances himself from the Bush administration, reminding everyone of his primary challenge to the president in 2000 and his disdain for conservative institutions such as the Christian right. He picks a relatively low-profile swing state GOPer like John Kasich as VP instead of Sarah Palin, who alienated millions. Aligned against both Bushism and Clintonism, he manages to become the “outsider” despite being a member of the incumbent presidential party.
- The collapse of Lehman Brothers, which really propelled Obama’s candidacy over the top, doesn’t happen until December 2008, by which time the election is already settled. This is the hardest of all the changes to imagine, but bear with me.
So McCain defeates Clinton and enters office in January 2009. What next?
Many policies such as the stimulus bill would have still gone through on his watch, with the help of a moderate Democratic majority in both houses of Congress. Healthcare refrom probably would not have happened, though.
The biggest mystery, though, is what would have become of the mortgage crisis he inherited from Bush. The wide-reaching economic despair that the financial meltdown wrought on the entire country would likely have continued for years as it did under Obama, assuming an even quasi-typical GOP response of tax cuts and bailouts for banks. It would have, in others words, become fertile ground for various dissent movements.
Indeed, this situation could have profoundly reshaped the 2010 midterms, which in reality turned out to be a landslide for the newly formed Tea Party. Would the Tea Party have even emerged without the monolithic target of the Obama administration and the Democratic Congress of 2009-2010, both of them overseeing the reeling economy? Would a Tea Party of the Left have sprouted up instead, perhaps spearheaded by Bernie Sanders (who toyed with the idea of running for president in 2012)? Would a 76 year-old McCain have been able to win re-election with a rickety economy and potentially gridlocked Congress in 2012?
Considering the political situation in the U.S. after 2016, it’s tempting to imagine that maybe the fallout from a McCain administration – with the GOP owning the tumultous early 2010s years – might actually have forestalled the party’s descent into madness and left the country on sounder institutional footing. But the price would have come at the expense of many people’s lives and rights, especially vulnerable populations such as the poor and the LGBTQ community, who might not have seen the great particular advances of the Obama administration.
I plan to map out a few of these counterfactual scenarios about politics in future posts. This one, about 2008, is the one nearest to my heart, though, since it’s the first time I was ever excited about a presidential race, and it all happened at a pivotal moment in my life, when I was moving to Chicago for the first time. I did an absentee vote for Obama in Kentucky. However, his first term coincided with the hardest years of my life, when I struggled to find work. I don’t think my life would have been easier under a McCain presidency but sometimes I wonder about the implications.
Doppler radar-like, I could hear it coming and I could feel it passing just by. Two Massachusetts kids raced through the third floor corridor of our dorm, wordless but louder than silence. Stationary, I stared at the red and green mess on the TV. David Ortiz had just delivered his second walkoff hit in as many nights to force a Game 6 in the 2004 ALCS between the Boston Red Sox and the New York Yankees.
A week later: My then-Intro to Greek instructor (and now friend, for 12 years running) wondered to our class if everyone had gone crazy because of the lunar eclipse that October. Nope. The Red Sox had just sent all of Rhode Island into a frenzy by sweeping the St. Louis Cardinals to win the 2004 World Series, their first title since the year World War I ended. My English professor remarked that she knew then-President of Baseball Operations for the Red Sox Theo Epstein’s mother, who was an instructor at Boston University.
“An ancient one”
Baseball is an old sport. The Chicago Cubs began operation in 1876 as the Chicago White Stockings. Before 2016, they had last appeared in the World Series in 1945 – a year before the NBA was founded. Their 1908 title predates both the NHL and the NFL.
Professional baseball’s 19th century origins has meant that there have been some epic championship droughts. The Red Sox did not win between 1918 and 2004, the White Sox from 1917 to 2005, and the Cubs from 1908 to 2016. Even the longest current drought – belonging to the Cleveland Indians, this year’s runners-up – dates to the Harry Truman administration.
During my years in New England in the mid-2000s, and especially during the fall of 2004, the anxiety expended on the Red Sox was heavy enough to send the university campus into frenzies of relief after each victory. Sometimes I thought of this seemingly throwaway quote from Moby-Dick:
“Almost universally, a lone whale proves an ancient one.”
Why did we – even me, an 18 year-old from Kentucky who grew up rooting for the Indians – think so much about this baseball team? Because they were basically alone in their futility, and it was some truly ancient futility, dating to a time when my oldest grandparents hadn’t hit double-digits yet. The entire point of being a Red Sox fan was that you almost certainly had never seen a championship in your lifetime. Every game was life and (will we win before my) death. Generations of fans came and went, but that ancient whale – the Curse of the Bambino, traced back to the fateful 1918 day when Babe Ruth was traded from the Red Sox to the Yankees – was very much alive, however immaterial.
“This grey-headed, ungodly old man, chasing with curses a Job’s whale”
But curses are ultimately just stories. The Red Sox curse broken during my first year in college was “only” the third longest at the time. Why was it so much more prominent than the longer White So and Cubs curses? I mean, Boston is one-sixth the size of Chicago. Both the Cubs and White Sox fanbases are substantially larger than Boston’s.
The answer: The marketing around the Curse of the Bambino was flawless. It combined specific superstitions – the Ruth backstory, the epic collapse to the New York Mets in the 1986 World Series, clips of which had been shown endlessly on ESPN in the school cafeteria during that year’s Red Sox run – with Boston’s longstanding inferiority complex compared to New York City. Being cursed, doomed to root for this alwasy second-best team, was emblematic of being a New England sports fan.
I don’t know when the Red Sox drought in particular took on the momentum of a “curse,” but 1986 seems like a good candidate. Up 3 games to 2 on the Mets, the Red Sox were at one point just one strike away from a title. Instead, the Mets rallied for several runs to completely turn the series around. Infamously, with the game tied, a hit from Mookie Wilson slipped between the ankles of Boston first baseman Bill Buckner, reaching the outfield to send the game-winning run home. Buckner was for years the face of Boston’s baseball failures.
It wasn’t his fault, though. Let’s say he grabs that ball. The game doesn’t end. The Red Sox would have batted again, but they also would have defended the lead in the bottom of the next inning since the game was at Shea Stadium. And guess what: Even with the loss, the Red Sox still had Game 7!
I wasn’t old enough to remember the Red Sox-Mets incident, but I did witness the team’s loss to the Yankess in the 2003 ALCS on Aaron Boone’s walkoff homerun in extra innings. The Red Sox had led by 3 runs as late as the 8th inning when Boston manager Grady Little – in Melville’s terminology, that “grey-headed, ungodly old man, chasing with curses a Job’s whale” – inexplicably allowed an exhausted Pedro Martinez to keep pitching to the Yankees, allowing New York to rally. After the game ended, I thought that maybe Boston is just always going to be second-best to New York, home of the two baseball teams (Mets and Yankees) who had prolonged years of New England sports misery.
“Saturn’s grey chaos rolls over me”
The next year, Boston broke through and then won again in 2007 and 2013. I was finishing college in 2007 and I don’t remember much excitement about that title relative to 2004. The Red Sox were just another team now. The White Sox also won during my undergraduate years.
Still, the Cubs drought persisted, that unrivaled Leviathan of sports curses. No titles since 1908. No World Series appearances since 1945. I moved to Chicago in 2008 and the Cubs won the division that fall. They were swept in the first round and a championship seemed further away than ever, with the drought guaranteed to surpass 100 years.
Since I began my time in Chicago living on the South Side, I started as a White Sox fan and never had many feelings about the Cubs. There was little doubt to me though that the Cubs were the dominant baseball team in the city in terms of fandom. When I moved to Irving Park in 2009, I became accustomed to the train full of Cubs fans arriving at the nearby Metra station from the suburbs, to take the bus to Wrigley Field. The losing persisted.
Many times, I wondered why Cubs fans bothered, not having reached my realization yet that, like Red Sox and White Sox fans before them, the losing perversely made it fun, or at least unique, to be a Cubs fan. Like the Red Sox, the Billy Goat Curse was a triumph of marketing. Following that loss to the Detroit Tigers in the 1945 World Series, the Cubs were for decades the second fiddle to the much more popular White Sox teams of the 1950s an 1960s (the 1959 World Series between the White Sox and the Dodgers was the most well-attended World Series of all time). No one was particularly aware of the Cubs’ title drought even as it passed 70 years in 1978.
Everything started changing in the 1980s. WGN launched its superstation programming, bringing Cubs games into living rooms around the country. Longtime Cardinals and White Sox announcer Harry Caray became the face of the Cubs, bringing his tradition of singing “Take Me Out to the Ballgame” during the 7th inning stretch to Wrigley Field. Steve Goodman wrote “Go, Cubs, Go.”
In the 1990s, the legend was cemented by Caray’s famous “someday the Chicago Cubs will be in the World Series, and it might be a lot sooner than we think” remark in 1991, and the team’s dismal 0-14 start to the 1997 season, which would prove to be his final one. In 1998, the franchise was also at the heart of the race to break Roger Maris’s home run record, with Cubs outfielder Sammy Sosa hitting 66 home runs that year.
Even then, though, the Cubs’ drought, unlike the Red Sox’s, was not one well-known for near-misses and heartbreak. The team had appeared twice in the NLCS since LCSes were first instated in 1969. They blew a 2-0 series lead to the San Diego Padres in the 1984 NLCS, which was then a best-of-5 format – a surprising, but hardly unheard of, feat. They were easily dispatched by the San Francisco Giants in the 1989 NLCS.
It’s true that in 2003 they were snakebitten. With a 3 games to 1 lead over the Florida Marlins – a team that had at one point that season been 10 games below .500 and was managed by the eccentric 72-year old Jack McKeon – in the NLCS, they were shut out in Game 5, then blew a 3-0 lead in the 8th inning of Game 6 after a controversial incident with a fan trying to catch a foul ball. The Marlins won Game 7 and then their second World Series title by defeating the Yankees the next week.
Like the Buckner incident, the “Bartman game” (Game 6) has had many of its vital details airbrushed. The foul ball was probably not catchable. Cubs starting pitcher Mark Prior had thrown over 100 pitches by the 8th inning and unsurprisingly lost his control, walking that same batter on the next pitch on a passed ball. Shortstop Alex Gonzalez botched a surefire inning-ending double play. The Marlins scored an astonishing 8 runs in just that inning.
Like other “cursed” teams, the Cubs were ultimately victims of two contradictory trends, more so than these crazy one-off incidents:
- Until 1969, only one team from each league made the playoffs (and until 1995, only two). This limited a team’s chances unless it had the best record in its division or league. Many Red Sox teams were in fact shut out of the playoffs in the 1970s despite winning close to 100 games, since the Yankees were often better.
- But baseball was also expanding rapidly, with more teams making it harder to win a title in any given year. The Marlins only joined in 1993, for example. Expansion has meant that there are many teams (8, to be exact) that have never won a title and likely won’t for years. Already, the Rangers and Astros have existed for 50+ years with no World Series. The Mariners and the Nationals have never even won the pennant.
- What’s the difference between a 50-year old pre-2016 Cubs fan and say a 50 year-old Milwaukee Brewers fan? Neither had seen a title in a lifetime (the Brewers have never won the World Series). The Cubs “curse,” compounded by lack of opportunity as well as expansion, lasted so long that it became impersonal. Only 100 people on earth alive as of Nov. 4, 2016 were confirmed to have been born on Jan. 13, 1906 or earlier, which is likely the minimm for having been sentient the last time the Cubs won in 1908. It was as if they had never won at all.
Melville has another good quote for this too, one that I think of even more so than the others I have cited here:
“When I stand among these mighty Leviathan skeletons, skulls, tusks, jaws, ribs, and vertebrae, all characterized by partial resemblances to the existing breeds of sea-monsters; but at the same time bearing on the other hand similar affinities to the annihilated antichronical Leviathans, their incalculable seniors; I am, by a flood, borne back to that wondrous period, ere time itself can be said to have begun; for time began with man. Here Saturn’s grey chaos rolls over me, and I obtain dim, shuddering glimpses into those Polar eternities; when wedged bastions of ice pressed hard upon what are now the Tropics; and in all the 25,000 miles of this world’s circumference, not an inhabitable hand’s breadth of land was visible. Then the whole world was the whale’s; and, king of creation, he left his wake along the present lines of the Andes and the Himmalehs.”
To be a Cubs fan was to stand constantly amid the “might Leviathan skeletons” of their two titles (1907 and 1908) from the Theodore Roosevelt administration, seeing the “partial resemblances” of the dead ball era game to today’s multimillion dollar MLB juggernaut, thinking about their “incalculable seniors,” many of them long since perished waiting for a Cubs title, letting your thoughts bear you back to “that wonderous period” before Wrigley Field (the second oldest park in the majors, having been finished in 1914) was even built, indeed before time itself for anyone who is currently living, obtaining only “dim, shuddering glimpses” into what it must feel like to celebrate a Cubs title, and imagining an entire world that was yours for a day as you basked in your post-championship euphoria.
Those two kids running through the 3rd floor corridor were probably heading for the quadrangle. I didn’t follow them. But they were also running into the past, letting “Saturn grey chaos” roll them back to a reconstructed past they never lived through, a virtually ancient New England where the Red Sox were somehow the world champs. Would it feel that good this time, in 2004?
I had no rooting interest in the Cubs-Indians World Series this year. But once the game pushed into extra innings, I remembered 1997. That year, the Indians lost in Game 7 in extra innins to the Marlins – exactly the situation in 2016, except against the Cubs. I had been rooting for the Indians all postseason that year, watching the games with my grandfather at his house. When the Marlins got the Series-winning double, it felt like a gut punch; I’ve never really cared about any sports outcome as I did that one, when I was still an impressionable 11 year-old. This time came close since the circumstances were so similar, at least on the TV screen. I kind of miss getting so wrapped up in somewhat meaningless things like sports fandom now. I also missed him, and wondered what it woudl have been like for him to live to see all the curses – even the great Leviathan itself, the Cubs drought – finally end, with me 30 years old and sitting next to my dad on the couch in our North Side Chicago house.
Some writers write the same piece their entire lives. Sometimes, the repetition is fun for the audience. The abstract novelist Will Self has recycled the same characters and inimitable style across many of his books, yet the effect is never less than bizarre and original (admittedly, Self is obtuse and not everyone will be able to make it past a single page, but no accounting for taste, etc.)
But then there’s the work of people like Joel Kotkin, who has for years written about how the dense urban areas of the U.S. (i.e., New York, Chicago, D.C., San Francisco) are in decline because so many people are moving to the sunbelt cities, which are cheaper because the urban cities are so expensive because everyone wants to live there…you can probably see how this argument is self-refuting.
Similarly, many tech bloggers like Ben Thompson of Stratechery keep trotting out the same arguments about “the Internet” in what feels like an interminable series of posts stretching all the way back to the advent of the World Wide Web (in reality, he’s somehow only been blogging since 2012). His pet argument is about how the Internet has ruined “distribution” as a business model, citing the decline of the newspaper industry in particular, which could not keep up once its printing presses, local advertising networks, and distribution trucks became enormous liabilities compared to the instantaneous delivery of Google and Facebook.
This reasoning ignores how reliant even companies such as Amazon – which Thompson cites as one of the key companies that took advantage of the “free” distribution of the Internet – are on logistics (in the case of Amazon in particular) and on massive, expensive, and environmentally corrosive data centers. What if those buildings packed with servers some day become as obsolete as the newspaper infrastructure that he regards as passé?
On Oct. 21, 2016, many major websites, including Spotify, Reddit, and Twitter, were down for hours as a massive distributed denial-of-service (DDoS) cyberattack overwhelmed these very same infrastructures. DDoS is a dense concept that is beyond the scope of this blog, but to explain it as simply as I can: It involves machines (PCs, servers, anything with an Internet connection) sending tons of useless requests to websites. This flood of traffic makes it impossible for the targeted websites to process legitimate requests. For the layman, this means that you try to go to “twitter.com” and instead you get an error and the page never finishes loading.
It is hard to imagine how a similar attack would play out on “legacy” communication networks like the postal system or the plain old telephone grid. I mean, imagine if the post office got so much junk mail each day that it couldn’t even deliver any of your mail, or anyone else’s, and you’re close to grasping the insanity of a DDoS attack. The Internet is uniquely exposed to danger in this way.
A key enabler of the Oct. 21 attack was a botnet, which is simply an interconected set of machines that have been hijacked and programmed to do harm, typically in the form of flooding websites with bogus traffic. As more and more devices become connected – e.g., home appliances, vehicles, etc. – the potential pool of enslavable botnet machines grows, making ever-more devastating DDoS attacks possible.
I only veer into the DDoS case to emphasize that “the Internet” is A) not new and B) not necessarily permanent. Commentators such as Thompson still speak of the Internet in terms of “revolution,” with prose treating it as something new, when it has existed for decades. The World Wide Web predates NAFTA and the Super Nintendo Entertainment System. Wi-Fi was approved by the IEEE the same year Bill Clinton was sworn in for a second term. Ethernet was first commercialized while the Summer Olympics were being held in the Soviet Union. Someone who joined Facebook on its first day of availability would be at least 30 years old now. The Internet is old.
As for permanence, I’m talking not so much about how websites can go down or be deleted forever, but instead about how the Internet itself as a global, homogenous systems with Americentric features may not be long for this world. Today’s DDoS attack was targeted as U.S. services, and with a vast, mature pool of devices now out there to enlist into botnets – again, the result of decades of Internet existence – more events like this one, resulting in entire days of major websites being unavailable, are almost inevitable. Combating them could be costly to the point of making routine website visits onerous. Enjoy the Internet, because like anything else it won’t last.