Great post up by Lukas Mathis, responding to John Gruber, about the 3DS and the temptation of pigeonholing it as a mobile device:
“I don’t think most people buy portable gaming systems with the intention of regularly carrying them in their pockets. I don’t think they ever did. I don’t remember knowing even a single person who routinely carried a portable gaming device in his or her pocket.”
I got my first Game Boy in August 1998. It was a Game Boy Pocket – apparently, Nintendo was giving all of its Tetris-playing Link’s Awakening-loving gamers the green light to start carrying their Game Boys everywhere. That was feasible, as long as the gamer also wanted to pack some batteries, game cases, and maybe a Game Boy Printer, too. The Game Boy ecosystem was huge, occupied by peripherals and palm-sized cartridges; it did not lend itself to mobility as well as even pre-iPod CD players and disc-carrying books, and in retrospect very little about it foreshadows what the breathless press now calls mobility, i.e., carrying a consolidated networked device.
The “Pocket” moniker was no declaration of revolutionary mobility – it simply showed that the gargantuan first-gen Game Boy had been succeed by something smaller but no less capable. Nintendo is not a company given to consolidation for its own sake, or even for the sake of forcing new technologies on its users (unlike Apple) – the slim Pocket and its upgraded partner, the Game Boy Color, gave way to the stockier widescreen Game Boy Advance, whose backward compatibility now meant that there was even more to carry around. The DS similarly introduced a easily losable stylus and backward compatibility with the Advance. These devices are not even trying to be smaller or more amenable to “on-the-go” players with limited attentions spans, or even to IT execs who think mobility will solve world poverty.
The 3DS, released in 2011, is often compared to the smartphones and tablets. The narrative goes: do-everything touchscreen devices have obviated the need for dedicated devices, and the 3DS (and presumptively, the PS Vita) are doomed. This line of reasoning betrays ignorance of the dedicated handheld market, a unique space that only Nintendo has ever really dominated. To see how the (3)DS is different from a smartphone or tablet, it’s necessary to look at one of its quintessential offerings, the Zero Escape series of adventure games/visual novels.
Desktop adventure gaming declined long ago, but it has gotten new life in the last decade thanks to studios like Quantic Dream and from third-party DS developers. Games like Last Window have demonstrated the DS’s unique abilities to create an immersive, almost book-like experience – that game in particular required that you hold the DS upright, featured lots of text to read, and one of its most stunning puzzles could only be solved by closing the DS’s clamshell. However, Cing (the studio that made Last Window and its prequel, Hotel Dusk) closed its doors several years ago. In contrast, Aksys Games, the makers of Zero Escape, scored one of the original DS’s most unlikely hits with Zero Escape: Nine Hours, Nine Persons, Nine Doors, and had similar success with the beautiful 3DS/Vita sequel, Zero Escape: Virtue’s Last Reward. Buoyed by strong sales, a third game is apparently in the works.
Zero Escape is like little else in mainstream gaming, either on desktop or mobile. Most of your time will be spent reading; every now and then, you might solve an escape-the-room puzzle. Despite having little action and being on a traditionally family-friendly platform, it is also incredibly violent and nihilist. Without spoken dialogue (at least in much of the first game), it’s like a creepy, interactive silent movie. Or, as I alluded to earlier, a book – and here we may see exactly where the (3)DS resides in the device landscape.
The Zero Escape games, like the best ones on the platform, are games most easily played at home, where players do not have a set amount of time to kill, like reaching a certain subway stop, or finally getting called in by the doctor/receptionist. Those scenarios are perfect for smartphone/tablet games that can be suspended and resumed at any time, but the 3DS usually works better at home or with time to spare.
As Mathis points out, the home is an environment in which consumers typically favor dedicated devices, rather than the convenience of consolidation. If they didn’t, then PCs would have long ago cannibalized TVs, music players, game consoles, streaming boxes and much more. Non-consolidation also means that devices like the Kindle Paperwhite, which in theory should be under tremendous pressure from hi-res tablets, remain favorites even of Nintendo pessimists like MG Siegler.
With its sophisticated reading capabilities and false front as a “mobile” device, the Paperwhite, rather than Android and iOS hardware, may be the best comparison for the (3)DS. I’ve been skeptical about how long Amazon would continue selling reading-first/reading-mostly devices, but like the DS, they appear to serve a sizable, loyal audience that likes dedicated functionality. It can be easy to overlook that when one’s main perspective is mostly limited to the rapid iteration and refinement of phones and tablets, which follow different lines of logic and occupy a largely separate market at least for now.
Farhad Manjoo figures that same-day delivery services Amazon Fresh and Google Shopping Express are the future of shopping, a future wherein persons won’t have to leave their houses for produce and other groceries:
“After using it for a few weeks, it’s hard to escape the notion that a service like Shopping Express represents the future of shopping. (Also the past of shopping—the return of profitless late-1990s’ services like Kozmo and WebVan, though presumably with some way of making money this time.) It’s not just Google: Yesterday, Reuters reported that Amazon is expanding AmazonFresh, its grocery delivery service, to big cities beyond Seattle, where it has been running for several years. Amazon’s move confirms the theory I floated a year ago, that the e-commerce giant’s long-term goal is to make same-day shipping the norm for most of its customers.”
WebVan was certainly one of the greatest disappointments of its time and emblematic of the dot-com mindset. It wanted to do same-day delivery, but couldn’t make the economics or the customer service aspects work. But I wonder if Manjoo knows what the past of shopping really looks like?
Milkmen and icemen of the legit type used to deliver consumable goods to Americans’ front-doors. Paperboys (their declining profession once immortalized and coincidentally eulogized by Atari) delivered newspapers. 1920s-era American apartment buildings like mine so took for granted this economic model that they built special doors into the walls so that milkmen in particular could put deliveries into them. This type of service delivery, combined with the often-proposed “Internet of things” (a physical, tangible network of networked appliances and devices), is not so much something truly novel as it is a revival of old economic models.
Manjoo is better than most in at least realizing that the future often looks a lot like the past, even if he does cordon-off his perspective to tech ventures. But these instances are good reminders that knowing at least some history lets you see the future much better.
There’s been a recent surge in attention given to a relatively obscure British journalist’s thoughts on headline writing. “Betteridge’s Law” is the informal term for the argument that any (usually technology-related) headline that ends in a question mark can be answered “no.” Betteridge made his original argument in response to a TechCrunch article entitled “Did Last.fm Just Hand Over User Listening Data to the RIAA?”
The reason that so many of this rhetorical questions can be answered “no” comes from their shared reliance on flimsy evidence and/or rumor. The TechCrunch piece in question ignited controversy and resulted in a slew of vehement denials from Last.fm, none of which TechCrunch was able to rebut with actual evidence. John Gruber also recently snagged a prime example in The Verge’s review of Fanhattan’s new set-top TV box, entitled “Fan TV revealed: is this the set-top box we’ve been waiting for?”
So we know what Betteridge’s Law cases look like in terms of their headlines, which feature overzealous rhetorical questions. But what sorts of stylistic traits unite the body of these articles? Moreover, why do journalists use this cheap trick (other than to garner page-views and lengthen their comments sections), and what types of arguments and rhetoric do they employ in following-up their question? I am guilty of writing a Betteridge headline in my own “Mailbox for Android: Will Anyone Care?,” which isn’t my strongest piece, so I’ll try to synthesize my own motivations in writing that article with trends I’ve noticed in another recent article that used a Betteridge headline, entitled “With Big Bucks Chasing Big Data, Will Consumers Get a Cut?”
Most visibly, Betteridge’s Law cases employ numerous hedges, qualifiers, and ill-defined terms, some of which are often denoted by italics or scare-marks. By their nature, they’re almost invariably concerned with the future, which explains the feigned confusion inherent in the question they pose. That is, they act unsure, but they have an argument (and maybe even a prediction to make). Nevertheless, they have to hedge on account of the future not having happened yet (the “predictions are hard, especially about the future” syndrome), or, similarly, use conditional statements.
I did this near the end of my Mailbox article, saying “This isn’t a critical problem yet, or at least for as long as Google makes quality apps and services that it doesn’t kill-off abruptly, but it will make life hard for the likes of Mailbox and Dropbox.” My “yet” is a hedge, and my “it will” is the prediction I’m trying to use to establish more credibility. In The Verge article linked to by Gruber, the authors say “IPTV — live television delivered over the internet — is in its infancy,” strengthen that with “Meanwhile, competition for the living room is as fierce as it has ever been,” and then feebly try to make sense of it all by saying “At the same time, if it matches the experience shown in today’s demos, Fan TV could win plenty of converts.”
Delving into the aformentioned article about “big data,” we find similarly representative text:
- “You probably won’t get rich, but it’s possible”
- “But there’s a long road ahead before that’s settled”
- “Others aren’t so sure a new market for personal data will catch on everywhere”
- “not as much is known about these consumers”
- “That’s a big change from the way things have worked so far in the Internet economy, particularly in the First World.”
- “big data”
This headline is really a grand slam for Betteridge’s Law. Simply answering “no” means that you believe that corporations specializing in data-collection won’t be all that generous in compensating their subjects for data that they’ve possibly given up without even realizing that they’ve done so. After all, lucid arguments have been made about how Google in particular could be subtly abetting authoritarianism via its data collection, which if true would constitute a reality directly opposed to the fairer, more democratic world proposed by advocates of data-related payments. To the latter point, Jaron Lanier has argued for “micropayments” to preserve both middle-class society and democracy in the West.
The article examines mostly nascent data-collection and technology companies and ideas whose success or failure is so far hard to quantify and whose prospects remain unclear. Accordingly, the author must use filler about the weak possibility of becoming rich, the cliché of a “long road ahead,” and the admission that many consumer habits are a black box and that maybe not all consumers are the same. Even the broad “consumers” term is flimsy, to say nothing of the nebulous term – “big data” – that the article must presuppose as well-defined (I have argued that it is not so well-defined) to even have a workable article premise.
For additional seasoning, the article resorts to the outmoded term “First World” (a leftover from the Cold War) and the ill-defined “Internet economy.” I think I know what he means with the latter: the targeted-ad model of Google, Amazon, and Facbook. But the vacuity of the term “internet” leaves the door open: would Apple’s sale of devices that require the internet for most functions count as part of the “internet economy,” too, despite having a different structure in which users pay with money rather than data?
Like many Betteridge-compliant headlines, the accompanying article isn’t a contribution to any sophisticated discussion of the issues that it pretends to care about. Hence the teaselike question-headline; Betteridge’s Law cases pretend that they’re engaging in high discourse, perhaps in the same way that the valley girl accent – riddled with unusual intonations cadences that throw off the rhythm of its speaker’s sentences and draws attention away from content – pretends it is partaking in real conversation. Perhaps we really should bring back the punctus percontativus so we can see these rhetorical questions for what they really are.