Advertisements

Category Archives: Idle Thoughts

Facebook’s mobile Web app is better than its Android app

Have you ever opened up your browser and gone to gmail.com…on your iPhone? Ever felt the urge to load the Instagram website in Chrome, on your Nexus 5?

Web protocols – hello, HTTP – may be central to how we experience mobile software, but mobile browsers have lost the race to hybrid and native apps. Whereas one could conceivably spend an entire workday on a Mac or PC inside Safari or Chrome, the same workflow would be unbearable on an iOS or Android device.

Mobile browsers seem like library stacks – you probably wouldn’t wander into one unless you had to retrieve something, whether a book or a URL. There’s a reason for all those annoying “download our app” banners when you stumble upon some TV station’s mobile website. Apps are better.

If a service offers both a mobile website/Web app and an app that can be downloaded from an app store, I just about always go for the latter. Its hybrid/native app – i.e., the one from the App Store and Google Play that has its own springboard icon and takes up the full screen when launched – is almost certainly faster and definitely more immersive.

Everything from Yelp to various Reddit clients are much better than their corresponding mobile Web experiences. On Android, the incentive to “go native” is even greater since some links can be opened in apps rather than Chrome (e.g., click a URL that goes to Google+ and opt to open up the Google+ Android app rather than the website).

But there seems to be at least one big blue exception to the normal Web versus hybrid/native app rule.

Facebook’s mobile apps, especially for Android, have always been a mess. The early versions were just Facebook.com repackaged with multiple hamburger buttons. While the visual style has gotten flatter with the years and the app has been rebuilt with native code, it’s still got a lot of flaws:

  1. It’s a big battery drain. Deleting Facebook alone is probably the best generalist advice for saving battery on Android other than just keeping the screen off all the time.
  2. It’s inefficient. Photos can take forever to load on even modest network connections, with performance that lags behind even bandwidth-intensive apps such as Netflix.
  3. It’s inconvenient. The recent separation of Facebook Messenger means that Facebook is not one but increasingly many apps taking up space and battery on your phone.

Facebook’s Android app is what happens when the immovable object of a massively popular PC-era app means the unstoppable force of mobile app usage. It manages to hold onto all the legacy features – post filtering, sponsored posts, tons of menus – while taking chic 2010s aesthetic cues. The result is something that is at best a mercifully unobtrusive and at worst crashy and unusable.

There’s plenty of irony about Facebook’s operations, including its fundamental devotion of massive computer science expertise to the art of ad-clicking. With app design in particular, it’s amazing that the lowest-risk way to use Facebook on Android in 2014 is via Chrome.

The Facebook Web app, especially on a big screen Android phone, has several advantages over its cousin:

  1. First, it leaves you alone. There are no push notifications, though you can still check your notifications and your messages through the familiar row of icons at the top. Once you leave the site, you’re basically out of Facebook’s world.
  2. Facebook Messenger? It’s right there in that same row and it actively updates with new messages without requiring a page refresh.
  3. On middling Internet connections, it seems to load media better than its Android counterpart. I’ve seen big rows of gray boxes in the place of photos on the Android app, whereas the website on the same network is fine.

Granted, going to Facebook.com on an LTE-equipped phone feels like bringing modern hardware back to 2006. But in every way it feels like the healthiest choice for the phone and for my own sanity.

Advertisements

Don’t Step Out of Your Comfort Zone

Where did the “comfort zone” come from? It seems of recent vintage, although the combination of words may have been influenced by thermostat marketing. From the 2004 New Yorker story “The Comfort Zone,” chronicling an episode from 1970s America:

“My father came home on cool nights to complain about the house’s ‘chill.’ My mother countered that the house wasn’t cold if you were doing housework all day. My father marched into the dining room to adjust the thermostat and dramatically point to its ‘Comfort Zone,’ a pale-blue arc between 72 and 78 degrees.”

The story traces a back-and-forth between the couple. What’s striking is how both the father and the mother have good arguments for setting the “right” temperature. A current reading would probably scold the father for being intransigent and praise the mother for her support of hard work.

Should he have stepped outside (the most common phrase used with “comfort zone”) his comfort zone? The house was cold to someone who had been in the heat, but normal for someone who had been working in it all day. Instead of dogma about “your comfort zone,” perhaps we should see that what’s comfortable depends on the person’s situation, and that having comfort – a temperate house, a relaxing chair from which you can reset your brain by staring off into space – is not always bad or “safe” (regrettable that this word has negative connotations now).

I thought of this New Yorker vignette when I recently visited Mammoth Cave in Brownsville, Kentucky. It was in the high 80s F outside, but once we neared the cave’s entrance, cool air wafted over us. The cave itself was a constant 54 F. After a working up a sweat from walking and ducking through the passages, we barely noticed the temperature that just minutes ago had seemed cold.

Maybe we had stepped outside our etc. But that 54 F became comfortable too, and it felt good to go back to the 80 F temps from before. We stepped back inside the comfort zone, one could say, and it felt good. Despite having trekked through a 54 F cave, I wouldn’t say that it is now within my “comfort zone,” though – it would still feel weird at first, and I wouldn’t want to stay forever.

Anymore, “comfort zone” is a dark place for peddlers of corporate management theory or lifestyle coaching. It’s not hard to see why – you need to get out of it before buying into their programs, which presuppose that:

  1. everyone benefits from the new stimuli of doing “uncomfortable” tasks
  2. doing “uncomfortable” things is voluntary – a matter of “want to” not “have to”
  3. it’s possible to fundamentally change someone’s attitude, usually from a “negative” to a “positive” outlook.

# 2 sticks with me. I think of how many people involuntarily venture beyond what a life coach would euphemistically call their “comfort zones” just to survive – working a job for which they’re overqualified, having to apply for unemployment insurance, raising a child for the first time. They’re constantly having new experiences that aren’t comfortable and that they may not want to repeat, but if evaluated through the lens of “comfort zone” behavioral analysis or Internet discussion, would likely be “ordinary” people, “trapped” in 9-5 work and unwilling to “step outside the comfort zone” by skydiving, doing improv comedy, or being put in charge of some intramural game.

Indeed, the stakes of voluntarily ditching the vaunted “comfort zone” through such activities are often low. I can remember doing many such extroverted activities to feel more comfortable, only to see the same feelings of anxiety crop up again over the years in similar situations. The experiences were not transferable, and I questioned their value as anything other than lingo in practice. Plus, it’s ironic that proponents of extra-“comfort zone” expansion express their views through a cliche. It’s so meaningless that even uncomfortable actions that are familiar – living with pain or sadness – qualify.

For the father coming home to his thermostat in the 1970s, having a literal Comfort Zone on his thermostat was probably a relief from the pressures and annoyances of work. It’s ok to rest, to slip back into what we’re comfortable doing sometimes; otherwise we’re tasked with reinventing ourselves around the clock, “stepping it up” to meet some unfulfiling ideal, and heading toward burnout.

 

Deadmau5 and the reimagining of the album

The album: From LP to SoundCloud
The album as an art form has been under escalating artistic, economic, and political pressures for decades. Since the decline of vinyl LPs in the 1980s, creative possibilities such as themed sides or run-out grooves were lost, swept away by digital audio. Bonus tracks, remixes, live versions, the whole lot were appended to already exhausting CD run times, producing an experience that was increasingly at odds with the ideal of the album as a digestible, coherent statement. It was the musical equivalent of every novel suddenly becoming Infinite Jest (that is not a compliment).

The CD was overtaken by the MP3, a simple file with no close association with any larger artistic system, at least not in the same way as a vinyl groove or a Red Book audio track. The MP3 could go it alone, be shoved into a playlist with anything else, mislabeled (the early days of Napster sent one Pitchfork writer for a ride by labeling old Pavement material as Weezer’s then-unreleased Green Album), or shuffled off onto an iPod or smartphone.

Now even the MP3 is bowing to streaming services such as Spotify and SoundCloud. Music has become something to experience, not own. In this sense, it has come full circle, returning to its millennia-old state as something that individuals and groups absorb in a continuous stream, without the discrete packaging of an album or single. The key difference, though, is that the user has more curatorial power than ever – it took the decline of the album to make everyone her own album producer and sequencer.

As someone who listens predominantly to albums, I have found the music industry’s direction over the past three decades dispiriting, but also liberating. What’s telling about the most shift to streaming is that it appears to have affected EDM more acutely, and earlier, than rock or even hip-hop.

The idea of an artist album in trance, house, techno, or any EDM field was always a lot different than in other genres – an artist might go years, producing tons of remixes, mixtapes, and podcasts, without putting together a “proper” album of original, deliberately sequenced music. Look at Sasha and Michael Cassette for but two examples. EDM artists, it seems, were just waiting for the consumptive and technological breakthroughs that would turn their habits into freeform yet stamped listening experiences enabled by the likes of SoundCloud and Pandora.

Deadmau5: At the frontier of the album’s evolution
No artist in EDM has been as publicly and repeatedly conscious of the genre’s complex relationship with form than Deadmau5. His albums, if you can call them that, have all born cheeky, inscrutable titles, from Random Album Title to <album title goes here> to For Lack of a Better Name. None of them were what a rockist might think of as an album, often recycling previously released material and using segues to disguise an absence of cohesion. Deadmau5 himself has also been at the center of recent debates about authenticity in EDM, a blanket genre going up against decades of rock-centric critical skepticism of electronic music’s value.

Leave it to Deadmau5 to expose one of the core contradictions of EDM: while mixtapes and similar media are often continuous, with one song fading into the other, this seamlessness does not play the same role as it does in rock, a genre in which the segue (think The Beatles’ Sgt. Pepper) is often a way of making a Big Artistic Statement. In EDM, it’s just mechanics – an experience might run through all sorts of disparate songs, but still keep the listener gripped with nice transitions. Mat Zo’s “Mat Zo Mixes” on SoundCloud, which span drum ‘n’ bass and Anjunadeep releases, exemplify this exact ethos.

There are plenty of EDM artists still dedicated to the album experience, though. Above & Beyond’s recent Acoustic release is an example of a trance artist taking up the classic rock trope of an unplugged set to confer seriousness and artistic depth. Now, Deadmau5 himself is on the eve of releasing a double album with a cute C programming-inspired title and 25 tracks that he claims represents the first work that he’s made that can “even be called an album.”

Is Deadmau5 injecting traditional album design into the anti-album EDM world? Earlier this year, he purged his massively popular SoundCloud feed. His albums have been getting progressively more immersive and deliberate, with 4×4=12 and <album titles goes here> both showing the traces of long player logic despite their castoff titles.

While(1 <2): Deadmau5’s Biggest Statement So Far
Deadmau5’s latest album, While(1 < 2), is both his most forward-looking and old school effort. It has more genre exercises than ever before – minor-key piano interludes, contemplative acoustic guitar, vocoder experiments, and 90s/early 00s alt-rock angst – to go alongside some of the most distinctive hooks (“Phantoms Can’t Hang,” “Avaritia”) of his career.

Its unmixed version, clocking in at an astonishing 139 minutes, resists flow and momentum, almost deliberately. There’s a remix of the ancient NIN track “Survivalism” right next to the piano balladry of “Silent Picture.” Hook-drenched opener “Avaritia” segues into the barely-there “Coelacanth I,” which yields to a remix of How to Destroy Angels’ “Ice Age.” While the influence of Trent Reznor and Atticus Ross is undeniable – both in Deadmau5’s apparent love of “The Social Network” sound track and in the two Reznor-related remixes that sit next to the 20+ originals – While(1 < 2) has even more in common with Aphex Twin’s 2001 oddity Drukqs, another double album chock full of discrete genre exercises from drum ‘n’ bass to classical (the unforgettable Avril 14th became the basis of Kanye West’s “Blame Game”).

Strangely, While(1 < 2) becomes an album through this resistance to the easy segue and undifferentiated experience of the mixtape and, one could argue, of latter-day rock and pop albums, which have taken the coherence mandate of Sgt. Pepper and its successors to the extreme, by making everything sound the same (uniformly loud, vaguely dance-y, consistently exhausting). The tracks on While(1 < 2) each call out for individual attention – why else put the title-says-it-all “A Moment to Myself” as a prelude to the upbeat, hookier “Pets”? Yet its epic length, by willfully tempting short attention spans, begs for it to be put on in the background as something that doesn’t have to be touched for more than 2 hours. It can demand careful attention or mere acquiescence, depending on the listener’s situation. Time to have another go at it.

Delta Airlines interview vs tech and education interviews

Job interviews are increasingly weird. Striking up an in-person, phone or Hangout conversation with someone at the company could lead to a question about climbing out of a blender, or an enervating haze of “culture fit” queries. The interviewee is still likely to get the classics – ‘tell me about yourself’, ‘what’s a challenge you overcame?’ ‘any questions for me?’ – but she also can’t expect that this particular interview will be the final or even penultimate stop on her journey to getting the position.

Today’s interviews are logical extensions of job postings that are written in epic jargon. Doubtless many candidates are discouraged to apply after seeing a wall of bullet points about “multitasking” and “detail-oriented” or the occasional all-caps phrase. That’s probably the aim of whomever is offering the position. For the ones that do make it past the submission phase, a potentially weeks-long interviewing escapade awaits.

Zappos stands out in this context for bypassing this byzantine norm and requiring months of (unpaid) interaction with current Zappos employees via social media. It seems extreme, but this process is just a more codified version of what candidates already had to do: Message, chat, and visit with potential employers until the latter calls time.

Most of my interviews have been at tech firms and at schools (all of my jobs have, naturally, been at one such institution). On a few occasions I have been on the other side of the table. By and large these interviews have been “normal” in that they’ve been protracted.

But to get a real feel for what interviewing today is like, look at airlines. Being a flight attendant is obviously a desirable position, with regular travel around the world not a luxury but a part of the job. With thousands of applicants to deal with, airlines like Delta have come up with a process that makes becoming a flight attendant more difficult than being admitted to Stanford.

Here’s how interviewing at Delta Airlines compares to interviewing at Uber and a few other companies. The details about Delta are from someone else; everything else if from my own experience.

Delta Airlines
The job: Flight attendant
The process: online application; 2x phone interview, video interview, face-to-face interview, background check

Good luck. A standard online application is followed by at least one and in the case I studied two phone interviews. The first is essentially a screening to make sure the applicant has the baseline educational and professional background. The second, if it occurs, is full of situational questions. These are not your theoretical (and useless) “Google questions” that are often used as gotchas during phone screenings, but questions with right answers rather than gold stars meted out for “thinking outside the box.”

If you make it past the application and the phone calls, a video interview awaits. Last summer I interviewed and was admitted to Dev Bootcamp (I since withdrew; it wasn’t the right fit) and the video interview was by far the most stressful part of the process. This is worse. Whereas the DBC video was just one component of many in the initial application, this has a lot more riding on it – you’ve already made it this far, and getting to Atlanta just requires recording a video of you answering some questions.

It must be treated like a real face-to-face interview (with the camera as the other ‘face’) and you must dress formally and record your video in an aesthetically appealing setting. My source did the interview wearing a shirt and tie, in a living room with lots of books. The video is time-constrained; all questions should be answered within just a few minutes.

Anyone industrious enough to make it this far must clear two more hurdles: the on-site Atlanta interview (called a face-to-face or F2F) and a background check. The F2F is undoubtedly the epic process’s toughest stage, its equivalent of the “test of the champion.” Arriving on site can instantly make a candidate fell like all is lost: just look at all those smiling, professionally dressed other people! Overcoming that anxiety is probably the toughest step. The actual interview – a two-on-one setup with lots of situational questions – can be prepared for with adequate Internet research and rehearsal.

Uber
The job: Community manager
The process: Online submission; creative exercise; phone interview; on-site interview; written exercise; party attendance

This process wasn’t difficult so much as it was inscrutable. No part of it made me feel like I was getting closer to the prize. The initial submission is standard (resume, cover letter) but nothing else is. Note that my experience is from late 2012/early 2013, and they’ve probably tightened up the process since then.

The creative exercise requires developing various marketing and promotional initiatives for Uber and takes hours to complete. It’s basically unpaid consulting. If it passes muster, then it’s on to a somewhat straightforward phone screening – no gotchas or case questions.

The on-site interview is extremely informal. It’s the dreaded “culture fit” round, and there’s not much you can do to control your fate here. They either like you or think that, for whatever reason, you won’t fit in.

Somehow, that ends up not being the end of the road. I had to do a written analysis of a ride I took, as well as go to a holiday party. Nothing materialized and I ended up waiting weeks to get the TBNT letter.

Teaching
Job: Teacher
Process: online application, profile creation, assessment, on-site networking, video interview

Teaching runs the gamut. I once got a part-time professorship basically via email, making it the easiest job interview process ever (there wasn’t one, basically). On the other hand, teaching positions, especially in English, can be very competitive.

Every school is different. But for high-performing ones, expect tests, video interviews and on-sight workshops, all before even getting to a demo lesson or reference check. Like the other jobs here, it can be tough to know if you’re doing well, since much of the process – for all of its semblance of objectivity and metrics – is about fit and finish.

What would it take for Google to decline?

A recent thread in /r/AskReddit posed a similar question. The comments were revelatory, with plenty of resigned jokes about the heat death of the universe, antitrust proceedings, and the (unlikely) rise of Bing being the only ways for Mountain View’s best to be bested:

  • “The first and most obvious way to cause a decline might be from some sort of anti-monopoly judgement being levied on them causing say for example the search engine portion of google, to be split from the part of google that manages android and chrome.” – /u/icantrecallaccnt
  • “The heat death of the universe. Though they’ll probably buy some quirky startup that’s figured out how to reverse entropy and remain in business forever.” – /u/SoresuMakashi
  • ‘The Big Bing’ – /u/tenillusions
  • “If Chinese mega-sites and portals decide to really take expansion outside of their borders seriously. Baidu, Tencent et al are well on their way.” – /u/Tuxedo_Superman

Granted, there were some thoughtful responses that probed Google’s complacence and ongoing alienation of its important demographics (advertisers, developers – note: not end-users). But I think the issue isn’t so much that Google has gotten fat and happy and turned into Microsoft 2.0 (riding Search, Maps and Gmail the same way Ballmer et al rode Windows XP and Office). Rather, the issue is that Google is desperate.

Odd word choice? Not really – Wired picked up on it recently, too, with the keen observation that the middling Google+ has left Google clinging to ever-declining per-click costs while trying to find something – anything – to help it keep pace with rivals such as Facebook, that, despite having nowhere near Google’s profits, have arguably staked out a better slice of smartphone attention spans. I have often made fun of Facebook for being essentially a channeling of some of the best talents in computer science toward the end of designing hamburger buttons and click-by-accident advertising, but I admit that its new mobile strategy – discrete offerings for messaging, news, etc. – amplifies the threats to Google’s Web-centric business model that have always resided in walled-garden apps.

Still, you’d be hard pressed to find  much appetite in the mainstream technology media for examining Google’s weaknesses. In contrast, Apple – the world’s most profitable company – is often construed as facing near-constant extinction if it doesn’t, say, release a smart watch in the next two months. The inimitable Horace Dediu succinctly broke down the double standard in his post, “Invulnerable” –

“I suspect the absence of scrutiny comes from Google being seen as an analogy of the Internet itself. We don’t question the survival of the Internet so we don’t question the survival of Google — its backbone, its index, and its pervasive ads which, somehow, keep the lights on. We believe Google is infrastructure. We don’t dwell on whether electric grids are vulnerable, or supplies of fuel, or the weather.”

I would go a step further and say that Google is like a church or a cathedral. That is, it is frequently visited, assumed to be a mainstay of the cultural fabric regardless of external economic conditions and – most importantly – it collects little to no money from any of the end users who interact with it. Sure, parishioners may make a slight donation to the local church, but the real funding comes from other sources; likewise, Joe Surfer doesn’t directly pay Google for anything, with the possible exception of a buck or two for extra Google Drive space or Google Play Music All Access. Hence, the actual business of Google is abstracted from consumers, who end up spending little or no time contemplating how or why it could go belly up – it’s not like they can point to reduced foot traffic or ridiculous clearance sales as harbingers of decline.

The signs are there, though:

-Let’s start with Android. Android was a defensive land grab to stop Microsoft and then Apple from shutting Google out of mobile. It has succeeded in terms of worldwide adoption, but it confers on Google nowhere near the profits that iOS has on Apple. Maybe that’s not a fair comparison, but it’s symbolic of how Android was never designed from the ground up as a sustainable business but as a vehicle for legacy Google services (there hasn’t been a really great new Google service since Maps in 2005).

As such, Google is always tinkering with Android to make it less like an open source project and more like its own Google service. Peter Bright’s article on forking Android understandably struck a nerve with Google, which is awkwardly trying to maintain Android’s chief competitive advantage (no licensing fees, tons of customization possibilities for OEMs and carriers) while bringing it further under Mountain View’s umbrella.

-One of the best revelations of the ongoing Samsung-Apple legal battle is that Samsung really would like to move on from Android. Samsung isn’t a great leader, but the fact that it would even consider something as nascent as Tizen to take the place of Android on its smartphones lines is telling.

-Google Glass reeks of desperation. Jay Yarow of Business Insider insisted that Google botched Glass’ launch, ensuring that it would never take its apparently rightful place as the successor to the iPad as the next big thing in consumer tech. It’s a computer for the face, with no obvious use case as yet, a crazy price tag, and understandable cultural stigma. Tech media were wrong to puff it up as the Next Big Thing, but consider also the absurdity of this situation: Google is trying to sell a terrible HUD in order to get out ahead of the competition, like Apple did to much better effect with the iPod and then the iPhone.

-It’s not just Glass, either. The Nest acqusition, the Boston Dynamics aquisition, and the obsession with “sci-fi” projects at GoogleX. – Google could be looked at as “shooting for the moon.” Or, it could be viewed instead as desperately trying to find any revenue stream alternative to mobile ads, which just don’t work like desktop ones do and, moreover, are subject to intense competition from social networks and messaging platforms.

-The sci-fi thing merits more attention. Forever ago, I wrote this about Google Glass and its ilk:

By “the future,” commentators usually mean “a reality corresponding to some writer or creative artist’s widely disseminated vision,” which shows the odd poverty of their own imagination as well as the degree to which they often underestimate the power of creative artists/humanities types to drive technological evolution. But can human ingenuity really aspire to nothing more than the realization of a particular flight of fancy? Should we congratulate ourselves for bringing to life the technology from a reality that doesn’t exist?

Trying to actualize the fantasies of sci-fi is not forward-looking; it is, by definition, backward-looking, with respect to someone’s text or vision about what was possible in the past. If someone created a real Death Star today, it would be impressive – as a testament to madness. Why would someone exert such enormous, concerted effort at recreating a technology conceived for recreational purposes in the 1970s, by individuals who had no idea that smartphones, MP3s, Bluetooth, Wi-Fi, and on on would be invented?

To analyze sci-fi is often to analyze what it doesn’t conceive of. I watched Gattaca recently, a 1997 movie with a setting in the far future. What was in this high-tech future? Big, hulking desktop PCs and keyboards. Sci-fi is the product of constrained imagination (“the future is hard to predict” – Captain Obvious), but imitating it is even more self-defeating. For this reason, I am immensely pessimistic about the prospects of any of Google’s top-secret projects being a breakthrough that would expand its business or appeal in meaningful ways. Sci-fi is a small porthole on the future.

-Google’s customers are advertisers and other businesses, not individuals. It reaches the latter by its presence on platforms that belong to the former – think its default search engine deals for Firefox and Safari. There’s not any real competition on those fronts for now  – Bing is good but has lithe mindshare, and Yahoo is still locked into its deal with Microsoft. But Marissa Mayer is driven to displace Google on iOS, and Apple and Yahoo have a good relationship (Yahoo provides the data for Weather on iOS, for example). As MG Siegler has pointed out, it seems implausible that Apple would go on subsidizing Google, enabling it to make so much money off of iOS, money that it can channel into Android.

-Once one gets into the “Google isn’t invulnerable” mindset, it’s easy to see everything as a weakness, sometimes without good reason. But think about its efforts to bring Chrome OS apps to mobile devices. Such a tack seems defensive – a way to halt the decline of the Web and keep matters squarely in the realm of JS, HTML and CSS. I’ve often argued that Chrome OS is more of a breakthrough than Android (it has the potential to disrupt both the business model of Windows PCs and the essential appeal of tablets), but it looks like it could turn into just a moat for Google’s existing (and, to be fair, highly profitable, at least for now) Web businesses.

-Google+ has become the DNA of Google services. Its profile system is a way of indexing Internet users. It has succeeded in helping Google collect more nuanced data, even if it hasn’t exactly done much to blunt the impact of Twitter, Facebook, and others. But now that Vic Gundotra is leaving, Google+ looks weirdly quaint – like nothing more than Gundotra’s messy senior project for getting hired by another firm. There are already rumors that the Google+ team will be split up and sent to other projects (in the same way that the Google Reader team was once chopped up to work on Google’s initial forays into social).

Look, Google isn’t going to turn into AOL or Yahoo. But it should be increasingly apparent that Google is not synonymous with the Internet at large, and is not guaranteed to constantly occupy so much mind share.