SEO Is Not Dead. Yet.
by Anvil on December 16, 2013SEOICYMI, Hummingbird was the SEOsphere’s latest occasion to trot out the shopworn “SEO Is Dead” and “Is SEO Dead?” and “SEO Is So Not Dead, You Guys” articles that make their rounds once or twice a year. To bring you up to speed, the consensus that emerged when the smoke cleared this time—just like every other time—was that SEO Is Not Dead, which is a convenient consensus for a bunch of professional SEO specialists to reach, given how much we enjoy getting paid to do SEO in between getting paid to write articles about whether or not it’s dead. It’s a good line of work; I mean, I certainly enjoy it.
Of course, it’s easy to laugh at the apocalyptic pronouncements that touch off these fevered discussions, but as rash, alarmist, and clickbait-y as they typically are, I have to admit that my experience of reading this latest round was a little less amusing than usual. For one thing, I noticed something that I hadn’t before. The first wave of backlash that these doomsday articles usually provoke, which initiates the whole discursive process that ultimately leads us back to our self-soothing and resolving to carry on, typically takes the form of a few sage, seasoned voices admonishing the rest of us—in a tone always more exasperated than it was last time—to remember that SEO has weathered a million storms since the dawn of search and has always managed to remain relevant and needed. This is unassailably true and a perfectly understandable perspective to take in looking for a little comfort when something comes along and quakes the landscape like this, but the message carries an oddly confident subtext: that SEO will never die, that sites will always need help getting found, and that the discipline’s relevancy and necessity, therefore, are forever. For reasons I didn’t really understand right away, the hubris of this observation jumped out and gave me pause this time. Sure, it was clear that Hummingbird wasn’t transformative enough to be capable of murdering our entire line of work outright in its first month of life, but can we really pretend to act like nothing ever could? Is there no possible future for search without some form of site-side optimization playing a role? Just the awareness of such a possibility was new and unsettling for me.
It only got more unsettling from there. The closer I looked at Hummingbird, the more clearly I could see that a future without SEO is more or less exactly what it promises. Remember, Hummingbird didn’t come out of nowhere; it’s merely the latest and largest step that Google has yet to take on its path to advance the semantic web. But it’s a large enough step to indicate that Google is quite a bit more serious and more deliberate about this transformation than I realized. They’re not waiting around to watch the semantic web slowly expand to take over search; they’re sweating to bring the day about as quickly as they can, and to make it happen on their terms.
Consider how essential a facet of SEO the geomodifier was until suddenly, one day, it wasn’t anymore. My first job in SEO was a copywriting gig in which I had to research the names of the largest suburbs of a major American metropolis and carefully shoehorn them into a bunch of content describing the services that the company I worked for provided to people in that metro area, so that they would be found by people searching for the service in conjunction with as many suburb names as possible. Making your content keyword-relevant to the geographical locations you served was once as unquestionably essential to proper content optimization as anything, because Google needed to be told these things. Then, one day, we woke up to discover that things had changed; with Google’s Location Services and new local results pages with Maps integration, it was clear that they already knew both where you were searching from and exactly what relevant service providers were nearby. Seemingly overnight, searchers no longer had to formally query location to get pertinent local results, and consequently, companies no longer had to state their location(s) in every piece of content they published. As Google got better at learning, there was less we needed to tell it.
Remember when the Knowledge Graph burst on the scene? That was arguably even more of a shake-up. Suddenly, there were all kinds of things that Google itself just knew and could tell you, no third-party site required. Granted, it was as often as not flying in its Knowledge Graph content from trusted sources (Wikipedia perhaps foremost), but that wasn’t the point: the point was that Google knew where to find something like a trustworthy and satisfying answer to a question, and it could supply that answer to users so efficiently that they didn’t even need to click anything to get it.
So, first, businesses no longer had to tell Google what locations they served in every paragraph they published about themselves, because Google was capable of finding that out on its own. That was kind of a relief and a luxury. But then, a site no longer had to tell Google that it was an authoritative resource on, say, salamanders, and this time it wasn’t because Google already knew that about that specific site and was going to send scads of search traffic to our site specifically without the site’s webmaster having to lift a finger; rather, it was, as we discovered, because Google already had a trusted resource for information on salamanders that they felt pretty confident would deliver to most users the basic information that they’re looking for without demanding any click-through of them. That wasn’t so reassuring. If you were the SEO specialist for a site on salamanders, there likely were optimization efforts you could pursue to rank ahead of some of your competition in the organic results, but there wasn’t a thing you could do to displace or even influence the content of that Knowledge Graph box, and every user satisfied by that box was a user for whom the organic results were simply not needed. If you’re among the sites trying to appeal to that audience, a #1 ranking is still too low.
That’s the semantic web. Rather than asking Google to point you somewhere for information, you’re asking Google for the information directly. And rather than algorithmically choosing which third-party resource is most likely to have the information you seek in its most accurate form, Google is just up and answering your question. In transactions like these, Google is not a conduit to a searcher’s final destination; Google is the destination.
A future featuring a lot more of that kind of thing is where I think we’re headed, and Hummingbird—which, remember, was reported to affect over 90% of searches—is going to get us there sooner than we realize. The day SEO dies will be the day when there is simply nothing that a site can teach Google, using keyword placement and markup, that Google can’t learn by itself, and faster, simply by crawling the thing. What does this day look like? Informational searches answered by a single, monolithic, Google-approved source in detail (i.e. a Knowledge Graph that’s interactive and rich), local restaurant searches where the results have been ranked inviolably according to each place’s average Google+ rating, product queries that zip you instantaneously to a Google Shopping walled garden, and brand searches where Google simply redirects the searcher to the brand site in one go via the link on the brand’s verified Google+ page.
So, what do you think? Am I off my rocker? Is there anything to be done to keep SEO viable after Hummingbird has extended its reach to 100% of searches? And if not, and especially if the web will be a better place for the loss, what do we do next? How can we continue to participate in the web’s betterment? Will Google ever be able to do truly everything for itself? Talk to me in the comments. I would love to be wrong.