Site iconDigital Marketing Agency | Portland PPC SEO Services | Anvil Media

An Unprecedented Week of Change at Google: 100% (not provided), and Hummingbird

Any news of a change in Google’s algorithm, or a change in Google’s policy toward webmasters and marketers, sends ripples through the search marketing community, so the fact that this past week delivered one major news item in each category—both with seemingly colossal repercussions—made it a banner week in SEO/digital marketing history. As it is every SEO professional’s duty to himself and his clients to remain forever adaptable to the shifting search landscape, let’s examine both stories and determine the kind of impact that these changes are going to have on the way we do our jobs.

Early last week, word got out about Google’s decision to—any day now—cease passing any of its organic search keyword data through analytics, rendering it all invisible in perpetuity. Those of us in SEO are already well accustomed to the (not provided) phenomenon, which began in September 2011 as a response to ostensible user data security risks, encrypting all search queries sent by searchers logged in to Google at the time of search (assuming the search ended in an organic result click-through; Google never made any effort to encrypt search queries that led to clicks on paid links). Over the following two years, the share of web searches sent by users logged in to Google grew, and at the same time, Google formally expanded the reach of (not provided) by incrementally transitioning the search bars of the world’s most popular browsers to an encrypted search default. The natural effect of these two processes was continual steady growth of the portion of search keywords hidden behind the (not provided) veil, forcing us in SEO to depend more and more on data sampling in order to deliver keyword reports of any value to our clients.

But over the month of September, the web average (not provided) percentage began skyrocketing, going from its previously steady, relatively gentle half-a-percentage-point-per-week rate of increase to an astonishing eight to ten percentage points per week, climbing over the course of four weeks from encrypting just under 50% of all searches to encrypting nearly 80% (today’s reported average is 79.79%). This shift inspired some of the world’s foremost SEO minds—most notably Danny Sullivan of Search Engine Land—to formally ask Google what was going on. The answer he got was the confirmation of a fear we’ve alternately suppressed and nurtured over the last two years: Google is rolling out, seemingly as fast as it can, a new policy to encrypt all organic search keyword data, web-wide, which will have the eventual effect of shooting the (not provided) share of organic keyword data to 100%.

Two days later, just as most of us in the SEO community were getting ready to take our heads out from between our knees and put down the paper bag we’d been breathing into and start planning how we were going to adapt to this colossal change, Google made another announcement, this one on their own terms. At a special 15-year anniversary event hosted in the garage where Google was founded, it was revealed that Google had completely retooled its search algorithm. The new algorithm—complete with a new name: Hummingbird—has been cooked up to, first and foremost, improve the quality of the results that Google serves on complex search queries, which are essentially being defined as search queries that more closely resemble formal questions than simple keyword strings, and that consequently typically seek results that provide especially rich and complete answers rather than superficial direction. But for reasons that those of us outside the Google bubble will perhaps never understand completely, this algorithm change proved truly holistic, being described by Google’s search chief Amit Singhal (in Sullivan’s Search Engine Land column) as the most dramatic rewriting of Google’s algorithm that he’d seen since he joined the company twelve years ago.

So. What does all this mean for SEO? Once the panic began to subside, I realized that the answer is: less than it seems.

The loss of keyword data is going to sting a bit, but we can arrive at an approximation of the data we’re losing by correlating a given site’s organic search landing page data (very much still “provided”) with a snapshot of its rankings for non-branded keywords (noting both the rankings themselves and the landing pages that are doing the ranking), monitoring the way the two influence one another over time, and checking our findings against our knowledge of the site’s keyword placement. This amounts to carving out a tortuous dirt path to the sorts of insights that used to be accessible by freeway, if you’ll pardon the metaphor, but for now, it seems to present the most sensible way forward for keyword reporting. Given how dependent on data sampling our keywords reports had already become over the course of (not provided)’s two-year ascendancy, perfect numerical accuracy was already lost some time ago, so a shift to a more circumventive model is less radical than it might first appear.

What’s more, I feel this change actually has the potential to improve the state (and, with it, the reputation) of SEO. In order to be able to perform this correlation of organic search landing page data and rank tracking and be able to derive from our findings any kind of confident assertions about which keywords are driving search traffic to each page, we are going to need to assert far more control over the content of each page. Given that the certain knowledge we once had of which keywords were working on a given page and which ones weren’t is now lost, we’ll simply have to start working harder to see that every keyword works to its optimal capacity on every page. Coasting on the strength of established, reliably successful traffic-driving keywords never counted as good SEO, but now it won’t even be possible anymore, because we will have no way of knowing which are the successful ones. This certainly makes for more work, but it also points to a web made better by this work. When the kind of SEO that Google has long implored us to focus on exclusively—the production of good content—becomes the only kind that works anymore, then the long-standing, toxic reputation of SEO as cheap algorithm gamesmanship—a reputation that admittedly has fair basis in the years during which black-hat tactics were truly viable—might perish once and for all.

The direct consequences of Hummingbird for SEO are not as clear, but the same principle applies: content of substance, it seems, will now earn even greater favor in Google’s rankings than content that seems like boilerplate, marketing-speak, or a shallow pretense to motivate a call-to-action, so we must emphasize even more than before the importance of making content that improves people’s lives. Now more than ever, that will be the steak, and the best practices that remain pertinent to metadata, markup, server optimization, and so forth will simply be the sizzle. Incidentally, though it got far less press by comparison (and, I’d guess, quite a lot less than it would have if it’d been announced after Hummingbird rather than before), Google revealed another new feature about six weeks ago that seems to play into Hummingbird’s hands: support for so-called “In-Depth Article” search. Google announced that research they conducted revealed that roughly 10% of all search queries were “general education” queries, i.e. queries that suggested that the searcher was seeking a deep understanding of a particular subject, rather than basic information about a product, celebrity, TV show, etc. In response to this, they rolled out a new specialized search feature for articles and essays of especial length and depth, and suggested, as the link above shows, a collection of best practices for helping content to qualify for this specialized search (as well as the possibility of rich snippets when they appear in general web search results). This doesn’t suggest a way to game Hummingbird; rather, it simply speaks to the same mission that Hummingbird does, and further reinforces the idea that Matt Cutts’s eternal maxim about making good content is more than just a coy, cagey way of stonewalling algorithmic real talk; it’s the honest-to-God bedrock of their company philosophy. They want SEO to cease being about “gaming” anything, and, if I’m being perfectly honest here, so do I.

Naturally, we won’t know exactly what either of these changes has in store for us until we’ve had some time to sit with them; for my part, I’ll be spending this month updating my keyword reporting process and drafting a template for its presentation, and I hope for a chance to compile a case study on the tangible effects of Hummingbird on written content very soon. But I’m truly comforted by reminding myself that both changes are going to train the SEO hive mind’s focus on producing quality content like nothing else that has come before, no matter how much more work this shift demands of us. After all, we stand only to gain from changes that ratchet up the quality of the content on the average webpage, because though we may be SEO practitioners, we’re searchers, and citizens of the web, first.

Exit mobile version