Google algorithm updates

Google algorithm updates

Links = Ranking and Google updates

The old Google (pre-Panda) was to some extent largely before the Google Algorithm Updates.

Once you had enough links to a site, you could literally pour content into a site like water and have the aggregate link authority of the domain help everything on that site rank quickly.

As much as
PageRank
was publicized and important, so was the importance of having a diverse range of linking domains and keyword-driven anchor text.

Brand = Rank and Google updates

After Vince and then Panda, a site’s notoriety (or, rather, the ranking signals that could best simulate it) was integrated into the ability to rank well.

Panda took into account factors beyond links, and in its first deployment would cut anything on a particular domain or subdomain. Some sites, like HubPages, have had their content moved into sub-domains by users. And some aggressive spammers would rotate their entire site to different sub-domains repeatedly each time a Panda update occurred. This allowed these sites to recover immediately from the first two Panda updates, but Google eventually closed this loophole.

Any signal we rely on ends up being abused, whether intentionally or not. And over time, this leads to a “sameness” of the result set unless other signals are used:

Google is an absolute dustbin when it comes to searching for anything to do with a product. If I’m trying to learn something, I always have to look for another source like Reddit via Google. For example, I discovered the concept of weighted blankets and was intrigued. So I Googled “why use a weighted blanket ” and “the benefits of a weighted blanket”. Just by virtue of the fact that the word “weighted cover” was in the search, I got pages and pages of nothing but ads trying to sell them, and no meaningful discourse on why I would use one

More granularity

Over time, as Google refined with Panda, general sites outside the news vertical often fell on hard times, unless they were dedicated to a specific media format or had numerous user engagement metrics, such as a powerful social networking site. This is largely why The New York Times sold About.com for less than it paid. the Balance (personal finance), Lifewire (technology), Tripsavvy (travel) and ThoughtCo (education and personal development).

Penguin has again cut aggressive anchor text built on poor quality links. When the Penguin update was rolled out, Google also deployed a spam classifier on the page to further mask the update. And the Penguin update has been sandwiched by Panda updates on both sides, making it difficult for people to reverse-engineer any signals from weekly lists of winners and losers from services that aggregate massive amounts of tracking data from the keyword ranking.

Much of the link graph was decimated as Google reversed its stance on nofollow where, on March 1 this year, they began treating it as an index versus a guideline for ranking purposes. Numerous
mainstream
were overusing nofollow or not citing sources at all, so this extra layer of obfuscation from Google will allow them to find more signal in the noise.

Algo update

On May 4, Google rolled out another major update.

Later in the day, we publish a large update of the basic algorithm, as we do several times a year. This is the main update for May 2020. Our advice on these updates remains as before. Please see this blog post for more information: https://t.co/e5ZQUAlt0G-
Google SearchLiaison
(@searchliaison) May 4, 2020

I’ve seen some sites whose rankings have been suppressed for years see a big jump. But a lot has changed all at once.

Corner problems

On some political search queries that have been primarily ranked as news-related, Google is trying to limit the political backlash by displaying official sites and data extracted from official sites instead of putting news in the foreground.

“Google has pretty much made it clear that it will not propagate news sites as far as election-related queries are concerned and you scroll down and you get a giant election widget in your phone and it shows you all the different data on the main results and then you scroll down, you find Wikipedia, you find other historical references, and before you even get to a single news article, it’s pretty crazy to see how Google has changed the way SERPs are designed.

This shift reflects the permanent change in the news media ecosystem brought about by the Web.

The Internet has trivialized the dissemination of facts. The “news” media reacted by turning to opinion and entertainment. – Naval (@naval) May 26, 2016

YMYL and Google updates

A blog post by Lily Ray of Path Interactive used Sistrix data to show that many of the sites that experienced high volatility belonged to the health vertical and other your money, your life (YMYL) categories.

Aggressive monetization

One of the most interesting comments on the update came from Rank Ranger, where they looked at particular pages that skipped or fell hard on the update. They noted that sites that place ads or ad-like content in the foreground may have experienced a sharp decline on some of these high-revenue pages that have been aggressively monetized :

Seeing this only reinforces the idea (in my mind at least) that Google didn’t want content unrelated to the main purpose of the page to appear above the waterline to the exclusion of the page’s main content! Now for the second wrinkle in my theory…. Many of the pages replaced by new ones didn’t use the format shown above, where a series of “navigation boxes” dominated the page above the waterline.

The above change has had a major impact on some sites that are worth a lot of money. Intuit paid over $7 billion to acquire Credit Karma, but their credit card affiliate pages have recently slipped hard.

Credit Karma has lost 40% of its traffic since the main update in May. It’s insane, they do major TV commercials and probably pay millions of dollars in SEO spend. Think about it, people. Your site is not secure. Google radically changes what it wants with each update, without telling us! – SEOwner (@tehseowner) May 14, 2020

The type of change above reflects that Google is becoming more granular with its algorithms. In the beginning, Panda was all or nothing. Then it began to have different levels of impact in different parts of a site.

The brand was a sort of bandage or rising tide that lifted all the (branded) boats. Now we see Google becoming more granular with its
algorithms
where a strong brand may not be enough if they consider monetization excessive. This same focus on layout can have a more negative impact on small niche websites.

One of my former clients had a site that was primarily monetized through the Amazon affiliate program. About a month ago, Amazon cut affiliate commissions in half, then aggressive ad placement halved search traffic to the site as rankings slipped on this update.

Their site has seen a downward trend over the past two years, largely due to neglect, as it was always a small side project. They recently improved some of the content about a month ago and that ended up giving it a run for its money, but this update has arrived. As long as this advertising space remains unchanged, the downward trend is likely to continue.

They’ve just removed this ad block, but that meant another drop in revenue, because until there’s another big algo update, they’ll probably stay at around half of search traffic. So now they have half of half of half of half. It’s a good thing the site didn’t have any full-time employees, or they’d be among the millions of newly unemployed. However, this experience really reflects how websites can be almost like companies in debt in terms of bankruptcy virtually overnight. Who can see their income drop around 88%, then increase their investment in the property using the remaining 12% while waiting for the site to revalue for a quarter of a year or more?

“If you’ve been adversely affected by a main update, you (usually) can’t see recovery from it until another main update. What’s more, you’ll only see the payback if you significantly improve the site over the long term. If you haven’t done enough to improve the site as a whole, you may have to wait for several updates to see an increase as you improve the site. And since major updates are generally spaced 3 to 4 months apart, this means you may have to wait a while.

Hardly anyone can afford to do this unless the site is just a side project.

Google might choose to run major updates more frequently, allowing sites to recover more quickly, but they gain an economic advantage by funding investments in referencing and adding opportunity costs to aggressive SEO strategies by ensuring that ranking drops on major updates last a season or more.

Choose a strategy or let things come to you

They probably should have reduced the density of their ads when they made these other upgrades. Had they done so, they would probably have seen rankings at their worst flat, or probably rising, as some of the other competing sites fell. Instead, they roll with a half-half-half on the income front. Glenn Gabe preaches the importance of solving every problem you can find, rather than just fixing one or two things and hoping that’s enough. If you have a borderline site, you have to consider the trade-offs between different monetization approaches.

Monetize it slightly

monetize it lightly and aggressively, while using the extra income to further improve the site elsewhere and make sure you have enough to get you through the lean months
aggressively monetize the soon after a major ranking update if it was previously lightly monetized, then hope to sell it a month or two later before the next major algorithm update cuts it off again

Results will depend in part on timing and luck, but consciously choosing a strategy will probably yield better returns than doing a bit of mix-n-match while having your head buried in the sand.

Read Algo updates

You can spend 50 or 100 hours reading blog posts about updating and learn precisely nothing in the process if you don’t know which authors are bullshitting and which authors are writing about the right signals.

But how do you know who knows what they’re talking about?

This is more than a little tricky, as the people who know most often have no economic advantage in writing details about the update. If you primarily monetize your own websites, ignorance of the wider market is a big part of your competitive advantage.

To make things even more complicated, the less you know, the more Google is likely to trust you to send official messages through you. If you syndicate their messages without questioning them, you get a treat – more exclusives. If you question their messages in a way that undermines their objectives, you’ll quickly become persona non grata – something cNet learned many years ago when they published Eric Schmidt’s speech.

It would be unlikely that you would see the following type of Tweet from Blue Hat
SEO
or Fantomaster or whatever.

I asked Gary about EAT. He said it’s largely based on links and mentions on authoritative sites. i.e. if the Washington Post mentions you, that’s good.

He recommended reading the QRG sections on EAT as it describes things well. @methode #Pubcon- Marie Haynes (@Marie_Haynes) February 21, 2018

To be able to read the algorithms well, you need to have certain market sectors and keyword groups that you know well. Passive collection of an archive of historical data can quickly highlight major changes.

Anyone who depends on SEO for a living should subscribe to an online ranking tracking service or run something like Serposcope locally to track at least a dozen or two dozen keywords. If you’re tracking rankings locally, it makes sense to use a set of web proxies and execute requests slowly through each one so as not to be blocked.

You need to follow at least one diversified range to get a real idea of algorithmic changes.

a few different industries
a few different geographic markets (or at least some terms of local versus national intent within a country)
some key words head, midtail and longtail
sites of different size, age and brand awareness within a particular market

Some tools make it easy to quickly add or remove graphs of anything that has moved a lot and appears in the first 50 or 100 results, which can help you quickly find outliers. And some tools make it easy to compare rankings over time. As updates develop, you’ll often see several sites making big changes at the same time, and if you know a lot about the keyword, the market, and the sites, you can get a good idea of what might have changed to cause these changes .

Once you see someone mentioning outliers that most people miss and that match what you see in a data set, your confidence level increases and you can spend more time trying to unravel the signals that have changed.

I’ve read influential industry writers mention that links have been greatly reduced with this update. I’ve also read Tweets like this one that could potentially indicate otherwise.

Check out https://t.co/1GhD2U01ch. Even more than Pinterest and ranking for real weird shit. – Paul Macnamara (@TheRealpmac) May 12, 2020

If I had little or no data, I wouldn’t be able to get a signal from this range of opinions. I’d be sort of stuck at “who knows”.

With my own data, I can quickly determine which message is most consistent with what I’ve seen in my subset of data, and form a stronger hypothesis.

No single firearm

As Glenn Gabe likes to say, reservoir sites usually have several major problems.

Google rolls out major updates infrequently enough to be able to integrate several different aspects into major updates at the same time, making it more difficult to reverse-engineer updates. So it’s useful to read widely with an open mind and imagine what signal changes might cause the kinds of ranking changes you see.

Sometimes site-level data is more than enough to understand what has changed, but as the Credit Karma example above showed, sometimes you need to be much more granular and look at page-level data to form a solid hypothesis.

As the world changes, so does the Web

About 15 years ago, online dating was considered a strange niche for recluses who perhaps generally repelled real people in person. There are now all kinds of specialized niche dating sites, including a variety of DTF-type applications. What was once strange and absurd has over time become normal.

The fear of COVID-19 will bring about lasting changes in consumer behavior, accelerating the e-commerce movement. A decade of change is coming in a year or two in many markets.

Telemedicine will develop rapidly. Facebook is adding featured commerce directly to its platform thanks to a partnership with Shopify. Spotify spends a lot of money buying exclusive rights to distribute widely followed podcasters like Joe Rogan. Uber recently proposed acquiring GrubHub. Google and Apple will continue to add financing features to their mobile devices. Movie theaters have lost much of their appeal.

Tons of offline “value” businesses ended up worthless after months of lost revenue, while large unpaid debts accrued interest. It’s thought that some of these brands will have strong latent brand value that will carry over online, but if they were weak even when offline stores acting as interactive billboards subsidized consumer awareness of their brands, as these stores close consumer awareness and loyalty from face-to-face interactions will also dry up. It’s unlikely that a corporate shell rebuilt around the Toys R’ Us brand will beat Amazon’s parallel offering or a company that still runs offline stores.

Big-box retailers like Target and Walmart are increasing their online sales by several hundred percent year-on-year.

There will be waves of bankruptcies, dramatic changes in commercial real estate prices (already reflected in falling REIT prices) and more people working remotely (shifting residential real estate demand from the urban core to the suburbs).

People who work remotely are easier to hire and easier to fire. Those who continue to improve their skills will eventually be rewarded, while those who don’t will change jobs every year or two. The lack of stability will increase demand for education, although much of this additional demand will be for new technologies and specific sectors – certificates or informal training programs instead of diplomas.

More and more activities will become normal online activities.

The University of California has around half a million students, and in the fall semester they’ll be trying to get most of these courses online. How much usage data does Google earn as thousands of institutions put more and more of their infrastructure and services online?

Colleges have to convince students for the next year that distance learning is just as good as face-to-face instruction, then backtrack before students start to believe it.

It’s as if you could only sell your competitor’s product for one year. – Naval (@naval) May 6, 2020

Many B & C schools will disappear as similar comparisons become easier. Back when I ran a membership site here, a college paid us for students to have access to our membership area of the site. As online education becomes more normalized, many unofficial trade-related sites will appear more economically attractive on a relative basis.

While the main state institutions provide most of their services online, other companies are likely to follow suit. When big cities publish lists of crimes they won’t respond to during economic downturns, they’re effectively subsidizing more crime. This makes it logical to move somewhere a little more rural and less expensive, especially when you no longer need to live near your employer.

The most important implication of this ongoing WFH movement is state income tax.

Warm, sunny states with affordable housing and zero taxes will see an influx of wealthy, educated workers. States will have to cut taxes to keep pace.

 

Thanks for reading, see you at the next blog!

If you have any questions or would like a quote, please contact us by e-mail at info@koanthic.com or at 418-455-2259.

Exit mobile version