Key Information Revealed by Google About the Recent Major Algorithm Updates
In early September, search marketing expert Barry Schwartz interviewed Google’s Search Liaison, Danny Sullivan, and later published a lengthy summary of the interview.
Barry Schwartz is the founder of Search Engine Roundtable and the CEO and technologist of RustyBrick. Danny Sullivan, on the other hand, represents Google in clarifying and explaining updates to Google Search, algorithm changes, and other related matters to ensure that the public understands how Google Search works and how to adapt to these changes.
Given the professional and authoritative background of these two figures, this interview holds great value for those who are keen on understanding Google’s recent algorithm updates. I’ve read through the interview and have summarized the key points to help others gain a clearer understanding of the direction Google is heading with its updates, potentially offering insights for future traffic-driving strategies.
Below is my summary and personal thoughts.
(I want to emphasize that this summary was not generated by simply pasting the original content into GPT; it was organized and summarized manually, so you can read it with confidence.)
Key information is in bold;
My personal thoughts are in italics.
On Update Frequency
Google Search undergoes around 5,000 updates a year, and periodically introduces “major updates” when necessary. These large-scale updates, which have a broad impact, are referred to as “core updates.”
This means that Google Search makes about 14 adjustments to its algorithm every day. This further reinforces my view that “you shouldn’t waste time trying to outsmart the algorithm.” It’s highly likely that any loophole shared online has already been fixed. (Focus your time on creating content that truly serves users.)
Rewarding Small Independent Sites
Google’s core updates from March to August 2024 focused on giving more rewards to small independent sites, and Google is far from finished with this effort. It’s expected that small independent sites with quality content will continue to benefit from future core updates.
Small independent sites often have less commercialization, are more niche, and are more personalized. I refer to them as “purer” than larger sites. It seems that Google is aiming for more diversity on the web and is rethinking how it defines “authority.” This has significant implications for those operating independent sites: you’ve chosen a very good path, so stick with it.
What to Do When Your Site’s Rankings Drop After an Algorithm Update
Google acknowledges that some sites experience significant traffic fluctuations following algorithm updates. Sullivan’s advice is to “just make sure you’re doing the right thing for users” and remain patient (as the updates are a gradual process). He also emphasized that chasing Google’s ranking system is an outdated approach.
Every algorithm update is unlikely to be perfect, especially considering the diversity of both websites and users. In online gaming, there’s a saying that each version has its own “OP character,” and the same can be said for Google Search. Perhaps Danny gave a more fundamental piece of advice: focus on your users, not on Google’s algorithm.
What Does a Ranking Change Mean?
If your site drops from first to second place, the impact on traffic can be significant. But this doesn’t necessarily mean Google’s algorithm dislikes your content. The reality is that the higher-ranking content is likely also highly valuable to users. Google’s goal is to ensure that all the content in search results is useful, or at least that every user benefits from it.
No page will occupy the top search spot forever. In the highly competitive world of top-tier search rankings, the key question is whether your content is universally more valuable to users searching for related keywords. Frankly, this isn’t something that technical optimizations or tricks can achieve.
Does Google Adjust Its Updates During the Rollout Process?
Some updates cause significant ranking and traffic fluctuations, but Google has made it clear that it won’t adjust the algorithm mid-update just because of these fluctuations. These updates go through multiple rounds of evaluation and testing before being rolled out, and the only thing Google will do is expand the update to a broader scope.
This feels a bit arrogant, given that Google isn’t always objectively correct. However, from a technical perspective, changing the update content midway could introduce untested issues and create even bigger problems…
Does Filling Out Google’s Feedback Form Affect Your Site’s Ranking?
Sometimes, during core updates, Google releases a feedback form to gather input from users or site owners about the impact of the update. The feedback form for the March 2024 update received 12,000 submissions related to 1,300 domains. Google has made it clear that whether or not you submit feedback will not affect your site’s ranking.
There’s a story of someone submitting the feedback form over 1,700 times for the same domain, but Google has debunked the myth multiple times: there are no shortcuts in SEO.
What Kind of Content Does Google Reward?
Sullivan gave a specific example, translated here:
A local plumbing website wasn’t ranking well for “local plumbing” topics in Google Search. This was because the content looked like the usual generic content, such as how to fix a sink. It was just ordinary plumber content that likely didn’t drive conversions.
In contrast, if that plumber shared personal and professional stories about plumbing issues in their local area, that would be the type of content Google wants to reward.
This dialogue clearly illustrates that Google is looking to reward content with strong personal or unique qualities. Generic content often feels repetitive and doesn’t inspire users much. For example, the internet doesn’t need thousands of articles explaining “what coffee is” (nor will Google distribute traffic to them). However, if an article is about a unique experience at a seaside café, it’s likely one-of-a-kind and more worth reading.
Emphasizing Page Experience
If your users have a satisfying experience when they first land on your site (from the first page onward), Google’s ranking system will try to reward your site. As a site operator, you should pay close attention to your site’s “Core Web Vitals,” though this isn’t everything.
Here, a concept is introduced — Core Web Vitals. Let me explain:
Core Web Vitals are a set of key metrics introduced by Google to evaluate a webpage’s user experience, focusing on loading performance, interactivity, and visual stability. These metrics are designed to measure the actual experience users have when visiting a page, helping websites optimize these aspects to improve user satisfaction and search rankings. Specifically, Core Web Vitals include three core metrics:
- Largest Contentful Paint (LCP): Measures how quickly the main content of a page loads, referring to the time it takes for the largest visible element on the page to load. Ideally, this should happen within 2.5 seconds.
- First Input Delay (FID): Measures a page’s interactivity, referring to the delay between the user’s first interaction with the page (such as clicking a button) and the page’s response. Ideally, this delay should be less than 100 milliseconds.
- Cumulative Layout Shift (CLS): Measures visual stability, referring to unexpected shifts in page layout during the loading process. Smaller shifts (CLS score below 0.1) indicate a better user experience.
These metrics comprehensively assess the quality of the user experience on a website and influence Google’s ranking factors. Thus, it’s crucial for webmasters and developers to optimize Core Web Vitals. However, user experience can’t always be captured by a few simple metrics, so Google encourages site operators to think more about the human aspects of improving the experience.
I believe the key points are creativity, imagination, and attention to the details of users’ feelings.
Content from Forums Like Reddit in Google’s Rankings
Google is keen on showcasing content from social platforms and forums (such as Reddit) and will continue to do so. This is because these pieces of information are often more specific, niche, and can offer clearer help or guidance to searchers. Sometimes, content shared on Reddit even ranks higher than the original source, possibly due to search keywords and content freshness, among other factors.
Recall that if you’ve ever searched for a very specific scenario on Google, content from forums like Reddit might have been more helpful because it’s often centered around a very specific topic and is more timely.
This shows that the value of social media and forums lies in offering content fragments that address specific problems. When Google determines that this is the answer users are looking for, this content will be indexed at the top. This is a boon for UGC content marketing, but ensure that the content genuinely holds value.
On Hidden Gems
Google is continuously working to improve its algorithm so it can better identify and surface these “hidden gems” for users, ensuring that not only high-ranking pages get discovered, but also sites with genuinely valuable and unique content.
In Google’s search algorithm, Hidden Gems usually refer to pages that, despite not ranking highly or having much traffic, offer exceptionally high-quality and useful information for users. These pages may not have high rankings due to lack of SEO optimization or enough links, but their content is invaluable for specific queries or user needs. It seems Google is actively taking steps to uncover these sites.
On Parasite SEO
In early 2024, Google began manually penalizing Parasite SEO behavior and has no plans to implement automated penalties in the near future.
In Parasite SEO, attackers typically find vulnerabilities on legitimate websites (such as government, university, or large brand sites) and exploit these vulnerabilities to upload or embed their own content. This content may include links, spam, or other SEO targets. Google seems cautious about rolling out large-scale penalties.
On AI-Generated Content
Google made it clear that the key issue with low-quality content isn’t whether it’s AI-generated, but whether it’s mass-produced. If Google detects that content is mass-produced for ranking purposes (whether generated by AI or manually), it will be considered a violation of their policy on scaled content.
Google repeatedly emphasizes that it is more concerned with whether content is valuable to users, regardless of how it is created. AI can produce highly valuable content (For example, a blog generated using Blogcard — Sorry for the marketing; it is indeed my product.), just as humans can generate large amounts of low-quality content.
The answer is obvious: think about what your users want and write for your users, not just for ranking purposes.
This concludes the key information summary (and personal thoughts) based on Barry Schwartz’s interview with Danny Sullivan.
Sullivan’s insights give me the impression that he is much like the Head of Operations in the TV show Westworld. Just as you cannot fully control the behavior of the “hosts” in the show, you also can’t be certain about how Google’s indexing results are generated. They have set many rules, but the answers need to be found in the real world.
Interview original text: Search Engine Roundtable