Google's algorithm secrets leaked: Will SEO change from now on?

A widely recognized collection for machine learning tasks.
Post Reply
tongfkymm44
Posts: 214
Joined: Sun Dec 22, 2024 3:20 am

Google's algorithm secrets leaked: Will SEO change from now on?

Post by tongfkymm44 »

The world of SEO and Digital Marketing has been revolutionized by the alleged leak of more than 2,500 internal documents from the Google Search API , which details the operation and parameters that are taken into account to "position" or rank a website in search results.

To date, the parameters that Google uses in its algorithms to generate rankings and determine which content will rank above others in the results have never been directly declared or detailed by Google.

Everything that was known came from public guidelines provided by the search engine in terms of guidelines, recommendations or good practices, together with data from SEO professionals , who through their results in gambling mailing lists projects or experiments carried out, share information with conclusions about the points that can be considered relevant and their weight.

But these leaked documents , exposed by Rand Fishkin on the Spark Toro blog on May 27, reveal information that contradicts what Google has been telling us in many cases through its guidelines.

Here's my post breaking down the leak's source, my efforts to authenticate it, and early findings from the document trove: https://t.co/nmOD0fd5mN pic.twitter.com/yMxMrSeeLa

— Rand Fishkin (follow @randderuiter on Threads) (@randfish) May 28, 2024

The leak provides a never-before-seen look at Google's internal systems , challenging several public claims the company has made over the years about what parameters or data are or are not used to generate search results rankings.

This information can redefine SEO strategies by fully showing what factors really influence search rankings and leveling the playing field in some way by putting the rules of the game on the table in writing and fully detailed. Although for those of us who have been in the SEO world for many years, these documents will serve to validate the strategies we were already carrying out, researching and understanding the user, identifying what they need and creating the best possible content to meet those needs, along with the analysis and technical optimizations to enhance them. It doesn't seem like there is much new information, but there is confirmation.

Improve your SEO!
And although all the information in the leaked documents still needs to be analyzed and understood in detail, we can say that in the coming days and weeks, many reports will be released that will make everything that has been exposed much clearer and more concise. Surely, tables will soon emerge classifying all the parameters or attributes (more than 14,000) and their weight within the algorithm based on the leaked Google documentation.

For now, we leave you here with the first conclusions from the information that is being extracted from the documents:

Table of Contents
Use of User Clickstream Data
Existence of a Domain or Site Authority score (siteAuthority)
Chrome Browser Data Usage
Existence of secondary algorithms to promote or lower rankings
Content authors and EEA
Links remain an important parameter
Freshness of content
Other references to parameters
Use of User Clickstream Data
According to the documents, the first inconsistency with Google's public information arises here: although Google sources have always stated that user click data is not used in ranking, Google appears to use click data to improve the accuracy of its search results.

This data comes from user behaviors, such as clicks on search results and subsequent navigation.

The leaked Google documentation has references to modules that use user click data and classifications such as “goodClicks,” “badClicks,” “lastLongestClicks,” among others, which are used to display results in the SERP.

This data is linked to systems such as NavBoost and Glue , which improve the accuracy of the results.

Google also apparently filters out unwanted clicks, measures click duration, and uses Chrome view data to calculate metrics and determine the most important URLs.

In the SEO world, this made a lot of sense and was considered a fairly clear way to determine the interest of users in content and to be able to assign quality metrics to it. This is even something that we at SEOCOM Agency do in our projects to be able to measure with concrete data, which content is working well and is of interest to users.
Post Reply