ChatGPT is a chatbot launched by OpenAI is a great tool for generating text to be used in many ways. But does it fall under "Spam" according to Google's own policy?
Yes… Probably. At least we think so.
Let’s dig a little into what’s going on here.
ChatGPT is a chatbot launched by OpenAI in November 2022. It is built on top of OpenAI’s GPT-3 family of large language models and fine-tuned with supervised and reinforcement learning techniques.
How could this impact my website’s SEO? There are a few reasons. Let’s take a look at how SEO has evolved over the years.
meta tags are HTML tags used to provide additional information about a page to search engines and other clients. Clients process the
meta tags and ignore those they don’t support.
meta tags are added to the
<head> section of your HTML page (Source: Google)
There are many
meta tags that Google supports.
These are just a few examples of
meta tags Google supports.
If you’ve been in the website world for a while, you likely remember some other popular
meta tags such as “keywords”.
The meta-keyword previously allowed website developers to pack a ton of “keywords” into this
meta description to help it place in searches for keywords that may or may. not have anything to do with the website and its content.
Google stopped using
meta keywords back in 2009 as a ranking factor because they were overused and spammed by SEO professionals and marketers.
You may be wondering what this particular
meta tag has to do with ChatGPT. Let’s take a look at the Spam policies for Google web search.
Spammy automatically-generated content
Spammy automatically generated (or “auto-generated”) content is content that’s been generated programmatically without producing anything original or adding sufficient value; instead, it’s been generated for the primary purpose of manipulating search rankings and not helping users. Examples of spammy auto-generated content include:
- Text that makes no sense to the reader but contains search keywords
- Text translated by an automated tool without human review or curation before publishing
- Text generated through automated processes without regard for quality or user experience
- Text generated using automated synonymizing, paraphrasing, or obfuscation techniques
- Text generated from scraping feeds or search results
- Stitching or combining content from different web pages without adding sufficient value
If you’re hosting such content on your site, you can use these methods to exclude them from Search.
The reason is that the quality of auto-generated content generated by different types of scrips has been of very low quality and mainly contained keywords and been aimed at manipulating Google’s search results.
That’s why Google has been trying to detect this type of content to remove it from the SERPs and keep the integrity of its search results.
The take was that by focusing on high-quality, human-generated content Google could provide a better experience for its users and maintain the credibility of its search engine.
How Does Google actually rank results?
E-A-T — or expertise, authoritativeness, and trustworthiness — is one of the most important parts of the Google algorithm but is not the final, definitive judge of content quality. Google now actually employs human search quality raters to verify the algorithm’s results.
For example, once a piece of content has been uploaded and has been indexed by Google, the algorithm will make an assessment based on its quality (we, of course, don’t know how exactly that assessment is made). However, a search quality rater will then review the content themselves, and make a decision on whether it contains “strong” E-A-T before it is ranked.
These decisions are made based on the Search Quality Evaluator Guidelines (SQEG), a detailed PDF document that is publicly available.
What Are the Key Ranking Factors in the Google Search Algorithm?
Meaning and Intent
Within Google’s search algorithm, understanding and clarifying the meaning and intent of the search query is the key first step. The mechanisms that enable this are, again, a secret, but we know that it allows the search engine to understand:
- The scope of the query.
Once the algorithm has understood the meaning and intent of the query, it then looks at the Google index to identify which pages offer the most relevant solution to it. This is where on-page SEO is important, as one of the most basic signals of relevance is if your page contains the same keywords as the search query (especially if they are in your headings).
“We suggest focusing on ensuring you’re offering the best content you can. That’s what our algorithms seek to reward.” -Google, 2019
Google has not officially announced that they are detecting whether the content is written by AI vs a real person.
But in the discussion there have been around AI-generated content, even before ChatGPT was introduced, they have stated that they are not per se against AI content but rather focusing on whether it is helpful content or not.
So what does this mean for SEO? We’re not completely sure yet, but history tends to repeat itself. Google’s SEO takes into account hundreds of factors, with a focus on relevant and quality content. Will the increased use of AI-generated content eventually cause your search ratings to drop? Much like the depreciation of the
meta “keywords” tag, we won’t be surprised if Google started parsing the content to identify this AI-created content and rank it lower than what it considers to be more likely “human” generated.
*or will Google ramp up the use of its own AI content generator and allow content generated from that to be used while penalizing content generated elsewhere? Only time will tell.