We frequently mention them, since they represent the compass to which all SEO professionals should refer in order to implement fair optimization tactics that are allowed for the ranking competition on the search engine, but perhaps it is time to make a more specific focus to avoid running into mistakes: so today is the time to talk about the Google guidelines, the document that reports all the SEO techniques and strategies that are in line with the expectations of the search engine and that, therefore, do not expose us to the risk of penalties, manual actions or any violation sanction.

What are Google guidelines

There is not really such a thing as a single set of Google guidelines, because the search engine has various documents with tips on how to program a site so as to make it suitable for viewing on Google, as you can see from this overview.

In particular, we can rely on four broad categories of guidelines, which apply to all websites that Google has added to its index:

  • General guidelines

best practices that support a site to be viewed on Google and gain visibility positions in SERPs.

  • Content specific guidelines

or additional suggestions related to specific content types on the site, such as images, videos, AMP, AJAX and sites optimized for mobile devices.

  • Quality standards

which describe specific prohibited technical that may lead to the omission of a page or site from the Search results: if these practices are implemented – which fall within the defined ones of Black Hat SEO – the site may be subject to manual action. In this article we have covered the full list of violations that may cause manual actions by Google.

  • Instructions for webmasters

Basic guidelines to achieve sustainable SEO results, using techniques in line with Google’s expectations to avoid an algorithmic penalty or a total manual penalty, which in severe cases can lead to the complete de-indexing of the site from Google SERPs.

Fully understand the guidelines (all) is the only way to avoid potential missteps and future damage to the site, and then we immerse ourselves in the in-depth guidance for SEO that comes directly from the search engine.

What are the Google guidelines for webmasters

We start from the last category indicated, Google’s instructions for webmasters, which are a collection of general best practices, which can facilitate the visibility of a site in Google Search, and quality standards that, if not met, may cause the omission of the page or the entire site from Search.

These guidelines then define the actions that webmasters can take to improve the indexing or scanning of their websites, and at the same time report the procedures that Google considers violations of these instructions, which may result in heavy penalties.

The Bing search engine also has its own webmaster guidelines, which are roughly based on the same principles as the Google webmaster guidelines.

What the webmaster instructions say

The guidelines are based on three central aspects: webmasters should support Google in crawling (searching and scanning websites), as well as helping Google to classify and “recognize” content (indexing). Third, webmasters should take care of the user’s usability and experience, thus supporting users in the navigation and use of their pages.

In essence, according to the Google webmaster guidelines, a website should help the search engine find pages, understand pages (following all the rules related to content), as well as simplifying the use of pages for visitors ensuring a good user experience. Webmasters must first make sure to design pages for users, not for search engines; avoid tricks to improve the ranking in search engines and, in general, make the site unique, valuable, engaging and valuable, offering quality content that is easy to understand and navigate for people.

Goals easier said than done, because creating websites that fully adhere to the Google Webmaster Guidelines is a challenge, which can only be undertaken by fully understanding the directions (and prohibitions) of Google.

In general, what we can and must do is to comply with the General Instructions to allow Google to find, index and position our site, and subsequently pay attention to the Quality Standards and, therefore, avoid illegal practices, like these black hat SEO techniques, which may involve the complete removal of a site from the Google Index or a manual or algorithmic antispam action on the site.

What is the goal of guidelines

This document serves to clarify to all sites the rules of the game for the ranking on the search engine, specifying the framework of the rules valid for a compliant optimization for search engines: If you follow this established, you can expect that your website will not be penalised and that your content can consistently achieve a good positioning.

The broader purpose of the guides, as noted by Searchmetrics, is in the interest that search engines have to link to “good” sites in search results: the better the sites that are referred to in Serps, in fact, the more satisfied the users are, and in turn this satisfaction increases confidence in the search engine, which can thus increase its reliability even in the eyes of advertising investors.

Main indications of Google guidelines for SEO

Let’s look now at the advice (or guidelines) that Google reserves for webmasters and SEOs, and that allow us to understand what it means to be a quality site in the eyes of the search engine.

The general guidelines ask for:

  • Simplify the structure of URLs as much as possible.
  • Avoid creating duplicate content.
  • Create links that can be scanned.
  • Qualify outbound links for Google (attributing the correct rel tags), also the subject of the recent spam link update, which threatens sanctions against spam links in guest posts and inadequately marked affiliate content.
  • Verify that Googlebot has not been blocked.
  • Coding a site for processing sites or services intended for minors.
  • Ensure compatibility of the browser.

To these are added another set of more specific indications, as highlighted by Brian Harnish on Search Engine Land, which we will now deepen further.

Making resources completely crawlable and indexable

In order to help Google fully understand the content of the site, the guidelines suggest allowing “the scan of all site resources that would significantly affect the display of pages, such as CSS and Javascript files that affect the understanding of pages”.

Google’s indexing system displays web pages as users would see them, including CSS files, Javascript and images.

Webmasters often block CSS and Javascript via robots.txt even for conflicts with other files on the site, while other times they present problems when they are fully rendered: according to Google, however, we must not block these resources because all the elements are essential to ensure that the crawler fully understands the context of the page.

Having pages linked by another page that can be found

Google recommends that all pages of our site receive at least one link from another page (so as to avoid the presence of orphan pages), which can come through the navigation menu, breadcrumbs or contextual links (internal links).

These links should also be crawlable, since it ensures a great user experience and helps Google to easily scan and understand the site; you must also avoid using generic anchor text to create these links, while we can use keyword phrases to describe the outgoing page.

The architecture of the site can help to strengthen the topical relevance of the pages, especially when organized in a hierarchical structure thanks to which Google can better understand them and also improve the knowledge of the topics covered.

Using a reasonable quantity of links

There is no maximum links share per page and, in general, it is not possible to define a specific amount of links to always follow: also in this case, you have to think about the quality of the links and think about the utility for users, without adversely affecting their experience.

Google’s guidelines recommend using a “reasonable” number and indicate that you can also have a few thousand (at most) links on a page (before it was said not to exceed 100); “It is not unreasonable to assume that Google uses amounts of links as a spam signal,” says Harnish.

Creating a useful site, full of information

Google expressly says that in order to rank in SERP you need to create “a site that is useful and rich in information, with pages that clearly and accurately describe the contents of the site”.

This means that we must avoid thin content, thin content not in terms of mere word count, but the quality of the content, depth and extent of the topics covered in our pages, which are the factors that define the value of the pages.

Including keywords that users search for

Again, Google’s guidelines expressly require you to think “about the words that users might type to search for your pages and make sure they are included in the site”.

It is here that an effective keyword research comes into play, which must first consider the intent of the potential customer and his point within the funnel, to try to intercept it with appropriate content.

Once the keyword search phase is over, we must work on the on-page optimization process usually, and make sure (at least) that each page of the site mentions the target query of that same page.

We cannot do SEO without a keyword research and targeting for effective keywords, summarizes Harnish.

Projecting the site with a very clear hierarchy of pages

As already mentioned before, a hierarchical structure – an SEO silo – is the one that can offer the best opportunities in terms of SEO, because it allows you to organize the topics of the site in main topics, under which are located secondary ones and so on.

There are two schools of thought on this indication: the first advises the siloing, and then the creation of a clear hierarchy of conceptual pages that covers in breadth and depth the topic, so as to demonstrate also the specialization in the eyes of the search engine. According to the other theory, you should not stray from a flat architecture, which means that any page should not be more than three clicks away from the home page.

In general, these two theses can coexist, although perhaps the SEO Siloing presents a cohesive organization of pages and topical discussions and therefore is often preferred.

Making visible the important content of the site by default

Google also asks for support to simplify the scanning of content: while being able to scan hidden HTML content within browsing elements such as tabs or expandable sections, the search engine “considers this content to be less accessible to users and therefore considers it appropriate that you make the most important information visible in the default page view”.

In practice, we are advised not to use buttons, tabs and other navigation elements to reveal this content. Even tabbed content (content divided into tabs) is among those less accessible to users, because only the first tab is fully viewable and visible to users until they click on the tab to move to the next and so on, and this for Google is an example of limited accessibility.

The right approach to Google guidelines

These instructions tell what Google asks site owners and efforts to ensure search engine users the highest quality of the results shown in SERP; the two big issues are spam (which is still pretty strong, as the latest Webspam Report 2020 tells us) and poor content, but there are also many other negative aspects from which Google invites us to stay away. Although, the document clarifies, there are other misleading behaviors not listed that can cause consequences for the sites and “it is not advisable to assume that Google approves a page just because it has not used deceptive techniques”.

For those who work in the field of SEO, the guidelines for webmasters of Google must be interpreted, precisely, as guidelines and not necessarily as rules to follow slavishly: it is clear, however, that any behavior to the limit (or even beyond the limit) could result in a penalty or manual action, if Google notices the breach and considers it serious.

However, not all violations of the guidelines involve penalties, because some errors cause problems with scanning and indexing, which can then affect the ranking. If we are in critical situations of this kind, we have available a number of techniques to clean up the site, solve problems and ask Google for a reconsideration, which could restore the status of the site.

And so, ultimately, trying to comply with Google’s guidelines in the SEO strategy should, although with longer times than aggressive and reckless techniques, produce positive results in terms of ranking: this fair approach, furthermore, it can help us build and maintain a more stable online presence in Google Serps and not fear the sword of Damocles of a manual action or an algorithmic devaluation always hanging over our pages.