Google and Bing, the connections and false myths between the two search engines

Put us to the test

Put us to the test!

Analyze your site
Select the database

After a year from the first season, is back on Youtube the series “SEO Mythbusting” in which Martin Splitt discusses with a guest of SEO related issues and search engine activities. In this first episode they talk about the relationship between Google and Bing, two of the world’s leading search engines, and address various issues, both technical and practical, related to the work of website optimization for these two different tools, but not too much.

Google and Bing, two closer and closer engines

Indeed, as the Googler says these two engines “are not isolated” and neither are the teams that work there, which are committed to improving the search for all users and webmasters around the world.

As proof of this, the protagonist of the episode is Sandhya Guntreddy, Principal Program Manager at Microsoft, and the two quasi-colleagues exchange “personal experiences of work in this industry and in this community”.

How to improve the ranking, the most asked question

The first part of the video focuses on the most frequently asked questions about SEO received in the course of the respective work, from which emerge some rather common “false myths“.

prima parte del video è incentrata sulle domande più frequenti sulla SEO ricevute nel corso del rispettivo lavoro, da cui emergono alcuni “falsi miti” piuttosto comuni.

The first concerns – inevitably – the role played by backlinks to rank: both on Google and Bing, in fact, many think that all it takes is to receive links to increase the ranking, but “this is just one of the factors, along with many others”, reminds Sandhya Guntreddy. Also because – adds Splitt – if backlinks were the only signal used by search engines “the game would be simple: you buy a lot of links and get to the top positions in a moment”.

Instead, ultimately, the goal of search engines is to meet the needs of users and what they really seek, trying to meet these needs with the content shown by the sites, using hundreds of factors to provide the best answers. For this, the best advice provided by both is to “think of the users first” and then optimize the technical aspects of the site.

The false myth of ads correlation

If “how do I reach the highest positions of the search engine” is the most widespread question to the public voices of both search engines, the false myth that they have to most frequently refute concerns the correlation between organic ranking and online advertising. As for Google Ads and Search, even on Bing “there is an organic ranking and an ad system that has different positions, which are two worlds neatly and strongly separated, that have no relationship“, explains Sandhya Guntreddy.

The evolution of the international SEO community

Splitt and Guntreddy also discuss broader topics, such as the relationship with the international SEO community (which annoys them through every type of channel, even Instagram, they joke), which they say is becoming increasingly technical and attentive to technical SEO issues.

However, according to the Principal Program Manager of Bing often webmasters and site owners “forget the fundamentals and the reasons why they do this job”. A specific case is the block settings in robots.txt, often used without due cognition, with the result of preventing the indexing of pages by search engines, and equally frequently the most common reaction is to accuse search engines of penalizing or hiding the site from SERPs.

Bing’s tools for crawling

The speech on blocking settings then introduces another topic, the one concerning crawling with Bingbot: often webmasters complain about the load of scanning requests coming from bots of search engines, and then Bing has studied a series of solutions.

For example, with the URL submission API plugin sites can notify the search engine when the content is updated or created, allowing a quick (and almost instantaneous) discovery, scanning and indexing of the pages of the site. It is a fast “push” (and not pull) type solution to require scanning, and apparently it works really well.

Moreover, the Microsoft engine has also developed a crawl control tool, with which a webmaster has the power to control the frequency with which Bingbot makes requests for pages and resources on its website. In practice, you can set the speed in each of the 24 hours of the day, so as to limit the activity of Bingbot in peak hours (when on the site there are visitors) and leave free a greater bandwidth to the crawler during the quieter hours.

Le similitudini e i falsi miti su Google e Bing

The collaboration between Google and Bing

Always with a view to proactive (and non-competitive) collaboration, it is also possible to integrate the webmaster tools of the Google Search Console with the Bing Webmaster Tools, so as to simplify operations and above all avoid delays and waste of time with separate verification processes.

SEO and JavaScript, the Google and Bing approaches

Martin Splitt then introduces a theme dear to him, namely the relationship between SEO and Javascript, defined as “the most expensive resource on the sites, even more than images and videos”, which “you can pass more or less as they are provided, while with Javascript is not possible“.

On one hand, the Googler says, “people do really great things with Javascript”, but on the other hand would not want them to “only rely on JS”; similar are the concerns of the Bing’s colleague, who explains the approach of the other search engine.

That is, says Sandhya Guntreddy, the first step is to try to understand “the intent of the site and the reason why it uses Javascript”, because often the sites use it in an exaggerated and unmotivated way, risking only to waste resources that are limited. Therefore, Bing also invites you to use Javascript responsibly and look for a balance point.

The issues with JavaScript

Continuing the reasoning, Martin Splitt says that “a client-side-rendered application in Javascript is not problematic in itself, is only more expensive. And the search engine does not have any suggestions in advance on the page, cannot understand its intention, does not see anything until the code is executed”. One possible solution would be to first use a site rendering and hydration service system, which “sends a bunch of initial content to the browser, and then employ Javascript to improve it for the user, giving you a balance”.

The problem with third-party scores and tools

The last part of the chat focuses on another problem common to both Google and Bing, namely the fact that very often users rely too much on third-party tools and invented scores to assess the quality of a site.

The reference is, once again, to Domain Authority and other “fantastic and delightful metrics” (as Splitt ironically defines them): the error is not in using these tools – webmasters need numbers and data, says Sandhya Guntreddy – but in limiting himself to reading these metrics without interpreting them, limiting oneself to usually dwell on only one aspect and on a single ranking factor among the existing hundreds.

The value of the search intent

More generally, according to Bing’s exponent there is a basic misunderstanding: “I think many people don’t really understand that there is an intent query and think only about what there is on the service side and what they have in the Index”, forgetting and neglecting “the search intent behind the queries, which is what we have to answer to”.

Ultimately, then, we need to change approach: instead of just looking at a number given by a random tool, we must “try to understand what our users need, what is the intention and best answer to that search intention“, because what counts on search engines – both Bing and Google – is “to serve users and provide them pages with very nice and clean intentions”.

Call to action

Iscriviti alla newsletter

Try SEOZoom

7 days for FREE

Discover now all the SEOZoom features!
TOP