How to best manage Javascripts for our site, and especially how to efficiently doi it for Google and SEO? This remains one of the most critical topics and with the largest shadow areas of online work, and so Google’s John Mueller and Martin Splitt decided to clarify a few issues by answering precise questions during the last #askgooglewebmaster on YouTube.

JavaScript and SEO, Google’s clarifications

The two googlers have collected the questions received sharing the JavaScript theme and tried to provide guidance to webmaster and developers engaged in the optimization of the applications of the programming language for their own site, without adversely affecting the ranking on search engines.

How to manage old assets and cache

The first question immediately leads us into the practical issues: a user asks in fact for advice to manage the old assets when using the Rails Asset Pipeline for caching. In particular, he wonders about the status code to be returned: “Googlebot scans these old resources that we currently put in 404, but is it better to set up a 410 or keep those assets alive for a few months? Is it something that can destroy our crawl budget?”

The answer is entrusted to Martin splitt, Developer Advocate at Google, who says that in fact the issue generally concerns every type of obsolete resource that we want to update. In particular, the Rails Asset Pipeline is a way by which “rails process and pre-process your assets as you have them during the development phase for the loading part of the site”, and so, it is “an automatic pipeline that treats site assets in the rails application”.

Compared to the managing of old resources – like old JavaScript, earlier versions of CSS and images that are no longer loaded into pages – Splitt explains that the best method is to keep these assets still “alive” and “active” until Googlebot scans the current HTML content again and gets the new assets. Immediately deleting the old resources may result in scanning or rendering problems interrupted due to caching, so the advice is to wait a little while, checking on the server logs when the bot stops using these assets and only then deleting them completely.

What to do with irrelevant elements in pre-rendering

The second question is about the pre-rendering aspect, and the user asks if the irrelevant elements can be replaced or skipped, such as the svg graphic bars generated by JavaScript. To answer this time is John Mueller who, from “his point of view”, invites to “always include everything that there is” so that, when Googlebot accesses the pages of the site, can see the whole content.

It is then Splitt’s turn to speak to deepen the topic of the pre-rendering, emphasizing that it is “only” a matter to launch JavaScript server-side when the content changes and then to deploy the static resources to all; in this case, it would not be necessary to “skip” elements. The same answer (to include everything) also applies to the dynamic rendering situation, where users are basically offered a different content from that of crawlers.

I webmasters Google danno consigli su JavaScript

The title tag, chats and JavaScript

Very specific is also the third question under the spotlight: if a site has a chat system that rewrites the title tag to indicate the notification to the visitor, how can we avoid that Google indexes the version of the title dynamically rewritten by JavaScript?

There is not much to do, according to the googlers, if not completely avoid this scenario beforehand: when it renders the page, Googlebot takes the written titles it finds and could then read the unwanted one. The solution could be to hide or delay the chat behind the user’s interaction, because Googlebot does not interact with these things: therefore, an ideal process would be the one that starts with the click on the button, continues with the appearance of the chat popup and then determines the change to the title.

What to include in the pre-rendering?

For the last question we return to the pre-rendering theme, with a user who doubt whether it can still be JavaScript elements, such as those that generate minimal changes to the layout of the content, but not AJAX requests.

Mueller reminds us that with pre-rendering we can basically regenerate those pages in advance and offer them to users, and so “it’s okay to have some JavaScript elements in there too”. This is also true from the user’s point of view, because sometimes you offer contents very quickly if you use pre-rendering and then use JavaScript to add interactive elements.

Therefore, we do not have to delete all JavaScript elements in the pre-render of the pages, but the correct process, as Splitt adds, is rather to use the so-called “hydration“: hydration means running JavaScript on the browser to improve a web page rendered on server-side. This allows you to give the content to the user as quickly as possible.

Thoughts on JavaScript and SEO

Before concluding the video, the two Googlers briefly discuss the importance of JavaScript for SEO and in particular of the forms of rendering: according to Splitt, apart from the dynamic rendering (a gimmick, a temporary solution that should therefore vanish) server-side rendering and pre-rendering are concepts that are meant to remain useful for a long time.

With these systems in fact we can give users and crawlers the contents faster; with the HTML everything can pass as it is, while what is written in JavaScript must first be analyzed and executed before generating content. So, Googlebot is not able to do a simultaneous streaming of content on the browser side, but it does not have to either because of server-side availability.

Call to action