Which one is the right length for a content? This is one of the “million dollar questions” of the SEO, one of those topics that cyclically returns to the center of the debate and that polarizes the positions, between those who think that the word count is a crucial estimate, to be observed almost religiously, and those who instead minimizes its weight. We also talk about this in the sixth episode of SEO Mythbusting season 2, indeed dedicated to the content and the many false myths that surround this topic.

False myths about content and word count

To keep Google’s Martin Splitt company is Lily Ray, SEO director of Path Interactive, and together the two experts dispel the myths as “more content equals better quality” or if the word count is a ranking factor, and offer advice on what to do in the event of low-performance content and more.

And it is exactly Ray to open the episode, asking what is the best strategy for a publisher who publishes regularly on the same topic every year, with only slightly different content: better to create new articles or update the old ones?

How to manage routine content

Lily Ray cites a concrete example to clarify the issue, that of a site that publishes an article to describe “a certain type of beauty treatment, and speaks in 2017 and 2018 and 2019”. So, what is the best practice recommended by Google? “Take that content and update it every year, or publish three different pages on the same topic“?

According to Splitt, the optimal answer is that in principle – unless the news is distorted – it would be better to work on updating the already published contents and avoid publishing articles on very similar topics or that essentially say the same thing, also because this is likely to be seen as a case of content duplication by Google, which will operate a canonicalization regardless of the settings indicated by the publisher.

To avoid this problem, therefore – which could lead to only one article “visible” and others relegated to simple reproductions discarded by the search engine – it is advisable to undertake to update the existing page, intervening with modifications to the sentences to make them more current, and especially reposition it better to make it more visible on the site, maybe approaching the home page.

How much content we need according to SEO?

We then move on to deal with another hot topic of SEO, which is the minimum amount of content you need to have good performance on Google. It’s always Lily Ray who introduces the speech, bringing back the point of view of many companies that think it’s necessary to produce a lot of content because “this will help classify for a lot of different keywords”, and so they decide to “publish a new blog post every single week, to the point that their site has thousands of blog posts that however do not offer positive performance”.

So, the question is “how many contents should I have on the site and to what extent does this really help my performance”?

The Google Developer Advocate responds by inviting (everyone) to step back and remember what is really essential: providing useful information to users. So, to determine how many contents are necessary for this purpose “depends a bit on what you’re doing”.

On the practical front, “if you’re a news site, then yes, it’s helpful to cover up as many events as possible, but if your website is about a specific product, then there’s not much you can say about it” and forcefully stretching things, article after article, does not help much.

The regular production of many contents is particularly recommended for industry blogs, where new information is constantly coming out. What matters is not to publish just for the sake of publishing or for the belief that this can help in some way the performance, because in fact, in this way you only create thin content and disperse the crawl budget, wasting resources on items that do not give results.

Does having a blog and publishing content help with the Google ranking?

Very direct also the next question asked to the Googler: the presence of a blog and the periodic publication of new content are a sort of a side site ranking factor and can help the overall performance?

Martin Splitt first denies that these aspects can be site-wide ranking signals, but then explains that often update the blog with items such as industry news (or otherwise relevant content and useful to visitors to users) can improve reputation among users and thus determine (more or less indirectly) an increase in performance, without changing the ranking or organic performance.

Tips for the management of performing contents

And what to do with the content that works? Lily Ray specifically asks how to manage quality articles over time to prevent them from getting too old: better to make regular updates or put your hand in the text only in case of significant changes?

The position expressed by the Googler is very practical: it is preferable to update these quality content and that (still) give good performances only when there are substantial and particular news, and – “if not changed much” – devote time and work to writing new content linking to this resource, thus linking the old articles to the new ones. This does not have a direct impact on the ranking, but it is useful for site visitors who can thus deepen the topic they are interested in.

Too numerous and under-performing content, how to intervene

The SEO expert then tries to find out if the statistics on Google crawling can be an indication to determine whether a site publishes too much content or whether its pages are under-performing: Splitt is first of all very clear in saying that for Google there is no limit of quantity to publications (he literally states that “there is no such thing as too much content”).

Lily Ray e Martin Splitt sui contenuti e la SEO

Then, he argues that those mentioned are the wrong metrics to have insights on those elements, because the frequency with which Googlebot scans the contents does not indicate that these contents are good or bad or that there are too many.

More targeted for this purpose is the Google Search Console performance report and, if there are many impressions but a few clicks, intervene with improvements, trying first to assess the situation (and the value of the content) from the user’s point of view and from what they intend to obtain from the visit to that page.

The relation between limited content and site authority

Still talking about non-performing content, the next “myth” to debunk is the relationship between many pages with poor performance and the level of trust and authority of the site from the point of view of Google. According to Splitt, everything depends on why the content is not sufficiently performing: for example, if the bad performance depends on the detection of spam or thin content, this can negatively reflect on the whole site, leading to penalties or manual actions.

So, regardless of why the content has insufficient performance, it is always a good idea to do cleaning and assess whether you need to update or remove it to safeguard your total project.

To reinforce and group the contents

A strategy in this sense could be the search for similar subtle content by topic, to be grouped in a single, longer and more in-depth informative article, provided that it makes sense to do so.

It is the advice that Splitt offers – as a direct answer to a doubt of its host – to sites that have numerous pages of very short content, composed “only of a sentence or two” (for example, pages of assistance or help that provide answers to very specific questions), which therefore risk being considered as “unprofitable” by Google and cause a negative impact on search rankings.

So, explains the Developer Advocate of Mountain View, you can try to group these pages by topic and structure them significantly: for example, gather all the questions on a specific range of products, on the resolution of problems or methods of using the product on a “denser and more useful” page.

Consolidating multiple parts of relevant information in one place is something that is positively reflected in Google Search, also because it reduces the crawling load for the bot and offers users useful and relevant content on a single page.

Word count, ranking and SEO: Google’s truth

And so you get to one of the central themes of the episode, the myth of the word count: Splitt is categorical in stating that the word count is not a ranking factor, because the only criterion to refer to is to understand the search intent of the user. “You can say what the user needs to know in 50 words, 100 or 2000, are all absolutely acceptable amounts depending on the real needs of people,” he says, and you should not try to reach a precise quota forcibly, “always repeating the same concepts without any benefit to the reader”.

But what to do when “you have written 500 words on a specific topic or keyword for which you intend to classify yourself, and you see that all competitors are placed with articles of four thousand words?”, asks Lily Ray. While not technically a ranking factor, the word count could therefore be a good indicator that users (and Google) expect longer content, he adds.

Splitt does not express himself directly on the question, simply commenting with a “depends: if you see them at the top, it is not just because they have pages with a high number of words”, and then reiterate that, even in this case, you need to understand “what users need”. So, if a person is looking for a question that can be satisfied with a quick answer, a shorter content should be the best match, and still it makes no sense to extend the length of the content to reach a certain number of words.

Info on self-generated content

Lily Ray has another issue to bring to Google’s attention, regarding the consideration of self-generated content, such as several location pages with virtually the same content on each. In particular, the expert refers to 50 location pages for the 50 American states in which we talk about the business, which is essentially the same for each location, and therefore the content varies for small details (you change the name of the place and maybe you add a few specific details about the location), to know if Google is able to discover these “tricks” and how it evaluates them.

To Splitt, it is a 50/50 situation: “They work or not”, and specifically these types of pages can work if there is at least some unique information relevant to each location, while on the contrary they might not work if the content is too similar.

In fact, changing just “a handful of words and keeping the rest of the content identical, Google might consider one page duplicating the other and not include it in the Index”, and therefore the advice for those in these situations is to invest in the production of pages that are truly unique, relevant and useful, even if similar.

Google and content duplication

The last topic addressed in this episode specifically concerns content duplication and the ways in which Google determines which pages present content “too similar”.

Splitt says he is not sure that there is a real threshold, but explains that Google relies on the fingerprint content system to determine the possible duplication: each page, that is, has its own fingerprint that is used to determine how much one content is similar to another. The Googler also cites “similarity metrics” that Google uses for this task, but does not provide any further specification.

Call to action