Google makes four changes to the index coverage report

The Google Search Console index coverage report is receiving four updates to keep site owners better informed about crawl issues.

The index coverage report is new compared to other reports that Google offers, as it was first presented when the redesigned version of Search Console was launched in 2018.

Since the launch of the Index Coverage report, website owners have shared comments with Google about the improvements they would like to see in the future.

Changes to the index coverage report, released today, are based on feedback provided by the webmaster community.

“Based on the feedback we’ve received from the community, today we’re rolling out significant improvements to this report so that you’re better informed about issues that could prevent Google from crawling and indexing your pages. The change focuses on providing a more accurate state of existing problems, which should help you to resolve them more easily. “

Advertising

Continue reading below

Search Console index coverage report changes

The list of changes to the index coverage report in Search Console includes:

  • Removal of the generic “tracking anomaly” problem type – all tracking errors should now be mapped to a problem with a more accurate resolution.
  • Pages submitted, but blocked by robots.txt and indexed, are now reported as “indexed, but blocked” (warning) instead of “sent, but blocked” (error)
  • Addition of a new issue: “indexed without content” (notice)
  • Soft 404 reports are now more accurate

The overarching theme of these updates appears to be data accuracy.

There are no more assumptions involved when it comes to tracking errors, as the “tracking anomaly” problem is being replaced by specific problems and solutions.

Site owners will know for sure whether a page indexed by Google is blocked by robots.txt because the report will say “indexed but blocked ”instead of“subject but blocked. ”Submitting a URL is not the same as indexing it, and the report is now updated to reflect this.

Advertising

Continue reading below

Soft 404 reports are considered more accurate and there is the addition of a new problem called “indexed without content”. Let’s take a closer look at this issue if it appears in one of your reports.

Here’s what the Search Console help page says about indexed without content:

“This page appears in the Google index, but for some reason Google was unable to read the content. Possible reasons are that the page may be hidden from Google or may be in a format that Google is unable to index. This is not a case of blocking robots.txt. “

If you find the indexed without content problem, it means that the URL is in the Google index, but your web crawlers cannot view the content.

This could mean that you accidentally published a blank page or that there is an error on the page that is preventing Google from rendering the content.

For more guidance on how to resolve an index without error in content, I advise website owners to run the specific page through Google’s URL inspection tool.

The URL inspection tool will render the page as Google sees it, which can help to understand why the content cannot be viewed by Google’s web crawlers.

These changes are now reflected in the index coverage report. Website owners may see new types of problems or changes in the problem count.

For more information, see the official Google blog post.

Source