Submitted url has crawl issue

Submitted URL has crawl issue : Error Analysed and Solved

The Google Search Console indicates the error SUBMITTED URL HAS CRAWL ISSUE when the search engine bot fails to crawl on any webpage. 

If you are reading this blog post, I can assume you have connected your website with Google Search Console which means you already know how Search engines work during crawling and indexing.

Step 1 ➡️ Connect website with Google through sitemap submission

Step 2 ➡️ Google bot collects website data from sitemap

Step 3 ➡️ Google algorithm ranks SEO-friendly websites on search results page

As you can see from the above roadmap, Google indicates crawl errors while collecting website data from sitemap (step 2). 

It is very important to eliminate crawl errors to maintain good website health. This is why Google has designed the Search Console tool to identify crawl errors and take necessary actions. 

In this tutorial, I am going to share my experience on how I analyse the crawl errors and take appropriate action to them.

To make this more understandable, I want to share my “submitted URL for crawl issue” report from Google Search Console as a reference. Have a look into the screenshot below.

Google search console coverage report - submitted url has crawl issue

Crawl issues can occur for several reasons. I will explain those with different types of URLs with multiple issues I had faced. When I clicked on the first URL (see screenshot below) there were two different options.

  1. INSPECT URL
  2. TEST ROBOT.TXT BLOCKING
Coverage report

I will go with the TEST ROBOT.TXT BLOCKING method first where I want to check if the webpage has any error or blocked on Google.

Normally, I check webpage errors first and then proceed for the INSPECT URL method. But you can do as per your requirement.

Now click on the TEST ROBOT.TXT BLOCKING option to check errors.

And here you go. The ROBOT.TXT file tester is indicating that the webpage is error-free and allowed for Google Bot.

Robot.txt tester

Now we will proceed for the next method using INSPECT URL tool.

The INSPECT URL tester will help us to understand if the webpage was processed through SITEMAP or not.

Normally, most of our web pages are automatically indexed by SITEMAP if it is connected with the Search Console. But I have seen it a couple of times, somehow some of the webpages fail in this process.

As a result, we have found a list of URLs in the “submitted URL has crawl issue” section.

Url inspection

As you can see from the above screenshot, the URL is not on Google. This is why I am getting crawl errors. If Google fails to index automatically, then we need to complete this job manually.

Before working on the manual method, we need to ensure if the URL really deserves to be crawled or not.

Let me explain this to make it more understandable.

Scenario 1: Got a crawl error for URL available on the website

Scenario 2: Got a crawl error for URL is not available on the website

For the first scenario, we should go for manual indexing. But the second scenario will resolve using the URL removal tool. Both of these methods are directly connected with Google. 

Well, let’s get into the the tutorial.

As shown in the above screenshot, click on the right-corner arrow to open the link in a new tab. In my case, the link has opened with an image file.

I have used the above image in one of my blog posts. Unfortunately, the Google bot failed to crawl through this image. This is why I am going to notify Google about this image manually. 

Back to the earlier paragraph where I have used INSPECT URL tester and found that URL IS NOT ON GOOGLE: INDEXING ERROR.

On this page, you can find the option for REQUEST INDEXING. Just click on it and wait for a couple of seconds. Google will confirm once the indexing request has been accepted by them. 

Google indexing request
How to Remove 404 Pages from Google Index:

Moving forward to the scenario 2 on how to remove 404 pages from Google index.

404 error

As per the standard process, Google bot crawls a webpage automatically when it gets published. Once we remove any webpage, we need to inform Google otherwise it will continue to crawl on a regular basis. As a result, we get SUBMITTED URL HAS CRAWL ISSUE error.

Search Console offers a FREE TOOL called REMOVALS to inform Google about deleted webpages. To use this tool, log in to your new Google Search Console account and click on REMOVALS on the left sidebar. (Refer to the above screenshot)

Now copy URL from SUBMITTED URL HAS CRAWL ISSUE list and open REMOVALS tool.

➡️ Click on the NEW REQUEST button in REMOVALS

➡️ Paste the URL and click on NEXT

➡️ Click on SUBMIT REQUEST

And it is done. Google will show a pop up to let you know that the request was submitted. Once this is done, the Search Console will remove the URL from the crawl error list.

Subscribe
Notify of
0 Comments
Inline Feedbacks
View all comments
0
Would love your thoughts, please comment.x
()
x
Share via
Copy link
Powered by Social Snap