If Google doesn’t index your website, then you’re pretty much invisible. You won’t show up for any search queries
Given that you’re here, I’m guessing this isn’t news to you. So let’s get straight down to business.
This article teaches you how to fix any of these three problems:
- Your entire website isn’t indexed.
- Some of your pages are indexed, but others aren’t.
- Your newly-published web pages aren’t getting indexed fast enough.
What is crawling and indexing?
Google discovers new web pages by crawling the web, and then they add those pages to their index. They do this using a web spider called Googlebot.
Confused? Let’s define a few key terms.
- Crawling: The process of following hyperlinks on the web to discover new content.
- Indexing: The process of storing every web page in a vast database.
- Web spider: A piece of software designed to carry out the crawling process at scale.
- Googlebot: Google’s web spider.
Here’s a video from Google that explains the process in more detail:
When you Google something, you’re asking Google to return all relevant pages from their index. Because there are often millions of pages that fit the bill, Google’s ranking algorithm does its best to sort the pages so that you see the best and most relevant results first.
The critical point I’m making here is that indexing and ranking are two different things.
Indexing is showing up for the race; ranking is winning.
How to check if you’re indexed in Google
Go to Google, then search for
This number shows roughly how many of your pages Google has indexed.
If you want to check the index status of a specific URL, use the same site:webranktop.com/domain-authority-checker operator.
No results will show up if the page isn’t indexed.
Now, it’s worth noting that if you’re a Google Search Console user, you can use the Coverage report to get a more accurate insight into the index status of your website. Just go to:
Google Search Console > Index > Coverage
Look at the number of valid pages (with and without warnings).
If these two numbers total anything but zero, then Google has at least some of the pages on your website indexed. If not, then you have a severe problem because none of your web pages are indexed.SIDENOTE. Not a Google Search Console user? Sign up. It’s free. Everyone who runs a website and cares about getting traffic from Google should use Google Search Console. It’s that important.
You can also use Search Console to check whether a specific page is indexed. To do that, paste the URL into the URL Inspection tool.
If that page is indexed, it’ll say “URL is on Google.”
If the page isn’t indexed, you’ll see the words “URL is not on Google.”
How to get indexed by Google
Found that your website or web page isn’t indexed in Google? Try this:
- Go to Google Search Console
- Navigate to the URL inspection tool
- Paste the URL you’d like Google to index into the search bar.
- Wait for Google to check the URL
- Click the “Request indexing” button
This process is good practice when you publish a new post or page. You’re effectively telling Google that you’ve added something new to your site and that they should take a look at it.
However, requesting indexing is unlikely to solve underlying problems preventing Google from indexing old pages. If that’s the case, follow the checklist below to diagnose and fix the problem.
Here are some quick links to each tactic—in case you’ve already tried some:
- Remove crawl blocks in your robots.txt file
- Remove rogue noindex tags
- Include the page in your sitemap
- Remove rogue canonical tags
- Check that the page isn’t orphaned
- Fix nofollow internal links
- Add “powerful” internal links
- Make sure the page is valuable and unique
- Remove low-quality pages (to optimize “crawl budget”)
- Build high-quality backlinks
1) Remove crawl blocks in your robots.txt file
Is Google not indexing your entire website? It could be due to a crawl block in something called a robots.txt file.
To check for this issue, go to shentharindu.com/robots.txt.
Look for either of these two snippets of code:
1 User-agent: Googlebot
2 Disallow: /
1 User-agent: *
2 Disallow: /
Both of these tell Googlebot that they’re not allowed to crawl any pages on your site. To fix the issue, remove them. It’s that simple.
A crawl block in robots.txt could also be the culprit if Google isn’t indexing a single web page. To check if this is the case, paste the URL into the URL inspection tool in Google Search Console. Click on the Coverage block to reveal more details, then look for the “Crawl allowed? No: blocked by robots.txt” error.
This indicates that the page is blocked in robots.txt.
If that’s the case, recheck your robots.txt file for any “disallow” rules relating to the page or related subsection.
Remove where necessary.
2) Remove rogue noindex tags
Google won’t index pages if you tell them not to. This is useful for keeping some web pages private. There are two ways to do it:
Method 1: meta tag
Pages with either of these meta tags in their
<head> section won’t be indexed by Google:
This is a meta robots tag, and it tells search engines whether they can or can’t index the page.SIDENOTE. The key part is the “noindex” value. If you see that, then the page is set to noindex.
To find all pages with a noindex meta tag on your site, run a crawl with Shen e-Services Site Audit. Go to the Indexability report. Look for “Noindex page” warnings.
Click through to see all affected pages. Remove the noindex meta tag from any pages where it doesn’t belong.
Method 2: X‑Robots-Tag
Crawlers also respect the X‑Robots-Tag HTTP response header. You can implement this using a server-side scripting language like PHP, or in your .htaccess file, or by changing your server configuration.
The URL inspection tool in Search Console tells you whether Google is blocked from crawling a page because of this header. Just enter your URL, then look for the “Indexing allowed? No: ‘noindex’ detected in ‘X‑Robots-Tag’ http header”
If you want to check for this issue across your site, run a crawl in Shen e-Services Site Audit, then use the “Robots information in HTTP header” filter in the Page Explorer:
Tell your developer to exclude pages you want indexing from returning this header.
3) Include the page in your sitemap
A sitemap tells Google which pages on your site are important, and which aren’t. It may also give some guidance on how often they should be re-crawled.
Google should be able to find pages on your website regardless of whether they’re in your sitemap, but it’s still good practice to include them. After all, there’s no point making Google’s life difficult.
To check if a page is in your sitemap, use the URL inspection tool in Search Console. If you see the “URL is not on Google” error and “Sitemap: N/A,” then it isn’t in your sitemap or indexed.
Continue the next article >>> Comin Soon