Blog » SEO / Organic Traffic » Why Your Site Isn’t Indexed by Google – 8 Most Common Reasons

Why Your Site Isn’t Indexed by Google – 8 Most Common Reasons

November 4, 2019 Natalia Dyatko 7 min read
Natalia Dyatko

A popular SEO joke states that the best place to hide a dead body is on the second page of Google results. Sure, not ranking high enough is a common problem among websites owners. But you might also encounter an even more serious issue: what if your site doesn’t appear in search results at all?

Just imagine: you’ve worked for weeks or months on a new site, and you really like the result. It’s informative and well-designed, so you expect a good amount of traffic once you launch. And yet that traffic doesn’t materialize.

You try not to panic and run some searches that should lead to your site. Once again, it’s not there – not on the first page, not on the second, not even on the fifth.

What happened? Does Google hate you? Is your site somehow invisible? Have you done something wrong when coding?

But first, let’s go over some indexing basics.

Crawling vs indexing vs ranking: understanding the difference

In order to index a site, Google first needs to find it. The search giant uses special programs (called spiders) to crawl the web. A spider starts from a site it already knows. If it contains any links – external or internal – the bot crawls them, too. If they, in turn, contain links to other sites, the spider looks at those, and so on. This goes on 24/7, on billions of pages every day.

Whenever Google discovers a new page, it tries to understand what it’s about. It “reads” the title, headings, meta descriptions, content, pictures, videos, and so on. This information is added to the index – the complete list of web pages used by Google search.

Thus, when a user runs a search, Google doesn’t look for results on the web – it analyzes the index. It ranks all the relevant results in accordance with a complex (and secret) formula – and voila, your get your SERP (search engine results page). If you’d like to learn more about how Google Search works, watch this video.

Is your site really not indexed?

It can happen that your page is indexed but doesn’t appear in the first few result pages. To see which is your problem, run a simple test:

type site:your-site’s-address in Google’s search bar.

Don’t forget to insert your website’s URL, obviously.

8reasons1

Note the number of indexed pages. If it looks right to you, then your site is properly indexed. Keep in mind that not all pages of a site must be indexed, however.

If your page is indexed by Google but you still get little or no traffic, the problem could indeed lie in your SEO.

But if you discover that some or all of your pages don’t appear in the search, something must be wrong with the indexing. Read on to find out the possible reason. Don’t worry: whatever went wrong, it’s probably easy to fix.

Reason 1: your site is too new

How long should it take for Google to index a new site? For most sites, the waiting period ranges from 4 days to 1 month. Some people report getting indexed faster, but there is also anecdotal evidence of indexing taking months.

If your site is only a few days old, wait for a week or so, then run the test again. If you are still not indexed, try to ask Google to do it for you.

Solution: Fetch & Render

Google Search Console has a great tool called URL Inspection.

8reasons2

Source: Google Help

There, you enter the address of any page on your site or your whole domain, see their current index status, and request indexing. Google will send its bot to crawl and index your site, but it’s not an instant or guaranteed fix. It can easily take days, but since it’s so easy and free, why not try it?

Reason 2: You don’t have a sitemap

A sitemap is a document in the XML format that helps Google crawl and index your website. A sitemap contains detailed information about all the pages, when they were updated, what they contain, etc.

8reasons3

Source: Statcounter

In theory, Google should be able to index your pages even without it. But if your pages aren’t well linked to each other – or if there aren’t many external links to your site – the normal crawling algorithm might not work so well. This is especially true for new sites. So I do recommend that you make a sitemap and supply it to Google.

Solution: Sitemap generators

If your site lives on WordPress, you’re in luck. You can use a simple automatic plugin like Google XML Sitemaps or Sitemap Generator to generate the XML file you need. Google fill find it and read it.

For other types of sites, you will first need to use a third-party tool, such as the XML Sitemaps Generator, Online XML Sitemap Generator, a sitemap generator extension for Chrome, and others. Once you have your .xml file, you’ll need to submit it to Google through the search console.

Reason 3: “noindex” tags

The way your page is coded could be blocking Google from crawling it. It’s known as “noindex” tags, and they are often added while a page is still under construction. It makes sense: if your website is still not ready, you wouldn’t want anyone to stumble upon it, right?

You’ll know that your page is made invisible to crawlers if its section contains something like this:

– this second tag means that Google won’t be able to crawl your site, but other search engines (such as Bing) will.

Solution: check your pages’ sections and remove any “noindex” bits you find. You can read more about this issue here.

Reason 4: Canonicalization errors

The verbs “to canonicalize” or even “to canonical” can seem clumsy – and they are. But these are verbs that you need to know. A canonical version of a page is the one that Google treats as the main one or preferential if it encounters duplicate content. Perhaps you’re thinking, “but there’s no duplicate content on my site!”. Well, think again. Is there just one way to reach your homepage? For the site you are now reading, for example, there are several:

… and others. Any of these will work and look exactly the same – but for Google, they are four different pages with duplicate content! And if your site is translated into other languages, the problem iexacerbated – you’ll have even more “identical” pages.

A canonical tag is found in the section of a page and looks like this:

Solution: run the Google URL Inspection tool. It will tell you if there’s anything wrong with the canonical tags on any of your pages. In case of a problem, you’ll see something like this:

8reasons5

Source: Google Help

Still in the Search Console, you can set the correct canonical URL to each page. It might take a bit of time if you have a large site, but it’s time well invested.

Reason 5: A recent switch to HTTPS

Google actively discourages sites that still don’t have an SSL certificate. So if your site still features a “http” URL, you should definitely switch to HTTPS – especially since you can get a certificate for free. However, after you do, you might find that your pages aren’t indexed anymore!

The culprits are once again the canonicals. You see, the transition will make all the URLs on your site change from http://… to https://… But if you had your old HTTP addresses set as canonical, they will still be there – though the pages themselves have gone!

Solution: First you’ll need to make sure that you are the verified owner of each and every version of your main page. Then, you’ll be able to add the HTTPS property in the Search Console, so that your new SSL-enabled page versions are preferred. More info is available here.

Reason 6: Your site is too big for your Crawl budget

The crawl budget isn’t about real money. In fact, you can’t pay Google to index your site any faster. Rather, it’s about the search engine’s resources – which are huge but not infinite. It has to crawl millions of sites per day, so if your site has hundreds of pages, it probably won’t crawl them all. And if you keep generating new URLs, Google definitely won’t be able to keep up.

Solution: in the Search Console, you’ll find your Crawl Stats, though it will probably take you some time to make sense of the crawl budget statistics.

8Reasons6

Source: Hackernoon

But what you really want to do is increase it – that is, make Google crawl your site more. The best way to achieve that is by link-building. If there are lots of links on high-quality resources pointing to your website, Google will conclude that your site must be good, too. In this guide you’ll learn how to build quality external links.

Reason 7: Orphaned pages

As I’ve explained in the beginning, Googlebot discovers new pages by following links from other pages or sites. So if a page isn’t referenced anywhere else, the bot won’t find it. It’s called an orphan page.

Solution: Google Search Console only lets you test individual pages. To check your whole site for orphans, you can use a plugin like the Screaming Frog SEO Spider, Yoast, or SEMRush.

Reason 8: Hosting issues

The problem may lie not with you but with your hosting provider. If Googlebot happens to visit your site while it’s down, it won’t crawl anything. Similarly, if the page loads too slowly, the bot will leave. In either case, it won’t be indexed.

Solution: Speed and uptime matter a lot to SEO in general – and different hosting companies differ a lot in this regard. You can see how your provider compares to the industry leaders using a web hosting ranking tool. If you find that your hosting company leaves much to be desired, simply switch to a new one. Changing your hosting provider isn’t difficult at all – and it’s not expensive, either. In fact, you might catch a promotion from one of the leading companies and pay even less than before.

This isn’t an exhaustive list of the reasons why Google fails to list your site. For instance, you could get penalized for doing something nasty. But in the vast majority of cases, one of the solutions on my list will work for you. Just be patient and try them one by one until the problem is fixed. As an additional bonus, you’ll learn to use Google Search Console, understand how the search works, and even practice some coding. After all, having a website means continuous learning.

Make your landing page SEO friendly!

Start using smart tools!
Natalia Dyatko

Natalia Dyatko

Natalya Dyatko is a freelance writer and content marketing/SEO specialist. Follow her on Linkedin and Twitter for more SEO tips, tutorials, and analysis.

Are you looking for the hottest landing page tips?

Enter your email to get updates straight to your inbox.

Subscribe
We are using cookies for analytical purpose. Learn more.