Google Search Console (GSC) sometimes provides “Discovered - currently not indexed” as a reason for a webpage not being indexed in Google’s search results. This notification means that Google’s crawlers have located a URL but haven’t yet crawled it or indexed it in search.
If your website is new or small, it’s normal for this to be a reason that your webpages aren’t yet indexed. Google’s crawlers will dedicate only so many resources to crawling your website and indexing your pages. However, if you’re regularly running into these issues when you publish new pages, there may be other issues you need to address.
There are a number of things you can do to make Google crawl and index your website faster. In this article, we’ll provide a step-by-step process for getting your “Discovered - currently not indexed” webpages indexed.
Step 1: Request Indexing
For websites with a relatively small number of “Discovered - currently not indexed” pages, a good first step is to request that Google index your pages. You can do this directly in GSC.
To get started, inspect the URL in question by clicking on the row and then clicking on Inspect URL on the right-hand side:
Then click on Request Indexing after the URL has been inspected:
It might take a minute or two for GSC to test the URL. After it’s finished, you’ll see a message confirming that indexing has been requested:
Amazing! While this process is easy, it’s often a bit finicky. Google will allow you to submit only a certain number of URLs for indexing per day, typically 10 to 15.
In my experience, you’ll often need to request indexing multiple times for certain pages. If this happens to you, know that there is likely a bigger issue that you need to address.
Step 2: Improve Internal Link Structure
An internal link is simply a link from one page on your website to another. Internal links are a critically important component of technical SEO. And internal links are important in signaling to Google which pages on your website should be indexed.
For Google, internal links communicate what pages on your website are important. Moreover, internal links help Google understand how your pages are interconnected and related to one another.
If you’ve recently published a large amount of new content, chances are that these pages don’t have many internal links pointing to them from other pages on your website. Or even worse, some of these pages might be orphaned, or have no internal links pointing to them.
If your pages still need to be indexed after you request indexing in GSC, the best next step is to place internal links to these pages on your website’s other pages. You can add contextual internal links in the body of other pages — for example, in blog posts. You can also add internal links to these pages from your website’s navigational elements — for example, the header or footer section.
At Positional, our Internals toolset finds missing internal link opportunities across your website, on both existing pages and the new pages you’re creating. Positional can be helpful for internally linking your new pages and indexing them faster, but you can also do this process manually.
As a best practice, you shouldn’t force an internal link. Instead, do what’s best for the website visitor and place internal links only where they’re helpful. When creating internal links, you’ll want to use targeted anchor text to further communicate to Google what your linked page is about and what keywords it should rank for in organic search.
If you have a very large website, you’ll want to ensure that your site architecture is clear and that it’s easy for Google’s crawlers to navigate your pages.
Step 3: Removing Crawl Budget Friction
In a 2021 Google SEO office hours hangout, John Mueller from Google’s search team was asked directly about “Discovered - currently not indexed” issues — he was asked whether this issue was typically caused by a website’s crawl budget — that is, by the amount of resources Google was dedicating to crawling the website and indexing it. John stated that for most smaller websites, this wouldn’t be the case; however, for websites with a very large number of pages — millions of pages — he said it could be an issue.
If you’re managing a large website, there may be a few things you can do to reduce friction and increase your crawl budget. If you manage a small website with only a few hundred or a couple thousand pages, you can skip this section and move on to Step 4.
- Reduce redirecting internal links: Have you recently 301 redirected a page, but still haven’t replaced the old versions of this URL when internally linking? Reducing the number of 301 redirecting internal links can help.
- Duplicate versions of your pages: Are you serving nearly identical copies of your pages in multiple places, for example, on both www. and non-www.? If so, you should pick a primary location of your pages and redirect the non-primary version to it. The same goes for HTTPS and HTTP.
- Speed things up: If you use resource-heavy themes or plugins, you might consider removing them to improve server response time and page loading speed, which impact crawling. In addition, consider using a content delivery network (CDN) as a way to improve server initial response time.
Step 4: Improve Content Quality
Content quality issues are a common reason for Google discovering but not indexing your pages.
In the previously mentioned Google SEO office hours, John was asked whether content quality issues on a website might cause the “Discovered - currently not indexed” issue. He replied that Google likely wouldn’t be determining an individual URL's quality and deciding not to index it, but said that Google looks at a website’s overall quality — and that if there are large amounts of low-quality pages, Google’s crawlers may not want to index pages from the website.
John reiterated that small websites shouldn’t focus on technical issues but instead on what could be quality issues if a large number of their pages are failing to index.
There are a few different types of content quality issues:
AI-Generated Content
Companies publishing large amounts of low-quality AI-generated content may face indexing challenges.
Google has made mixed and nuanced statements about using AI-generated content. It’s clear that there’s a place for AI-generated content, but if you’re using AI in your content creation process, you should use it as a starting point in the editorial process and not for mass publishing large amounts of content to your website — or, as Google would say, using AI-generated content as a means to manipulate search results.
Google is increasingly asking website publishers to create uniquely valuable content. There are many ways to make a piece of content uniquely valuable — for example, by using expert quotes, first-party data, high-quality graphics, or videos.
Suppose you’ve published a lot of AI-generated content and are experiencing indexing issues. In that case, you should go back to that content and improve it or delete it altogether, as it may be causing Google to judge your website as low quality.
Also, if you’ve published a large amount of AI-generated content to a subdomain as a way to reduce the risk of this content impacting the perceived quality of your primary domain, know that Gary Illyes from Google’s search team has recently stated that Google looks at sitewide signals to determine quality, which could include content on a subdomain.
Thin Content
Thin content is low-quality content that provides little or no value for searchers. From Google’s perspective, showing thin content in search results would reduce the quality of its experience for its customers (the searchers).
Thin content is often described as content with too few words on a page. While that is one type of thin content, there are other types of thin content, including low-quality programmatically created content, AI-generated content, duplicate content, doorway pages, and overly promotional content.
As with many things in SEO, quality is more important than quality. If you’ve published many pages that aren’t helpful, these pages might be considered thin content.
And if you have a large number of thin pages, that might lower the perceived quality of your entire website to Google’s algorithms; as a result, Google might be reluctant to crawl and index new pages on your website.
Duplicate Content
Duplicate content is content that is the same as content either on your website or on a third-party website. Duplicate content can cause a number of SEO issues, including indexing issues and keyword cannibalization.
If you have a large number of pages with duplicate content or significantly duplicative content, that may cause Google to perceive the quality of your website as low. If you’re running a programmatic SEO strategy, you may have substantial amounts of similar content from one page to another.
Suppose you’ve copied content from one page to another to move faster and are experiencing indexing issues. In that case, you’ll want to return to those previous published pages and make them substantially unique.
Step 5: Improve Domain Authority
Domain authority is a measure of the strength and quality of external backlinks pointing to your website. New or smaller websites will generally have low domain authority, whereas large and established websites will often have hundreds or even thousands of backlinks and, as a result, a high domain authority.
Domain authority is often referred to as PageRank, which was the original name that Google used to describe the quality or strength of backlinks pointing to a website.
By increasing your website’s domain authority, you can signal to Google that your website is an authoritative and high-quality resource. Google wants to index authoritative and high-quality websites.
By building backlinks to your website, you can positively impact the speed and rate at which Google indexes your new webpages.
There are many different types of backlinks and many different ways to build backlinks. As a general rule, quality is always more important than quantity. You’ll want to build backlinks from reputable and authoritative websites in your industry. You don’t want to buy a large number of backlinks from low-quality websites or user-generated websites. And you’ll want to be very careful if you’re working with a link-building agency to do this work for you.
Final Thoughts
If you are running into the “Discovered - currently not indexed” issue in Google Search Console, you are not alone. Newer and smaller websites often run into this issue.
As a starting point, if you have a low number of these pages, you can manually request indexing within Google Search Console.
If your pages still aren’t indexed after you request indexing, there is likely a more significant issue with your website’s content. Start by focusing on internal linking as a way to show Google which pages you care about.
From there, you’ll want to be very critical of your pages' quality. If you’re using large amounts of AI-generated content, have a large number of thin pages, or have run into a duplicate content issue, know that these issues are fixable. After fixing these issues, your pages should be indexed appropriately.
Building backlinks and increasing your website’s domain authority can also positively impact indexing and the rate at which Google crawls your website.
And for very large websites, with tens of thousands or millions of pages, you can make your website more efficient for Google to crawl.
At Positional, we’ve built a number of tools for content marketing and SEO teams, including our Internals toolset for internal linking, which is particularly helpful for internally linking existing pages on your website and the new content you’re creating. We have a number of other tools for everything from content optimization to duplicate content detection and analytics.