How to unblock Blocked by robots.txt:|Pages that aren’t indexed can’t be served on Google
To unblock pages blocked by robots.txt, follow these steps:
1. Check the robots.txt file: Identify the specific rules blocking search engines in the robots.txt file, usually located at (link unavailable).
2. Remove or modify the rules: Edit the robots.txt file to remove or modify the rules blocking search engines. For example, change "Disallow: /" to "Allow: /" or remove specific URL blocks.
3. Test the robots.txt file: Use tools like Google's Robots.txt Tester or SEO tools to ensure the changes allow search engines to crawl the previously blocked pages.
4. Submit a request for recrawl: Inform search engines about the changes by submitting a request for recrawl through Google Search Console or Bing Webmaster Tools.
5. Verify website ownership: Ensure you have verified website ownership in search engine webmaster tools to submit recrawl requests.
6. Wait for recrawl: Allow search engines time to recrawl and reindex the previously blocked pages.
Additionally, consider:
- Using meta tags: Instead of robots.txt, use meta tags like "index" or "follow" to control crawling and indexing on specific pages.
- Sitemap submission: Submit a sitemap to help search engines discover and crawl all pages.
- Regularly update content: Keep content fresh to encourage search engines to recrawl and reindex pages.
Remember to carefully review and test changes to avoid inadvertently blocking important pages.
That's correct! Pages that aren't indexed by Google cannot be served in Google's search results. Indexing is the process by which Google discovers and processes content, making it available for search queries.
Here's what happens when a page isn't indexed:
1. Google can't find it: Google's crawlers won't be able to discover the page, so it won't be included in search results.
2. No search visibility: The page won't appear in search results, even if it's relevant to the search query.
3. No organic traffic: Since the page isn't indexed, it won't receive organic traffic from Google search.
To make a page indexable:
1. Ensure crawlability: Make sure Google's crawlers can access and crawl the page.
2. Use proper meta tags: Use meta tags like "index" and "follow" to indicate to Google that the page should be indexed.
3. Submit a sitemap: Submit a sitemap to help Google discover and crawl all pages.
4. Regularly update content: Keep content fresh to encourage Google to recrawl and reindex pages.
By following these steps, you can increase the chances of getting your pages indexed and served in Google's search results.
No comments