If you see the "Blocked by robots.txt" issue in Google Search Console, it means Googlebot is unable to crawl certain pages on your website due to restrictions in your robots.txt file. This can affect your SEO performance, especially if important pages are blocked.

- What causes this issue
- How to check which pages are blocked
- Step-by-step method to fix it
What Causes the "Blocked by robots.txt" Issue?
- Blocking search pages: Blogger often blocks label pages (/search/label/) to avoid duplicate content.
- Blocking unnecessary pages: Pages like /feeds/, /admin/, or login pages are often blocked for security and SEO purposes.
- Incorrect robots.txt settings: If important pages are mistakenly blocked, they won’t appear in Google search results.
https://www.creatoryt.in/search/label/Blogging
How to Check if Your Pages Are Blocked?
- Open Google Search Console
- Go to Indexing > Pages
- Find the section "Why pages aren’t indexed"
- Look for "Blocked by robots.txt"
- Click on the affected pages to see the exact URLs
- Go to Google Search Console > Settings > Robots.txt Tester
- Enter a blocked URL and click Test
- If you see a red warning, it means the page is blocked.
How to Fix the "Blocked by robots.txt" Issue?
- Go to Blogger Dashboard
- Click on Settings
- Scroll to Crawlers and Indexing
- Enable Custom robots.txt
- Copy and paste the new robots.txt file
User-agent: * Allow: /search Allow: / Sitemap: https://www.creatoryt.in/sitemap.xml
User-agent: * Disallow: /search Allow: / Sitemap: https://www.creatoryt.in/sitemap.xml
- Open Google Search Console
- Go to Indexing > Pages
- Click on the "Blocked by robots.txt" error
- Click Validate Fix
- Wait for Google to recrawl your site (may take 1-2 weeks)
- If your blog posts or important pages are blocked → Fix it immediately.
- If label pages (/search/label/) or feeds (/feeds/) are blocked → You can ignore it.
Frequently Asked Questions (FAQs)
Q1. What happens if I don’t fix the "Blocked by robots.txt" issue?
If important pages are blocked, they won’t appear in Google search results, reducing your organic traffic.
Q2. How long does it take for Google to update the changes?
Google usually updates indexing within 1-2 weeks after you submit a fix in Search Console.
Q3. Should I allow label pages to be indexed?
Allowing label pages can increase traffic, but it may also create duplicate content issues.