UPDATE: I had always (wrongly) thought that Google wouldn’t index pages that were blocked in robots.txt. But John Mueller clarified this for me (thank you John): robots.txt will control the crawling BUT NOT the indexing. Good explanation of that is here:

—————-

Google is now sometimes indexing pages DESPITE explicitly being blocked in robots.txt:

Google Search Console report showing blocked pages indexed anyways

The screenshot above is from the new Search Console index coverage report, and shows that Google is choosing to index 36 pages that they see are explicitly blocked in robots.txt.

Categories: Technical SEO

Michael Cottam

Michael is an independent SEO consultant, specializing in organic SEO, technical SEO implementation, and Google penalty recovery. Michael lives in Portland, Oregon with his son Ben.


Copyright 2025 OzTech, Inc. All rights reserved.