Google indexing pages despite explicitly blocked in robots.txt

UPDATE: I had always (wrongly) thought that Google wouldn’t index pages that were blocked in robots.txt. But John Mueller clarified this for me (thank you John): robots.txt will control the crawling BUT NOT the indexing. Good explanation of that is here:

—————-

Google is now sometimes indexing pages DESPITE explicitly being blocked in robots.txt:

Google Search Console report showing blocked pages indexed anyways

The screenshot above is from the new Search Console index coverage report, and shows that Google is choosing to index 36 pages that they see are explicitly blocked in robots.txt.