I try to pay attention to the different coverage issues in the Google Search Console. One thing I have been battling and losing is it keeps showing my outgoing links to casinos as "indexed but blocked by robots.txt" All my link are /go/betonline, or /go/5dimes etc etc. I have always blocked these url's in my robots.txt file and the redirect them to the actual linking url in my htaccess file.
So why does Google still says they are indexed but blocked by the robots file. If they are blocked shouldn't Google know not to index them? I have even put <meta name="robots" content="noindex"/> in the /go/5dimes/ files hoping that would help.
I know it is probably not a big deal, but I just wish they would quit showing as "Valid with Warnings" in my coverage section.
Thanks,
Travis


LinkBack URL
About LinkBacks
Reply With Quote


