Results 1 to 4 of 4
  1. #1
    TravG's Avatar
    TravG is offline Private Member
    Join Date
    September 2008
    Posts
    2,069
    Thanks
    34
    Thanked 179 Times in 131 Posts

    Default Coverage Issues in Google Search Console

    I try to pay attention to the different coverage issues in the Google Search Console. One thing I have been battling and losing is it keeps showing my outgoing links to casinos as "indexed but blocked by robots.txt" All my link are /go/betonline, or /go/5dimes etc etc. I have always blocked these url's in my robots.txt file and the redirect them to the actual linking url in my htaccess file.

    So why does Google still says they are indexed but blocked by the robots file. If they are blocked shouldn't Google know not to index them? I have even put <meta name="robots" content="noindex"/> in the /go/5dimes/ files hoping that would help.

    I know it is probably not a big deal, but I just wish they would quit showing as "Valid with Warnings" in my coverage section.

    Thanks,

    Travis
    Live Casino USA - the best USA live online casinos. Play USA online slots and other casino games like USA online blackjack. Play at USA online casinos and find the best USA online casino. Want to play USA online poker? Find the best poker sites at 4DeucesPoker.com.

  2. #2
    ddm
    ddm is offline Former Member
    Join Date
    July 2006
    Posts
    1,125
    Thanks
    418
    Thanked 470 Times in 287 Posts

    Default

    if you remove them from the search index [via gsc] then they will be removed when goog next tries to recrawl them but cannot due to robots directive.

  3. #3
    MrFWebmaster is offline Public Member
    Join Date
    February 2019
    Posts
    14
    Thanks
    0
    Thanked 3 Times in 3 Posts

    Default

    ensure the links pointing to these URLs have the attribute rel="nofollow" as well

  4. #4
    Michael Martinez is offline Public Member
    Join Date
    March 2017
    Location
    USA
    Posts
    79
    Thanks
    9
    Thanked 55 Times in 31 Posts

    Default

    Quote Originally Posted by TravG View Post
    So why does Google still says they are indexed but blocked by the robots file. If they are blocked shouldn't Google know not to index them? I have even put <meta name="robots" content="noindex"/> in the /go/5dimes/ files hoping that would help.
    Google has always indexed URLs even if it could not crawl them. You used to see them all the time and we called them "URL-only listings". While they don't appear in search results much any more Google still "knows" about them. They are found on other pages that Google crawls.

    These reports don't imply you have a problem on your site. You don't need to do anything about them.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •