Results 1 to 7 of 7
  1. #1
    Jonui's Avatar
    Jonui is offline Public Member
    Join Date
    October 2020
    Location
    Jakarta
    Posts
    16
    Thanks
    3
    Thanked 0 Times in 0 Posts

    Question How to hide / block backlink ?

    Hello everyone...

    i want ask about how to block robots ahrefs/majestic/semrush/moz ?

    My questions is what script to hide/block in .htaccess ?

    thankyou

  2. #2
    robyroy is offline Private Member
    Join Date
    March 2009
    Posts
    161
    Thanks
    7
    Thanked 18 Times in 16 Posts

    Default

    try something like this in robots.txt:

    User-agent: AhrefsBot
    Disallow: /
    User-agent: MJ12bot
    Disallow: /
    User-agent: SemrushBot
    Disallow: /

    Or

    via htaccess:

    RewriteEngine On
    RewriteCond %{HTTP_USER_AGENT} AhrefsBot [OR]
    RewriteCond %{HTTP_USER_AGENT} MJ12bot [OR]
    RewriteCond %{HTTP_USER_AGENT} SemrushBot [OR]
    RewriteRule . - [F,L]

    I don't know what bot use Moz, but i am pretty sure that you can find it

    Hope that helps a little.

  3. The Following User Says Thank You to robyroy For This Useful Post:

    Jonui (17 October 2020)

  4. #3
    artrust99's Avatar
    artrust99 is offline Private Member
    Join Date
    September 2020
    Posts
    76
    Thanks
    13
    Thanked 18 Times in 11 Posts

    Default

    IRON WITHIN IRON WITHOUT
    Ukrainian online casinos
    ru.slots43.com & Cosmolot24 | Best Australian online casinos

  5. The Following User Says Thank You to artrust99 For This Useful Post:

    Jonui (17 October 2020)

  6. #4
    universal4's Avatar
    universal4 is offline Forum Administrator
    Join Date
    July 2003
    Location
    Courage is being scared to death...and saddling up anyway. John Wayne
    Posts
    26,906
    Thanks
    1,851
    Thanked 7,706 Times in 4,858 Posts

    Default

    What I find ironic is that gorilla marketing claims to do something they do not themselves do.

    What they claimed they did in their own robots text file is not what is is in their own robots text file.

    Rick
    Universal4

  7. The Following User Says Thank You to universal4 For This Useful Post:

    robyroy (18 October 2020)

  8. #5
    TheGooner's Avatar
    TheGooner is online now Private Member
    Join Date
    March 2007
    Location
    New Zealand
    Posts
    4,192
    Thanks
    1,908
    Thanked 4,131 Times in 1,961 Posts

    Default

    Here is a little trick I've tried in .htaccess

    RewriteCond %{HTTP_USER_AGENT} !(Google|msnbot|bingbot|facebook|duckduck) [NC]
    RewriteCond %{HTTP_USER_AGENT} (Bot|Spider|Crawl) [NC]
    RewriteCond %{REQUEST_URI} !^bots\.htm
    RewriteRule .* /bots.htm [L]

    That basically says :
    - if it's not Google, msn, bing etc
    - and if it has the words bot or spider or crawl
    - and it's not already looking for a file called bots.htm
    - then send them to a file called bots.htm

    This doesn't give them an error code - just all they ever see is a file called bots.htm no matter what they ask for.
    That hopefully screws up their plans and they don't notice that they've been directed to a dummy file.

  9. The Following 3 Users Say Thank You to TheGooner For This Useful Post:

    casinoportal (18 October 2020), Michael Martinez (18 October 2020), universal4 (17 October 2020)

  10. #6
    universal4's Avatar
    universal4 is offline Forum Administrator
    Join Date
    July 2003
    Location
    Courage is being scared to death...and saddling up anyway. John Wayne
    Posts
    26,906
    Thanks
    1,851
    Thanked 7,706 Times in 4,858 Posts

    Default

    Ha....and if you wanna be mean, make the bots.htm for certain useragents an infinite loop....or at least one that goes on for a few days....lol

    Rick
    Universal4

  11. #7
    Michael Martinez is offline Public Member
    Join Date
    March 2017
    Location
    USA
    Posts
    38
    Thanks
    5
    Thanked 18 Times in 11 Posts

    Default

    Quote Originally Posted by universal4 View Post
    Ha....and if you wanna be mean, make the bots.htm for certain useragents an infinite loop....or at least one that goes on for a few days....lol

    Rick
    Universal4
    Normally I wouldn't mind sending rogue bots on a wild crawl, but an infinite loop would probably cash your server or max out your shared hosting account. But for that, I'd give your half-joking idea a thumbs up (if I could).

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •