Be Confident! Face Custom Robots.txt for BlogSpot? Positively.


Robots.txt provides to put off web crawlers work with web robots from accessing all or part of a website if it is publicly view able. Essentially a robots.txt file on a web site will serve as a request that specified robots ignore specified files or directories when crawling a site.




On the off chance that you utilize the wordpress.org stage Online journal, robots.txt record can make yourself by means of document chief in cpanel, however in the event that you are blogspot client then there are steps that should be before actuating robots.txt (crippled as a matter of course).

1. Login to your Blogger account and go to your blog dashboard.

2. Click on Settings >> Search preferences

Robots.txt


3. Edit your Custom robots.txt the choose Yes to enable custom robots.txt content

Robots.txt


4. Then fill with the following command

User-agent: Mediapartners-Google
Allow:/
User-agent: Googlebot
Disallow: /?m=1
Disallow: /?m=0
Disallow: /*?m=1
Disallow: /*?m=0
User-agent: *
Disallow: /search?*
User-agent: Twitterbot
Allow:/
User-agent: *
Disallow: /search
Disallow: /p/*
Disallow: /view/*
Allow: /
Sitemap: http://blogsnucleus.blogspot.com/feeds/post/default?orderby=UPDATED

 Note: Green letters replace with your own blog

5. Save your jobs


For websites with several sub domains, each sub domain must have its own robots.txt file. If blogsnucleus.com had a robots.txt file but sub. blogsnucleus.com not, the rule which would apply for blogsnucleus.com would not apply to a. blogsnucleus.com

If you liked this post, click here and enter your Email address to follow my advancement.


1 comment:

  1. We are leading digital marketing company in Toronto. Our services include SEO, SMO, Google ads, Facebook ads

    ReplyDelete

Related Posts Plugin for WordPress, Blogger...

More From Webworld