Be Confident! Face Custom Robots.txt for BlogSpot? Positively.

Robots.txt provides to put off web crawlers work with web robots from accessing all or part of a website if it is publicly view able. Essentially a robots.txt file on a web site will serve as a request that specified robots ignore specified files or directories when crawling a site.

On the off chance that you utilize the stage Online journal, robots.txt record can make yourself by means of document chief in cpanel, however in the event that you are blogspot client then there are steps that should be before actuating robots.txt (crippled as a matter of course).

1. Login to your Blogger account and go to your blog dashboard.

2. Click on Settings >> Search preferences


3. Edit your Custom robots.txt the choose Yes to enable custom robots.txt content


4. Then fill with the following command

User-agent: Mediapartners-Google
User-agent: Googlebot
Disallow: /?m=1
Disallow: /?m=0
Disallow: /*?m=1
Disallow: /*?m=0
User-agent: *
Disallow: /search?*
User-agent: Twitterbot
User-agent: *
Disallow: /search
Disallow: /p/*
Disallow: /view/*
Allow: /

 Note: Green letters replace with your own blog

5. Save your jobs

For websites with several sub domains, each sub domain must have its own robots.txt file. If had a robots.txt file but sub. not, the rule which would apply for would not apply to a.

If you liked this post, click here and enter your Email address to follow my advancement.

No comments:

Post a Comment

Related Posts Plugin for WordPress, Blogger...

More From Webworld