Sunday, July 17, 2016

Get Custom Robots.txt Code for your Blogger blog

First of all you need to know what is custom robots.txt and how to use it.  So here I am going to explain that why custom robots.txt is necessary for your blog.  Custom robots.txt are used for guidelines and rules forward to crawls. Its function to provide the guidelines to crawl that how and which pages is to be indexed. So It plays a very important role in indexing your site or blog into search engine such as google.
Blogger has a very easy interface to use regarding posts and SEO (Search Engine Optimizations). It allows you to set your custom robots.txt for your blog. Here I am going to show that how can you set or get custom robots.txt for your blog. Follow these Steps

1) Copy The Following Code

User-agent: Mediapartners-Google
Disallow: 

User-agent: *
Disallow: /search
Allow: /

Sitemap: http://toolsmeadow.blogspot.com/sitemap.xml

2) Change The URL 

Change the URL to your own.


3) Set in Blogger 

Now you have to go to your blogger dashboard then Click on Setting > Search Preference and there you will see the Custom Robot.txt option. as shown below
get custom robots.txt for blogger

So you need to click on edit and check the yes options. And put the code which you copied in notepad and click on save buttion

No comments:

Post a Comment