About Robots.txt and how to use custom robots.txt in your blogger blog

Robots.txt is for giving instructions about the website to web robots. in robots.txt files sites, owners can set permissions. about which page is accessible and which page they don't want to show on search engines.

in this post, you will read about what is robots.txt and how to set custom robots.txt in blogger. blogger provide you an option to set your desired robots.txt. in blogger, you can easily set your own robots.txt so let's start. Also Read This: five best and trusted ways to earn money online.

what is robots.txt

About Robots.txt and how to use custom robots.txt in your blogger blog

The Website owners use the /robots.txt file to give instructions
about their site to web robots. this is called The Robots Exclusion Protocol.

How it works??

when a visitor visits your website he opens the web page like http://92tricks.blogspot.com. but when robots scan your web pages they first check your robots.txt file. and they find the following command in your robots.txt file.

User-agent: * Disallow: /

The User-agent: * means this web page is accessible for all robots. The Disallow: / tells to the robots that this page is not for showing in search engines.

There are two important considerations when using /robots.txt:

robots can ignore your /robots.txt. Especially malware robots that scan the web for security vulnerabilities.
 and email address harvesters used by spammers will pay no attention.
the /robots.txt file is a publicly available file.
Anyone can see what sections of your server you don't want robots to use.





if you want To allow all robots to complete access

then you can use the following commands in your robots.txt file.
User-agent: * Disallow:
or just create an empty robots.txt file

To exclude all robots from the entire server

User-agent: * Disallow: /

To exclude a single robot.

use this robots.txt
User-agent: BadBot Disallow: /
if want to allow just one robot to access your site.
use the following code. in this command I allowed Google robots to access my website.

User-agent: Google Disallow: User-agent: * Disallow: /



I hope now you understand about robots.txt if not then don't worry
I am here to teach you.
you can ask me in comments.

how to set custom robots.txt in blogger?


  1. login your google account and go to blogger.com.
  2. select your blog where you are going to set custom robots.txt.
  3. from the blog, dashboard clicks on the settings link.
    About Robots.txt and how to use custom robots.txt in your blogger blog
  4. then go to search preferences.
  5. About Robots.txt and how to use custom robots.txt in your blogger blog
  6. you will see custom robots.txt option in the front of custom robots.tx? option click on the edit link.
    About Robots.txt and how to use custom robots.txt in your blogger blog
  7. check the yes input to enable custom robots.txt.
  8. after checking the yes input you will see a text box just past your desired robots.txt here. and click on the save changes button.
    About Robots.txt and how to use custom robots.txt in your blogger blog
the following robots.txt file batter for your blog so you can use it as your robots.txt.
User-agent: Mediapartners-Google Disallow: User-agent: * Disallow: /search Allow: / Sitemap: http://www.92tricks.blogspot.com/sitemap.xml

Post a Comment

0 Comments