Don’t you want to know what is the purpose of Custom Robots.txt and how to set up it on Blogger? You’re in the right place.
In this article, you will learn what is Custom Robot.txt, How to set up a perfect Custom Robots.txt in Blogger, and How to add sitemap in Blogger using Custom robot.txt.
What is Custom robots.txt? Website owners use the Custom robots.txt file to inform web-robots which URLs, directories, and pages of the website to be crawled.
Web Robots are also known as Spiders or Crawlers, are programs that traverse/crawls the web automatically. Search engines such as Google, Bing, etc, use them to index the web content.
When a web-robot visits your website it goes to custom robots.txt and checks which pages, directories, folders or URLs are allowed to be crawled because sometimes we don’t want robots to crawl unnecessary pages or non-public folders and directories like admin side URLs or directories.
Below is the Custom Robot.txt code for your Blogger blog. You have to Copy this code and Paste into Custom Robots.txt, as shown below in How To Set Up Custom Robots Txt In Blogger.
In Sitemap replace “example” with your Blogger subdomain name like “https://techguroo.blogspot.com”.
- The “User-agent: *” means this section applies to all robots.
- The User-agent: Mediapartners-Google & Disallow allows Google AdSense ads to display on any page of your blog.
- The Disallow: /search means ignore all those URLs having keyword /search/ like https://www.example.com/search/label/Blogger.
- The allow directive is used to override disallow directives in the same robots.txt file. Here Allow:/ instructs web-robots that they can crawl any Page/URL of website except disallowed.
- A Sitemap is an XML file that lists your blog’s pages, making sure Google and other search engines can find and crawl them all.
- How to Set Up Blogger Settings | A to Z
- How to Pick a Right Blogger Template?
- Introduction to Theme Customize Settings in Blogger
How To Set Up Custom Robots Txt In Blogger?
In this section, I will guide you on how to set up Custom robot.txt in Blogger.
- Sign in to your Blogger account.
- Go to Sidebar and click Settings.
- Scroll down to Crawlers and Indexing Section.
- Turn on the Enable custom robots.txt by dragging the slider towards right.
- Click “Custom robots.txt“.
- Now Copy & Paste the Custom Robot.txt code here.
- Click Save.
Now it is time to test your Custom robots.txt file. Add /robots.txt at the end of your blog URL and press enter, like:
It will show you the same code that you had added in the Custom robots.txt.
- Get access to all our Blogger Tutorials.
- If you like this post then don’t forget to share with your friends. Share your feedback in the comments section below.