How to Create Robots.txt file in WordPress:
Always we talk about SEO for blogs or websites. Robots.txt file is an important factor of search engine optimization. Because this file block search engine bot to crawl important or specific part of your website. However sometime wrong configured robots.txt file may let your existence completely go away from search engine. As we above say that, it’s an important factor to optimize robots.txt file so, We should keep remember few things. Only block those pages which you want don’t index in search engine. When Google crawler come on web pages ,then it first read robots.txt file instruction. Robots.txt file should be exist on your server root directory.
SEO Consist of hundred of element and robots.txt is one of the necessary part of SEO. Mostly webmaster tend to bypass edit this file. Because most these people’s don’t be aware about this factor editing. I suggest mostly, Be practice yourself always and no need hard knowledge for this file editing. Everyone do that with simple and basic knowledge. Firstly if you are using SEO plugins on your WP blog like Yoast SEO or All in One SEO So, Additionally robots.txt option exist in one of these. But by default you need to create simple notepad and rename with robots.txt then upload it on your root directory via FTP Client. Check my blog robots.txt file screenshot.
Above mentioned robots.txt file display which i created for my blog.
For WordPress users robots.txt file by default exist on your root directory. And for other static websites, you need to create simple text document on your desktop rename with robots.txt then upload on root directory via FTP Client or direct File Manager option from Cpanel Both are accessible. You can disallow category or tags pages for duplication activities on search engines. Must remember every bot not honor robots.txt file some nasty bots will even read the robots.txt to find which pages and directories they should mark first for search engine. So,Excellent webmasters always optimize robots.txt file.
How to make robots.txt file:
As I earlier mention, no need a hard skill to build robots.txt file. Just create simple notepad file and rename with robots.txt then upload to website root directory via FTP Client (FileZilla). Next we need to know that, Every robots.txt file contains records every records consist on specific command for search bots.
User-agent: googlebot These lines allow to Google bot to index every page of this website instead of cgi-bin folder on root directory and category or tags pages. As we above mentioned, Category or tags pages are may become the cause of duplication on search engine that’s why we suggest to add disallow category or tags pages, Just target your main post pages. Where we add
disallow with any folder or page, This means search engine are not able to crawl this specific part of your website.
User-agent: * Here some question arising, On internet not a single search engine because if you want more traffic from multiple search engines so you need to add these lines
User-agent: * and allow all search engines to index your website.
Somethings are Highlighted:
Somethings are highlighted while creating robots.txt file, don’t add free spaces between them like
User- agent: * Because these terms are prohibited in robots.txt, Beware of these movements. Secondly don’t use comments in robots.txt file. Alphabetical Upper case lower case character case-sensitive that’s why beware during writing folders name and pages name: like if you want index “Images” Directory write “images” instead of capitalized form.
Create WordPress Robots.txt:
After processing all above instructions now you being able to add robots.txt file in root directory via FTP Client. Mostly users use Robots Meta plugin because its easy to use and accessible from WordPress user interface. Here is a useful tip for everyone, you can mention your sitemap URL in your robots.txt file because its easy for Bots to find your sitemap path then bots faster indexing your website pages. You use my blog robots.txt file. Just replace your website name in sitemap URL.
How we check robots.txt response:
Here is common question arise, After done everything “How we analysis that our robots.txt file really gonna be work” And we need to know that, Here is any effect of updated robots.txt file. Let’s check with Google webmaster tool >>Fetch as Google Under Crawl Option.
Fetch as Google is a finest tool to check out the response of robots.txt file. If your provided URL success then it can be accessible from bots on the other condition return failure it’s not accessible for bots. Hope this article will help you to make better robots.txt file. For more tips and tricks of SEO subscribe our Newsletter.