Smart Robots.txt Tools allows you to improve and customize ‘robots.txt’ file used by search engines and other types of bots when indexing your website. Through this file you can disallow access to some areas of the website for all or only some bots.
This plugin allows you to use improved basic rules in robots.txt file with few extra predefined rules, or you can create your own custom rules that can include any number of bots user agents and rules to allow or disallow access.
When plugin is installed, it will adopt WordPress search engine visibility settings, and if your website is set hidden from search engines (WordPress uses robots file for this), plugin will be set to Deny All. You can change robots content to anything through plugin settings, but as a convenience, WordPress settings are used on installation only.
Fully customized set of rules in robots.txt
Add rules for different bots / search engines or any custom user agent, allowing or disallowing access to parts of the website, setting the URL to one or more sitemaps and setting crawl delay value.
Predefined or custom content for robots.txt
See the status of robots.txt file, availablilty and enable override through this plugin. Specify use of predefined or custom rules, and add some extra set of rules targeting some common search engines.
Create robots.txt rules targeting any search engine or custom user agent
Add rules for any search engine bot from the list or by any user agent string. Add any number of allow/disallow rules, one or more sitemaps and custom crawl delay. Modify rules for all user agents.
Other Plugin Features Included
- Extra panel to reset plugin settings including all custom rules.
- Easy to use export and import for transferring settings from one website to another.
- Support for Multisite WordPress mode, each website can set plugin on it’s own.
- Support for translation and includes POT file.
System and WordPress Requirements
- PHP 5.2.4 or newer
- WordPress 3.5 or newer
Important about Robots.txt
Before you proceed, there is one very important limitation for this file. This limitation is a specification of the ‘robots.txt’ file and it is not related to this plugin. Robots file must be located in the root for a domain or subdomain, it can’t be used if the website is in subdirectory of the main domain or subdomain. So, if you have installed your WordPress website in subdirectory of main website domain, then you can’t use robots.txt for that WordPress installation. Search engines can load robots.txt only from root of the domain, nowhere else.
Plugin contains PDF user guide in the plugin package, inside the ‘docs’ directory. Check out this documentto get information on plugin options, usage and more.
Version 1.0 / 2014.08.07.
- First release