Robots.txt is a visual editor for Robot Exclusion Files and a log analyzer software. It allows a user to quickly and easily create the robots.txt files required to instruct search engine spiders, which are parts of a Web site are not to be indexed and made searchable by the general Web public and then to identify spiders, which do not keep to those instructions. The program provides the user a way to log onto his FTP or local network server and then select the documents and directories which are not to be made searchable. By means of this program you will be able to visually generate industry standard robots.txt files; identify malicious and unwanted spiders and ban them from your site; direct search engine crawlers to the appropriate pages for multilingual sites; keep spiders out of sensitive and private areas of your Web site; track spider visits and more. Free program`s updates and upgrades unrestricted in time; unlimited number of Web sites to work with.
||Free to try
||Windows 98/Me/NT/2000/XP/2003 Server