The Log File Analyser is light, but extremely powerful - able to process, store and analyse millions of lines of log file event data in a smart database. It gathers key log file data to allow SEOs to make informed decisions. Some of the common uses include - Identify Crawled URLs. View and analyse exactly which URLs Googlebot & other search bots are able to crawl, when and how frequently. Discover Crawl Frequency: Get insight to which search bots crawl most frequently, how many URLs are crawled each day and the total number of bot events. Find Broken Links & Errors: Discover all response codes, broken links and errors that search engine bots have encountered while crawling your site. Audit Redirects: Find temporary and permanent redirects encountered by search bots, that might be different to those in a browser or simulated crawl. Improve Crawl Budget: Analyse your most and least crawled URLs & directories of the site, to identify waste and improve crawl efficiency. Identify Large & Slow Pages: Review the average bytes downloaded & time taken to identify large pages or performance issues. Find Uncrawled & Orphan Pages: Import a list of URLs and match against log file data, to identify orphan or unknown pages or URLs which Googlebot hasn't crawled. Combine & Compare Any Data: Import and match any data with a 'URLs' column against log file data. So import crawls, directives, or external link data for advanced analysis.
|License||Free to try|
|File Size||137.97 MB|
|Operating System||Windows 8 Windows 2003 Windows Me Windows Server 2008 Windows Windows NT Windows 98 Windows XP Windows Vista Windows 7 Windows 10 Windows 2000|
|System Requirements||Java 8|