Screaming Frog SEO Log File Analyzer is designed for Seoers. Example Check the URLs of the topics you have opened and check if your website has accessed any bot accounts. It can also analyze and store the data of log files in your database. With these tools, you will be able to:
- Crawl URLs – See and analyze exactly which URLs were crawled by search bots, such as Googlebot and Bingbot.
- Crawl frequency – Analyze the most frequently & at least crawled URLs by searching for bot user agents.
- Full event data – Access the full log file event data for each URL detected in the timestamp log.
- Error – Identify client-side error, such as broken link and server error (response code 4XX, 5XX).
- Redirects – View permanent & temporary redirects (response 302 or 301).
- Inconsistent response code – See at a glance at URLs with inconsistent response code over a period of time.
- Last response time – See exactly when the search bot last crawled the URL (and for the first time, as well as every other event!).
- Average Bytes – Analyze the average byte of each URL collected directly from the log file event data.
- Average response time (ms) – Explore the average response time for each URL.
- Who introduced – See the number of events to introduce people to each URL is detected.
- Directories – Analyze directories and sections that are crawled most often and at least by the website.
- URLs not crawled – Enter a list of URLs and discover URLs that have not yet been crawled.
- Orphan URL – Enter a list of URLs & discover in log data but you don’t know.
- Analyze Bots over time – Upload multiple log files at the same time or over time, to analyze and measure bot activity.
- Compare any data – Upload any data with the title ‘URL’ to automatically match log and analysis file data.
- Verify Search Bots – Automatically verify search bots like Googlebot and see IP spoofing requests.
See the new version feature: https://www.screamingfrog.co.uk/log-file-analyser-3-0/