fbpx

SEO Log File Analyzer Tool That Increases Googlebot Crawling Efficiency

crawlOPTIMIZER is a new SEO software tool that helps you with optimizing your crawl budget, so that you are able to navigate Googlebot in the most precise and efficient way through your own website.

Log File Analysis Tool from a SEO Perspective

Do you know which content of your website is beeing crawled by Googlebot? No? Then you should take a deeper look into this tool – check it out!

Made in Germany

John Müller

Webmaster Trends Analyst bei Google

“Log files are so underrated, so much good information in them.” (Twitter)

John Müller

crawlOPTIMIZER, a cloud-based tool, analyzes your logfiles and gives you exact information which of your website’s content is being crawled by Google.

 

Based on predefined analyses & evaluations, crawlOPTIMIZER shows you unmet SEO potentials for maximizing crawl-efficiency. You will be able to always feed Google index-updated content as fast as possible. Get more unpaid search engine traffic.

Dashboard crawlOPTIMIZER

crawlOPTIMIZER analyzes & saves log files automatically

crawlOPTIMIZER analyzes Googlebot log files fully automated and files them on a secured server in Germany up to 5 years long. Our simply arranged crawlOPTIMIZER dashboard allows you to monitor Googlebot activities 24/7. Due to the saving of log files and our specially developed logfile explorer even retroactive analyses are possible at any given time.
Ready-to-use evaluations

Predefined analyses & evaluations help even SEO novices to easily discover crawling difficulties.

Server location Germany & 100% GDPR-compliant

Only Googlebot log files will be analyzed & filed – no user related data! All GDPR requirements are met.

About crawlOPTIMIZER

Bastian Grimm, Peak Ace AG

“Log-file analysis has become an essential part of technical search engine optimization. crawlOPTIMIZER and its multitude of predefined dashboards, tremendously eases the process of log file evaluation, especially for newcomers. Fortunately, the tool integrates seamlessly into existing workflows and is also super flexible. This makes the connection of various data sources for the delivery and evaluation of log files fairly easy.”

Bastian Grimm, CEO & Director Organic Search at Peak Ace AG

Stephan Czysch, Dept Agency

“If you have an extensive website and SEO is important to your website’s success, then regular monitoring and optimization of crawling behavior is mandatory. The crawlOPTIMIZER prepares the log files of your server flawlessly and allows you to identify areas of improvement very quickly. By adapting robots.txt following logfile analyzes, we were able to achieve significant traffic gains, especially for large online shops.”

Stephan Czysch, Author of several SEO textbooks & lecturer for SEO

Stephan Czysch | crawlOPTIMIZER

For whom was crawlOPTIMIZER developed for?

Log file analyses made under an SEO aspect offer every digital marketer massive potential for optimization.

1. Audience

crawlOPTIMIZER is for every professional responsible for organic search engine traffic in a company, and who might not have endless time for detailed log file analyses.

• Head of SEO
• SEO-Managers
• Head of Online Marketing
• Online Marketing Managers
• Head of E-Commerce
• e-Commerce Manager
• Webmaster / IT-Admins
• SEO-agencies

2. Websites

crawlOPTIMIZER was specifically developed for websites that contain a few thousand pages or for online ventures that are interested in increasing their unpaid search engine traffic.


• Web shops / online shops
• E-Commerce websites
• Informational websites
• News / weather portals
• Big blogs / forums

Why do I need crawlOPTIMIZER?

A simple explanation of an SEO process

Without crawling there is no indexation and thus no google search engine ranking.

Your website can only be found via Google search, if it has been added to the Google-Index. Because of this Googlebot (also called Spider) crawls every available website worldwide daily searching for new and updated content.

By using various onsite optimization measures you are able to influence the crawling process and further decide which content should be added to the Google-index and needs to be crawled more frequently.

Only webserver log files provide information about which and how often Googlebot crawls content. Because of this a continuous crawling monitoring should become the standard for every online venture. For this purpose crawlOPTIMIZER was developed, true to the mantra: ‘Keep it as simple as possible’

If you want to learn more about this topic, please visit page Google’s Indexing Process

Googles Indexing Process

‘Google-Bot is Google’s web crawling bot (sometimes also called ‘spider’). Crawling is the process by which Googlebot discovers new and updated pages to be added to the Google Index.’ Source: Google Search Console

What makes the crawlOPTIMIZER so valueable?

All features were developed to improve and boost unpaid organic search engine traffic.

1. Google spends money for crawling

“If you use google products, our servers in our data processing centers do the work for you – worldwide and all the time. Our servers support many different products at the same time. This is the so called “Cloud”. By keeping our servers busy, we can do more with less[…]”, source: Google

 

• Google runs 15 data processing centers worldwide

• Operating costs per year are more than $10 billion dollars

• Google spends a lot of money for the crawling process of websites

• Your server resources are also heavy strained by a needless crawl volume

2. Usage & added value

crawlOPTIMIZER offers insights on how Google crawls your website; which content Google is able to crawl and which one it isn’t. This information is crucial, since webpages that aren’t crawled won’t rank and therefore won’t generate organic search engine traffic either.

Examples:


• What does Google use its resources for and where are they wasted?
• Is my content being crawled?
• Which content is not being crawled?
• Does Googlebot have problems while crawling my website?
• Where is potential for optimization?
• and many more…

3. Goals of a crawling optimization

✔ New & edited contents should be included in the Google-index as fast as possible (e.g. indexing new products / title tag & meta description changes / new robots instructions / new or changed text content, etc.)

✔ Only SEO landing pages should be crawled – no waste of Google crawls

✔ Shift of originally wasted Google crawls to business relevant SEO landing pages

✔ An almost complete crawl of the entire website (SEO-relevant pages) in the shortest possible time

✔ detect errors of your own website or infrastructure quickly and eliminate them immediately (for example, 4xx, 5xx errors)

✔ Gentle and efficient use of resources (server infrastructure, traffic, etc.), both internally and with Google

✔ Strengthening the brand

Outcome: increase of your visibility & traffic

 

 

Highlights

Why crawlOPTIMIZER stands out

Googlebot activities

Get to know which pages are crawled by Google. Goal: decide upon measures that are improving the crawling process accordingly.

Complete analyses & evaluations

The dashboard represents the tool’s core with ready-to-use & relevant evaluations. Using it will save you a lot of precious time and money.

Data storage up to 5 years

Finally retroactive log file analyses can be done without any effort! Thanks to log file storing in a save cloud. Easy access 24/7!

Simple & easy to understand

All analyses & evaluations are prepared in an easy and understandable way – even if you don’t have a SEO background.

Fast & easy connection

For the easiest way to access your log files, we have developed many options to choose from. The one-time setup is extremely user-friendly.

24/7 support

If you have any questions, we are always there to help. We’d love to assist you in your initial setup and are more than happy to reach out to your IT or hosting providers.

Unique Features

crawlOPTIMIZER’s core pieces are the dashboard and the log file explorer.

Simple dashboard + KPIs

In order to ease your daily routine, our dashboard contains complete evaluations and KPIs based on your log files. Time-consuming, manual log file analyses via Excel or ELK (Elasticsearch, Logstash, Kibana) will become obsolete. All imported KPIs are made graphically available and can be viewed in the dashboard. Everything is prepared in a way that even people without a SEO background are able to understand them.


• Crawl budget statistics & waste of your crawl budget
• Status codes & crawling statistics
• Crawling of business relevant pages
• Top 50 of crawled URLs & resources
• Top 10 of crawled sitemaps & products
• Crawled types of content, parameters & http(s)
• Different googlebots
• And many more

The log file explorer

Anyone that tried to do log file analyses via Excel knows the typical problems: bigger log files up to a million entries can’t be processed. Due to this issue, professionals have only been able to analyze a snapshot of the Googlebot. Our solution: crawlOPTIMIZER’s log file explorer, where unlimited amounts of data are easily processed.
Processing of nearly unlimited volumes of data

Your log files are automatically stored and archived in our high-performance servers. This allows you to have easy access from anywhere – all the time. Another benefit worth mentioning is that long-term & retroactive analyses won’t be a problem any longer.

Easy and nearly endless filtering options + export features

Our log file explorer offers you various filter options to personalize your individual analyses & evaluations. Just a few clicks are needed to access needed data & information and can be exported as XLSX- or CSV file.

Google Search Console API

You can connect your project(s) to Google Search Console in just one click. This enriches your log data with search data (Clicks, Impressions, CTR, etc.). Yeah!

Google Indexing API

You can connect your project(s) to the Google Indexing API and push/delete URL’s from/to the Google index in seconds.

Logfile Explorer | crawlOPTIMIZER

Data mining thanks to GSC API

Google Search Console API

Prices & packages

You need more than 500,000 requests per day or would you like to use the tool for your agency?
Then request an individual offer. Get in contact now!
MATT
*Trial period of 7 days. Cancel any time before, free of charge

€ 199

monthly + VAT
Properties / websites:
1
Max. of Googlebot requests per day
75,000 / day
Log data saving
24 months
Minimum contract duration:
6 months
✔ Analysis dashboard
✔ Secure German server location
✔ Log file explorer
Google Search Console API
Google INDEXING API
✔ CSV / XLSX Export
✔ Personal setup support
Premium service WhatsApp support
MARTIN
*Trial period of 7 days. Cancel any time before, free of charge

€ 269

monthly + VAT
Properties / websites:
1
Max. of Googlebot requests per day
150,000 / day
Log data saving
24 months
Minimum contract duration:
3 months
✔ Analysis dashboard
✔ Secure German server location
✔ Log file explorer
Google Search Console API
Google INDEXING API
✔ CSV / XLSX Export
✔ Personal setup support
Premium service WhatsApp support
GARY
*Trial period of 7 days. Cancel any time before, free of charge

€ 459

monthly + VAT
Properties / websites:
1
Max. of Googlebot requests per day
300,000 / day
Log data saving
24 months
Minimum contract duration:
3 months
✔ Analysis dashboard
✔ Secure German server location
✔ Log file explorer
Google Search Console API
Google INDEXING API
✔ CSV / XLSX Export
✔ Personal setup support
✔ Premium service WhatsApp support
LARRY
*Trial period of 7 days. Cancel any time before, free of charge

€ 649

monthly + VAT
Properties / websites:
1
Max. of Googlebot requests per day
500,000 / day
Log data saving
24 months
Minimum contract duration:
3 months
✔ Analysis dashboard
✔ Secure German server location
✔ Log file explorer
Google Search Console API
Google INDEXING API
✔ CSV / XLSX Export
✔ Personal setup support
✔ Premium service WhatsApp support

* No risk – 7 days free test trial

After the conclusion of your contract and once we have set up your personal account, you can try the crawlOPTIMIZER 7 days for free. During this trial period, we will import and analyze 35% of your logfiles. We will inform you once your trial period starts. Within these 7 days you can cancel the contract at any time – no commitments. How to cancel? A short e-mail to info@crawloptimizer.com with the reason for your cancellation is sufficient. You won’t be charged for these 7 days of testing. Your contract will be terminated once you agree to cancel your current test-version. Assuming that our tool’s features will satisfy you, and you won’t cancel within these 7 days, the previously concluded contract becomes binding from the 8th day, and we will start importing 100% of your logs.  Let’s rock!

Specials

ZOE (very small business)
*Trial period of 7 days. Cancel any time before, free of charge

€ 79

monthly + VAT
Instead of € 99 monthly + VAT

save now € 20

Properties / websites:
1
Max. of Googlebot requests per day
25,000 / day
Log data saving
24 months
Minimum contract duration:
6 months
✔ Analysis dashboard
✔ Secure German server location
✔ Log file explorer
Google Search Console API
✔ CSV / XLSX Export
✔ Personal setup support
BUSINESS SUITE AGENCY

individual pricing

monthly + VAT
Properties / websites:
starting from 1
Max. of Googlebot requests per day
starting from 500,000 / day
Log data saving
starting from 24 months
Minimum contract duration:
individual
✔ Analysis dashboard
✔ Secure German server location
✔ Log file explorer
Google Search Console API
✔ CSV / XLSX Export
✔ Personal setup support
✔ SEO Consulting:
-20% discount
✔ Premium service WhatsApp support

We only sell to entrepreneurs/companies
Above mentioned prices do not include VAT.
For further information feel free to contact us via email or phone +43 (1) 577 35 49.
Note: The limited Black Week offer is only valid until 15/12/22. The reduced price applies to the entire contract period.

Contact us

Need support or got questions? Get in touch!


    I have read the privacy policy and hereby accept them by clicking the 'Send now' button.


    Further ways to get in touch