Published on 2025-08-07T06:18:08Z

Qwantbot

Qwantbot is the official web crawler for Qwant, a French, privacy-focused search engine. Its purpose is to discover and index public web content to build Qwant's independent search database. For website owners, being indexed by Qwantbot is beneficial for reaching a European and privacy-conscious audience that prefers alternatives to the dominant search engines.

What is Qwantbot?

Qwantbot is the web crawler for the French search engine Qwant. It functions as a traditional search engine bot, systematically browsing the web to discover and index content for its search results. The bot identifies itself in server logs with user-agent strings like Mozilla/5.0 (compatible; Qwantbot/1.0; +https://help.qwant.com/bot/). Qwantbot operates from a range of IP addresses located primarily in France and is designed to respect standard web protocols, including robots.txt directives.

Why is Qwantbot crawling my site?

Qwantbot is crawling your website to index its content for the Qwant search engine. Its goal is to make your site's content discoverable by Qwant users. The frequency of its visits depends on factors like your site's popularity and how often its content is updated. New content or changes to existing pages will often trigger a visit from the bot. This is a standard and authorized activity for a legitimate search engine crawler.

What is the purpose of Qwantbot?

The purpose of Qwantbot is to support the Qwant search engine by building and maintaining its independent search index. Qwant positions itself as a privacy-focused alternative to other search engines, and the data collected by its bot is used exclusively to power its search results without user tracking. For website owners, being crawled by Qwantbot provides the benefit of visibility in Qwant's search results, which can drive traffic from users who prefer a privacy-respecting search engine, particularly in France and Europe.

How do I block Qwantbot?

If you wish to prevent Qwantbot from accessing your website, you can add a disallow rule to your robots.txt file. This will prevent your pages from appearing in Qwant search results.

To block this bot, add the following lines to your robots.txt file:

User-agent: Qwantbot
Disallow: /

How to verify the authenticity of the user-agent operated by Qwant?

Reverse IP lookup technique

To verify user-agent authenticity, you can use host linux command two times with the IP address of the requester.
  1. > host IPAddressOfRequest
    This command returns the reverse lookup hostname (e.g., 4.4.8.8.in-addr.arpa.).
  2. > host ReverseDNSFromTheOutputOfFirstRequest
If the output matches the original IP address and the domain is associated with a trusted operator (e.g., Qwant), the user-agent can be considered legitimate.

IP list lookup technique

Some operators provide a public list of IP addresses used by their crawlers. This list can be cross-referenced to verify a user-agent's authenticity. However, both operators and website owners may find it challenging to maintain an up-to-date list, so use this method with caution and in conjunction with other verification techniques.