Published on 2025-08-07T06:18:08Z
SiteAuditBot
SiteAuditBot is a specialized web crawler for the SEO platform Semrush. It is not an autonomous crawler but is activated only when a Semrush user initiates a technical audit of a website. Its purpose is to examine a site for over 120 technical SEO parameters, such as broken links, page speed, and mobile-friendliness, to identify issues that could be harming its search engine performance. Its presence indicates a direct interest in your site's technical health.
What is SiteAuditBot?
SiteAuditBot is the web crawler for the Semrush Site Audit tool. It is a technical auditing bot that examines websites for issues related to SEO performance and best practices. The bot only visits a website when a Semrush user specifically triggers a site audit. It identifies itself in server logs with the user-agent string Mozilla/5.0 (compatible; SiteAuditBot/0.97; +http://www.semrush.com/bot.html)
. The bot mimics either a mobile or desktop browser, depending on the audit settings, to conduct its analysis.
Why is SiteAuditBot crawling my site?
SiteAuditBot is crawling your website because a Semrush user is actively analyzing its technical performance. This user could be a member of your own team, an SEO agency you have hired, or even a competitor researching your site's structure. The bot's visits are not on a regular schedule but are directly correlated with when an audit is initiated on the Semrush platform. Each audit session can generate multiple requests as the bot examines various aspects of your site.
What is the purpose of SiteAuditBot?
The purpose of SiteAuditBot is to power the Semrush Site Audit tool, which evaluates over 120 technical SEO parameters. It identifies issues like broken links, redirect chains, slow page speeds, and duplicate content. The data it collects helps website owners find and fix technical problems that could be harming their search engine rankings or user experience. Regular audits with this tool can lead to improved search visibility and a healthier website.
How do I block SiteAuditBot?
To prevent Semrush users from running a site audit on your website, you can add a disallow rule to your robots.txt
file. This will block the SiteAuditBot from accessing your site.
To block this bot, add the following lines to your robots.txt
file:
User-agent: SiteAuditBot
Disallow: /
How to verify the authenticity of the user-agent operated by Semrush?
Reverse IP lookup technique
host
linux command two times with the IP address of the requester.-
This command returns the reverse lookup hostname (e.g., 4.4.8.8.in-addr.arpa.).> host IPAddressOfRequest
-
> host ReverseDNSFromTheOutputOfFirstRequest