Published on 2025-08-07T06:18:08Z

linkdexbot

linkdexbot is a specialized web crawler for the enterprise SEO platform Authoritas (formerly Linkdex). It is a user-directed crawler, meaning it only visits a website when an Authoritas client initiates an SEO analysis or audit. It gathers data on a site's structure, content, and backlink profile to power the platform's technical SEO and competitive intelligence tools.

What is linkdexbot?

linkdexbot is the official web crawler for the enterprise SEO platform Authoritas (which acquired Linkdex). It is a technical SEO crawler that systematically browses websites to collect data for SEO auditing and analysis. A key characteristic is that it operates only under the explicit direction of an Authoritas client; it does not crawl the web autonomously. The bot identifies itself in server logs with user-agent strings like Mozilla/5.0 (compatible; linkdexbot/2.1; +https://www.linkdex.com/en-us/about/bots/).

Why is linkdexbot crawling my site?

The presence of linkdexbot in your server logs indicates that an Authoritas client, such as an SEO professional or marketing agency, is actively analyzing your website. The bot is examining your site's structure, content, and technical elements to provide SEO insights to that user. The frequency of visits depends on the client's settings and the scope of their analysis. While the crawl is initiated by a third party, it is generally considered authorized as part of legitimate SEO research.

What is the purpose of linkdexbot?

The purpose of linkdexbot is to serve the Authoritas SEO platform by collecting the data that helps users understand and improve a website's search engine optimization. It supports backlink analysis by mapping link graphs and conducts technical audits to detect issues like broken links or non-canonical URLs. For website owners, the indirect value of this crawling comes from the insights that SEO professionals gain from the data, which can lead to recommendations that improve your site's performance in search results.

How do I block linkdexbot?

To prevent linkdexbot from analyzing your site, you can add a disallow rule to your robots.txt file. This is the standard method for managing access for legitimate crawlers.

To block linkdexbot, add the following lines to your robots.txt file:

User-agent: linkdexbot
Disallow: /

How to verify the authenticity of the user-agent operated by Authoritas?

Reverse IP lookup technique

To verify user-agent authenticity, you can use host linux command two times with the IP address of the requester.
  1. > host IPAddressOfRequest
    This command returns the reverse lookup hostname (e.g., 4.4.8.8.in-addr.arpa.).
  2. > host ReverseDNSFromTheOutputOfFirstRequest
If the output matches the original IP address and the domain is associated with a trusted operator (e.g., Authoritas), the user-agent can be considered legitimate.

IP list lookup technique

Some operators provide a public list of IP addresses used by their crawlers. This list can be cross-referenced to verify a user-agent's authenticity. However, both operators and website owners may find it challenging to maintain an up-to-date list, so use this method with caution and in conjunction with other verification techniques.