Published on 2025-08-07T06:18:08Z

YandexSitelinks bot

The YandexSitelinks bot is a specialized web crawler from the Russian search engine Yandex. Its purpose is to validate the URLs that are intended for use as 'sitelinks' in Yandex's search results. Sitelinks are the additional navigation links that appear under a main search result, and this bot ensures they remain functional and relevant, which helps improve user experience and click-through rates for website owners.

What is the YandexSitelinks bot?

The YandexSitelinks bot is a web crawler from Yandex designed specifically to validate and maintain the integrity of the sitelinks that are displayed in its search results. Sitelinks are the secondary links that appear beneath a website's main search result. The bot identifies itself in server logs with the user-agent string YandexSitelinksBot. It operates independently of the main Yandex crawler and focuses exclusively on monitoring the URLs that have been designated as potential sitelinks.

Why is the YandexSitelinks bot crawling my site?

The YandexSitelinks bot is visiting your site to verify the availability and relevance of pages that Yandex has identified as potential sitelinks for your domain. It periodically checks these pages to ensure they are still accessible and serve their intended purpose. The bot focuses on important navigational pages like 'About Us' or 'Contact' pages. This is an authorized and standard activity for a major search engine.

What is the purpose of the YandexSitelinks bot?

The purpose of the YandexSitelinks bot is to enhance the quality of Yandex search results by ensuring that sitelinks are functional and relevant. This helps Yandex maintain user satisfaction by preventing broken links from appearing in search results. For website owners, having sitelinks displayed in Yandex search can significantly improve click-through rates by allowing users to navigate directly to important sections of your site. The bot's work benefits both search users and website owners.

How do I block the YandexSitelinks bot?

To prevent the YandexSitelinks bot from accessing your website, you can add a specific disallow rule to your robots.txt file. This is the standard method for managing crawler access.

To block this bot, add the following lines to your robots.txt file:

User-agent: YandexSitelinksBot
Disallow: /

How to verify the authenticity of the user-agent operated by Yandex?

Reverse IP lookup technique

To verify user-agent authenticity, you can use host linux command two times with the IP address of the requester.
  1. > host IPAddressOfRequest
    This command returns the reverse lookup hostname (e.g., 4.4.8.8.in-addr.arpa.).
  2. > host ReverseDNSFromTheOutputOfFirstRequest
If the output matches the original IP address and the domain is associated with a trusted operator (e.g., Yandex), the user-agent can be considered legitimate.

IP list lookup technique

Some operators provide a public list of IP addresses used by their crawlers. This list can be cross-referenced to verify a user-agent's authenticity. However, both operators and website owners may find it challenging to maintain an up-to-date list, so use this method with caution and in conjunction with other verification techniques.