Published on 2025-08-07T06:18:08Z
SeekportBot
SeekportBot is the official web crawler for Seekport, a German-based independent search engine operated by SISTRIX. Its purpose is to discover and index public web content to build Seekport's search database. Seekport positions itself as a free, public, and privacy-focused alternative to the major search engines, and for website owners, being indexed by its bot offers a channel to reach this audience.
What is SeekportBot?
SeekportBot is the web crawler for the German independent search engine Seekport, which is operated by the platform intelligence provider SISTRIX. The bot systematically browses the internet to discover and index web content for its search results. It identifies itself with the user-agent string Mozilla/5.0 (compatible; SeekportBot; +https://bot.seekport.com)
. SeekportBot is a well-behaved bot that respects robots.txt
directives and is designed to have a light footprint on server resources. Its independence from major search platforms is central to Seekport's mission.
Why is SeekportBot crawling my site?
SeekportBot is crawling your site to index its public content for inclusion in the Seekport search engine. Its goal is to find new and updated content for its search index. The crawler is designed to be lightweight, making as few calls per second as possible. The frequency of visits depends on your site's size, update schedule, and relevance to Seekport's users. This crawling is a standard and authorized activity for a legitimate search engine.
What is the purpose of SeekportBot?
The purpose of SeekportBot is to build and maintain the search index for the Seekport search engine. Seekport promotes itself as a privacy-focused alternative that does not store user data or create user profiles. The data collected by its bot is used exclusively to power its search results, without the influence of advertising. For website owners, being indexed by Seekport provides another channel for discovery, allowing you to reach an audience that values privacy and independent search results.
How do I block SeekportBot?
To prevent SeekportBot from accessing your website, you can add a disallow rule to your robots.txt
file. This will prevent your pages from appearing in Seekport's search results.
To block this bot, add the following lines to your robots.txt
file:
User-agent: SeekportBot
Disallow: /
How to verify the authenticity of the user-agent operated by Seekport?
Reverse IP lookup technique
host
linux command two times with the IP address of the requester.-
This command returns the reverse lookup hostname (e.g., 4.4.8.8.in-addr.arpa.).> host IPAddressOfRequest
-
> host ReverseDNSFromTheOutputOfFirstRequest