Robot Configuration

Robot Configuration
In order to protect own pages on the Internet from web crawlers, the "robots.txt" was introduced.
Status: stable
Developer: Martijn Koster
Extension type: MediaWiki
Edition:
"BlueSpice free" is not in the list (BlueSpice pro, BlueSpice) of allowed values for the "BSExtensionInfoEdition" property.
Dependencies:
License: -
"-" is not in the list (ISC, MIT, WTFPL, GPL v2+, GPL v2+ and MIT, GPL v3+, GPL v3, GPL v3 only, GPL v2+ and BSD 3 Clause) of allowed values for the "BSExtensionInfoLicense" property.
Activated: -
"-" is not recognized as a Boolean (true/false) value.
Category: -
Documentation on MediaWiki.org

In order to protect own pages on the Internet from web crawlers, the "robots.txt" was introduced.

This file is located in the root directory of the server and is a web crawler control file that can ban.

It is often used to reduce the data traffic caused by the large number of web crawlers or to block certain areas for the web crawlers.


For further information about  this file, please refer to the Wikipedia.

Attachments

Discussions