What is a web robot / spider / crawler?
Robots (aka spiders or crawlers) are programs (not humans) that visit web pages (aka documents) and download information. If their intentions are good, sometimes such robots, sometimes such robots are welcomed. But sometimes they are not welcomed because they perform unwanted activities. It is important for us to know if a request for a web page or a submission to add or update lyrics is being made by a human or by a robot program. If the request is made by a program, we want to know why and by whom. One of the most important reasons we need to know this information is because it is necessary for us to comply with copyright requirements that mandate us to prevent the distribution and usage of lyrics without compensation to the lyrics copyright owners.