What is a Robot / Spider / Crawler?
Robots (a.k.a. “spiders” or “crawlers”) are programs that visit web pages and download information. Sometimes such robots are welcomed, their intentions are good, but sometimes they are not welcomed and perform unwanted activities. It is important for us to know if a web page request is being made by a human or a program. If the request is made by a program we like to know why and by whom. There are many reasons why this is important and interesting to us, the most important one is to comply with copyright requirements and to prevent unauthorized distribution and usage of lyrics (e.g.