The Web Robot Pages Web Robots (also known as Web Wanderers, Crawlers, or Spiders), are programs that traverse the Web automatically. Web Robot (juga dikenal sebagai Web Wanderers, Crawlers, atau Laba-laba), beberapa program yang melintasi Web secara otomatis. Search engines such as Google use them to index the web content, spammers use them to scan for email addresses, and they have many other uses. Mesin pencari seperti Google menggunakan mereka untuk mengindeks konten web, spammer menggunakan mereka untuk memindai alamat email, dan mereka mempunyai banyak kegunaan lain. On this site you can learn more about web robots. Di situs ini Anda dapat mempelajari lebih lanjut tentang robot web. About /robots.txt explains what /robots.txt is, and how to use it. About / robots.txt menjelaskan apa / robots.txt, dan bagaimana menggunakannya. The FAQ answers many frequently asked questions, such as How do I stop robots visiting my site? and How can I get the best listing in