crawler
A crawler is a program used by search engines to collect data from the internet. When a crawler visits a website, it picks over the entire website's content (i.e. the text) and stores it in a databank. By this process the crawler captures and indexes every website that has links to at least one other website.