GILS GILS logo Home |  About |  Technology |  Standards |  Policy |  Index |  Search |

Automated Gathering

"Agents" (also known as robots, spiders, or crawlers) gather information automatically. Current agent programs exploit network interfaces such as File Transfer Protocol or Hyper Text Transfer Protocol. More advanced agents can generate locator records using context in addition to content (see, for example, the Advanced Search Facility).


RSS XML icon Comments |  Privacy Notice |  URL:/showcase/agents.html