GILS
Home |
About | Technology |
Standards | Policy |
Index | Search |
Automated Gathering
"Agents" (also known as robots, spiders, or crawlers) gather information
automatically. Current agent programs exploit network interfaces such as File Transfer
Protocol or Hyper Text Transfer Protocol. More advanced agents can generate
locator records using context in addition to content (see, for example, the Advanced Search Facility).
-
Mercury | - every
night, Mercury rebuilds its database using the harvested metadata from the
data providers’ servers
-
CERES | - uses the
full range of techniques, from fully automated agents to human-cataloged
directory entries
-
Washington State GILS | -
uses a
combination of traditional catalog plus metadata harvested automatically from
Web sites throughout the state
-
DESIRE (Development of a European Service for Information on
Research and Education) | - also exploits metadata embedded within documents encoded with
HTML
Comments |
Privacy Notice |
URL:/showcase/agents.html