BotSeer
From Wikipedia, the free encyclopedia
BotSeer is a Web-based information system and search tool that provides resources and services for research on Web robots and trends in Robot Exclusion Protocol deployment and adherence. It has been created and designed by Yang Sun, Isaac G. Councill, Ziming Zhuang and C. Lee Giles.
BotSeer provides three major services including robots.txt searching, robot bias analysis[1], and robot-generated log analysis. The prototype of BotSeer also allows users to search six thousand documentation files and source codes from 18 open source crawler projects. BotSeer serves as a resource for studying the regulation and behavior of Web robots as well as information about the creation of effective robots.txt files and crawler implementations. Currently, it is publicly available on the World Wide Web at the College of Information Sciences and Technology at the Pennsylvania State University. BotSeer has indexed and analyzed 2.2 million robots.txt files obtained from 13.2 million websites, as well as a large Web server log of real-world robot behavior and related analysis. BotSeer's goals are to assist researchers, webmasters, web crawler developers and others with web robots related research and information needs.
BotSeer has also set up a honeypot[1] to test the ethicality, performance and behavior of web crawlers.
[edit] References
- ^ Yang Sun, Z. Zhuang, I. Councill, C.L. Giles, "Determining Bias to Search Engines from Robots.txt," Proceedings of IEEE/WIC/ACM International Conference on Web Intelligence (WI 2007), 149-155, 2007.
- Webmasters May Shape Search Results. Associated Press (November 28, 2007). Retrieved on 2008-01-15.
- Google favored by Web admins. Network World (November 15, 2007). Retrieved on 2007-12-19.