Link rot

From Wikipedia, the free encyclopedia

Link rot is the process by which links on a website gradually become irrelevant or broken as time goes on, because websites that they link to disappear, change their content or redirect to new locations.

The phrase also describes the effects of failing to update web pages so that they become out-of-date, containing information that is old and useless, and that clutters up search engine results. This process most frequently occurs in personal web pages and is prevalent in free web hosts such as GeoCities, where there is no financial incentive to fix link rot (most of these sites have not been updated for years on end).

Contents

[edit] Prevalence

The 404 "not found" response is familiar to even the occasional Web user. A number of studies have examined the prevalence of link rot on the Web, in academic literature, and in digital libraries. In a 2003 experiment, Fetterly et al. (2003) discovered that about 0.5% of web pages disappeared each week. McCown et al. (2005) discovered that half of the URLs cited in D-Lib Magazine articles were no longer accessible 10 years after publication, and other studies have shown link rot in academic literature to be even worse (Spinellis, 2003, Lawrence et al., 2001). Nelson and Allen (2002) examined link rot in digital libraries about found that 3% of the objects were no longer accessible after one year.

News sites contribute to the link rot problem by commonly keeping only recent news articles online where they are freely accessible at their original URLs, then removing them or moving them to a paid subscription area. This causes a heavy loss of supporting links in sites discussing newsworthy events and using news sites as references.

[edit] Discovering

Detecting link rot for a given URL is difficult using automated methods. If a URL is accessed and returns back an HTTP 200 (OK) response, it may be considered accessible, but the contents of the page may have changed and may no longer be relevant. Some web servers also return a soft 404, a page returned with a 200 (OK) response (instead of a 404) that indicates the URL is no longer accessible. Bar-Yossef et al. (2004) developed a heuristic for automatically discovering soft 404s.

[edit] Modern management

On Wikipedia, and other wiki-based websites, broken external links still present a maintenance problem - at least 10% of the external links on Wikipedia are broken. Wikipedia uses a clear color system with internal links, so the user can see if the link is live before clicking on it. If referencing an old website or dated information, users can externally link to pages using a web archiving service, allowing for a reliable permanent link.

[edit] Combating

[edit] Web archiving

To combat link rot, web archivists are actively engaged in collecting the Web or particular portions of the Web and ensuring the collection is preserved in an archive, such as an archive site, for future researchers, historians, and the public. The largest web archiving organization is the Internet Archive which strives to maintain an archive of the entire Web. National libraries, national archives and various consortia of organizations are also involved in archiving culturally important Web content.

Individuals may also use a number of tools that allow them to archive web resources that may go missing in the future:

  • WebCite, a tool specifically for scholarly authors, journal editors and publishers to permanently archive "on-demand" and retrieve cited Internet references (Eysenbach and Trudel, 2005).
  • Archive-It, a subscription service that allows institutions to build, manage and search their own web archive
  • hanzo:web is a personal web archiving service created by Hanzo Archives that can archive a single web resource, a cluster of web resources, or an entire website, as a one-off collection, scheduled/repeated collection, an RSS/Atom feed collection or collect on-demand via Hanzo's open API.

[edit] Webmasters

Webmasters have developed a number of best practices for combating link rot:

  • Avoiding unmanaged hyperlink collections
  • Avoiding links to pages deep in a website ("deep linking")
  • Using hyperlink checking software or a Content Management System (CMS) that automatically checks links
  • Using permalinks
  • Using redirection mechanisms (e.g. "301: Moved Permanently") to automatically refer browsers and crawlers to the new location of a URL

[edit] Authors citing URLs

A number of studies have shown how wide-spread link rot is in academic literature (see below). Authors of scholarly publications have also developed best-practices for combating link rot in their work:

[edit] References

[edit] Link rot on the Web

[edit] In academic literature

[edit] In digital libraries

[edit] See also

[edit] External links

In other languages