An example of WikiTrust screenshot |
|
Developer(s) | UCSC Online Collaboration Lab |
Stable release | v2.12 / November 19, 2008 [1] |
Preview release | v3.0.pre1 / August 21, 2009 [2] |
Written in | PHP, Objective Caml [3] |
Operating system | Cross-platform |
Type | MediaWiki plug-in |
License | BSD, GPL [4] |
Website | wikitrust.soe.ucsc.edu |
WikiTrust is a software product that assesses the credibility of content and author reputation of wiki articles using an automated algorithm. WikiTrust is a plug-in for servers using the MediaWiki platform, such as Wikipedia. When installed on a MediaWiki website it enables users of that website to obtain information about the author, origin, and reliability of that website's wiki text.[5] Content that is stable, based on an analysis of article history, is displayed in normal black-on-white type, and content that is not stable is highlighted in varying shades of yellow or orange.
WikiTrust is a project undertaken by the Online Collaboration Lab at the University of California, Santa Cruz, in response to a Wikipedia:Meta quality initiative sponsored by the Wikimedia Foundation.[5]
The project, discussed at Wikimania 2009, is one of a number of quality/rating tools for Wikipedia content that the Wikimedia Foundation is considering.[6]
Communications of the ACM August 2011, has an article on it.[7]
WikTrust can be used by everybody in English and German via the Wiki-Watch pagedetails for Wikipedia articles,[8] in several languages via a Firefox plugin or it can be installed in any MediaWiki configuration.[9]
Contents |
WikiTrust computes, for each word, three pieces of information:
The trust of the word is computed according to how much the word, and the surrounding text, have been revised by users that WikiTrust considers of "high reputation". To decide which users have high reputation, WikiTrust uses sophisticated "data mining" algorithms that assess the credibility of editors (users that make changes to a wikis content) by tracking their contributions.[10][11] This project is still in a beta test stage.[11][12]
The criticism has been raised[10] that "the software doesn’t really measure trustworthiness, and the danger is that people will trust the software to measure something that it does not." Generally, users whose content persists for a long time without being "reverted" by other editors are deemed more trustworthy by the software.[11][12] This may mean that users who edit controversial articles subject to frequent reversion may be found to be less trustworthy than others.[12] Interestingly, the software uses a variation of Levenshtein distance to measure how much of user's edit is kept or rearranged, so that users can receive "partial credit" for their work.[13]
The software has also been described as measuring the amount of consensus in an article.[14] The community of editors collaborate on articles and revise each other until agreement is reached. Users who make edits which are more similar to the final agreement will receive more reputation. The point is also made that consensus revolves around the beliefs of the community, so that the reputation computed is also a reflection of the community.