Recent from talks
Nothing was collected or created yet.
Reliability of Wikipedia
View on Wikipedia

The reliability of Wikipedia and its volunteer-driven and community-regulated editing model, particularly its English-language edition, has been questioned and tested. Wikipedia is written and edited by volunteer editors (known as Wikipedians) who generate online content with the editorial oversight of other volunteer editors via community-generated policies and guidelines. The reliability of the project has been tested statistically through comparative review, analysis of the historical patterns, and strengths and weaknesses inherent in its editing process.[3] The online encyclopedia has been criticized for its factual unreliability, principally regarding its content, presentation, and editorial processes. Studies and surveys attempting to gauge the reliability of Wikipedia have mixed results. Wikipedia's reliability was frequently criticized in the 2000s but has been improved; its English-language edition has been generally praised in the late 2010s and early 2020s.[4][5][6]

Select assessments of its reliability have examined how quickly vandalism—content perceived by editors to constitute false or misleading information—is removed. Two years after the project was started, in 2003, an IBM study found that "vandalism is usually repaired extremely quickly—so quickly that most users will never see its effects".[7][8] The inclusion of false or fabricated content has, at times, lasted for years on Wikipedia due to its volunteer editorship.[9][10] Its editing model facilitates multiple systemic biases, namely selection bias, inclusion bias, participation bias, and group-think bias. The majority of the encyclopedia is written by male editors, leading to a gender bias in coverage, and the make up of the editing community has prompted concerns about racial bias, spin bias, corporate bias, and national bias, among others.[11][12][13] An ideological bias on Wikipedia has also been identified on both conscious and subconscious levels. A series of studies from Harvard Business School in 2012 and 2014 found Wikipedia "significantly more biased" than Encyclopædia Britannica but attributed the finding more to the length of the online encyclopedia as opposed to slanted editing.[14][15]
Instances of non-neutral or conflict-of-interest editing and the use of Wikipedia for "revenge editing" has attracted attention to false, biased, or defamatory content in articles, especially biographies of living people.[16][17] Articles on less technical subjects, such as the social sciences, humanities, and culture, have been known to deal with misinformation cycles, cognitive biases, coverage discrepancies, and editor disputes. The online encyclopedia does not guarantee the validity of its information. Nevertheless, it is regarded as a valuable "starting point" for researchers seeking to examine the listed references, citations, and sources. Academics suggest reviewing reliable sources when assessing the quality of articles.[18][19]
Its coverage of medical and scientific articles such as pathology,[20] toxicology,[21] oncology,[22] pharmaceuticals,[23] and psychiatry[24] were compared to professional and peer-reviewed sources in a 2005 Nature study.[25] A year later Encyclopædia Britannica disputed the Nature study, whose authors, in turn, replied with a further rebuttal.[26][27] Concerns regarding readability and the overuse of technical language were raised in studies published by the American Society of Clinical Oncology (2011),[28] Psychological Medicine (2012),[24] and European Journal of Gastroenterology and Hepatology (2014).[29] The Simple English Wikipedia serves as a simplified version of articles to make complex articles more accessible to the layperson on a given topic in Basic English. Wikipedia's popularity, mass readership, and free accessibility has led the encyclopedia to command a substantial second-hand cognitive authority across the world.[30][31][nb 1]
Wikipedia editing model
[edit]Wikipedia allows anonymous editing; contributors (known as "editors") are not required to provide any identification or an email address. A 2009 study of Dartmouth College in the English Wikipedia noted that, contrary to usual social expectations, anonymous editors were some of Wikipedia's most productive contributors of valid content.[32] The Dartmouth study was criticized by John Timmer of Ars Technica for its methodological shortcomings.[33]
Wikipedia trusts the same community to self-regulate and become more proficient at quality control. Wikipedia has harnessed the work of millions of people to produce the world's largest knowledge-based site along with software to support it, resulting in more than nineteen million articles written, across more than 280 different language versions, in fewer than twelve years.[34] For this reason, there has been considerable interest in the project both academically and from diverse fields such as information technology, business, project management, knowledge acquisition, software programming, other collaborative projects and sociology, to explore whether the Wikipedia model can produce quality results, what collaboration in this way can reveal about people and whether the scale of involvement can overcome the obstacles of individual limitations and poor editorship which would otherwise arise.[citation needed]
Wikipedia's degree of truthfulness extends from its technology, policies, and editor culture. Edit histories are publicly visible. Footnotes show the origins of claims. Editors remove unverifiable claims and overrule ("revert") claims not phrased in a neutral point of view (NPOV). Wikipedia editors also tend towards self-examination and acknowledge Wikipedia's flaws. Its open model permits article-tampering (vandalism) including short-lived jokes and longer hoaxes. Some editors dedicate as much time to trolling (creating vandalism, spam, and harassment) as others do improving the encyclopedia. The English Wikipedia's editor pool, roughly 40,000 active editors who make five edits monthly, largely skews male and white, leading to gender- and race-based systemic biases in coverage. Variations in coverage mean that Wikipedia can be both, as online communities professor Amy S. Bruckman put it, "the most accurate form of information ever created by humans" on the whole while short articles can be "total garbage".[35]
Academics view Wikipedia as representing a "consensus truth" in which readers can check reality in an age of contested facts. For example, when facts surrounding the COVID-19 pandemic rapidly changed or were debated, editors removed claims that did not adhere to the "verifiability" and "NPOV" guidelines.[35]
Fact-checking
[edit]Fact-checking of Wikipedia is the process through which Wikipedia editors perform fact-checking of content published in Wikipedia, while fact-checking using Wikipedia is the use of Wikipedia for fact-checking other publications. The broader topic of fact checking in the context of Wikipedia also includes the cultural discussion of the place of Wikipedia in fact-checking. Major platforms including YouTube[36] and Facebook[37] use Wikipedia's content to confirm the accuracy of information in their own media collections. Seeking public trust is a major part of Wikipedia's publication philosophy.[38]
Wikipedia has grown beyond a simple encyclopedia to become what The New York Times called a "factual netting that holds the digital world together".[35] Common questions asked of search engines are answered using knowledge ingested from Wikipedia, and often credit or link to Wikipedia as their source. Wikipedia is likely the most important single source used to train generative artificial intelligence (AI) models, such as ChatGPT, for which Wikipedia is valued as a well-curated data set with highly structured formatting.[39] The accuracy of AI models depend on the quality of their training data, but these models are also fundamentally unable to cite their original source for their knowledge, thus AI users use Wikipedia knowledge without knowing that Wikipedia is its source. AI users also receive results that intertwine facts originating from Wikipedia with fictional data points (AI hallucinations), lowering the quality of information absent a real-time fact-check during information retrieval.[35]
Assessments
[edit]This section needs to be updated. (March 2022) |
Criteria for evaluating reliability
[edit]The reliability of Wikipedia articles can be measured by the following criteria:

- Accuracy of information provided within articles
- Appropriateness of the images provided with the article
- Appropriateness of the style and focus of the articles[40]
- Susceptibility to, and exclusion and removal of, false information
- Comprehensiveness, scope and coverage within articles and in the range of articles
- Identification of reputable third-party sources as citations
- Verifiability of statements by respected sources[18]
- Stability of the articles
- Susceptibility to editorial and systemic bias
- Quality of writing
Several "market-oriented" extrinsic measures demonstrate that large audiences trust Wikipedia in one way or another. For instance, "50 percent of [US] physicians report that they've consulted ... [Wikipedia] for information on health conditions", according to a report from IMS Institute for Healthcare Informatics.[41]
Comparative studies
[edit]This section needs to be updated. (January 2026) |
On October 24, 2005, the British newspaper, The Guardian, published a story entitled "Can you trust Wikipedia?" in which a panel of experts were asked to review seven entries related to their fields, giving each article reviewed a number designation from 0 to 10. Most of these reviewed articles received marks between 5 and 8. The most common critiques were poor prose, or ease-of-reading issues (three mentions), omissions or inaccuracies, often small but including key omissions in some articles (three mentions), and poor balance, with less important areas being given more attention and vice versa (one mention). The most common praises were factually sound and correct, no glaring inaccuracies (four mentions), and much useful information, including well-selected links, making it possible to "access much information quickly" (three mentions).[42]
In December 2005, the journal Nature published results of an attempted blind study seeking reviewer evaluations of the accuracy of a small subset of articles from Wikipedia and Encyclopædia Britannica. The non-peer-reviewed study was based on Nature's selection of 42 articles on scientific topics, including biographies of well-known scientists. Factual errors, omissions or misleading statements found in the sampled articles was 162 for Wikipedia and 123 for Britannica (4:3). For serious errors, such as misinterpretations of important concepts, 4 were found in Wikipedia, and 4 in Britannica (1:1). The study concluded that "Wikipedia comes close to Britannica in terms of the accuracy of its science entries",[25] although Wikipedia's articles were often "poorly structured".[25]
Encyclopædia Britannica expressed concerns, leading Nature to release further documentation of its survey method.[43] Based on this additional information, Encyclopædia Britannica denied the validity of the Nature study, stating that it was "fatally flawed". Among Britannica's criticisms were that excerpts rather than the full texts of some of their articles were used, that some of the extracts were compilations that included articles written for the youth version, that Nature did not check the factual assertions of its reviewers, and that many points the reviewers labeled as errors were differences of editorial opinion. Britannica further stated that "While the heading proclaimed that 'Wikipedia comes close to Britannica in terms of the accuracy of its science entries,' the numbers buried deep in the body of the article said precisely the opposite: Wikipedia in fact had a third more inaccuracies than Britannica. (As we demonstrate below, Nature's research grossly exaggerated Britannica's inaccuracies, so we cite this figure only to point out the slanted way in which the numbers were presented.)"[44] Nature acknowledged the compiled nature of some of the Britannica extracts, but denied that this invalidated the conclusions of the study.[45] Encyclopædia Britannica also argued that a breakdown of the errors indicated that the mistakes in Wikipedia were more often the inclusion of incorrect facts, while the mistakes in Britannica were "errors of omission", making "Britannica far more accurate than Wikipedia, according to the figures".[44] Nature has since rejected the Britannica response,[46] stating that any errors on the part of its reviewers were not biased in favor of either encyclopedia, that in some cases it used excerpts of articles from both encyclopedias, and that Britannica did not share particular concerns with Nature before publishing its "open letter" rebuttal.[47][48]
The point-for-point disagreement between these two parties that addressed the compilation/text excerpting and very small sample size issues—argued to bias the outcome in favor of Wikipedia, versus a comprehensive, full article, large sample size study favoring the quality-controlled format of Britannica—have been echoed in online discussions,[49][50] including of articles citing the Nature study, e.g., where a "flawed study design" for manual selection of articles/article portions, the lack of study "statistical power" in its comparing 40 articles from over 100,000 Britannica and over 1 million English Wikipedia articles, and the absence of any study statistical analyses (e.g., reported confidence intervals for study results) has also been noted.[51] Science communicator Jonathan Jarry said in 2024 that the study was historically important, and had been cited in almost every science paper on Wikipedia's reliability since then, but that research of this kind will only provide a "snapshot" and quickly become unreliable.[52]
In June 2006, Roy Rosenzweig, a professor specializing in American history, published a comparison of the Wikipedia biographies of 25 Americans to the corresponding biographies found on Encarta and American National Biography Online. He wrote that Wikipedia is "surprisingly accurate in reporting names, dates, and events in U.S. history" and described some of the errors as "widely held but inaccurate beliefs". However, he stated that Wikipedia often fails to distinguish important from trivial details, and does not provide the best references. He also complained about Wikipedia's lack of "persuasive analysis and interpretations, and clear and engaging prose".[53][nb 2]
A web-based survey conducted from December 2005 to May 2006 by Larry Press, a professor of Information Systems at California State University at Dominguez Hills, assessed the "accuracy and completeness of Wikipedia articles".[54] Fifty people accepted an invitation to assess an article. Of the fifty, seventy-six percent (76%) agreed or strongly agreed that the Wikipedia article was accurate, and forty-six percent (46%) agreed or strongly agreed that it was complete. Eighteen people compared the article they reviewed to the article on the same topic in the Encyclopædia Britannica. Opinions on accuracy were almost equal between the two encyclopedias (6 favoring Britannica, 7 favoring Wikipedia, 5 stating they were equal), and eleven of the eighteen (61%) found Wikipedia somewhat or substantially more complete, compared to seven of the eighteen (39%) for Britannica. The survey did not attempt a random selection of the participants, and it is not clear how the participants were invited.[55]
The German computing magazine c't performed a comparison of Brockhaus Multimedial, Microsoft Encarta, and the German Wikipedia in October 2004: Experts evaluated 66 articles in various fields. In overall score, Wikipedia was rated 3.6 out of 5 points (B-).[56] A second test by c't in February 2007 used 150 search terms, of which 56 were closely evaluated, to compare four digital encyclopedias: Bertelsmann Enzyklopädie 2007, Brockhaus Multimedial premium 2007, Encarta 2007 Enzyklopädie and Wikipedia. It concluded: "We did not find more errors in the texts of the free encyclopedia than in those of its commercial competitors."[57]
Viewing Wikipedia as fitting the economists' definition of a perfectly competitive marketplace of ideas, George Bragues (University of Guelph-Humber), examined Wikipedia's articles on seven top Western philosophers: Aristotle, Plato, Immanuel Kant, René Descartes, Georg Wilhelm Friedrich Hegel, Thomas Aquinas, and John Locke. Wikipedia's articles were compared to a consensus list of themes culled from four reference works in philosophy. Bragues found that, on average, Wikipedia's articles only covered 52% of consensus themes. No errors were found, though there were significant omissions.[58]
PC Pro magazine (August 2007) asked experts to compare four articles (a small sample) in their scientific fields between Wikipedia, Britannica and Encarta. In each case Wikipedia was described as "largely sound", "well handled", "performs well", "good for the bare facts" and "broadly accurate". One article had "a marked deterioration towards the end" while another had "clearer and more elegant" writing, a third was assessed as less well written but better detailed than its competitors, and a fourth was "of more benefit to the serious student than its Encarta or Britannica equivalents". No serious errors were noted in Wikipedia articles, whereas serious errors were noted in one Encarta and one Britannica article.[59]
In October 2007, the Australian magazine PC Authority published a feature article on the accuracy of Wikipedia. The article compared Wikipedia's content to other popular online encyclopedias, namely Britannica and Encarta. The magazine asked experts to evaluate articles pertaining to their field. A total of four articles were reviewed by three experts. Wikipedia was comparable to the other encyclopedias, topping the chemistry category.[60]
In December 2007, German magazine Stern published the results of a comparison between the German Wikipedia and the online version of the 15-volume edition of Brockhaus Enzyklopädie. The test was commissioned to a research institute (Cologne-based WIND GmbH), whose analysts assessed 50 articles from each encyclopedia (covering politics, business, sports, science, culture, entertainment, geography, medicine, history and religion) on four criteria (accuracy, completeness, timeliness and clarity), and judged Wikipedia articles to be more accurate on the average (1.6 on a scale from 1 to 6 versus 2.3 for Brockhaus, with 1 as the best and 6 as the worst). Wikipedia's coverage was also found to be more complete and up to date; however, Brockhaus was judged to be more clearly written, while several Wikipedia articles were criticized as being too complicated for non-experts, and many as too lengthy.[61][62][63]
In its April 2008 issue, British computing magazine PC Plus compared the English Wikipedia with the DVD editions of World Book Encyclopedia and Encyclopædia Britannica, assessing for each the coverage of a series of random subjects. It concluded, "The quality of content is good in all three cases" and advised Wikipedia users "Be aware that erroneous edits do occur, and check anything that seems outlandish with a second source. But the vast majority of Wikipedia is filled with valuable and accurate information."[64]
A 2008 paper in Reference Services Review compared nine Wikipedia entries on historical topics to their counterparts in Encyclopædia Britannica, The Dictionary of American History and American National Biography Online. The paper found that Wikipedia's entries had an overall accuracy rate of 80 percent, whereas the other encyclopedias had an accuracy rate of 95 to 96 percent.[65]
A 2010 study assessed the extent to which Wikipedia pages about the history of countries conformed to the site's policy of verifiability. It found that, in contradiction of this policy, many claims in these articles were not supported by citations, and that many of those that were, sourced to popular media and government websites rather than to academic journal articles.[66]
In April 2011, a study was published by Adam Brown of Brigham Young University in the journal PS Political Science & Politics which examined "thousands of Wikipedia articles about candidates, elections, and officeholders". The study found that while the information in these articles tended to be accurate, the articles examined contained many errors of omission.[67]
A 2012 study co-authored by Shane Greenstein examined a decade of Wikipedia articles on United States politics and found that the more contributors there were to a given article, the more neutral it tended to be, in line with a narrow interpretation of Linus's law.[68]
Reavley et al. (2012) compared the quality of articles on select mental health topics on Wikipedia with corresponding articles in Encyclopædia Britannica and a psychiatry textbook. They asked experts to rate article content with regard to accuracy, up-to-dateness, breadth of coverage, referencing and readability. Wikipedia scored highest on all criteria except readability, and the authors concluded that Wikipedia is as good as or better than Britannica and a standard textbook.[24]
A 2014 perspective piece in the New England Journal of Medicine examined Wikipedia pages about 22 prescription drugs to determine if they had been updated to include the most recent FDA safety warnings. It found that 41% of these pages were updated within two weeks after the warning, 23% were updated more than two weeks later, and the remaining 36% had not been updated to include the warning as of more than 1 year later as of January 2014.[69]
A 2014 study in the Journal of the American Pharmacists Association examined 19 Wikipedia articles about herbal supplements, and concluded that all of these articles contained information about their "therapeutic uses and adverse effects", but also concluded that "several lacked information on drug interactions, pregnancy, and contraindications". The study's authors therefore recommended that patients not rely solely on Wikipedia as a source for information about the herbal supplements in question.[70]
Another study published in 2014 in PLOS ONE found that Wikipedia's information about pharmacology was 99.7% accurate when compared to a pharmacology textbook, and that the completeness of such information on Wikipedia was 83.8%. The study also determined that completeness of these Wikipedia articles was lowest (68%) in the category "pharmacokinetics" and highest (91.3%) in the category "indication". The authors concluded that "Wikipedia is an accurate and comprehensive source of drug-related information for undergraduate medical education".[71]
Expert opinion
[edit]This section needs to be updated. The reason given is: The views presented here are obsolete, attitude in academia, and related studies, have shifted in the last decade, but most sources cited here are over 10 years old. (May 2021) |
Librarians' views
[edit]In a 2004 interview with The Guardian, self-described information specialist and Internet consultant[72] Philip Bradley said that he would not use Wikipedia and was "not aware of a single librarian who would". He then explained that "the main problem is the lack of authority. With printed publications, the publishers have to ensure that their data are reliable, as their livelihood depends on it. But with something like this, all that goes out the window."[73]
In 2005, the library at Trent University in Ontario stated Wikipedia had many articles that are "long and comprehensive", but that there is "a lot of room for misinformation and bias [and] a lot of variability in both the quality and depth of articles". It adds that Wikipedia has advantages and limitations, that it has "excellent coverage of technical topics" and articles are "often added quickly and, as a result, coverage of current events is quite good", comparing this to traditional sources which are unable to achieve this task. It concludes that, depending upon the need, one should think critically and assess the appropriateness of one's sources, "whether you are looking for fact or opinion, how in-depth you want to be as you explore a topic, the importance of reliability and accuracy, and the importance of timely or recent information", and adds that Wikipedia can be used in any event as a "starting point".[74]
A 2006 review of Wikipedia by Library Journal, using a panel of librarians, "the toughest critics of reference materials, whatever their format", asked "long standing reviewers" to evaluate three areas of Wikipedia (popular culture, current affairs, and science), and concluded: "While there are still reasons to proceed with caution when using a resource that takes pride in limited professional management, many encouraging signs suggest that (at least for now) Wikipedia may be granted the librarian's seal of approval". A reviewer who "decided to explore controversial historical and current events, hoping to find glaring abuses" said, "I was pleased by Wikipedia's objective presentation of controversial subjects" but that "as with much information floating around in cyberspace, a healthy degree of skepticism and skill at winnowing fact from opinion are required". Other reviewers noted that there is "much variation" but "good content abounds".[75]
In 2007, Michael Gorman, former president of the American Library Association (ALA) stated in an Encyclopædia Britannica blog that "A professor who encourages the use of Wikipedia is the intellectual equivalent of a dietician who recommends a steady diet of Big Macs with everything".[76]
Information Today (March 2006) cites librarian Nancy O'Neill (principal librarian for Reference Services at the Santa Monica Public Library System) as saying that "there is a good deal of skepticism about Wikipedia in the library community" but that "she also admits cheerfully that Wikipedia makes a good starting place for a search. You get terminology, names, and a feel for the subject."[77]
PC Pro (August 2007) cites the head of the European and American Collection at the British Library, Stephen Bury, as stating "Wikipedia is potentially a good thing—it provides a speedier response to new events, and to new evidence on old items". The article concludes: "For [Bury], the problem isn't so much the reliability of Wikipedia's content so much as the way in which it's used." "It's already become the first port of call for the researcher", Bury says, before noting that this is "not necessarily problematic except when they go no further". According to Bury, the trick to using Wikipedia is to understand that "just because it's in an encyclopedia (free, web or printed) doesn't mean it's true. Ask for evidence ... and contribute."[59]
Articles on contentious issues
[edit]A 2006 article for the Canadian Library Association (CLA)[78] discussed the Wikipedia approach, process and outcome in depth, commenting for example that in controversial topics, "what is most remarkable is that the two sides actually engaged each other and negotiated a version of the article that both can more or less live with". The author comments that:
In fact Wikipedia has more institutional structure than at first appears. Some 800 experienced users are designated as administrators, with special powers of binding and loosing: they can protect and unprotect, delete and undelete and revert articles, and block and unblock users. They are expected to use their powers in a neutral way, forming and implementing the consensus of the community. The effect of their intervention shows in the discussion pages of most contentious articles. Wikipedia has survived this long because it is easier to reverse vandalism than it is to commit it...
Shi et al. extended this analysis in discussing "The wisdom of polarized crowds" in 2017 based on content analysis of all edits to English Wikipedia articles relating to politics, social issues and science from its start to December 1, 2016. This included almost 233,000 articles representing approximately 5 percent of the English Wikipedia. They wrote: "Political speech [at least in the United States] has become markedly more polarized in recent years ... . [D]espite early promise of the world-wide-web to democratize access to diverse information, increased media choice and social networking platforms ... [create] echo chambers that ... degrade the quality of individual decisions, ... discount identity-incongruent opinions, stimulate and reinforce polarizing information ... foment conflict and even make communication counter-productive. Nevertheless, a large literature documents the largely positive effect that social differences can exert on the collaborative production of information, goods and services. Research demonstrates that individuals from socially distinct groups embody diverse cognitive resources and perspectives that, when cooperatively combined ... outperform those from homogeneous groups." They translated edit histories of millions of Wikipedia editors into a 7-point political identification scale and compared that with Wikipedia's six-level article quality score (stub, start, C, B, good, featured) assigned via a machine learning algorithm. They found that "articles attracting more attention tend to have more balanced engagement ... [and] higher polarization is associated with higher quality."[79]
Academia
[edit]Early years of Wikipedia (2000–2019)
[edit]Academics also criticized Wikipedia for its perceived failure as a reliable source and because Wikipedia editors may have no expertise, competence, or credentials in the topics on which they contribute.[80][81] Adrian Riskin, a mathematician in Whittier College commented that while highly technical articles may be written by mathematicians for mathematicians, the more general maths topics, such as the article on polynomials, are written in a very amateurish fashion with a number of obvious mistakes.[82]
Because Wikipedia cannot be considered a reliable source, the use of Wikipedia is not accepted in many schools and universities in writing a formal paper, and some educational institutions have banned it as a primary source while others have limited its use to only a pointer to external sources.[80][83][84] The criticism of not being a reliable source, however, may not only apply to Wikipedia but to encyclopedias in general—some university lecturers are not impressed when students cite print-based encyclopedias in assigned work.[85] However, it seems that instructors have underestimated the use of Wikipedia in academia because of these concerns. Researchers and academics contend that while Wikipedia may not be used as a 100 percent accurate source for final papers, it is a valuable jumping off point for research that can lead to many possibilities if approached critically. What may be missing in academia is the emphasis on critical analysis in regards to the use of Wikipedia in secondary and higher education. According to Polk, Johnston and Evers, academics should not dismiss Wikipedia entirely (there are fewer inaccuracies than there are errors of omission) but rather begin to support it, and teach the use of Wikipedia as an education tool in tandem with critical thinking skills that will allow students to filter the information found on the online encyclopedia and help them critically analyze their findings.[86]
An empirical study conducted in 2006 by a Business School lecturer in Information Systems at the University of Nottingham,[87] the subject of a review on the technical website Ars Technica,[88] involving 55 academics asked to review specific Wikipedia articles that either were in their expert field (group 1) or chosen at random (group 2), concluded that: "The experts found Wikipedia's articles to be more credible than the non-experts. This suggests that the accuracy of Wikipedia is high. However, the results should not be seen as support for Wikipedia as a totally reliable resource as, according to the experts, 13 percent of the articles contain mistakes (10% of the experts reported factual errors of an unspecified degree, 3% of them reported spelling errors)."[89]
The Gould Library at Carleton College in Minnesota has a web page describing the use of Wikipedia in academia. It asserts that "Wikipedia is without question a valuable and informative resource", but that "there is an inherent lack of reliability and stability" to its articles, again drawing attention to similar advantages and limitations as other sources. As with other reviews, it comments that one should assess one's sources and what is desired from them, and that "Wikipedia may be an appropriate resource for some assignments, but not for others." It cited Wikipedia co-founder Jimmy Wales' view that Wikipedia may not be ideal as a source for all academic uses, and (as with other sources) suggests that at the least, one strength of Wikipedia is that it provides a good starting point for current information on a very wide range of topics.[90]
In 2007, the Chronicle of Higher Education published an article written by Cathy Davidson, Professor of Interdisciplinary Studies and English at Duke University, in which she asserts that Wikipedia should be used to teach students about the concepts of reliability and credibility.[91]
In 2008, Hamlet Isakhanli, founder and president of Khazar University, compared the Encyclopædia Britannica and English Wikipedia articles on Azerbaijan and related subjects. His study found that Wikipedia covered the subject much more widely, more accurately and in more detail, though with some lack of balance, and that Wikipedia was the best source for the first approximation.[92]
In 2011, Karl Kehm, associate professor of physics at Washington College, said: "I do encourage [my students] to use [Wikipedia] as one of many launch points for pursuing original source material. The best Wikipedia entries are well researched with extensive citations".[93]
Some academic journals do refer to Wikipedia articles, but are not elevating it to the same level as traditional references. For instance, Wikipedia articles have been referenced in "enhanced perspectives" provided on-line in the journal Science. The first of these perspectives to provide a hyperlink to Wikipedia was "A White Collar Protein Senses Blue Light" in 2002,[94] and dozens of enhanced perspectives have provided such links since then. The publisher of Science states that these enhanced perspectives "include hypernotes—which link directly to websites of other relevant information available online—beyond the standard bibliographic references".[95]
2020s onwards
[edit]Sverrir Steinsson[who?] investigated factors that influenced the credibility of English Wikipedia in 2023, and found that "Wikipedia transformed from a dubious source of information in its early years to an increasingly reliable one over time."[96] This was due to it becoming "an active fact-checker and anti-fringe",[97] with "pro-fringe editors" leaving the site as the Wikipedia community changed its interpretation of the NPOV policy and began to more accurately label misleading content as pseudoscience, conspiracy theory, etc., in harmony with the citations used to source that content.[97] This reinterpretation of NPOV "had meaningful consequences, turning an organization that used to lend credence and false balance to pseudoscience, conspiracy theories, and extremism into a proactive debunker, fact-checker and identifier of fringe discourse."[96]
Educational and cognitive psychologist Sam Wineburg said in 2024 that "No, Wikipedia isn't an unreliable source that anyone can edit and that should be avoided. In 2024, it has become a remarkably rigorous self-correcting resource that all of us should be using more often."[98]
Journalism and use of Wikipedia in the newsroom
[edit]In his 2014 book Virtual Unreality, Charles Seife, a professor of journalism at New York University, noted Wikipedia's susceptibility to hoaxes and misinformation, including manipulation by commercial and political organizations "masquerading as common people" making edits to Wikipedia. In conclusion, Seife presented the following advice:[99]
Wikipedia is like an old and eccentric uncle.
He can be a lot of fun—over the years he's seen a lot, and he can tell a great story. He's also no dummy; he's accumulated a lot of information and has some strong opinions about what he's gathered. You can learn quite a bit from him. But take everything he says with a grain of salt. A lot of the things he thinks he knows for sure aren't quite right or are taken out of context. And when it comes down to it, sometimes he believes things that are a little bit, well, nuts.
If it ever matters to you whether something he said is real or fictional, it's crucial to check it out with a more reliable source.[99]
Seife observed that when false information from Wikipedia spreads to other publications, it sometimes alters truth itself.[99] On June 28, 2012, for example, an anonymous Wikipedia contributor added the invented nickname "Millville Meteor" to the Wikipedia biography of baseball player Mike Trout. A couple of weeks later, a Newsday sports writer reproduced the nickname in an article, and "with that act, the fake nickname became real".[99] Seife pointed out that while Wikipedia, by some standards, could be described as "roughly as accurate" as traditional publications, and is more up to date, "there's a difference between the kind of error one would find in Wikipedia and what one would in Britannica or Collier's or even in the now-defunct Microsoft Encarta encyclopedia ... the majority of hoaxes on Wikipedia could never have appeared in the old-fashioned encyclopedias."[99] Dwight Garner, reviewing Seife's book in The New York Times, said that he himself had "been burned enough times by bad online information", including "Wikipedia howlers", to have adopted a very sceptical mindset.[100]
In November 2012, judge Brian Leveson was accused of having forgotten "one of the elementary rules of journalism" when he named a "Brett Straub" as one of the founders of The Independent newspaper in his report on the culture, practices and ethics of the British press. The name had been added to the Wikipedia article on The Independent over a year prior, and turned out to be that of a 25-year-old Californian, whose friend had added his name to a string of Wikipedia pages as a prank.[101] Straub was tracked down by The Telegraph and commented, "The fact someone, especially a judge, has believed something on Wikipedia is kind of shocking. My friend went on and edited a bunch of Wikipedia pages and put my name there. [...] I knew my friend had done it but I didn't know how to change them back and I thought someone would. At one point I was the creator of Coca-Cola or something. You know how easy it is to change Wikipedia. Every time he came across a red linked name he put my name in its place."[102]
A 2016 BBC article by Ciaran McCauley similarly noted that "plenty of mischievous, made-up information has found its way" on to Wikipedia and that "many of these fake facts have fallen through the cracks and been taken as gospel by everyone from university academics to major newspapers and broadcasters."[103] Listing examples of journalists being embarrassed by reproducing hoaxes and other falsifications from Wikipedia in their writing, including false information propagated by major news organizations in their obituaries of Maurice Jarre and Ronnie Hazlehurst, McCauley stated: "Any journalist in any newsroom will likely get a sharp slap across the head from an editor for treating Wikipedia with anything but total skepticism (you can imagine the kicking I've taken over this article)."[103]
The Daily Mail—itself banned as a source on Wikipedia in 2017 because of its perceived unreliability—has publicly stated that it "banned all its journalists from using Wikipedia as a sole source in 2014 because of its unreliability".[104]
Slate said in 2022 that "Screenshots of vandalized Wikipedia articles, even when reverted within minutes, often have a much longer afterlife in news reports and on social media, creating the public impression that the platform is more vulnerable to abuse than it actually is."[105]
Science and medicine
[edit]Science and medicine are areas where accuracy is of high importance and peer review is the norm. While some of Wikipedia's content has passed a form of peer review, most has not.[106]
A 2008 study examined 80 Wikipedia drug entries. The researchers found few factual errors in this set of articles, but determined that these articles were often missing important information, like contraindications and drug interactions. One of the researchers noted that "If people went and used this as a sole or authoritative source without contacting a health professional...those are the types of negative impacts that can occur." The researchers also compared Wikipedia to Medscape Drug Reference (MDR), by looking for answers to 80 different questions covering eight categories of drug information, including adverse drug events, dosages, and mechanism of action. They have determined that MDR provided answers to 82.5 percent of the questions, while Wikipedia could only answer 40 percent, and that answers were less likely to be complete for Wikipedia as well. None of the answers from Wikipedia were determined factually inaccurate, while they found four inaccurate answers in MDR. But the researchers found 48 errors of omission in the Wikipedia entries, compared to 14 for MDR. The lead investigator concluded: "I think that these errors of omission can be just as dangerous [as inaccuracies]", and he pointed out that drug company representatives have been caught deleting information from Wikipedia entries that make their drugs look unsafe.[23]
A 2009 survey asked US toxicologists how accurately they rated the portrayal of health risks of chemicals in different media sources. It was based on the answers of 937 members of the Society of Toxicology and found that these experts regarded Wikipedia's reliability in this area as far higher than that of all traditional news media:
In perhaps the most surprising finding in the entire study, all these national media outlets [U.S. newspapers, news magazines, health magazines, broadcast and cable television networks] are easily eclipsed by two representatives of "new media": WebMD and Wikipedia. WebMD is the only news source whose coverage of chemical risk is regarded as accurate by a majority (56 percent) of toxicologists, closely followed by Wikipedia's 45 percent accuracy rating. By contrast, only 15 percent describe as accurate the portrayals of chemical risk found in The New York Times, Washington Post, and Wall Street Journal.[21]
In 2010, researchers compared information about 10 types of cancer on Wikipedia to similar data from the National Cancer Institute's Physician Data Query and concluded "the Wiki resource had similar accuracy and depth to the professionally edited database" and that "sub-analysis comparing common to uncommon cancers demonstrated no difference between the two", but that ease of readability was an issue.[107]
A study in 2011 came to the result that categories most frequently absent in Wikipedia's drug articles are those of drug interactions and medication use in breastfeeding.[108] Other categories with incomplete coverage were descriptions of off-label indications, contraindications and precautions, adverse drug events and dosing.[108] Information most frequently deviating from other sources used in the study were that of contraindications and precautions, drug absorption and adverse drug events.[108]
A 2012 study reported that Wikipedia articles about pediatric otolaryngology contained twice as many errors and omissions as the medical database eMedicine.[109]
In a U.S. study in 2014, 10 researchers examined 10 Wikipedia health articles of the most costly medical conditions in the United States and found that 90% of the entries contained errors and statements that contradicted latest medical research. However, according to Stevie Benton of Wikimedia UK the sample size used in the research may have been too small to be considered representative.[110][111] Only part of the data was made public, and for two statements that were released for other researchers to examine, the claim that Wikipedia's statements were contradictory to the peer-reviewed literature was called into question.[112] However, more open studies, published in 2017 and 2020, concluded, that Wikipedia provided less accurate medical information than paid-access online encyclopedias.[113][114]
A 2014 study published in PLOS One looked at the quality of Wikipedia articles on pharmacology, comparing articles from English and German Wikipedia with academic textbooks. This analysis revealed that the accuracy of the drug information on Wikipedia was 99.7%, while the completeness of this information was estimated at 83.8%. It conclude that "the collaborative and participatory design of Wikipedia does generate high quality information on pharmacology that is suitable for undergraduate medical education".[115]
A 2024 review of online information sources for healthcare-related research cautioned against using Wikipedia as a primary reference, and noted its value as a resource to identify sources of information.[116] Jarry said in 2024 that evaluating Wikipedia's reliability on medicine or any subject is challenging and that researchers "have to pick a sample and hope it is representative," saying also that "Wikipedia, overall, has no business being this good."[52]
Judiciary
[edit]References to Wikipedia in United States judicial opinions have increased each year since 2004. In a 2017 ruling, the Supreme Court of Texas advised against reliance on the information in Wikipedia for judicial rulings, arguing that its lack of reliability prevents using it as a source of authority in legal opinions.[117][118]
The Supreme Court of India in its judgment in Commr. of Customs, Bangalore vs. ACER India Pvt. (Citation 2007(12)SCALE581) held that "We have referred to Wikipedia, as the learned Counsel for the parties relied thereupon. It is an online encyclopaedia and information can be entered therein by any person and as such it may not be authentic."[119]
Editors of Encyclopædia Britannica
[edit]In a 2004 piece called "The Faith-Based Encyclopedia", Robert McHenry, a former editor-in-chief of Encyclopædia Britannica, stated that Wikipedia errs in billing itself as an encyclopedia, because that word implies a level of authority and accountability that he believes cannot be possessed by an openly editable reference. McHenry argued that "the typical user doesn't know how conventional encyclopedias achieve reliability, only that they do".[120] He added:
[H]owever closely a Wikipedia article may at some point in its life attain to reliability, it is forever open to the uninformed or semiliterate meddler... The user who visits Wikipedia to learn about some subject, to confirm some matter of fact, is rather in the position of a visitor to a public restroom. It may be obviously dirty, so that he knows to exercise great care, or it may seem fairly clean, so that he may be lulled into a false sense of security. What he certainly does not know is who has used the facilities before him."[120]
Similarly, Britannica's executive editor, Ted Pappas, was quoted in The Guardian as saying:
The premise of Wikipedia is that continuous improvement will lead to perfection. That premise is completely unproven.[73]
In the September 12, 2006, edition of The Wall Street Journal, Jimmy Wales debated with Dale Hoiberg, editor-in-chief of Encyclopædia Britannica. Hoiberg focused on a need for expertise and control in an encyclopedia and cited Lewis Mumford that overwhelming information could "bring about a state of intellectual enervation and depletion hardly to be distinguished from massive ignorance". Wales emphasized Wikipedia's differences, and asserted that openness and transparency lead to quality. Hoiberg replied that he "had neither the time nor space to respond to [criticisms]" and "could corral any number of links to articles alleging errors in Wikipedia", to which Wales responded: "No problem! Wikipedia to the rescue with a fine article", and included a link to the Wikipedia article Criticism of Wikipedia.[121]
Tools for testing the reliability of articles
[edit]
While experienced editors can view the article history and discussion page, for normal users it is not so easy to check whether information from Wikipedia is reliable. University projects from California, Switzerland and Germany try to improve that by methods of formal analysis and data mining. Wiki-Watch from Germany, which was inspired by the WikiBu from Switzerland, shows an evaluation up to five-stars for every English or German article in Wikipedia. Part of this rating is the tool WikiTrust which shows the trustworthiness of single text parts of Wikipedia articles by white (trustworthy) or orange (not trustworthy) markings.[122]
Information loop
[edit]
Sources accepted as reliable for Wikipedia may rely on Wikipedia as a reference source, sometimes indirectly. If the original information in Wikipedia was false, once it has been reported in sources considered reliable, Wikipedia can use them to reference the false information, giving an apparent credibility to falsehood. This in turn increases the likelihood of the false information being reported in other media.[123] A known example is the Sacha Baron Cohen article, where false information added in Wikipedia was apparently used by two newspapers, leading to it being treated as reliable in Wikipedia.[124][125] This process of creating reliable sources for false facts has been termed "citogenesis" by xkcd webcomic artist Randall Munroe.[126][127][128]
Propagation of misinformation
[edit]This section needs to be updated. (March 2022) |
Somewhat related to the "information loop" is the propagation of misinformation to other websites (Answers.com is just one of many) which will often quote misinformation from Wikipedia verbatim, and without mentioning that it has come from Wikipedia. A piece of misinformation originally taken from a Wikipedia article will live on in perhaps dozens of other websites, even if Wikipedia itself has deleted the unreliable material.[129]
Other
[edit]In one article, Information Today (March 2006) likens[77] comparisons between Wikipedia and Britannica to "apples and oranges":
Even the revered Encyclopædia Britannica is riddled with errors, not to mention the subtle yet pervasive biases of individual subjectivity and corporate correctness... There is no one perfect way. Britannica seems to claim that there is. Wikipedia acknowledges there's no such thing. Librarians and information professionals have always known this. That's why we always consult multiple sources and counsel our users to do the same.
Andrew Orlowski, a columnist for The Register, expressed similar criticisms in 2005, writing that the use of the term "encyclopedia" to describe Wikipedia may lead users into believing it is more reliable than it may be.[130]
BBC technology specialist Bill Thompson wrote that "Most Wikipedia entries are written and submitted in good faith, and we should not let the contentious areas such as politics, religion or biography shape our view of the project as a whole", that it forms a good starting point for serious research but that:[131]
No information source is guaranteed to be accurate, and we should not place complete faith in something which can so easily be undermined through malice or ignorance... That does not devalue the project entirely, it just means that we should be skeptical about Wikipedia entries as a primary source of information... It is the same with search engine results. Just because something comes up in the top 10 on MSN Search or Google does not automatically give it credibility or vouch for its accuracy or importance.[131]
Thompson adds the observation that since most popular online sources are inherently unreliable in this way, one byproduct of the information age is a wiser audience who are learning to check information rather than take it on faith due to its source, leading to "a better sense of how to evaluate information sources".[131]
In his 2007 Guide to Military History on the Internet, Simon Fowler rated Wikipedia as "the best general resource" for military history research, and stated that "the results are largely accurate and generally free of bias".[132] When rating Wikipedia as the No. 1 military site he mentioned that "Wikipedia is often criticised for its inaccuracy and bias, but in my experience the military history articles are spot on."[133]
In July 2008, The Economist magazine described Wikipedia as "a user-generated reference service" and noted that Wikipedia's "elaborate moderation rules put a limit to acrimony" generated by cyber-nationalism.[134]
Jimmy Wales, a co-founder of Wikipedia, stresses that encyclopedias of any type are not usually appropriate as primary sources, and should not be relied upon as being authoritative.[135]
Carnegie Mellon Professor Randy Pausch offered the following anecdote in his book The Last Lecture. He was surprised that his entry to World Book Encyclopedia on virtual reality was accepted without question, so he concluded, "I now believe Wikipedia is a perfectly fine source for your information, because I know what the quality control is for real encyclopedias."[136]
Removal of false information
[edit]Fernanda Viégas of the MIT Media Lab and Martin Wattenberg and Kushal Dave of IBM Research studied the flow of editing in the Wikipedia model, with emphasis on breaks in flow (from vandalism or substantial rewrites), showing the dynamic flow of material over time.[137] From a sample of vandalism edits on the English Wikipedia during May 2003, they found that most such acts were repaired within minutes, summarizing:
We've examined many pages on Wikipedia that treat controversial topics, and have discovered that most have, in fact, been vandalized at some point in their history. But we've also found that vandalism is usually repaired extremely quickly—so quickly that most users will never see its effects.[8]
They also stated that "it is essentially impossible to find a crisp definition of vandalism".[137]
Lih (2004) compared articles before and after they were mentioned in the press, and found that externally referenced articles are of higher quality work. An informal assessment by the popular IT magazine PC Pro for its 2007 article "Wikipedia Uncovered"[59] tested Wikipedia by introducing 10 errors that "varied between bleeding obvious and deftly subtle" into articles (the researchers later corrected the articles they had edited). Labeling the results "impressive" it noted that all but one was noted and fixed within the hour, and that "the Wikipedians' tools and know-how were just too much for our team." A second series of another 10 tests, using "far more subtle errors" and additional techniques to conceal their nature, met similar results: "despite our stealth attempts the vast majority... were discovered remarkably quickly... the ridiculously minor Jesse James error was corrected within a minute and a very slight change to Queen Anne's entry was put right within two minutes". Two of the latter series were not detected. The article concluded that "Wikipedia corrects the vast majority of errors within minutes, but if they're not spotted within the first day the chances... dwindle as you're then relying on someone to spot the errors while reading the article rather than reviewing the edits".
A study in late 2007 systematically inserted inaccuracies into Wikipedia entries about the lives of philosophers. Depending on how exactly the data are interpreted, either one third or one half of the inaccuracies were corrected within 48 hours.[138]
A 2007 peer-reviewed study[139] that measured the actual number of page views with damaged content stated: "42% of damage is repaired almost immediately, i.e., before it can confuse, offend, or mislead anyone. Nonetheless, there are still hundreds of millions of damaged views."[139]
Loc Vu-Quoc, professor for Mechanical and Aerospace Engineering at the University of Florida, stated in 2008 that "sometimes errors may go for years without being corrected as experts don't usually read Wikipedia articles in their own field to correct these errors".[140]
Susceptibility to bias
[edit]Individual bias and the WikiScanner tool
[edit]In August 2007, WikiScanner, a tool developed by Virgil Griffith of the California Institute of Technology, was released to match anonymous IP edits in the encyclopedia with an extensive database of addresses. News stories appeared about IP addresses from various organizations such as the Central Intelligence Agency, the Democratic Congressional Campaign Committee, Diebold, Inc. and the Australian government being used to make edits to Wikipedia articles, sometimes of an opinionated or questionable nature.[141] The BBC quoted a Wikimedia spokesperson as praising the tool: "We really value transparency and the scanner really takes this to another level. Wikipedia Scanner may prevent an organization or individuals from editing articles that they're really not supposed to."[142]
The WikiScanner story was also covered by The Independent, which stated that many "censorial interventions" by editors with vested interests on a variety of articles in Wikipedia had been discovered:
[Wikipedia] was hailed as a breakthrough in the democratisation of knowledge. But the online encyclopedia has since been hijacked by forces who decided that certain things were best left unknown... Now a website designed to monitor editorial changes made on Wikipedia has found thousands of self-serving edits and traced them to their original source. It has turned out to be hugely embarrassing for armies of political spin doctors and corporate revisionists who believed their censorial interventions had gone unnoticed.[143]
Not everyone hailed WikiScanner as a success for Wikipedia. Oliver Kamm, in a column for The Times, argued instead that:
The WikiScanner is thus an important development in bringing down a pernicious influence on our intellectual life. Critics of the web decry the medium as the cult of the amateur. Wikipedia is worse than that; it is the province of the covert lobby. The most constructive course is to stand on the sidelines and jeer at its pretensions.[144]
WikiScanner only reveals conflict of interest when the editor does not have a Wikipedia account and their IP address is used instead. Conflict of interest editing done by editors with accounts is not detected, since those edits are anonymous to everyone—except for "a handful of privileged Wikipedia admins".[145]
Coverage
[edit]Wikipedia has been accused of systemic bias, which is to say its general nature leads, without necessarily any conscious intention, to the propagation of various prejudices. Although many articles in newspapers have concentrated on minor, indeed trivial, factual errors in Wikipedia articles, there are also concerns about large-scale, presumably unintentional effects from the increasing influence and use of Wikipedia as a research tool at all levels. In an article in the Times Higher Education magazine (London) philosopher Martin Cohen frames Wikipedia of having "become a monopoly" with "all the prejudices and ignorance of its creators", which he describes as a "youthful cab-drivers" perspective.[146] Cohen's argument, however, finds a grave conclusion in these circumstances: "To control the reference sources that people use is to control the way people comprehend the world. Wikipedia may have a benign, even trivial face, but underneath may lie a more sinister and subtle threat to freedom of thought."[146] That freedom is undermined by what he sees as what matters on Wikipedia, "not your sources but the 'support of the community'."[146]
Critics also point to the tendency to cover topics in a detail disproportionate to their importance. For example, Stephen Colbert once mockingly praised Wikipedia for having a "longer entry on 'lightsabers' than it does on the 'printing press'."[147] In an interview with The Guardian, Dale Hoiberg, the editor-in-chief of Encyclopædia Britannica, noted:
People write of things they're interested in, and so many subjects don't get covered; and news events get covered in great detail. In the past, the entry on Hurricane Frances was more than five times the length of that on Chinese art, and the entry on Coronation Street was twice as long as the article on Tony Blair.[73]
This critical approach has been satirised as "Wikigroaning", a term coined by Jon Hendren[148] of the website Something Awful.[149] In the game, two articles (preferably with similar names) are compared: one about an acknowledged serious or classical subject and the other about a popular topic or current event.[150] Defenders of a broad inclusion criteria have held that the encyclopedia's coverage of pop culture does not impose space constraints on the coverage of more serious subjects (see "Wiki is not paper"). Ivor Tossell wrote:
That Wikipedia is chock full of useless arcana (and did you know, by the way, that the article on "Debate" is shorter than the piece that weighs the relative merits of the 1978 and 2003 versions of Battlestar Galactica?) isn't a knock against it: Since it can grow infinitely, the silly articles aren't depriving the serious ones of space.[151]
Wikipedia has been accused of deficiencies in comprehensiveness because of its voluntary nature, and of reflecting the systemic biases of its contributors. Wikipedia co-founder Larry Sanger stated in 2004, "when it comes to relatively specialized topics (outside of the interests of most of the contributors), the project's credibility is very uneven."[152] He expanded on this 16 years later in May 2020, by comparing how coverage impacts tone between the articles of U.S. presidents Donald Trump (seen as negative) and Barack Obama (seen as positive).[153][154]
In a GamesRadar editorial, columnist Charlie Barrat juxtaposed Wikipedia's coverage of video game-related topics with its smaller content about topics that have greater real-world significance, such as God, World War II and former U.S. presidents.[155] Wikipedia has been praised for making it possible for articles to be updated or created in response to current events. Its editors have also argued that, as a website, Wikipedia is able to include articles on a greater number of subjects than print encyclopedias can.[156]
A 2011 study reported evidence of cultural bias in Wikipedia articles about famous people on both the English and Polish Wikipedias. These biases included those pertaining to the cultures of both the United States and Poland on each of the corresponding-language Wikipedias, as well as a pro-U.S./English-language bias on both of them.[157]
Notability of article topics
[edit]Wikipedia's notability guidelines, which are used by editors to determine if a subject merits its own article, and the application thereof, are the subject of much criticism.[158] In May 2018, a Wikipedia editor rejected a draft article about Donna Strickland before she won the Nobel Prize in Physics in November of the same year, because no independent sources were given to show that Strickland was sufficiently notable by Wikipedia's standards. Journalists highlighted this as an indicator of the limited visibility of women in science compared to their male colleagues.[159][160]
The gender bias on Wikipedia is well documented and has prompted a movement to increase the number of notable women on Wikipedia through the Women in Red WikiProject. In an article entitled "Seeking Disambiguation", Annalisa Merelli interviewed Catalina Cruz, a candidate for office in Queens, New York in the 2018 election who had the notorious SEO disadvantage of having the same name as a porn star with a Wikipedia page. Merelli also interviewed the Wikipedia editor who wrote the candidate's ill-fated article (which was deleted, then restored, after she won the election). She described the Articles for Deletion process and pointed to other candidates who had pages on the English Wikipedia despite never having held office.[161]
Novelist Nicholson Baker, critical of deletionism, writes: "There are quires, reams, bales of controversy over what constitutes notability in Wikipedia: nobody will ever sort it out."[162]
Journalist Timothy Noah wrote of his treatment: "Wikipedia's notability policy resembles U.S. immigration policy before 9/11: stringent rules, spotty enforcement". In the same article, Noah mentions that the Pulitzer Prize-winning writer Stacy Schiff was not considered notable enough for a Wikipedia entry until she wrote her article "Know it All" about the Wikipedia Essjay controversy.[163]
On a more generic level, a 2014 study found no correlation between the characteristics of a given Wikipedia page about an academic and the academic's notability as determined by citation counts. The metrics of each Wikipedia page examined included length, number of links to the page from other articles, and number of edits made to the page. This study also found that Wikipedia did not cover notable ISI highly cited researchers properly.[164]
In 2020, Wikipedia was criticized for the amount of time it took for an article about Theresa Greenfield, a candidate for the 2020 United States Senate election in Iowa, to leave Wikipedia's Articles for Creation process and become published. Particularly, the criteria for notability were criticized, with The Washington Post reporting: "Greenfield is a uniquely tricky case for Wikipedia because she doesn't have the background that most candidates for major political office typically have (like prior government experience or prominence in business). Even if Wikipedia editors could recognize she was prominent, she had a hard time meeting the official criteria for notability."[165] Jimmy Wales also criticized the long process on his talk page.[166]
Political bias
[edit]Wikipedia co-founder Jimmy Wales stated in 2006: "The Wikipedia community is very diverse, from liberal to conservative to libertarian and beyond. If averages mattered, and due to the nature of the wiki software (no voting) they almost certainly don't, I would say that the Wikipedia community is slightly more liberal than the U.S. population on average, because we are global and the international community of English speakers is slightly more liberal than the U.S. population. There are no data or surveys to back that."[167]
A number of politically conservative commentators have argued that Wikipedia's coverage is affected by liberal bias.[168] Andrew Schlafly created Conservapedia because he found Wikipedia "increasingly anti-Christian and anti-American" for its frequent use of British spelling and coverage of topics like creationism and the effect of Christianity on the Renaissance.[169] In 2007, an article in The Christian Post criticised Wikipedia's coverage of intelligent design, saying that it was biased and hypocritical.[170] Lawrence Solomon of the National Review stated that Wikipedia articles on subjects like global warming, intelligent design, and Roe v. Wade are slanted in favor of liberal views.[171][non-primary source needed] In a September 2010 issue of the conservative weekly Human Events, Rowan Scarborough presented a critique of Wikipedia's coverage of American politicians prominent in the approaching midterm elections as evidence of systemic liberal bias. Scarborough compared the biographical articles of liberal and conservative opponents in Senate races in the Alaska Republican primary and the Delaware and Nevada general election, emphasizing the quantity of negative coverage of Tea Party movement-endorsed candidates. He also cites some criticism by Lawrence Solomon and quotes in full the lead section of Wikipedia's article on the conservative wiki Conservapedia as evidence of an underlying bias.[172][non-primary source needed] Jonathan Sidener of The San Diego Union-Tribune wrote that "vandalism and self-serving misinformation [are] common particularly in the political articles".[173][non-primary source needed] A 2015 study found that negative facts are more likely to be removed from Wikipedia articles on U.S. senators than positive facts but did not find any significant difference relating to political affiliation.[174]
Amid the George Floyd protests, there were several disputes over racial justice on Wikipedia.[168] The Wikipedia community voted against a proposal to black out the website in support of Black Lives Matter because it may have threatened Wikipedia's reputation for neutrality.[168][nb 3] It also led to the creation of the WikiProject Black Lives Matter, in line with AfroCROWD's Juneteenth efforts to improve the coverage of civil rights movement-related topics; the Black Lives Matter project was nominated for deletion on the grounds that it was "non-neutral advocacy".[168] In Wikipedia, neutrality is more of a process that is achieved through consensus. Social scientist Jackie Koerner took issue with the word neutrality and said she preferred the word balance to neutrality because she believed that one of Wikipedia's goals should be knowledge equity.[168]
The Japanese Wikipedia has been accused of right-wing historical revisionism, particularly on articles related to its role in World War II and colonialism, by a number of scholars.[175][176][177][178] The issue has been the subject of research supported by the Wikimedia Foundation.[179]
Reliability as a source in other contexts
[edit]Despite its status as a non-primary source, Wikipedia has been cited as evidence in some legal cases. In January 2007, The New York Times reported that U.S. courts vary greatly in their treatment of Wikipedia as a source of information, with over 100 judicial rulings having relied on the encyclopedia, including those involving taxes, narcotics, and civil issues such as personal injury and matrimonial issues.[180]
In April 2012, The Wall Street Journal reported that in the five years since the 2007 The New York Times story, federal courts of appeals had cited Wikipedia about 95 times. The story also reported that the U.S. Court of Appeals for the Fourth Circuit vacated convictions in a cockfighting case because a juror used Wikipedia to research an element of the crime, expressing in its decision concerns about Wikipedia's reliability.[181]
In one notable case, the trademark of Formula One racing decision,[182] the UK Intellectual Property Office considered both the reliability of Wikipedia, and its usefulness as a reliable source of evidence:
Wikipedia has sometimes suffered from the self-editing that is intrinsic to it, giving rise at times to potentially libellous statements. However, inherently, I cannot see that what is in Wikipedia is any less likely to be true than what is published in a book or on the websites of news organizations. [Formula One's lawyer] did not express any concerns about the Wikipedia evidence [presented by the plaintiff]. I consider that the evidence from Wikipedia can be taken at face value." The case turned substantively upon evidence cited from Wikipedia in 2006 as to the usage and interpretation of the term Formula One.
In the United States, the United States Court of Federal Claims has ruled that "Wikipedia may not be a reliable source of information."[183] and "...Articles [from Wikipedia] do not—at least on their face—remotely meet this reliability requirement...A review of the Wikipedia website reveals a pervasive and, for our purposes, disturbing series of disclaimers...".[180][184] Such disclaimers include the Wikipedia not being able to guarantee the validity of the information on its articles and having no formal peer review.
Among other reasons for these statements about Wikipedia's reliability are the stability of the articles (which due to editing may cause new readers to find information that differs from the originally cited) and, according to Stephen Gillers, a professor at New York University Law School, "the most critical fact is public acceptance", therefore "a judge should not use Wikipedia when the public is not prepared to accept it as authority".[185]
Wikipedia has also become a key source for some current news events such as the 2007 Virginia Tech massacre, when The New York Times cites Wikimedia to report 750,000 page views of the article in the two days after the event:
Even The Roanoke Times, which is published near Blacksburg, Virginia, where the university is located, noted on Thursday that Wikipedia "has emerged as the clearinghouse for detailed information on the event".[186]
The Washington Post commented, in the context of 2008 presidential election candidate biographies, that despite occasional brief vandalism, "it's hard to find a more up-to-date, detailed, thorough article on Obama than Wikipedia's. As of Friday (14 September 2007), Obama's article—more than 22 pages long, with 15 sections covering his personal and professional life—had a reference list of 167 sources."[187]
Broad opinions
[edit]Several commentators have drawn a middle ground, asserting that the project contains much valuable knowledge and has some reliability, even if the degree is not yet assessed with certainty. Others taking this view include danah boyd [sic], who in 2005 discussed Wikipedia as an academic source, concluding that "[i]t will never be an encyclopedia, but it will contain extensive knowledge that is quite valuable for different purposes",[188] and Bill Thompson who stated "I use the Wikipedia a lot. It is a good starting point for serious research, but I would never accept something that I read there without checking."[131]
Information Today's March 2006 article[77] concludes on a similar theme:
The inconvenient reality is that people and their products are messy, whether produced in a top-down or bottom-up manner. Almost every source includes errors... Many non-fiction books are produced via an appallingly sloppy process... In this author's opinion, the flap over Wikipedia was significantly overblown, but contained a silver lining: People are becoming more aware of the perils of accepting information at face value. They have learned not to consult just one source.
Dan Gillmor, a Silicon Valley commentator and author commented in October 2004 that, "I don't think anyone is saying Wikipedia is an absolute replacement for a traditional encyclopedia. But in the topics I know something about, I've found Wikipedia to be as accurate as any other source I've found."[73]
Larry Sanger stated on Kuro5hin in 2001 that "Given enough eyeballs, all errors are shallow",[189] which is a paraphrase of Linus' Law of open-source development.
Likewise, technology figure Joi Ito wrote on Wikipedia's authority, "[a]lthough it depends a bit on the field, the question is whether something is more likely to be true coming from a source whose resume sounds authoritative, or a source that has been viewed by hundreds of thousands of people (with the ability to comment) and has survived."[190]
In a 2008 letter to the editor of Physics Today, Gregg Jaeger, an associate professor at Boston University,[191] has characterized Wikipedia as a medium that is susceptible to fostering "anarchy and distortions" in relation to scientific information.[192][nb 4]
People known to use or recommend Wikipedia as a reference source include film critic Roger Ebert,[193][194][195][196] comedian Rosie O'Donnell,[197] University of Maryland physicist Robert L. Park,[198] Rutgers University sociology professor Ted Goertzel[199][200] and scientific skepticism promoter and investigator James Randi.[201] Periodicals that publish articles featuring citations of Wikipedia as a source include the American science magazines Skeptic[202][203] and Skeptical Inquirer.[204] In the January 2013 episode of his talk show, Stossel, about how ideas can flourish without regulation, journalist John Stossel interviewed Wikipedia co-founder Jimmy Wales, and discussed the success of Wikipedia's model versus that of Britannica, during which Stossel stated that his own Wikipedia article exhibited only one error.[205]
Jean Goodwin wrote on the reasons why Wikipedia may be trusted. According to him, while readers may not assess the actual expertise of the authors of a given article, they may assess the passion of Wikipedians, and in so far provide a reason for trust.[206]
Dariusz Jemielniak, a Wikimedia Foundation Board of Trustees member, suggested in 2019 that given the arrival of Wikipedia's 18th birthday, "maybe academics should start treating it as an adult".[207]
Notable incidents
[edit]False biographical information
[edit]Inaccurate information may persist in Wikipedia for a long time before it is challenged. The most prominent cases reported by mainstream media involved biographies of living persons. The Seigenthaler incident demonstrated that the subject of a biographical article must sometimes fix blatant lies about his or her own life. In May 2005, a user edited the biographical article on John Seigenthaler Sr. so that it contained several false and defamatory statements.[10] The inaccurate claims went unnoticed between May and September 2005 when they were discovered by Victor S. Johnson, Jr., a friend of Seigenthaler. Wikipedia content is often mirrored at sites such as Answers.com, which means that incorrect information can be replicated alongside correct information through a number of web sources. Such information can develop a misleading air of authority because of its presence at such sites: "Then [Seigenthaler's] son discovered that his father's hoax biography also appeared on two other sites, Reference.com and Answers.com, which took direct feeds from Wikipedia. It was out there for four months before Seigenthaler realized and got the Wikipedia entry replaced with a more reliable account. The lies remained for another three weeks on the mirror sites downstream."[208]
Seth Finkelstein reported in an article in The Guardian on his efforts to remove his own biography page from Wikipedia, simply because it was subjected to defamation: "Wikipedia has a short biography of me, originally added in February 2004, mostly concerned with my internet civil liberties achievements. After discovering in May 2006 that it had been vandalised in March, possibly by a long-time opponent, and that the attack had been subsequently propagated to many other sites which (legally) repackage Wikipedia's content, the article's existence seemed to me overall to be harmful rather than helpful." He added: "For people who are not very prominent, Wikipedia biographies can be an 'attractive nuisance'. It says, to every troll, vandal, and score-settler: 'Here's an article about a person where you can, with no accountability whatsoever, write any libel, defamation, or smear. It won't be a marginal comment with the social status of an inconsequential rant, but rather will be made prominent about the person, and reputation-laundered with the institutional status of an encyclopedia.'"[209]
In the same article, Finkelstein recounts how he voted his own biography as "not notable enough" in order to have it removed from Wikipedia. He goes on to recount a similar story involving Angela Beesley, previously a prominent member of the foundation which runs Wikipedia. Taner Akçam, a Turkish history professor at the University of Minnesota, was detained at the Montreal airport, as his article was vandalized by Turkish nationalists in 2007. While this mistake was resolved, he was again arrested in US for the same suspicion two days later.[210]
On March 2, 2007, MSNBC reported that Hillary Clinton had been incorrectly listed for 20 months in her Wikipedia biography as valedictorian of her class of 1969 at Wellesley College. (Hillary Rodham was not the valedictorian, though she did speak at commencement.)[211] The article included a link to the Wikipedia edit,[212] where the incorrect information was added on July 9, 2005. After the msnbc.com report, the inaccurate information was removed the same day.[213][nb 5]
Attempts to perpetrate hoaxes may not be confined to editing Wikipedia articles. In October 2005 Alan Mcilwraith, a former call center worker from Scotland created a Wikipedia article in which he claimed to be a highly decorated war hero. The article was quickly identified by other users as unreliable (see Wikipedia Signpost article April 17, 2006); however, Mcilwraith had also succeeded in convincing a number of charities and media organizations that he was who he claimed to be: "The 28-year-old, who calls himself Captain Sir Alan McIlwraith, KBE, DSO, MC, has mixed with celebrities for at least one fundraising event. But last night, an Army spokesman said: 'I can confirm he is a fraud. He has never been an officer, soldier or Army cadet.'"[214]
In May 2010, French politician Ségolène Royal publicly praised the memory of Léon-Robert de l'Astran, an 18th-century naturalist, humanist and son of a slave trader, who had opposed the slave trade. The newspaper Sud-Ouest revealed a month later that de l'Astran had never existed—except as the subject of an article in the French Wikipedia. Historian Jean-Louis Mahé discovered that de l'Astran was fictional after a student, interested by Royal's praise of him, asked Mahé about him. Mahé's research led him to realize that de l'Astran did not exist in any archives, and he traced the hoax back to the Rotary Club of La Rochelle. The article, created by members of the Club in January 2007, had thus remained online for three years—unsourced—before the hoax was uncovered. Upon Sud-Ouest's revelation—repeated in other major French newspapers—French Wikipedia administrator DonCamillo immediately deleted the article.[9][215][216][217][218][219]
There have also been instances of users deliberately inserting false information into Wikipedia in order to test the system and demonstrate its alleged unreliability. Journalist Gene Weingarten ran such a test in 2007 by anonymously inserting false information into his own biography. The fabrications were removed 27 hours later by a Wikipedia editor who was regularly watching changes to that article.[220] Television personality Stephen Colbert lampooned this drawback of Wikipedia, calling it wikiality.[221]
"Death by Wikipedia" is a phenomenon in which a person is erroneously proclaimed dead through vandalism. Articles about the comedian Paul Reiser, British television host Vernon Kay, French professor Bertrand Meyer, and the West Virginia Senator Robert Byrd, who died on June 28, 2010, have been vandalized in this way.[222][223][224][nb 6]
Other false information
[edit]In June 2007, an anonymous Wikipedia contributor became involved in the Chris Benoit double murder and suicide because of an unverified piece of information he added to the "Chris Benoit" English Wikipedia article. This information regarding Benoit's wife's death was added fourteen hours before police discovered the bodies of Benoit and his family.[225] Police detectives seized computer equipment from the man held responsible for the postings, but believed he was uninvolved and did not press charges.[226] The IP address from which the edit was made was traced to earlier instances of Wikipedia vandalism. The contributor apologized on Wikinews, saying: "I will never vandalize anything on Wikipedia or post wrongful information. I will never post anything here again unless it is pure fact ... ."[227]
On August 29, 2008, shortly after the first round draw was completed for UEFA Europa League football cup, an edit about a nickname of the fans was made to the article for the football club AC Omonia, apparently by users of the website B3ta.[228][nb 7] On September 18, 2008, David Anderson, a British journalist writing for the Daily Mirror, quoted this in his match preview ahead of Omonia's game with Manchester City, which appeared in the web and print versions of the Mirror and the nickname was quoted in subsequent editions on September 19.[229][230]
In May 2009, University College Dublin sociology student Shane Fitzgerald added an incorrect quote to the article on the recently deceased composer Maurice Jarre. Fitzgerald wanted to demonstrate the potential dangers of news reporters' reliance on the internet for information.[231] Although Fitzgerald's edits were removed three times from the Wikipedia article for lack of sourcing,[232] they were nevertheless copied into obituary columns in newspapers worldwide.[233] Fitzgerald believes that if he had not come forward his quote would have remained in history as fact.[232]
After the 2010 FIFA World Cup, FIFA president Sepp Blatter was presented with the Order of the Companions of Oliver Reginald Tambo. The citation, however, read: "The Order of the Companions of OR Tambo in Gold—awarded to Joseph Sepp Bellend Blatter (1936–) for his exceptional contribution to the field of football and support for the hosting of the Fifa World Cup on the African continent", after the name on his Wikipedia entry was vandalized.[234]
The death of Norman Wisdom in October 2010 led several major newspapers to repeat the false claim, drawn from Wikipedia, that he was the author of the lyrics of the Second World War song "(There'll Be Bluebirds Over) The White Cliffs of Dover".[235]
In October 2012, the Asian Football Confederation official website published an article about the United Arab Emirates national football team's bid to qualify for the 2015 AFC Asian Cup, in which the team's nickname was stated to be the "Sand Monkeys". This was the indirect result of vandalism of the Wikipedia article on the team, and the AFC was forced to apologise for what was perceived as a racist slur.[236][237]

In December 2012, an article titled "Bicholim conflict"[238] was deleted after standing since 2007.[239] It talked about a war that took place in India between the years 1640 and 1641, but was later confirmed to be completely fictitious.[240] The hoax article had won Wikipedia's "Good Article" award, a status conferred on fewer than 1 percent of articles on the site, a few months after its creation in 2007, and held that status for five years.[241]
In March 2013, it was discovered that both Wikipedia and IMDb had for three-and-a-half years contained articles on a fictitious Russian filmmaker named Yuri Gadyukin. False information had been planted in both sites as part of a viral promotion campaign for an upcoming film.[242]
In May 2014, The New Yorker reported that a 17-year-old student had added an invented nickname to the Wikipedia article on the coati in 2008, saying coatis were also known as "Brazilian aardvarks". The taxonomically false information, inserted as a private joke, lasted for six years in Wikipedia and over this time came to be propagated by hundreds of websites, several newspapers (one of which was later cited as a source in Wikipedia) and even books published by university presses. It was only removed from Wikipedia after publication of the New Yorker article, in which the student explained how the joke had come about.[1][2]
In March 2015, it became known that an article on Wikipedia entitled "Jar'Edo Wens", purportedly about an Australian aboriginal deity of that name, was a hoax. The article had survived for more than nine years before being deleted, making it one of the longest-lived documented hoax articles in Wikipedia's history. The article spawned mentions of the fake god on numerous other websites as well as in a book titled Atheism and the Case Against Christ.[243][244][245]
In August 2019, a discredited theory was removed from the article Warsaw concentration camp, over 10 years after it was debunked in mainstream scholarly literature. The article was first drafted in August 2004 by an established editor who presented as fact a fringe theory that the camp contained gas chambers in which 200,000 people perished. With the misinformation presented as fact for 15 years, media sources dubbed it as "Wikipedia's longest-standing hoax".[246][247][248]
In June 2022, it was discovered that an editor known as Zhemao (Chinese: 折毛) had created over 200 articles on the Chinese Wikipedia about fabricated events in medieval Russian history.[249] Dubbed the Zhemao hoaxes, the hoax articles combined research and fantasy, creating an alternate history centered around a "Kashin silver mine" and political ties between "princes of Tver" and "dukes of Moscow".[250]
In August 2022, Wikipedia criticism site Wikipediocracy published an interview with a hoaxer who ten years prior had added a hoax to Wikipedia, claiming that an "Alan MacMasters" had invented the electric toaster. The false information was widely reproduced online as well as in newspapers and books subsequently cited in Wikipedia.[251][252][253]
In 2023, Jan Grabowski and Shira Klein published an article in the Journal of Holocaust Research in which they claim to have discovered a "systematic, intentional distortion of Holocaust history" on the English-language Wikipedia.[254] Analysing 25 Wikipedia articles and almost 300 back pages (including talk pages, noticeboards and arbitration cases), Grabowski and Klein believe they have shown how a small group of editors managed to impose a fringe narrative on Polish–Jewish relations, informed by Polish nationalist propaganda and far removed from evidence-driven historical research. Supposed nationalist editing on these and other articles allegedly included content ranging "from minor errors to subtle manipulations and outright lies", examples of which the authors offer.[254] A response to Grabowski and Klein's article, which argues that their main conclusions are misleading or false, was published by Piotr Konieczny in the journal Holocaust Studies in 2025.[255]
Conflict-of-interest editing on Wikipedia
[edit]Political interests and advocacy
[edit]While Wikipedia policy requires articles to have a neutral point of view, there have been attempts to place a spin on articles. In January 2006 several staffers of members of the U.S. House of Representatives attempted to cleanse their respective bosses' biographies on Wikipedia, and to insert negative remarks on political opponents. References to a campaign promise by Martin Meehan to surrender his seat in 2000 were deleted, and negative comments were inserted into the articles on U.S. Senator Bill Frist and Eric Cantor, a congressman from Virginia. Numerous other changes were made from an IP address which is assigned to the House of Representatives.[256] In an interview, Jimmy Wales remarked that the changes were "not cool".[257]
On August 31, 2008, The New York Times ran an article detailing the edits made to the biography of Sarah Palin in the wake of her nomination as running mate of John McCain. During the 24 hours before the McCain campaign announcement, 30 edits, many of them flattering details, were made to the article by Wikipedia single-purpose user identity Young Trigg. This person later acknowledged working on the McCain campaign, and having several Wikipedia user accounts.[258][259]
Larry Delay and Pablo Bachelet write that from their perspective, some articles dealing with Latin American history and groups (such as the Sandinistas and Cuba) lack political neutrality and are written from a sympathetic Marxist perspective which treats socialist dictatorships favorably at the expense of alternate positions.[260][261]
In November 2007, libelous accusations were made against two politicians from southwestern France, Jean-Pierre Grand and Hélène Mandroux-Colas, on their Wikipedia biographies. Jean-Pierre Grand asked the president of the French National Assembly and the prime minister of France to reinforce the legislation on the penal responsibility of Internet sites and of authors who peddle false information in order to cause harm.[262] Senator Jean Louis Masson then requested the Minister of Justice to tell him whether it would be possible to increase the criminal responsibilities of hosting providers, site operators, and authors of libelous content; the minister declined to do so, recalling the existing rules in the LCEN law.[263]
In 2009, Wikipedia banned the Church of Scientology from editing any articles on its site. The Wikipedia articles concerning Scientology were edited by members of the group to improve its portrayal.[264]
On August 25, 2010, the Toronto Star reported that the Canadian "government is now conducting two investigations into federal employees who have taken to Wikipedia to express their opinion on federal policies and bitter political debates."[265]
In 2010, Al Jazeera's Teymoor Nabili suggested that the article Cyrus Cylinder had been edited for political purposes by "an apparent tussle of opinions in the shadowy world of hard drives and 'independent' editors that comprise the Wikipedia industry." He suggested that after the Iranian presidential election, 2009 and the ensuing "anti-Iranian activities" a "strenuous attempt to portray the cylinder as nothing more than the propaganda tool of an aggressive invader" was visible. The edits following his analysis of the edits during 2009 and 2010, represented "a complete dismissal of the suggestion that the cylinder, or Cyrus' actions, represent concern for human rights or any kind of enlightened intent", in stark contrast to Cyrus' own reputation (among the people of Babylon) as written in the Old Testament.[266]
Arab–Israeli conflict
[edit]In April 2008, the Boston-based Committee for Accuracy in Middle East Reporting in America (CAMERA) organized an e-mail campaign to encourage readers to correct perceived Israel-related biases and inconsistencies in Wikipedia.[267] Excerpts of some of the e-mails were published in the July 2008 issue of Harper's Magazine under the title of "Candid camera".[268]
CAMERA argued the excerpts were unrepresentative and that it had explicitly campaigned merely "toward encouraging people to learn about and edit the online encyclopedia for accuracy".[269] According to some defenders of CAMERA, serious misrepresentations of CAMERA's role emanated from the competing Electronic Intifada group; moreover, it is said, some other Palestinian advocacy groups have been guilty of systematic misrepresentations and manipulative behaviors but have not suffered bans of editors amongst their staff or volunteers.[270][271]
Five editors involved in the campaign were sanctioned by Wikipedia administrators.[272] Israeli diplomat David Saranga said that Wikipedia is generally fair in regard to Israel. When confronted with the fact that the entry on Israel mentioned the word "occupation" nine times, whereas the entry on the Palestinian people mentioned "terror" only once, he replied: "It means only one thing: Israelis should be more active on Wikipedia. Instead of blaming it, they should go on the site much more, and try and change it."[273]
Political commentator Haviv Rettig Gur, reviewing widespread perceptions in Israel of systemic bias in the English-language Wikipedia articles, has argued that there are deeper structural problems creating this bias: anonymous editing favors biased results, especially if those Gur calls "pro-Palestinian activists" organize concerted campaigns as has been putatively done in articles dealing with Arab-Israeli issues, and current Wikipedia policies, while well-meant, have proven ineffective in handling this.[274]
On August 3, 2010, it was reported that the Yesha Council together with Israel Sheli (My Israel), a network of online pro-Israel activists committed to spreading Zionism online, were organizing people at a workshop in Jerusalem to teach them how to edit Wikipedia articles in a pro-Israeli way.[275][276][277] Around 50 people took part in the course.[277]
The project organiser, Ayelet Shaked, who has since been elected to Israel's parliament, was interviewed on Arutz Sheva Radio. She emphasized that the information has to be reliable and meet Wikipedia rules. She cited some examples such as the use of the term "occupation" in Wikipedia entries, as well as in the editing of entries that link Israel with Judea and Samaria and Jewish history".[278]
"We don't want to change Wikipedia or turn it into a propaganda arm," commented Naftali Bennett, director of the Yesha Council. "We just want to show the other side. People think that Israelis are mean, evil people who only want to hurt Arabs all day."[279] "The idea is not to make Wikipedia rightist but for it to include our point of view," he said in another interview.[277]
A course participant explained that the course is not a "Zionist conspiracy to take over Wikipedia"; rather, it is an attempt to balance information about disputed issues presented in the online encyclopedia.
[T]he goal of this workshop was to train a number of pro-Israelis how to edit Wikipedia so that more people could present the Israeli side of things, and thus the content would be more balanced... Wikipedia is meant to be a fair and balanced source, and it is that way by having people from all across the spectrum contributing to the content.[280]
Following the course announcement, Abdul Nasser An-Najar, the head of Palestinian Journalists Syndicate said there were plans to set up a counter group to ensure the Palestinian view is presented online as the "next regional war will be [a] media war."[279]
In 2011, Wikipedia founder Jimmy Wales stated in retrospect about the course organized by Israel Sheli, "we saw absolutely no impact from that effort whatsoever. I don't think it ever—it was in the press but we never saw any impact."[281]
Editing for financial rewards
[edit]In an October 2012 Salon story, Wikipedia co-founder Jimmy Wales stated that he was against the practice of paid editing of Wikipedia, as are a number long-time members of Wikipedia's community. Nonetheless, a number of organizations do pay employees to edit Wikipedia articles, with one writer, Soraya Field Fiorio, stating that she writes commissioned Wikipedia articles for writers and musicians for $30 an hour. According to Fiorio, her clients control the article's content in the same way that they control press releases, which function as part of publicity strategies.[282] In January 2007, Rick Jelliffe claimed in a story carried by CBS[283] and IDG News Service[284][285] that Microsoft had offered him compensation in exchange for his future editorial services on OOXML. A Microsoft spokesperson, quoted by CBS, commented that "Microsoft and the writer, Rick Jelliffe, had not determined a price and no money had changed hands, but they had agreed that the company would not be allowed to review his writing before submission".
In a story covered by the BBC, Jeffrey Merkey claimed that in exchange for a donation his Wikipedia entry was edited in his favor. Jay Walsh, a spokesman for Wikipedia, flatly denied the allegations in an interview given to the Daily Telegraph.[286]
In a story covered by InformationWeek, Eric Goldman, assistant law professor at Santa Clara University in California argued that "eventually, marketers will build scripts to edit Wikipedia pages to insert links and conduct automated attacks on Wikipedia",[287] thus putting the encyclopedia beyond the ability of its editors to provide countermeasures against the attackers, particularly because of a vicious circle where the strain of responding to these attacks drives core contributors away, increasing the strain on those who remain.[288][nb 8]
Conflicts involving Wikipedia policy makers
[edit]In February 2008, British technology news and opinion website The Register stated that a prominent administrator of Wikipedia had edited a topic area where he had a conflict of interest to keep criticism to a bare minimum, as well as altering the Wikipedia policies regarding personal biography and conflict of interest to favour his editing.[289]
Some of the most scathing criticism of Wikipedia's claimed neutrality came in The Register, which in turn was allegedly criticized by founding members of the project. According to The Register: "In short, Wikipedia is a cult. Or at least, the inner circle is a cult. We aren't the first to make this observation. On the inside, they reinforce each other's beliefs. And if anyone on the outside questions those beliefs, they circle the wagons. They deny the facts. They attack the attacker. After our Jossi Fresco story, Fresco didn't refute our reporting. He simply accused us of 'yellow journalism'. After our Overstock.com article, Wales called us 'trash'."[290]
Charles Arthur in The Guardian said that "Wikipedia, and so many other online activities, show all the outward characteristics of a cult."[291]
In February 2015, a longstanding Wikipedia administrator was site-banned after Wikipedia's Arbitration Committee found that they had, over a period of several years, manipulated the content of Wikipedia articles to add positive content and remove negative content about the controversial Indian Institute of Planning and Management and its dean, Arindam Chaudhuri. An Indian journalist commented in Newsweek on the importance of the Wikipedia article to the institute's PR campaign and voiced the opinion that "by letting this go on for so long, Wikipedia has messed up perhaps 15,000 students' lives".[292][293]
Scientific disputes
[edit]The 2005 Nature study also gave two brief examples of challenges that Wikipedian science writers purportedly faced on Wikipedia. The first concerned the addition of a section on violence to the schizophrenia article, which exhibited the view of one of the article's regular editors, neuropsychologist Vaughan Bell, that it was little more than a "rant" about the need to lock people up, and that editing it stimulated him to look up the literature on the topic.[25]
The second dispute reported by Nature involved the climatologist William Connolley related to protracted disputes between editors of climate change topics, in which Connolley was placed on parole and several opponents banned from editing climate related articles for six months;[25] a separate paper commented that this was more about etiquette than bias and that Connolley did "not suffer fools gladly".[294]
See also
[edit]- Bourgeois v. Peters (2004), one of the earliest court opinions to cite and quote Wikipedia
- Essjay controversy
- Fictitious entry
- Ideological bias on Wikipedia
- Wikipedia:List of hoaxes on Wikipedia
- The Truth According to Wikipedia (2008)
- Truth in Numbers? (2010)
- WikiTrust, a reputation system for Wikipedia authors and content
- Woozle effect
In other Wikipedias
Further reading
[edit]- Thomas Leitch (October 2, 2014), Wikipedia U: Knowledge, Authority, and Liberal Education in the Digital Age, Johns Hopkins University Press, Wikidata Q108733210?
Notes
[edit]- ^ Its reliability has received wide-spread media coverage and is frequently featured in popular culture.
- ^ Wikipedia's policies on original research, including unpublished synthesis of published data, disallow new analysis and interpretation not found in reliable sources.
- ^ There was also a dispute on the "George Floyd", "George Floyd protests", and "Murder of George Floyd" articles on whether they should mention Floyd's prior criminal charges, use of the word riot (rejected because most reliable sources did not refer to them as riots), and change it from Death to Killing, respectively. While death was the more neutral term, editors felt that killing was the more accurate term and neutral by definition. As for the criminal charges, those in favour cited in support that Wikipedia is not censored, while those opposed cited weight policy, positing that it would be undue to add because his past criminal history did not have relevance to his murder.[168]
- ^ The letter was in response to a review of his book Quantum Information: An Overview, that had questioned "whether there is an audience for such encyclopedic texts, especially given the easy access to online sources of information such as the arXiv e-print server and Wikipedia."
- ^ Between the two edits, the wrong information had stayed in the Clinton article while it was edited more than 4,800 times over 20 months.
- ^ Wikipedia considers vandalism as "any addition, removal, or change of content in a deliberate attempt to compromise the integrity of Wikipedia". The Wikipedia page "Researching with Wikipedia" states: "Wikipedia's radical openness means that any given article may be, at any given moment, in a bad state: for example, it could be in the middle of a large edit or it could have been recently vandalized. While blatant vandalism is usually easily spotted and rapidly corrected, Wikipedia is certainly more subject to subtle vandalism than a typical reference work."
- ^ It added the following erroneous information to the section titled "The fans": "A small but loyal group of fans are lovingly called "The Zany Ones"—they like to wear hats made from discarded shoes and have a song about a little potato."
- ^ Wikipedia operates bots to aid in the detection and removal of vandalism, and uses nofollow and a CAPTCHA to discourage and filter additions of external links.
References
[edit]- ^ a b Randall, Eric (May 19, 2014). "How a raccoon became an aardvark". The New Yorker. Archived from the original on December 29, 2016. Retrieved November 24, 2016.
- ^ a b Kolbe, Andreas (January 16, 2017). "Happy birthday: Jimbo Wales' sweet 16 Wikipedia fails. From aardvark to Bicholim, the encylopedia [sic] of things that never were". The Register. Archived from the original on July 8, 2017. Retrieved June 4, 2017.
- ^ Seelye, Katharine Q. (December 5, 2005). "Snared in the Web of a Wikipedia Liar". The New York Times. Archived from the original on September 7, 2014. Retrieved February 23, 2017.
- ^ "Wikipedia is 20, and its reputation has never been higher". The Economist. January 9, 2021. Archived from the original on January 8, 2021. Retrieved February 25, 2021.
- ^ Cooke, Richard (February 17, 2020). "Wikipedia Is the Last Best Place on the Internet". Wired. Archived from the original on January 10, 2021. Retrieved October 13, 2020.
- ^ "Happy Birthday, Wikipedia". The Economist. January 9, 2021. Archived from the original on January 8, 2021. Retrieved February 16, 2022.
- ^ Fernanda B. Viégas, Martin Wattenberg, Kushal Dave: Studying Cooperation and Conflict between Authors with history flow Visualizations Archived January 25, 2006, at the Wayback Machine. Proceedings of the SIGCHI conference on Human factors in computing systems, 575–582, Vienna 2004, ISBN 1-58113-702-8
- ^ a b Viégas, Fernanda B.; Wattenberg, Martin; Dave, Kushal (2003). "History flow: results". research.ibm.com. IBM Collaborative User Experience Research Group. Archived from the original on November 2, 2006. Retrieved July 7, 2016.
- ^ a b Sage, Adam (June 9, 2010). "Ségolène Royal and Wikipedia duped by tale of anti-slavery activist". The Times. London. Archived from the original on July 28, 2014. Retrieved June 17, 2011.
- ^ a b Seigenthaler, John (November 29, 2005). "A false Wikipedia "biography"". USA Today. Archived from the original on January 6, 2012. Retrieved September 10, 2017.
- ^ Torres, Nicole (June 2, 2016). "Why Do So Few Women Edit Wikipedia?". Harvard Business Review. ISSN 0017-8012. Archived from the original on June 17, 2020. Retrieved June 26, 2020.
- ^ Cassano, Jay (January 29, 2015). "Black History Matters, So Why Is Wikipedia Missing So Much Of It?". Fast Company. Archived from the original on May 10, 2015. Retrieved April 13, 2015.
- ^ Cooke, Richard (January 2, 2020). "Wikipedia Is the Last Best Place on the Internet". Wired. ISSN 1059-1028. Archived from the original on January 10, 2021. Retrieved January 2, 2020.
- ^ Frick, Walter (December 3, 2014). "Wikipedia Is More Biased Than Britannica, but Don't Blame the Crowd". Harvard Business Review. ISSN 0017-8012. Archived from the original on June 26, 2020. Retrieved June 20, 2020.
- ^ Greenstein, Shane; Zhu, Feng (January 1, 2012). "Is Wikipedia Biased?". American Economic Review. 103. Harvard Business School. Archived from the original on December 30, 2019. Retrieved June 26, 2020.
- ^ Leonard, Andrew (May 17, 2013). "Revenge, ego and the corruption of Wikipedia". Salon. Archived from the original on May 31, 2016. Retrieved June 4, 2016.
- ^ Pinsker, Joe (August 11, 2015). "The Covert World of People Trying to Edit Wikipedia—for Pay". The Atlantic. Archived from the original on June 1, 2016. Retrieved June 4, 2016.
- ^ a b Petiška, Eduard; Moldan, Bedřich (December 9, 2019). "Indicator of quality for environmental articles on Wikipedia at the higher education level". Journal of Information Science. 47 (2): 269–280. doi:10.1177/0165551519888607. ISSN 0165-5515. S2CID 214401940.
- ^ Harrison, Stephen (March 19, 2020). "The Coronavirus Is Stress-Testing Wikipedia's Systems—and Editors". Slate Magazine. Archived from the original on April 18, 2020. Retrieved July 10, 2020.
- ^ Wood, A; Struthers, K (2010). "Pathology education, Wikipedia and the Net generation". Medical Teacher. 32 (7): 618–620. doi:10.3109/0142159X.2010.497719. PMID 20653388.
We have identified Wikipedia as an informative and accurate source for Pathology education and believe that Wikipedia is potentially an important learning tool for of the 'Net Generation'.
- ^ a b S. Robert Lichter, Ph.D,,: Are chemicals killing us? Statistical Assessment Service, May 21, 2009
- ^ Leithner, A; Maurer-Ertl, W; Glehr, M; Friesenbichler, J; Leithner, K; Windhager, R (July–August 2010). "Wikipedia and osteosarcoma: a trustworthy patients' information?". Journal of the American Medical Informatics Association. 17 (4): 373–4. doi:10.1136/jamia.2010.004507. PMC 2995655. PMID 20595302.
- ^ a b Clauson KA; Polen HH; Kamel Boulos MN; Dzenowagis JH (2008). "Scope, completeness, and accuracy of drug information in Wikipedia" (PDF). Annals of Pharmacotherapy. 42 (12): 1814–21. doi:10.1345/aph.1L474. PMID 19017825. S2CID 2072846. Archived from the original (PDF) on March 25, 2009. Retrieved September 25, 2009.
- Anne Harding (November 25, 2008). "Wikipedia often omits important drug information: study". Reuters. Archived from the original on October 5, 2020. Retrieved July 1, 2017.
- ^ a b c Reavley, N. J.; MacKinnon, A. J.; Morgan, A. J.; Alvarez-Jimenez, M.; Hetrick, S. E.; Killackey, E.; Nelson, B.; Purcell, R.; Yap, M. B. H.; Jorm, A. F. (2011). "Quality of information sources about mental disorders: A comparison of Wikipedia with centrally controlled web and printed sources". Psychological Medicine. 42 (8): 1753–1762. doi:10.1017/S003329171100287X. hdl:11343/59260. PMID 22166182. S2CID 13329595.
- ^ a b c d e Giles, J. (2005). "Internet encyclopaedias go head to head: Jimmy Wales' Wikipedia comes close to Britannica in terms of the accuracy of its science entries". Nature. 438 (7070): 900–1. Bibcode:2005Natur.438..900G. doi:10.1038/438900a. PMID 16355180. The study (which was not in itself peer-reviewed) was cited in many news articles such as this: "Wikipedia survives research test". BBC News. BBC. December 15, 2005. Archived from the original on August 7, 2012. Retrieved July 18, 2006.
- ^ Nature (March 30, 2006). "Nature's responses to Encyclopaedia Britannica". Nature.com. Archived from the original on November 5, 2006. Retrieved March 19, 2012.
- ^ Fatally Flawed: Refuting the recent study on encyclopedic accuracy by the journal Nature. Archived July 9, 2016, at the Wayback Machine Encyclopædia Britannica, March 2006
- ^ Rajagopalan, M. S.; Khanna, V. K.; Leiter, Y.; Stott, M.; Showalter, T. N.; Dicker, A. P.; Lawrence, Y. R. (2011). "Patient-Oriented Cancer Information on the Internet: A Comparison of Wikipedia and a Professionally Maintained Database". Journal of Oncology Practice. 7 (5): 319–323. doi:10.1200/JOP.2010.000209. PMC 3170066. PMID 22211130.
- ^ Azer, S. A. (2014). "Evaluation of gastroenterology and hepatology articles on Wikipedia". European Journal of Gastroenterology & Hepatology. 26 (2): 155–63. doi:10.1097/MEG.0000000000000003. PMID 24276492. S2CID 7760287.
- ^ Barnett, David (February 17, 2018). "Can we trust Wikipedia? 1.4 billion people can't be wrong". The Independent. Archived from the original on February 11, 2019. Retrieved July 15, 2021.
- ^ Mak, Aaron (May 28, 2019). "Inside the Brutal, Petty War Over Donald Trump's Wikipedia Page". Slate. Archived from the original on June 15, 2021. Retrieved July 15, 2020.
- ^ Anthony, Denise; Smith, Sean W.; Williamson, Timothy (July 20, 2009). "Reputation and Reliability in Collective Goods". Rationality and Society. 21 (3): 283–306. CiteSeerX 10.1.1.299.9401. doi:10.1177/1043463109336804. S2CID 146210753.
- ^ Timmer, John (October 18, 2007). "Anonymous "good samaritans" produce Wikipedia's best content, says study". Ars Technica. Archived from the original on October 26, 2007. Retrieved October 27, 2007.
Good samaritans with less than 100 edits made higher-quality contributions than those with registered accounts and equal amounts of content. In fact, anonymous contributors with a single edit had the highest quality of any group. But quality steadily declined, and more-frequent anonymous contributors were anything but Samaritans; their contributions generally didn't survive editing... The authors also recognize that contributions in the form of stubs on obscure topics might survive unaltered indefinitely, inflating the importance of single contributions...Objective ratings of quality are difficult, and it's hard to fault the authors for attempting to find an easily-measured proxy for it. In the absence of independent correlation, however, it's not clear that the measurement used actually works as a proxy. Combined with the concerns regarding anonymous contributor identity, there are enough problems with this study that the original question should probably be considered unanswered, regardless of how intuitively satisfying these results are.
- ^ "WikiStats by S23 – List of Wikipedias". s23Wiki. Archived from the original on July 18, 2011.
- ^ a b c d Gertner, Jon (July 18, 2023). "Wikipedia's Moment of Truth". The New York Times. ISSN 0362-4331. Archived from the original on July 20, 2023. Retrieved May 23, 2024.
- ^ Glaser, April (August 14, 2018). "YouTube Is Adding Fact-Check Links for Videos on Topics That Inspire Conspiracy Theories". Slate Magazine. Archived from the original on January 26, 2021. Retrieved January 23, 2021.
- ^ Flynn, Kerry (October 5, 2017). "Facebook outsources its fake news problem to Wikipedia—and an army of human moderators". Mashable. Archived from the original on December 4, 2020. Retrieved January 23, 2021.
- ^ Iannucci, Rebecca (July 6, 2017). "What can fact-checkers learn from Wikipedia? We asked the boss of its nonprofit owner". Poynter Institute. Archived from the original on January 12, 2021. Retrieved January 23, 2021.
- ^ Gertner 2023: "While estimates of its influence can vary, Wikipedia is probably the most important single source in the training of A.I. models. ... In fact, no one I spoke with in the tech community seemed to know if it would even be possible to build a good A.I. model without Wikipedia."
- ^ Stvilia, Besiki; Twidale, Michael B.; Smith, Linda C.; Gasser, Les (April 2008). "Information Quality Work Organization in Wikipedia" (PDF). Journal of the American Society for Information Science and Technology. 59 (6): 983–1001. CiteSeerX 10.1.1.163.5109. doi:10.1002/asi.20813. S2CID 10156153. Archived from the original (PDF) on August 20, 2007.
- ^ Fearnow, Benjamin (January 31, 2014). "Report: Wikipedia The Top Source Of Health Care Info For Doctors, Patients". CBS. Archived from the original on January 31, 2014. Retrieved February 1, 2014.
- ^ Mike Barnes (October 24, 2005). "Can you trust Wikipedia?". The Guardian. ISSN 0261-3077. Wikidata Q110613135.
- ^ "Supplementary information to accompany Nature news article 'Internet encyclopedias go head to head'". Nature. December 22, 2005. Archived from the original on September 28, 2007.
- ^ a b "Fatally Flawed – Refuting the recent study on encyclopedic accuracy by the journal Nature" (PDF). Encyclopædia Britannica, Inc. March 2006. Archived (PDF) from the original on July 9, 2016. Retrieved June 30, 2009.
- ^ "Britannica attacks". Nature. 440 (7084): 582. March 30, 2006. Bibcode:2006Natur.440R.582.. doi:10.1038/440582b. PMID 16572128.
- ^ "Wikipedia study 'fatally flawed'". BBC News. March 24, 2006. Archived from the original on July 14, 2006. Retrieved May 31, 2011.
- ^ "Encyclopædia Britannica and Nature: a response" (PDF). Nature. March 23, 2006. Archived (PDF) from the original on October 31, 2007. Retrieved October 31, 2007.
- ^ "Encyclopædia Britannica and Nature: a response" (PDF). Nature Press release. March 23, 2006. Archived (PDF) from the original on October 31, 2007. Retrieved May 31, 2011.
- ^ "Seth's Blog » Blog Archive » One-Sided Critiques of the Day". Blog.sethroberts.net. June 2, 2007. Archived from the original on April 28, 2014. Retrieved April 29, 2014.
- ^ "Seven years after Nature, pilot study compares Wikipedia favorably to other encyclopedias in three languages — Wikimedia blog". Blog.wikimedia.org. August 2, 2012. Archived from the original on June 15, 2014. Retrieved April 29, 2014.
- ^ See author-acknowledged comments in response to the citation of the Nature study, at PLoS One, 2014, Citation of fundamentally flawed Nature quality "study", in response to T. Yasseri et al. (2012), Dynamics of Conflicts in Wikipedia, published June 20, 2012, DOI 10.1371/journal.pone.0038869. Retrieved July 21, 2014. Archived January 16, 2016, at the Wayback Machine.
- ^ a b Jarry, Jonathan (September 6, 2024). "Can You Trust Dr. Wikipedia?". Office for Science and Society. Retrieved September 7, 2024.
- ^ Rosenzweig, Roy (June 2006). "Can History be Open Source? Wikipedia and the Future of the Past". The Journal of American History. 93 (1): 117–146. doi:10.2307/4486062. JSTOR 4486062. Archived from the original on April 25, 2010. Retrieved August 11, 2006. (Center for History and New Media)
- ^ "Survey of Wikipedia accuracy and completeness". California State University at Dominguez Hills. May 2006. Archived from the original on July 21, 2014. Retrieved December 10, 2012.
- ^ "Survey of Wikipedia accuracy and completeness". Larry Press, Professor of Computer Information Systems, California State University. 2006. Archived from the original on September 28, 2011. Retrieved October 31, 2007.
- ^ Michael Kurzidim: Wissenswettstreit. Die kostenlose Wikipedia tritt gegen die Marktführer Encarta und Brockhaus an, in: c't 21/2004, October 4, 2004, S. 132–139.
- ^ Dorothee Wiegand: "Entdeckungsreise. Digitale Enzyklopädien erklären die Welt." c't 6/2007, March 5, 2007, p. 136–145. Original quote: "Wir haben in den Texten der freien Enzyklopädie nicht mehr Fehler gefunden als in denen der kommerziellen Konkurrenz"
- ^ Bragues, George (April 2007). "Wiki-Philosophizing in a Marketplace of Ideas: Evaluating Wikipedia's Entries on Seven Great Minds". SSRN 978177.
- ^ a b c PC Pro magazine, August 2007, p. 136, "Wikipedia Uncovered".
- ^ "PC Authority – 'Wikipedia Uncovered'". Archived from the original on February 26, 2009. Retrieved December 31, 2008.
- ^ Schönert, Ulf; Güntheroth, Horst (December 2007). "Wikipedia: Wissen für alle" [Wikipedia: Knowledge for Everyone]. Stern (in German). Vol. 2007, no. 50. pp. 30–44. Archived from the original on January 11, 2023. Retrieved January 11, 2023.
Einige Wikipedia-Artikel sind für Laien schlicht zu kompliziert, viele zu weitschweifig, urteilten die Tester. [Some Wikipedia articles are simply too complicated for laypersons, many too long-winded, judged the testers.]
- ^ "Wikipedia schlägt Brockhaus" [Wikipedia beats Brockhaus]. Stern (in German). Gruner + Jahr. December 5, 2007. Archived from the original on August 2, 2009. Retrieved September 6, 2016.
- ^ K.C. Jones: German Wikipedia Outranks Traditional Encyclopedia's Online Version Archived December 12, 2007, at the Wayback Machine. InformationWeek, December 7, 2007
- ^ Williams, Simon (April 21, 2008). "Wikipedia vs Encyclopaedia: A question of trust?". Techradar.com. Archived from the original on July 5, 2008. Retrieved September 6, 2016.
- ^ Rector, Lucy Holman (2008). "Comparison of Wikipedia and other encyclopedias for accuracy, breadth, and depth in historical articles". Reference Services Review. 36 (1): 7–22. doi:10.1108/00907320810851998.
- ^ Luyt, Brendan; Tan, Daniel (April 1, 2010). "Improving Wikipedia's credibility: References and citations in a sample of history articles". Journal of the American Society for Information Science and Technology. 61 (4): 715–722. doi:10.1002/asi.21304. hdl:10356/95416. ISSN 1532-2890.
- ^ Brown, Adam R. (April 8, 2011). "Wikipedia as a Data Source for Political Scientists: Accuracy and Completeness of Coverage". PS: Political Science & Politics. 44 (2): 339–343. doi:10.1017/S1049096511000199. S2CID 154963796.
- ^ Greenstein, Shane; Zhu, Feng (June 2012). "Collective Intelligence and Neutral Point of View: The Case of Wikipedia". NBER Working Paper No. 18167. doi:10.3386/w18167.
- ^ Hwang, Thomas J.; Bourgeois, Florence T.; Seeger, John D. (June 26, 2014). "Drug Safety in the Digital Age". New England Journal of Medicine. 370 (26): 2460–2462. doi:10.1056/NEJMp1401767. PMID 24963564.
- ^ Phillips, Jennifer; Lam, Connie; Palmisano, Lisa (July 1, 2014). "Analysis of the accuracy and readability of herbal supplement information on Wikipedia". Journal of the American Pharmacists Association. 54 (4): 406–14. doi:10.1331/JAPhA.2014.13181. PMID 25063262.
- ^ Kräenbring, Jona; Monzon Penza, Tika; Gutmann, Joanna; Muehlich, Susanne; Zolk, Oliver; Wojnowski, Leszek; Maas, Renke; Engelhardt, Stefan; Sarikas, Antonio; Lovis, Christian (September 24, 2014). "Accuracy and Completeness of Drug Information in Wikipedia: A Comparison with Standard Textbooks of Pharmacology". PLOS ONE. 9 (9) e106930. Bibcode:2014PLoSO...9j6930K. doi:10.1371/journal.pone.0106930. PMC 4174509. PMID 25250889.
- ^ Self description taken from blog biography, "Phil Bradley – biography". Phil Bradley. 2007. Archived from the original on November 3, 2007. Retrieved October 31, 2007.
- ^ a b c d Waldman, Simon (October 26, 2004). "Who knows?". The Guardian. London. Archived from the original on August 25, 2014. Retrieved February 3, 2011.
- ^ "About Wikipedia". Trent University Library. Trent University. April 30, 2007. Archived from the original on December 4, 2005. Retrieved April 13, 2010.
- ^ "I want my Wikipedia!". Library Journal. April 2006. Archived from the original on December 4, 2015. Retrieved October 23, 2015.
- ^ Gorman, Michael. "Jabberwiki: The Educational Response, Part II". Encyclopædia Britannica Blog. Encyclopædia Britannica, Inc. Archived from the original on April 24, 2017. Retrieved April 23, 2017.
- ^ a b c "Wikipedia and Britannica: The kid's all right". Searcher. Information Today, Inc. March 2006. Archived from the original on November 4, 2007. Retrieved October 31, 2007.
- ^ Peter Binkley (2006). "Wikipedia Grows Up". Feliciter (2): 59–61. Wikidata Q66411582..
- ^ Feng Shi; Misha Teplitskiy; Eamon Duede; James A. Evans (March 4, 2019). "The wisdom of polarized crowds". Nature Human Behaviour. 3 (4): 329–336. arXiv:1712.06414. doi:10.1038/S41562-019-0541-6. ISSN 2397-3374. PMID 30971793. Wikidata Q47248083.. They continued, "To explore whether political diversity has an upper bound beyond which polarization hampers performance, we re-estimated the regression models of quality with a quadratic polarization term. Estimates suggest that quality may eventually decline with increasing polarization, but the optimal level of polarization is above that realized by 95% of the teams in this study. For the 5% most polarized teams, there is no statistically significant pattern between polarization and quality. In other words, we do not find evidence that very high levels of political polarization hampers Wikipedia performance." (p. 11)
- ^ a b Chen, Lysa (March 28, 2007). "Several colleges push to ban Wikipedia as resource". Duke Chronicle. Archived from the original on April 13, 2009.
- ^ Youngwood, Susan (April 1, 2007). "Wikipedia: What do they know; when do they know it, and when can we trust it?". Rutland Herald. Archived from the original on November 8, 2016. Retrieved May 16, 2019.
Perhaps the most important thing to understand about Wikipedia—both its genius and its Achilles heel—is that anyone can create or modify an entry. Anyone means your 10-year-old neighbor or a Nobel Prize winner—or an editor like me, who is itching to correct a grammar error in that Wikipedia entry that I just quoted. Entries can be edited by numerous people and be in constant flux. What you read now might change in five minutes. Five seconds, even.
- ^ Riskin, Adrian (October 21, 2013). "Elementary Mathematics on Wikipedia". Archived from the original on October 21, 2013. Retrieved October 24, 2013.
- ^ "A Stand Against Wikipedia Archived 2012-08-10 at the Wayback Machine", Inside Higher Ed (January 26, 2007). Retrieved January 27, 2007.
- ^ McHenry, Robert (November 15, 2004). "The Faith-Based Encyclopedia". Tech Central Station. Archived from the original on June 13, 2006. Retrieved October 12, 2008.
- ^ Cohen, Noam (February 27, 2007). "Wikipedia on an academic hit list". NY Times News Service. Archived from the original on March 5, 2007. Retrieved April 16, 2007.
Middlebury professor Thomas Beyer, of the Russian department, said: 'I guess I am not terribly impressed by anyone citing an encyclopedia as a reference point, but I am not against using it as a starting point.'
- ^ Polk, Tracy; Johnston, Melissa P.; Evers, Stephanie (2015). "Wikipedia Use in Research: Perceptions in Secondary Schools". TechTrends: Linking Research & Practice to Improve Learning. 59 (3): 92–102. doi:10.1007/s11528-015-0858-6. S2CID 62595811.
- ^ Chesney, Thomas (May 16, 2006). "An empirical examination of Wikipedia's credibility". First Monday. doi:10.5210/fm.v11i11.1413. Archived from the original on October 2, 2025. Retrieved January 20, 2010.
- ^ Study cited in "Experts rate Wikipedia's accuracy higher than non-experts". Ars Technica. November 27, 2006. Archived from the original on November 5, 2007. Retrieved October 31, 2007.
- ^ The study explains that "In the survey, all respondents under Condition 1 were asked if there were any mistakes in the article they had been asked to read. Only five reported seeing mistakes and one of those five reported spelling mistakes rather than factual errors. This suggests that 13 percent of Wikipedia's articles have errors." Thus 80% of the 13% related to factual errors and 20% of the 13% related to spelling errors. Chesney, Thomas (May 16, 2006). "An empirical examination of Wikipedia's credibility". First Monday. doi:10.5210/fm.v11i11.1413. Archived from the original on April 11, 2010. Retrieved January 20, 2010.
- ^ Bailey, Matt (October 2, 2007). "Using Wikipedia". Lawrence McKinley Gould Library, Carleton College. Archived from the original on November 3, 2007. Retrieved October 31, 2007.
- ^ "We Can't Ignore the Influence of Digital Technologies". The Chronicle of Higher Education. Chronicle of Higher Education. March 23, 2007. Archived from the original on January 16, 2016. Retrieved December 15, 2015.
- ^ What is Happening in the Educational System of the Contemporary World and How "The State Program on Reforms of the Higher Education System in the Republic of Azerbaijan for the Period of 2008–2012" May Best be Carried Out (in Azeri). Khazar University Press, 2008
- ^ Burnsed, Brian (June 20, 2011). "Wikipedia Gradually Accepted in College Classrooms". U.S. News & World Report. Archived from the original on June 12, 2018. Retrieved June 2, 2018.
- ^ Linden, Hartmut (August 2, 2002). "A White Collar Protein Senses Blue Light". Science. 297 (5582): 777–778. doi:10.1126/science.1075485. PMID 12161636. S2CID 41282143. (subscription access only)
- ^ George, Yolanda S. & Malcolm, Shirley S. "Perspectives from AAAS" (PDF). American Association for the Advancement of Science. Archived from the original (PDF) on October 29, 2007. Retrieved October 27, 2007.
- ^ a b Steinsson, Sverrir (March 9, 2023). "Rule Ambiguity, Institutional Clashes, and Population Loss: How Wikipedia Became the Last Good Place on the Internet". American Political Science Review. 118. Cambridge University Press: 235–251. doi:10.1017/s0003055423000138. ISSN 0003-0554. S2CID 257434844.
- ^ a b ShahBano Ijaz, Syeda (May 29, 2023). "How Conflicts and Population Loss Led to the Rise of English Wikipedia's Credibility". Political Science Now. Archived from the original on June 5, 2023. Retrieved June 20, 2023.
- ^ Wineburg, Sam; Ziv, Nadav (October 17, 2024). "Go ahead and use Wikipedia for research - The Boston Globe". The Boston Globe. Archived from the original on November 6, 2024. Retrieved October 18, 2024.
- ^ a b c d e Seife, Charles (2014). Virtual Unreality: Just Because the Internet Told You, how Do You Know It's True?. Penguin Publishing Group. pp. 26–29, 32–34, 201. ISBN 978-0-670-02608-1. Archived from the original on March 31, 2019. Retrieved June 4, 2017.
- ^ Garner, Dwight (July 1, 2014). "Online, the Lying Is Easy. In 'Virtual Unreality,' Charles Seife Unfriends Gullibility". The New York Times. Archived from the original on September 1, 2017. Retrieved June 4, 2017.
- ^ McSmith, Andy (November 30, 2012). "Leveson's Wikipedia moment: how internet 'research' on The Independent's history left him red-faced". The Independent. Archived from the original on December 4, 2012. Retrieved March 25, 2014.
- ^ Allen, Nick (December 5, 2012). "Wikipedia, the 25-year–old student and the prank that fooled Leveson". The Telegraph. Archived from the original on January 31, 2014. Retrieved March 25, 2014.
- ^ a b McCauley, Ciaran (February 8, 2017). "Wikipedia hoaxes: From Breakdancing to Bilcholim". BBC. Archived from the original on May 20, 2017. Retrieved June 4, 2017.
- ^ Jackson, Jasper (October 3, 2016). "Wikipedia bans Daily Mail as 'unreliable' source". The Guardian. Archived from the original on January 18, 2019. Retrieved June 4, 2017.
- ^ Breslow, Samuel (August 11, 2022). "How a False Claim About Wikipedia Sparked a Right-Wing Media Frenzy". Slate Magazine. Archived from the original on January 22, 2023. Retrieved September 1, 2022.
- ^ Heilman, JM; Kemmann, E; Bonert, M; Chatterjee, A; Ragar, B; Beards, GM; Iberri, DJ; Harvey, M; Thomas, B; Stomp, W; Martone, MF; Lodge, DJ; Vondracek, A; de Wolff, JF; Liber, C; Grover, SC; Vickers, TJ; Meskó, B; Laurent, MR (January 31, 2011). "Wikipedia: a key tool for global public health promotion". Journal of Medical Internet Research. 13 (1): e14. doi:10.2196/jmir.1589. PMC 3221335. PMID 21282098.
- ^ Rajagopalan; et al. (2010). "Accuracy of cancer information on the Internet: A comparison of a Wiki with a professionally maintained database". Journal of Clinical Oncology. 28 (15_suppl): 6058. doi:10.1200/jco.2010.28.15_suppl.6058. Archived from the original on January 9, 2014. Retrieved June 5, 2010.
- ^ a b c Lavsa, S. M.; Corman, S. L.; Culley, C. M.; Pummer, T. L. (2011). "Reliability of Wikipedia as a medication information source for pharmacy students". Currents in Pharmacy Teaching and Learning. 3 (2): 154–158. doi:10.1016/j.cptl.2011.01.007.
- ^ Volsky, Peter G.; Baldassari, Cristina M.; Mushti, Sirisha; Derkay, Craig S. (September 2012). "Quality of Internet information in pediatric otolaryngology: A comparison of three most referenced websites". International Journal of Pediatric Otorhinolaryngology. 76 (9): 1312–1316. doi:10.1016/j.ijporl.2012.05.026. PMID 22770592.
- ^ "Trust your doctor, not Wikipedia, say scientists". BBC News. May 27, 2014. Archived from the original on May 27, 2014. Retrieved May 27, 2014.
- ^ Hasty, RT; Garbalosa, RC; Barbato, VA; Valdes, PJ Jr; Powers, DW; Hernandez, E; John, JS; Suciu, G; Qureshi, F; Popa-Radu, M; San Jose, S; Drexler, N; Patankar, R; Paz, JR; King, CW; Gerber, HN; Valladares, MG; Somji, AA (May 1, 2014). "Wikipedia vs Peer-Reviewed Medical Literature for Information About the 10 Most Costly Medical Conditions". The Journal of the American Osteopathic Association. 114 (5): 368–373. doi:10.7556/jaoa.2014.035. PMID 24778001.
- ^ Jonathan Leo; Jeffrey R Lacasse (October 1, 2014). "Wikipedia vs peer-reviewed medical literature for information about the 10 most costly medical conditions-II". Journal of Osteopathic Medicine. 114 (10): 761–764. doi:10.7556/JAOA.2014.147. ISSN 0098-6151. PMID 25288708. Wikidata Q56888119.
- ^ Matheson, David; Matheson-Monnet, Catherine (2017). "Wikipedia as Informal Self-Education for Clinical Decision-Making in Medical Practice". Open Medicine Journal. 4: 15–25. doi:10.2174/1874220301704010015. hdl:2436/620739.
- ^ Yacob, Michael; Lotfi, Shamim; Tang, Shannon; Jetty, Prasad (2020). "Wikipedia in Vascular Surgery Medical Education: Comparative Study". JMIR Medical Education. 6 (1) e18076. doi:10.2196/18076. PMC 7334757. PMID 32417754.
- ^ Kräenbring, J.; Monzon Penza, T.; Gutmann, J.; Muehlich, S.; Zolk, O.; Wojnowski, L.; Maas, R.; Engelhardt, S.; Sarikas, A. (2014). "Accuracy and Completeness of Drug Information in Wikipedia: A Comparison with Standard Textbooks of Pharmacology". PLOS ONE. 9 (9) e106930. Bibcode:2014PLoSO...9j6930K. doi:10.1371/journal.pone.0106930. PMC 4174509. PMID 25250889.
- ^ Banchik, L. H.; Gray, B. (2024). "What happened to my Index Medicus?". Nutrition in Clinical Practice. 39 (4): 743–750. doi:10.1002/ncp.11173. PMID 38864650.
- ^ Volokh, Eugene (March 17, 2017). "When should courts rely on Wikipedia?". Reason. Archived from the original on August 21, 2021. Retrieved August 21, 2021.
- ^ Texas Supreme Court. "D Magazine Partners v. Rosenthal" (PDF). Archived (PDF) from the original on June 29, 2021. Retrieved August 21, 2021.
- ^ Sinham, B. Appeal (civil) 2321 of 2007 Archived April 10, 2009, at the Wayback Machine. Supreme Court of India.
- ^ a b McHenry, Robert (November 15, 2004). "The Faith-Based Encyclopedia". Tech Central Station. Archived from the original on June 13, 2006. Retrieved October 31, 2007.
- ^ "The Wall Street Journal Online". September 12, 2006. Archived from the original on August 9, 2017. Retrieved September 13, 2006.
- ^ Selena Mann: New tool used to evaluate Wikipedia. In: it-World Canada. January 14, 2011.
- ^ Metz, Cade. "Google and the Great Wikipedia Feedback Loop" Archived August 10, 2017, at the Wayback Machine, The Register, January 26, 2009
- ^ Akbar A (November 17, 2006). "Baron Cohen comes out of character to defend Borat". The Independent. Retrieved February 1, 2026.
- ^ "The Wall Street wizards find gold in these ills". The Guardian. November 17, 2006. Retrieved April 10, 2007.
- ^ Munroe, Randall. "Citogenesis". xkcd. Archived from the original on November 18, 2011.
- ^ "Citogenesis – Neologisms". Rice University. Archived from the original on August 14, 2014. Retrieved June 7, 2014.
- ^ Lemire, Daniel (January 27, 2012). "Citogenesis in science and the importance of real problems". Archived from the original on May 28, 2014. Retrieved June 7, 2014.
- ^ Rosenzweig, Roy (2011). Clio Wired: The Future of the Past in the Digital Age. New York: Columbia University Press. ISBN 978-0-231-15085-9. Archived from the original on March 31, 2019. Retrieved September 25, 2016. page 71.
- ^ Orlowski, Andrew (December 12, 2005). "Who's responsible for Wikipedia?". The Register. Archived from the original on February 6, 2009. Retrieved June 30, 2009.
The public has a firm idea of what an 'encyclopedia' is, and it's a place where information can generally be trusted, or at least slightly more trusted than what a labyrinthine, mysterious bureaucracy can agree upon, and surely more trustworthy than a piece of spontaneous graffiti—and Wikipedia is a king-sized cocktail of the two.
- ^ a b c d Thompson, Bill (December 16, 2005). "What is it with Wikipedia?". BBC. Archived from the original on August 16, 2007. Retrieved October 31, 2007.
- ^ Fowler, Simon Guide to Military History on the Internet, UK:Pen & Sword, ISBN 978-1-84415-606-1, p. 7
- ^ Fowler, Simon Guide to Military History on the Internet, UK:Pen & Sword, ISBN 978-1-84415-606-1, p. 201
- ^ "Cyber-nationalism | The brave new world of e-hatred". The Economist. July 24, 2008. Archived from the original on December 1, 2009. Retrieved April 13, 2010.
- ^ Wikipedia: "A Work in Progress" Archived April 21, 2012, at the Wayback Machine, BusinessWeek (December 14, 2005). Retrieved January 29, 2007.
- ^ Pausch, Randy (April 8, 2008). The Last Lecture. Hachette Books. p. PT42. ISBN 978-1-4013-9551-3. Archived from the original on May 5, 2016. Retrieved September 25, 2016.
- ^ a b Viégas, Fernanda B.; Wattenberg, Martin; Dave, Kushal (April 29, 2004). "Studying Cooperation and Conflict between Authors with history flow Visualizations" (PDF). CHI 2004, Vol. 6 No. 1. Archived (PDF) from the original on January 25, 2006. Retrieved October 31, 2007.
- ^ Magnus, P.D. Early response to false claims in Wikipedia Archived December 3, 2010, at the Wayback Machine. First Monday Archived July 31, 2009, at the Wayback Machine, 13 (9): September 1, 2008
- ^ a b Reid Priedhorsky, Jilin Chen, Shyong (Tony) K. Lam, Katherine Panciera, Loren Terveen, John Riedl, "Creating, destroying, and restoring value in wikipedia", Proc. GROUP 2007, doi: ACM.org
- ^ Vu-Quoc, L. Configuration integral Archived April 28, 2012, at the Wayback Machine, VQWiki, 2008.
- ^ Blakely, Rhys (August 15, 2007). "Exposed: guess who has been polishing their Wikipedia entries?". The Times of London. Archived from the original on June 12, 2011.
- ^ Fildes, Jonathan (August 15, 2007). "Wikipedia 'shows CIA page edits'". BBC News. Archived from the original on January 11, 2009. Retrieved March 14, 2021.
- ^ Verkaik, Robert (August 18, 2007). "Wikipedia and the art of censorship". The Independent. London. Archived from the original on January 9, 2009. Retrieved October 27, 2007.
- ^ Kamm, Oliver (August 16, 2007). "Wisdom? More like dumbness of the crowds". The Times. London. Archived from the original on May 9, 2009. Retrieved March 14, 2021.
- ^ Metz, Cade (December 18, 2007). "Truth, anonymity and the Wikipedia Way: Why it's broke and how it can be fixed". The Register. Archived from the original on August 10, 2017. Retrieved March 14, 2021.
- ^ a b c Cohen, Martin (August 27, 2008). "Encyclopaedia Idiotica". Times Higher Education (August 28, 2008): 26. Archived from the original on September 6, 2011. Retrieved May 31, 2011.
- ^ Stephen Colbert, The Colbert Report, episode 3109, August 21, 2007.
- ^ Brophy-Warren, Jamin (June 17, 2007). "Oh, that John Locke". The Wall Street Journal (June 16, 2007): P3. Archived from the original on September 4, 2017. Retrieved August 8, 2017.
- ^ Hendren, Johnny "DocEvil" (June 5, 2007). "The Art of Wikigroaning". Something Awful. Archived from the original on June 16, 2007. Retrieved June 17, 2007.
- ^ Brown, Andrew (June 14, 2007). "No amount of collaboration will make the sun orbit the Earth". The Guardian (June 14, 2007). London. Archived from the original on June 23, 2007. Retrieved March 27, 2010.
- ^ Tossell, Ivor (June 15, 2007). "Duality of Wikipedia". The Globe and Mail. Archived from the original on December 21, 2012. Retrieved October 4, 2012.
- ^ Sanger, Larry (December 31, 2004). "Why Wikipedia Must Jettison Its Anti-Elitism". Kuro5hin. Archived from the original on January 4, 2006. Retrieved October 31, 2007.
- ^ Sanger, Larry (May 14, 2020). "Wikipedia Is Badly Biased". larrysanger.org. Archived from the original on November 23, 2021. Retrieved March 8, 2025.
- ^ Baker, Gerard (May 27, 2020). "Big tech is blatantly biased against Trump". The Times and The Sunday Times. Retrieved March 8, 2025.
- ^ Barratt, Charlie (June 25, 2008). "The WTF World of Wikipedia". Future Publishing. pp. 1–5. Archived from the original on July 21, 2008. Retrieved February 20, 2009.
- ^ "Wikipedia:Replies to common objections", Wikipedia, 22:53 April 13, 2005.
- ^ Callahan, Ewa S.; Herring, Susan C. (October 2011). "Cultural bias in Wikipedia content on famous persons". Journal of the American Society for Information Science and Technology. 62 (10): 1899–1915. doi:10.1002/asi.21577. S2CID 14767483.
- ^ Kirby, J.P. (October 20, 2007). "The Problem with Wikipedia". J.P.'s Random Ramblings [blog]. Archived from the original on August 9, 2011.
- ^ Corinne Purtill; Zoë Schlanger (October 2, 2018). "Wikipedia had rejected Nobel Prize winner Donna Strickland because she wasn't famous enough". Quartz. Archived from the original on October 25, 2018. Retrieved November 20, 2018.
- ^ Resnick, Brian (October 3, 2018). "The 2018 Nobel Prize reminds us that women scientists too often go unrecognized". Vox. Archived from the original on October 25, 2018. Retrieved October 3, 2018.
- ^ Annalisa Merelli (August 18, 2018). "Seeking Disambiguation: Running for office is hard when you have a porn star's name. This makes it worse". Quartz. Archived from the original on November 21, 2018. Retrieved November 20, 2018.
- ^ Baker, Nicholson (March 20, 2008). "The Charms of Wikipedia". The New York Review of Books. 55 (4). Archived from the original on March 3, 2008. Retrieved August 30, 2015.
- ^ Noah, Timothy (February 24, 2007). "Evicted from Wikipedia". Slate. Archived from the original on June 21, 2009. Retrieved March 31, 2010.
- ^ Samoilenko, Anna; Yasseri, Taha (January 22, 2014). "The distorted mirror of Wikipedia: a quantitative analysis of Wikipedia coverage of academics". EPJ Data Science. 3 (1) 1. arXiv:1310.8508. doi:10.1140/epjds20. S2CID 4971771.
- ^ Steinsson, Sverrir. "Senate candidate Theresa Greenfield finally got her Wikipedia page. Here's why it took so long". The Washington Post. Retrieved October 28, 2020.
- ^ Harrison, Stephen (October 27, 2020). "Why Did It Take So Long for the Democratic Senate Candidate in Iowa to Get a Wikipedia Page?". Slate. Retrieved October 28, 2020.
- ^ Glaser, Mark (April 21, 2006). "Wales Discusses Political Bias on Wikipedia". PBS Mediashift. Archived from the original on August 19, 2007. Retrieved August 21, 2007.
- ^ a b c d e f Harrison, Stephen (June 9, 2020). "How Wikipedia Became a Battleground for Racial Justice". Slate. Archived from the original on November 3, 2021. Retrieved August 17, 2021.
- ^ Johnson, Bobbie (March 1, 2007). "Conservapedia—the US religious right's answer to Wikipedia". The Guardian. London. Archived from the original on February 17, 2022. Retrieved March 27, 2010.
- ^ Huntington, Doug (May 9, 2007). "'Design' Proponents Accuse Wikipedia of Bias, Hypocrisy". The Christian Post. Archived from the original on May 14, 2011. Retrieved June 1, 2018.
- ^ Solomon, Lawrence (July 8, 2008). "Wikipropaganda On Global Warming". National Review. CBS News. Archived from the original on August 28, 2008. Retrieved July 20, 2008.
- ^ Scarborough, Rowan (September 27, 2010). "Wikipedia Whacks the Right". Human Events. Archived from the original on December 7, 2010. Retrieved October 3, 2010.
- ^ Sidener, Jonathan (September 23, 2006). "Wikipedia co-founder looks to add accountability, end anarchy". The San Diego Union-Tribune. Archived from the original on January 17, 2018. Retrieved January 16, 2017.
- ^ Kalla, Joshua L.; Aronow, Peter M. (September 2, 2015). "Editorial Bias in Crowd-Sourced Political Information". PLOS ONE. 10 (9) e0136327. Bibcode:2015PLoSO..1036327K. doi:10.1371/journal.pone.0136327. PMC 4558055. PMID 26331611.
- ^ Schneider, Florian (August 16, 2018). China's Digital Nationalism. Oxford University Press. pp. 123–124. ISBN 978-0-19-087681-4. Archived from the original on December 14, 2024. Retrieved September 16, 2023.
- ^ Gustafsson, Karl (July 18, 2019). "International reconciliation on the Internet? Ontological security, attribution and the construction of war memory narratives in Wikipedia". International Relations. 34 (1): 3–24. doi:10.1177/0047117819864410. ISSN 0047-1178. S2CID 200020669.
- ^ Sato, Yumiko (March 19, 2021). "Non-English Editions of Wikipedia Have a Misinformation Problem". Slate. The Slate Group. Archived from the original on August 25, 2023. Retrieved August 23, 2021.
- ^ Sato, Yumiko (January 9, 2021). 日本語版ウィキペディアで「歴史修正主義」が広がる理由と解決策 [Reasons Why "Historical Revisionism" is Widespread on Japanese Wikipedia and Solutions for It]. Yumiko Sato's Music Therapy Journal (in Japanese). Archived from the original on August 6, 2021. Retrieved August 23, 2021.
- ^ Kim, Taehee; Garcia, David; Aragón, Pablo (May 11, 2023). "Controversies over Historical Revisionism in Wikipedia" (PDF). Wiki Workshop 2023. Wikimedia Foundation.
- ^ a b Cohen, Noam (January 29, 2007). "Courts Turn to Wikipedia, but Selectively" Archived March 18, 2017, at the Wayback Machine. The New York Times.
- ^ Palazzolo, Joe (April 23, 2012). "Which Federal Appeals Court Cites Wikipedia Most Often?" Archived November 20, 2018, at the Wayback Machine. The Wall Street Journal.
- ^ "Case ref. O-169-07: In the matter of application no 2277746C by Formula One Licensing B.V., to register the trade mark: "F1"" (PDF). UK Government Intellectual Property Office. June 14, 2007. Archived (PDF) from the original on October 31, 2007. Retrieved October 31, 2007.
- ^ Nordwall v. Secretary of Health & Human Services, No. 05-123V, 2008 WL 857661, at *7 n.6 (Fed. Cl. February 19, 2008) as cited in Capcom Co., Ltd, et al. v. The MKR Group, Inc., No. C 08-0904 RS Archived September 27, 2012, at the Wayback Machine
- ^ Campbell v. Sec'y of Health & Human Servs., 69 Fed. Cl. 775, 781 (Ct. Cl. 2006)
- ^ Cohen, Noam (January 29, 2007). "Courts Turn to Wikipedia, but Selectively". The New York Times. ISSN 0362-4331. Archived from the original on September 22, 2015. Retrieved December 8, 2015.
- ^ "Wikipedia emerges as key source for Virginia Tech shootings". Cyberjournalist.net. April 24, 2007. Archived from the original on October 22, 2007. Retrieved October 31, 2007.—cyberjournalist.net cites this article Cohen, Noam (April 23, 2007). "The Latest on Virginia Tech, From Wikipedia". The New York Times. Archived from the original on April 15, 2009. Retrieved October 31, 2007. for the above quote.
- ^ Vargas, Jose Antonio (September 17, 2007). "On Wikipedia, Debating 2008 Hopefuls' Every Facet". The Washington Post, Page A01. Archived from the original on November 3, 2012. Retrieved October 31, 2007.
- ^ boyd, danah (January 4, 2005). "Academia and Wikipedia". Many-to-Many. Archived from the original on January 12, 2006. Retrieved October 31, 2007.
- ^ Sanger, Larry (September 24, 2001). "Wikipedia is wide open. Why is it growing so fast? Why isn't it full of nonsense?". Kuro5hin. Archived from the original on October 10, 2007. Retrieved October 31, 2007.
- ^ Ito, Joi (August 29, 2004). "Wikipedia attacked by ignorant reporter". Joi Ito's Web. Archived from the original on September 28, 2007. Retrieved October 31, 2007.
- ^ Jaeger, G.professional webpage Archived May 17, 2003, at the Wayback Machine
- ^ Jaeger, Greg (July 2008). "Bits on Quantum Information". Physics Today. Vol. 61, no. 7. p. 10. doi:10.1063/1.2962992. Archived from the original on January 11, 2023.
- ^ Ebert, Roger (November 18, 2008). Roger Ebert's Movie Yearbook 2009. Andrews McMeel Publishing. p. 529. ISBN 978-0-7407-7745-5.
- ^ Ebert, Roger (October 7, 2009). Review of Good Hair Archived December 21, 2012, at the Wayback Machine. RogerEbert.com.
- ^ Ebert, Roger. "Why 3D doesn't work and never will. Case closed." Archived May 1, 2011, at the Wayback Machine Chicago Sun-Times. January 23, 2011
- ^ Ebert, Roger. "The Last Mountain" Archived December 21, 2012, at the Wayback Machine, rogerebert.com, June 22, 2011
- ^ Hall, Sarah. "Rosie vs. Donald: She Said, He Said" Archived April 26, 2011, at the Wayback Machine, E! Online, December 21, 2006
- ^ Park, Robert L. (August 28, 2009). "What's New". bobpark.org. Archived from the original on October 3, 2009. Retrieved July 13, 2010.
- ^ Goertzel, Ted (2011). "Letters to the Editor: Conspiracy Thinking". Skeptical Inquirer. 35 (3): 64.
- ^ "The Conspiracy Meme" (January/February 2011). Skeptical Inquirer. Vol. 35 No. 1. January/February 2011. p. 37
- ^ Randi, James (March 18, 2012). "Popoff's Still At It". James Randi Educational Foundation. Archived from the original on March 20, 2012. Retrieved March 22, 2012.
- ^ Hillshafer, David (2013). "The Mass Murder Problem". Skeptic. 18 (1): 24–32.
- ^ Lippard, Jim (2012). "The Decline and (Probable) Fall of the Scientology Empire!". Skeptic Vol. 17 No. 1. pp. 18–27. The citations in question are Citations 10, 14 and 16, as seen on page 27.
- ^ Sheaffer, Robert (2014). "Between a Beer Joint and a Highway Warning Sign: The 'Classic' Cash-Landrum Case Unravels". "Psychic Vibrations"". Skeptical Inquirer. 38 (2): 28.
- ^ "Wikipedia Defies Need for Regulation" Archived May 6, 2015, at the Wayback Machine. Stossel. Fox Business. January 4, 2013.
- ^ Goodwin, Jean. (2010). The authority of Wikipedia. In Juho Ritola (Ed.), Argument cultures: Proceedings of the Ontario Society for the Study of Argumentation Conference. Windsor, ON, Canada: Ontario Society for the Study of Argumentation. CD-ROM. 24 pp.
- ^ Jemielniak, Dariusz (2019). "Wikipedia: Why Is the Common Knowledge Resource Still Neglected by Academics?". GigaScience. 8 (12) giz139. doi:10.1093/gigascience/giz139. PMC 6889752. PMID 31794014.
- ^ "Mistakes and hoaxes on-line". Australian Broadcasting Corporation. April 15, 2006. Archived from the original on November 13, 2012. Retrieved April 28, 2007.
- ^ Seth Finkelstein (September 28, 2006) "I'm on Wikipedia, get me out of here" Archived November 12, 2016, at the Wayback Machine The Guardian. Inside IT.
- ^ "Top 10 Wikipedia Moments". Time. January 13, 2011. Archived from the original on August 24, 2011. Retrieved August 21, 2011.
- ^ Dedman, Bill (March 3, 2007). "Reading Hillary Clinton's hidden thesis". NBC News. Archived from the original on July 12, 2020. Retrieved March 17, 2007.
- ^ "Hillary Rodham Clinton". Wikipedia. July 9, 2005. Archived from the original on February 16, 2016. Retrieved March 17, 2007.
- ^ "Hillary Rodham Clinton". Wikipedia. March 2, 2007. Archived from the original on February 16, 2016. Retrieved March 17, 2007.
- ^ Paige, Cara (April 11, 2006). "Exclusive: Meet the Real Sir Walter Mitty". Daily Record. Archived from the original on September 30, 2007. Retrieved November 24, 2007.
- ^ "Ségolène Royal et Léon-Robert de l'Astran, le savant qui n'a jamais existé" Archived June 7, 2010, at the Wayback Machine, Le Monde, June 7, 2010
- ^ "Léon-Robert de L'Astran, celui qui n'a jamais existé" Archived June 11, 2010, at the Wayback Machine, Sud-Ouest, June 7, 2010
- ^ "Ségolène Royal tombe dans le piège de Wikipédia" Archived June 10, 2010, at the Wayback Machine, Le Figaro, June 8, 2010
- ^ "Royal, toute une Histoire" Archived June 10, 2010, at the Wayback Machine, Le Journal du Dimanche, June 7, 2010
- ^ "Léon Robert de L'Astran". Archived from the original on September 13, 2006. Retrieved June 8, 2010., article in the French Wikipedia, deleted on June 7, 2010
- ^ Weingarten, Gene (March 16, 2007). "A wickedly fun test of Wikipedia". The News & Observer. Archived from the original on March 20, 2007. Retrieved April 8, 2006.
- ^ McCarthy, Caroline (August 1, 2006). "Colbert speaks, America follows: All hail Wikiality!". CNET. Archived from the original on November 17, 2022. Retrieved November 17, 2022.
- ^ Metz, Cade (January 22, 2009). "Jimbo Wales ends death by Wikipedia". The Register. Archived from the original on April 27, 2010. Retrieved March 31, 2010.
- ^ Goss, Patrick (September 15, 2008). "Vernon Kay shocked at death by Wikipedia". Techradar.com. Archived from the original on May 19, 2011. Retrieved March 31, 2010.
- ^ Pershing, Ben (January 21, 2009). "Kennedy, Byrd the Latest Victims of Wikipedia Errors". The Washington Post. Archived from the original on August 11, 2011. Retrieved May 31, 2011.
- ^ Bachelor, Blane (June 28, 2007). "Web Time Stamps Indicate Benoit Death Reported About 14 Hours Before Police Found Bodies". Fox News. Archived from the original on May 17, 2008. Retrieved May 21, 2008.
- ^ Schoetz, David (June 29, 2007). "Police: Wiki Confession an 'Unbelievable Hindrance'". ABC News. Archived from the original on May 16, 2008. Retrieved May 21, 2008.
- ^ Spring, Corey (June 29, 2007). "The College Student Who 'Knew' About the Benoit Murder-Suicide Before Police". Newsvine. Archived from the original on September 11, 2016. Retrieved May 21, 2008.
- ^ Mirror duped by Wikipedia 'fact' Archived February 17, 2022, at the Wayback Machine (Web User, September 19, 2008)
- ^ "New-look Manchester City side begin their UEFA Cup campaign in earnest". Daily Mirror. September 18, 2008. Archived from the original on August 19, 2009. Retrieved April 13, 2010.
- ^ "Omonia Nicosia 1–2 Manchester City: Goals start to flow for Jo". Daily Mirror. April 9, 2010. Archived from the original on August 13, 2009. Retrieved April 13, 2010.
- ^ Fitzgerald, Shane (May 7, 2009). "Lazy journalism exposed by online hoax". The Irish Times. Archived from the original on November 1, 2011. Retrieved January 8, 2010.
- ^ a b Pogatchnik, Shawn (May 11, 2009). "Irish Student Hoaxes World's Media With Fake Quote". ABC News. Archived from the original on June 28, 2011. Retrieved January 8, 2010.
- ^ "Wikipedia hoax points to limits of journalists' research". arstechnica.com. May 8, 2009. Archived from the original on February 21, 2010. Retrieved January 8, 2010.
- ^ "Sepp Blatter called a 'bellend' during award of South African medal". The Metro. July 15, 2010. Archived from the original on August 6, 2010. Retrieved August 11, 2010.
- ^ "'Wikipedia vandals' strike again in Norman Wisdom obits". The Guardian. London. October 5, 2010. Archived from the original on January 16, 2016. Retrieved December 3, 2010.
- ^ Bailey, Ryan (October 25, 2012). "Asian Football Confederation apologize for calling UAE national team 'Sand Monkeys'". Yahoo! Sports. Archived from the original on March 25, 2014. Retrieved March 25, 2014.
- ^ "Asian soccer body apologizes for 'sand monkeys' slur of UAE team". CBS News. Associated Press. October 15, 2012. Archived from the original on March 25, 2014. Retrieved March 25, 2014.
- ^ a b Archived copy of article at time of deletion
- ^ "Wikipedia:Articles for deletion/Bicholim conflict". Wikipedia. Archived from the original on February 17, 2022. Retrieved May 22, 2013.
- ^ Pfeiffer, Eric (January 4, 2013). "War is over: Imaginary 'Bicholim Conflict' page removed from Wikipedia after five years". Yahoo! News. Archived from the original on January 9, 2013. Retrieved January 8, 2013.
- ^ Morris, Kevin (January 1, 2013). "After a half-decade, massive Wikipedia hoax finally exposed". The Daily Dot. Archived from the original on April 10, 2014. Retrieved March 25, 2014.
- ^ Morris, Kevin (April 25, 2013). "The greatest movie that never was". The Daily Dot. Archived from the original on April 26, 2013. Retrieved March 25, 2014.
- ^ Dewey, Caitlin (April 15, 2015). "The story behind Jar'Edo Wens, the longest-running hoax in Wikipedia history". The Washington Post. Archived from the original on April 19, 2015. Retrieved April 19, 2015.
- ^ "Aussie's Jar'Edo Wens prank sets new record as Wikipedia's longest-running hoax". The Sydney Morning Herald. March 23, 2015. Archived from the original on July 1, 2015. Retrieved July 8, 2015.
- ^ Cush, Andy. "How One Man Made Himself Into an Aboriginal God With Wikipedia". Weird Internet. Gawker Media. Archived from the original on July 9, 2015.
- ^ Benjakob, Omer (October 4, 2019). "The Fake Nazi Death Camp: Wikipedia's Longest Hoax, Exposed". Haaretz. Archived from the original on October 19, 2019. Retrieved December 29, 2021.
- ^ "Wikipedia's 'longest-running hoax' about fake Warsaw death camp revealed". Cleveland Jewish News. October 4, 2019. Archived from the original on November 24, 2021. Retrieved December 29, 2021.
- ^ "Wikipedia page on fake Warsaw concentration camp was 15-year hoax — report". The Times of Israel. October 5, 2019. Archived from the original on November 7, 2021. Retrieved December 29, 2021.
- ^ "A Bored Chinese Housewife Spent Years Falsifying Russian History on Wikipedia". Vice News. July 13, 2022. Archived from the original on July 17, 2022. Retrieved July 13, 2022.
- ^ Diamond, Jonny (June 28, 2022). "A "Chinese Borges" wrote millions of words of fake Russian history on Wikipedia for a decade". Literary Hub. Archived from the original on July 17, 2022. Retrieved August 6, 2022.
- ^ "Alan MacMasters: How the great online toaster hoax was exposed". BBC News. November 19, 2022. Archived from the original on November 20, 2022. Retrieved November 20, 2022.
- ^ "Wikipedia's Credibility Is Toast | Wikipediocracy". wikipediocracy.com. August 11, 2022. Archived from the original on September 4, 2022. Retrieved September 11, 2022.
- ^ Rauwerda, Annie (August 12, 2022). "A long-running Wikipedia hoax and the problem of circular reporting". Input. Archived from the original on September 4, 2022. Retrieved September 11, 2022.
- ^ a b Grabowski, Jan; Klein, Shira (February 9, 2023). "Wikipedia's Intentional Distortion of the History of the Holocaust". The Journal of Holocaust Research. 37 (2): 133–190. doi:10.1080/25785648.2023.2168939. ISSN 2578-5648. S2CID 257188267.
- ^ Piotr Konieczny (2025). "Fake news, an internet troll, and a conspiracy theory about 'Wikipedia's Intentional Distortion of the History of the Holocaust'". Holocaust Studies. 31 (4). Published online 5 June 2025. doi:10.1080/17504902.2025.2511459. Archived from the original on June 7, 2025. Retrieved June 7, 2025.
- ^ Kane, Margaret (January 30, 2006). "Politicians notice Wikipedia". Cnet news.com. Archived from the original on March 11, 2016. Retrieved January 28, 2007.
- ^ "Senator staffers spam Wikipedia". Archived from the original on March 29, 2006. Retrieved September 13, 2006.
- ^ Noam Cohen (August 31, 2008) "Don't Like Palin's Wikipedia Story? Change It" Archived February 28, 2018, at the Wayback Machine Technology. The New York Times.
- ^ "Sarah Palins Wikipedia entry glossed over by mystery user hrs. before VP announcement" Archived May 24, 2011, at the Wayback Machine, Thaindian News (September 2, 2008)
- ^ Bachelet, Pablo (May 3, 2006). "War of Words: Website Can't Define Cuba". The Miami Herald. Archived from the original on December 22, 2012. Retrieved July 8, 2008.
- ^ Delay, Larry (August 3, 2006). "A Pernicious Model for Control of the World Wide Web: The Cuba Case" (PDF). Association for Study of the Cuban Economy(ASCE). Archived from the original (PDF) on September 10, 2008. Retrieved July 8, 2008.
- ^ "Wikipédia en butte à une nouvelle affaire de calomnie" Archived May 16, 2008, at the Wayback Machine, Vnunet.fr, 28 novembre 2007.
- ^ Masson, Jean-Louis (November 29, 2007). "Responsabilité pénale des intervenants sur Internet: hébergeur du site, responsable du site et auteur d'allégations diffamatoires". Sénat. Archived from the original on July 21, 2011.
- ^ Narasimhan, Balaji (June 3, 2009). "Handling controversy on Wikipedia". MiD DAY. Archived from the original on October 26, 2011. Retrieved November 7, 2011.
- ^ Woods, Allan (August 25, 2010). "Ottawa investigating Wikipedia edits". Toronto Star. Archived from the original on August 27, 2010. Retrieved August 26, 2010.
- ^ Nabili, Teymoor (September 11, 2010). "The Cyrus Cylinder, Wikipedia and Iran conspiracies". Al Jazeera Blogs. Archived from the original on March 11, 2012. Retrieved March 19, 2012.
- ^ Metz, Cade, "US Department of Justice banned from Wikipedia Archived 2017-08-10 at the Wayback Machine, The Register, April 29, 2008.
- ^ "Candid camera". Harper's Magazine. July 2008. Archived from the original on June 15, 2011. Retrieved May 31, 2011.
- ^ "Letter in Harper's Magazine About Wikipedia Issues". Camera. August 14, 2008. Archived from the original on June 15, 2010. Retrieved March 31, 2010.
- ^ Oboler, Andre (May 14, 2008). "Exposed – Anti-Israel Subversion on Wikipedia". Media Critiques. Archived from the original on March 16, 2010. Retrieved May 31, 2011.
- ^ Oboler, Andre (May 13, 2008). "Wiki Warfare: Battle for the on-line encyclopedia". The Jerusalem Post. Archived from the original on June 29, 2011.
- ^ McElroy, Damien (May 8, 2008). "Israeli battles rage on Wikipedia". The Daily Telegraph. London. Archived from the original on May 9, 2008. Retrieved May 8, 2008.
- ^ Your wiki entry counts Archived June 5, 2011, at the Wayback Machine, Haaretz, By Cnaan Liphshiz 25.12.07
- ^ Gur, Haviv Rettig (May 16, 2010). "Israeli-Palestinian conflict rages on Wikipedia". The Jerusalem Post. Archived from the original on June 29, 2011. Retrieved May 31, 2011.
- ^ Shabi, Rachel; Kiss, Jemima (August 18, 2010). "Wikipedia editing courses launched by Zionist groups". The Guardian. London. Archived from the original on August 19, 2013. Retrieved December 16, 2016.
- ^ Benari, Elad (August 3, 2010). "Zionist Internet Struggle to Hit Wikipedia". Arutz Sheva. Archived from the original on August 21, 2010. Retrieved August 18, 2010.
- ^ a b c Hasson, Nir (March 12, 2012). "The right's latest weapon: 'Zionist editing' on Wikipedia". Haaretz. Archived from the original on May 24, 2012. Retrieved March 19, 2012.
- ^ Benari, Elad (March 8, 2010). "Zionist Internet Struggle to Hit Wikipedia". Arutz Sheva. Archived from the original on December 11, 2011. Retrieved March 19, 2012.
- ^ a b "The battle for Wikipedia: Palestinians counter Israeli editing group". Ynetnews. June 20, 1995. Archived from the original on January 8, 2012. Retrieved March 19, 2012.
- ^ "Readers Discuss Wikipedia Editing Course That Aims for 'Balanced and Zionist' Entries" Archived May 9, 2012, at the Wayback Machine By Robert Mackey August 23, 2010, 1:12 p.m.
- ^ Wikipedia founder: Israel-Palestine is heavily debated, but we're vigilant on neutrality Archived May 29, 2012, at the Wayback Machine, Haaretz
- ^ Ewing, Maura (October 23, 2012). "Is Wikipedia going commercial?" Archived June 16, 2013, at the Wayback Machine. Salon.
- ^ Brian Bergstein (January 24, 2007) Microsoft Violates Wikipedia's Sacred Rule The Associated Press. Archived from the original on June 4, 2011.
- ^ Nancy Gohring (January 23, 2007) "Microsoft said to offer payment for Wikipedia edits" Archived May 17, 2009, at the Wayback Machine IDG News Service. Retrieved September 3, 2008.
- ^ Nancy Gohring "Microsoft's step into Wikipedia prompts debate" Archived May 17, 2009, at the Wayback Machine IDG News Service.
- ^ March 12, 2008 Wiki boss 'edited for donation' Technology. BBC News.
- ^ Goldman, Eric (December 5, 2005). "Wikipedia Will Fail Within 5 Years". Archived from the original on January 5, 2010. Retrieved January 16, 2010.
- ^ Claburn, Thomas (December 5, 2006). "Law Professor Predicts Wikipedia's Demise". InformationWeek. Archived from the original on September 5, 2007. Retrieved December 16, 2006.
- ^ Metz, Cade (February 6, 2008). "Wikipedia ruled by 'Lord of the Universe'". The Register. Archived from the original on January 22, 2023. Retrieved January 22, 2023.
- ^ Cade Metz (March 6, 2008). "Why you should care that Jimmy Wales ignores reality Archived December 20, 2012, at the Wayback Machine". The Register. Retrieved April 27, 2010.
- ^ Arthur, Charles (December 15, 2005). "Log on and join in, but beware the web cults". The Guardian. London. Archived from the original on May 3, 2006. Retrieved July 14, 2006.
- ^ PM, Alastair Sloan On 3/24/15 at 12:20 (March 24, 2015). "Manipulating Wikipedia to Promote a Bogus Business School". Newsweek. Archived from the original on July 3, 2015. Retrieved July 8, 2015.
{{cite web}}: CS1 maint: numeric names: authors list (link) - ^ Chari, Mridula (March 25, 2015). "Wikipedia bans editor for consistent bias in favour of Arindam Chaudhuri's IIPM". Scroll.in. Archived from the original on April 30, 2015. Retrieved July 8, 2015.
- ^ O'Neil, Mathieu (March 2010). "Shirky and Sanger, or the costs of crowdsourcing". International School for Advanced Studies. 09 (1). Journal of Science Communication. Archived from the original on April 29, 2011. Retrieved May 31, 2011.
External links
[edit]- Librarians' Claims and Opinions Regarding Wikipedia Archived February 11, 2011, at the Wayback Machine at LISWiki.
- How pranks, hoaxes and manipulation undermine the reliability of Wikipedia Archived August 8, 2014, at the Wayback Machine, a harsh essay by Andreas Kolbe.
- Wikipedia has become a science reference source even though scientists don't cite it Archived February 10, 2018, at the Wayback Machine. Science News. February 5, 2018.
- Moore, Ryan (March 26, 2025). "When Did RFK Jr. Graduate from UVA Law?". Virginia Law Weekly. Archived from the original on April 10, 2025. Retrieved March 28, 2025.
Is the Encyclopedia Brittanica actually more accurate than Wikipedia? Is Wikipedia's reliance on the collective wisdom of the internet more reliable than the specialized knowledge of Encyclopedia Brittanica's army of paid fact checkers? How do we prove anything? What is the truth?
Wikipedia project pages
[edit]- Press coverage
- Replies to common objections
- Researching with Wikipedia
- Statistics
- Wikipedia as a court source (list of cited uses)
- Wikipedia as an academic source (list of cited uses)
- Wikipedia in academic studies (list of studies)
- WikiProject Wikipedia reliability
- America's Top Newspapers Use Wikipedia
- Comparison to Stanford Encyclopedia of Philosophy
- Wikipedia:External peer review
Reliability of Wikipedia
View on GrokipediaFundamental Mechanisms Affecting Reliability
Editing Model and Core Policies
Wikipedia's editing model operates on an open collaboration principle, allowing any internet user—registered or anonymous—to add, modify, or remove content in real time, with oversight provided through peer review, reversion of problematic edits, and discussion on article talk pages. This decentralized system enables swift incorporation of new information and collective error correction, as demonstrated by analyses showing that article quality typically improves through iterative edits, with many reaching a stable, high-quality state after sufficient revisions. However, the model is vulnerable to vandalism, hoaxes, and coordinated manipulation, as low barriers to entry permit bad-faith actors to introduce falsehoods that may persist briefly before detection, and contentious topics often devolve into prolonged edit wars where wholesale reversions hinder progress.[6][7] The core content policies—neutral point of view (NPOV), verifiability, and no original research (NOR)—form the foundational guidelines for maintaining reliability. NPOV mandates that articles represent all significant viewpoints from reliable sources in proportion to their prominence, aiming to avoid advocacy or undue weight; verifiability requires all material to be attributable to secondary sources deemed reliable by consensus, excluding primary interpretation; and NOR bars unpublished analyses or syntheses, confining contributions to summarization of existing knowledge. These policies theoretically promote factual integrity by prioritizing evidence over opinion, with verifiability serving as a check against unsubstantiated claims.[8] In practice, enforcement of these policies depends on volunteer consensus, which introduces variability and potential for bias, as disputes are resolved through majority voting or administrative discretion rather than impartial arbitration. Empirical studies reveal inconsistent compliance, particularly with NPOV, where political articles often exhibit left-leaning slants due to selective sourcing and editor demographics favoring progressive viewpoints, undermining the policy's neutrality ideal—for example, a 2012 analysis of U.S. political entries found deviations from balance aligning more with liberal media patterns. Verifiability, while strengthening scientific and historical coverage through citation requirements, falters when "reliable" sources are drawn disproportionately from ideologically aligned institutions like mainstream media or academia, which systematic biases render non-neutral in social and political domains. Overall, while the model and policies enable broad coverage and self-correction in neutral topics, their reliance on community goodwill amplifies reliability risks in polarized areas, where groupthink and participation imbalances distort outcomes.[3][9][10]Editor Demographics and Motivations
Wikipedia editors exhibit a pronounced demographic imbalance, characterized by overwhelming male dominance, with a 2011 survey of 5,073 respondents reporting 91% male participation.[11] Subsequent Wikimedia Foundation data from the 2020s maintains this skew at approximately 87% male among contributors, alongside underrepresentation of racial minorities such as fewer than 1% identifying as Black or African American in the U.S. editor base.[12] Age demographics center on younger adults, with an average of 32 years in the 2011 study, while educational attainment skews high, as 61% held at least an associate or bachelor's degree.[11] Geographically, editing activity concentrates in Western nations, with the U.S. (20%), Germany (12%), and the U.K. (6%) comprising a significant share of respondents in early surveys, reflecting limited input from the Global South.[11][13] This profile persists despite diversity initiatives, contributing to coverage gaps in non-Western and minority perspectives.[14] Motivations for editing primarily stem from intrinsic and altruistic impulses. In the 2011 survey, 69% initiated contributions to volunteer and disseminate knowledge, a factor sustaining 71% of ongoing participation, while 60% cited enjoyment as a key driver.[11] Peer-reviewed analyses reinforce this, identifying self-concept enhancement—such as bolstering personal identity through communal knowledge-building—as a dominant force surpassing extrinsic incentives like recognition.[15] Other research highlights task-oriented drives, including correcting errors and exchanging information, alongside social rewards from community interaction.[16] For specialized contributors like scientists, motivations include leveraging expertise to counter misinformation, viewing Wikipedia as a public good warranting voluntary upkeep.[17] However, demographic homogeneity intersects with motivations in ways that foster potential bias. Direct surveys of political affiliations remain limited, but editing patterns reveal ideological segregation: contributors to politically slanted articles on U.S. topics cluster by partisan leanings, with left-leaning content drawing distinct networks from right-leaning ones.[18] This suggests that for some, editing serves advocacy purposes, advancing preferred narratives under the guise of neutrality—a dynamic critiqued by co-founder Larry Sanger, who attributes systemic left-wing bias to editors' skewed credibility assessments favoring progressive viewpoints over conservatism or traditionalism.[19] Such motivations, amplified by low diversity, can prioritize worldview alignment over empirical detachment, as evidenced by content analyses showing disproportionate negativity toward right-leaning figures.[9] Wikimedia's self-reported data, while useful, may underemphasize these tensions due to institutional incentives for portraying inclusivity.[12] Overall, while core drives emphasize altruism, the editor pool's composition incentivizes selective fact-emphasis, undermining comprehensive reliability.Governance Structures and Dispute Resolution
Wikipedia employs a decentralized, peer-governed structure where content decisions emerge from community consensus among volunteer editors, without a central editorial board or formal hierarchy. Administrators, elected by the community through requests for adminship, hold technical privileges such as page protection, deletion, and user blocking to enforce policies, but their actions are subject to community oversight via review processes. This adhocratic model, blending anarchic participation with bureaucratic elements like peer accountability, aims to distribute authority but has evolved into informal hierarchies influenced by editor experience and tenure.[20][21] Dispute resolution follows a multi-tiered escalation process: initial discussions on article talk pages, followed by informal mechanisms like third opinions or Requests for Comments (RfCs), specialized noticeboards, volunteer mediation, and ultimately the Arbitration Committee (ArbCom) for conduct violations. RfCs solicit broader input to gauge consensus on contentious issues, while ArbCom, comprising 7-15 elected arbitrators serving staggered terms, issues binding remedies in severe cases, such as topic bans or indefinite blocks. These processes prioritize verifiability, neutrality, and no original research policies to maintain content integrity.[22] Empirical analyses reveal inefficiencies in dispute resolution that undermine reliability. A study of approximately 7,000 RfCs from 2011-2017 found that about 33% remained unresolved due to vague proposals, protracted arguments, and insufficient third-party engagement, allowing contentious content to persist without closure. Qualitative insights from frequent closers highlighted deviations from deliberative norms, such as biased framing or niche topic disinterest, exacerbating delays and potential for unaddressed errors.[23] ArbCom decisions exhibit influences from disputants' social capital, defined by edit histories and community ties, which correlates with lighter sanctions to preserve social cohesion over strict norm enforcement. Analysis of 524 cases from 2004-2020 showed a negative relationship between an editor's Wikipedia-related edit volume and sanction severity, with well-connected individuals leveraging preliminary statements for favorable outcomes, often at the expense of newcomers or outsiders. Interviews with 28 editors underscored factional dynamics and "power plays," suggesting that governance favors entrenched networks—predominantly experienced, Western editors—potentially perpetuating ideological imbalances in content control.[24] Critics argue that inconsistent policy application and power concentration in a small administrative corps enable capture by motivated subgroups, hindering neutral resolution and allowing biases to embed in articles. While rapid consensus works for uncontroversial topics, high-stakes disputes risk stalemates or rulings that prioritize harmony over factual rigor, contributing to variable article quality and vulnerability to systemic skews.[20]Empirical Evaluations of Factual Accuracy
Comparisons with Traditional Encyclopedias
A 2005 investigation by Nature compared the accuracy of 42 biomedical science articles from Wikipedia and Encyclopædia Britannica, enlisting experts to identify factual errors, omissions, or misleading statements. The review found 162 such issues in Wikipedia entries versus 123 in Britannica, yielding averages of 3.9 and 2.9 errors per article, respectively, suggesting comparable overall reliability despite Wikipedia's collaborative model.[1] Britannica disputed the methodology, claiming Nature inflated errors through inconsistent criteria, such as counting disputed interpretations as mistakes and overlooking Britannica's corrections; their independent audit identified four serious errors in Britannica (none major in Wikipedia) but 45 major errors in Wikipedia across the sample.[25] Subsequent empirical work has reinforced similarities in raw factual accuracy while highlighting differences in systematic biases. For instance, analyses of historical and political topics, such as national histories, reveal Wikipedia entries occasionally omitting contentious events (e.g., wars) more frequently than Britannica equivalents, potentially due to editorial self-censorship in crowd-sourced environments lacking centralized expertise.[26] Traditional encyclopedias like Britannica employ paid subject-matter experts and rigorous peer review, minimizing factual deviations through controlled revision cycles, whereas Wikipedia's volunteer-driven process enables quicker factual corrections but exposes content to transient inaccuracies during disputes.[27] Quantitative assessments of ideological slant further differentiate the platforms. Greenstein and Zhu's 2012–2014 studies, examining U.S. political biographies and election coverage, measured slant via linguistic markers (e.g., partisan word usage) and found Wikipedia articles exhibited 25% greater bias toward Democratic viewpoints than Britannica's, with Wikipedia's median slant score at 1.96 versus Britannica's 1.62 on a normalized scale; however, articles with higher edit volumes trended toward neutrality over time.[28] These findings attribute Wikipedia's edge in volume (over 6 million articles by 2014 versus Britannica's curated ~65,000) to its scalability, but underscore traditional encyclopedias' advantage in consistent neutrality via expert gatekeeping, reducing vulnerability to demographic skews among contributors.[29] Overall, while factual error rates align closely, Wikipedia's reliability lags in bias-resistant domains due to its decentralized governance compared to Britannica's professional curation.Quantitative Studies on Error Rates
A 2005 comparative study published in Nature examined the accuracy of 42 Wikipedia articles on scientific topics by having experts review them alongside corresponding Encyclopædia Britannica entries. The analysis identified 162 factual errors, omissions, or misleading statements in Wikipedia compared to 123 in Britannica, yielding an average of approximately four errors per Wikipedia article and three per Britannica article.[1] Britannica contested the methodology, arguing that the error count was inflated by including minor issues and that their own review found Wikipedia's inaccuracies to be about a third higher when applying stricter criteria.[25] In a 2006 evaluation of historical content, Roy Rosenzweig assessed 25 Wikipedia biographies of notable Americans against Encarta and the American National Biography Online. Wikipedia achieved an 80% factual accuracy rate, lower than the 95-96% for the professional sources, with errors primarily in minor details and more frequent omissions of nuance. The study attributed this partly to Wikipedia's reliance on volunteer editors without specialized historical training, though it noted comparable rates to other encyclopedias in broad factual claims.[30] A 2012 analysis in Public Relations Review reviewed 60 Wikipedia articles on companies, cross-checked against official filings and websites. It found factual errors in 60% of entries, including inaccuracies in founding dates, revenue figures, and executive names, suggesting vulnerabilities in coverage of self-interested or commercial topics due to unverified edits. Focusing on medicine, a 2014 study by Hasty et al. in the Journal of the American Osteopathic Association compared Wikipedia articles on the 10 most costly U.S. health conditions (e.g., diabetes, lung cancer) to peer-reviewed literature. Nine out of 10 articles contained errors, defined as contradictions or inconsistencies with evidence-based sources, with issues in treatment efficacy, risk factors, and prognosis. The authors highlighted Wikipedia's limitations for clinical decision-making, as errors persisted despite citations to primary research.[31]| Study | Domain | Sample Size | Error Rate in Wikipedia | Comparison |
|---|---|---|---|---|
| Nature (2005) | Science | 42 articles | ~4 errors/article (162 total) | Britannica: ~3 errors/article (123 total)[1] |
| Rosenzweig (2006) | U.S. History Biographies | 25 articles | 80% accuracy (20% error/omission) | Encarta/ANBO: 95-96% accuracy |
| PRR (2012) | Companies | 60 articles | 60% with factual errors | Official sources (e.g., SEC filings) |
| Hasty et al. (2014) | Medical Conditions | 10 articles | 90% with errors/inconsistencies | Peer-reviewed medical literature |
Domain-Specific Accuracy Assessments
Assessments of Wikipedia's factual accuracy reveal variability across domains, with stronger performance in empirical sciences and medicine compared to humanities and politically sensitive topics. A 2005 comparative study by Nature magazine evaluated 42 science articles, identifying 162 factual errors in Wikipedia entries versus 123 in Encyclopædia Britannica, concluding that Wikipedia's science coverage approached professional encyclopedia standards despite occasional minor inaccuracies.[1] Similarly, the 2012 EPIC-Oxford study, involving expert evaluations of articles in English, Spanish, and Arabic across disciplines including biology and physics, found Wikipedia scoring higher on accuracy (mean 3.6 out of 5) than competitors like Citizendium in several scientific categories, though it lagged in depth for specialized subfields.[32] In medicine, Wikipedia demonstrates high factual reliability, particularly for drug monographs and disease descriptions, though completeness remains a limitation. A 2011 analysis of 100 drug articles rated 99.7%±0.2% for factual accuracy against professional pharmacology references, with errors primarily in omissions rather than fabrications.[33] A 2020 review of health-related content corroborated this, noting that 83% of medical articles achieved "good" or "very good" quality ratings by experts, outperforming non-medical entries due to stricter sourcing norms enforced by domain-specific volunteer editors and citation to peer-reviewed journals.[34] However, studies highlight gaps in nuanced topics like nutrition, where accuracy averaged 78% in a 2014 evaluation, often due to oversimplification of conflicting evidence.[35] Historical articles exhibit lower accuracy, attributed to interpretive disputes and reliance on secondary sources prone to revisionism. A comparative analysis of historical encyclopedia entries reported Wikipedia's accuracy at 80%, compared to 95-96% for established references, with errors stemming from uncited claims or selective framing of events.[36] The EPIC-Oxford evaluation echoed this, assigning history articles a mean accuracy score of 3.2 out of 5, below science but above popular online alternatives, due to challenges in verifying primary sources amid edit wars over contentious narratives.[32] In politics and biographies, factual details on verifiable events and careers are generally reliable, especially for prominent figures, but prone to inconsistencies in lesser-covered topics. A 2011 study of U.S. congressional candidate biographies found Wikipedia provided accurate political experience data in 100% of cases examined, sufficient for quantitative analysis.[37] Brigham Young University research similarly validated its utility for political education, with error rates under 5% for election-related facts, though coverage completeness favored high-profile individuals over niche or historical politicians.[38] These strengths derive from cross-verification by ideologically diverse editors, yet domain experts caution that factual precision does not preclude subtle distortions in aggregation of sourced material.[39]Systemic Biases and Neutrality Challenges
Evidence of Ideological Left-Leaning Bias
A 2024 computational analysis by David Rozado examined over 1,000 Wikipedia articles on public figures and U.S. politics, using large language models to annotate sentiment and emotional associations. The study found that Wikipedia content tends to link right-of-center figures and terms with more negative sentiment and emotions like anger or disgust compared to left-of-center equivalents, indicating a mild to moderate left-leaning bias.[2] Similar patterns emerged in assessments of political terminology, where right-leaning concepts received disproportionately negative framing.[40] Earlier quantitative research by Shane Greenstein and Feng Zhu compared Wikipedia's coverage of U.S. political topics to Encyclopædia Britannica across thousands of articles from 2008 to 2017. Their findings revealed that Wikipedia exhibited greater left-leaning slant in phrasing and emphasis, particularly on contentious issues like economics and civil rights, exceeding Britannica's neutrality in 2016 and 2018 updates. A 2012 precursor study by the same authors measured slant in 28,000 U.S. politics articles via dictionary-based methods, confirming Wikipedia's entries leaned left on average, though revisions reduced but did not eliminate the disparity.[3] Wikipedia co-founder Larry Sanger has publicly asserted since 2020 that the platform's articles reflect a systemic left-wing bias, citing examples such as the deprecation of conservative-leaning sources like the Daily Wire while permitting left-leaning outlets, and skewed portrayals of topics like socialism and gender issues.[9] Sanger attributes this to editor demographics and enforcement of neutrality policies that favor "establishment" views, a claim echoed in analyses showing persistent ideological asymmetry in high-controversy articles despite policy guidelines.[41] These patterns align with broader empirical observations of content divergence: for instance, articles on right-leaning politicians often emphasize controversies with higher frequency and intensity than analogous left-leaning profiles, as quantified through natural language processing of revision histories.[42] While Wikipedia's neutral point of view policy aims for balance, studies indicate it fails to fully counteract the aggregate effect of editor incentives and sourcing preferences, resulting in measurable left-leaning distortions in political coverage.[2]Coverage Gaps and Selection Biases
Wikipedia's article coverage reveals systematic gaps, particularly in topics aligned with the interests and demographics of its predominantly Western, male, and left-leaning editor base, resulting in underrepresentation of non-Western perspectives, female-centric subjects, and conservative viewpoints. A 2025 global analysis of over 6 million articles identified disparities tied to factors including citizenship and political ideology, with coverage skewed toward contributors from high-income, urbanized regions and excluding events or figures from lower-wealth or peripheral areas.[13] Similarly, a 2023 study of event articles on Wikipedia quantified regional biases, showing that events in wealthier nations receive disproportionate attention relative to their global occurrence, while those in developing regions face coverage shortfalls of up to 50% compared to population-adjusted expectations.[43] These gaps stem from self-selection among editors, who prioritize familiar subjects, amplifying imbalances in notability assessments under Wikipedia's guidelines.[44] Gender-related coverage exhibits pronounced selection biases, with biographies of women comprising fewer than 20% of total entries and receiving shorter treatments alongside reduced visual elements.[45] A 2022 controlled analysis of over 1,000 biographies confirmed that articles on racial and gender minorities are systematically shorter and employ less neutral language patterns, attributing this to editor demographics where women constitute under 20% of active contributors.[46] In scholarly citations, publications by female authors are cited 10-15% less frequently in Wikipedia than expected based on academic impact metrics, reflecting sourcing preferences that favor male-dominated fields.[47] Such patterns indicate not mere oversight but structural selection against topics perceived as niche by the core editing community. Ideological selection biases manifest in the deprecation of conservative-leaning sources and undercoverage of right-of-center figures or events, as mainstream media—often cited as "reliable" under Wikipedia policy—exhibits documented left-leaning tilts that influence notability decisions. A 2025 report by the Media Research Center documented instances where conservative outlets like the Daily Wire were blacklisted as unreliable, limiting their use in verifying alternative narratives and contributing to article deletions or stubs on topics like U.S. conservative policy debates.[48] In political science coverage, entries disproportionately feature American male scholars, with living female and non-U.S. political scientists underrepresented by factors of 2-3 relative to their field prominence, per a 2022 assessment of over 500 academics.[49] A 2024 causal analysis of 1,399 articles further linked these gaps to editor ideological clustering, where left-leaning majorities enforce sourcing norms that marginalize dissenting views, reducing overall neutrality in topic selection. This reliance on ideologically aligned secondary sources perpetuates exclusion, as empirical reviews show Wikipedia's political entries lag in breadth compared to balanced encyclopedic benchmarks.[9]Conflict-of-Interest and Advocacy Editing
Conflict-of-interest (COI) editing on Wikipedia occurs when individuals or groups edit articles to promote external interests, such as financial gains, corporate reputations, or ideological agendas, rather than adhering to neutral point-of-view principles.[50] This practice undermines the encyclopedia's reliability by introducing biased content that may persist due to inconsistent enforcement by volunteer moderators.[50] Academic analyses have identified thousands of such articles; for instance, one dataset compiled 3,280 COI-affected entries through content-based detection methods, highlighting patterns like promotional language and self-referential sourcing.[51] Paid editing services represent a prominent form of COI, often involving public relations firms hired to enhance client pages without disclosure. In 2013, the firm Wiki-PR was exposed for using hundreds of undisclosed accounts to edit on behalf of paying clients, leading to widespread blocks after community investigations revealed systematic manipulation.[52] Similarly, medical device company Medtronic employed staff and consultants to favor edits promoting kyphoplasty procedures, attempting to alter articles on related treatments despite lacking independence.[50] More recently, as of 2025, large law firms have been documented paying undisclosed editors to excise mentions of scandals from their entries, violating transparency rules and prioritizing reputational control over factual completeness.[53] Advocacy-driven editing further exacerbates reliability issues, particularly in politically charged topics, where coordinated groups advance partisan narratives. A 2025 investigation identified at least 30 editors collaborating on over 1 million edits across more than 10,000 articles related to Israel and the Israeli-Palestinian conflict, with activity spiking after October 7, 2023.[54] These editors, linked to advocacy efforts like Tech for Palestine recruitment, removed citations documenting terrorism (e.g., Samir Kuntar's convictions) and reframed events to minimize Palestinian violence while amplifying anti-Israel framing, such as portraying Zionism as colonialist.[54] Such coordination—evidenced by 18 times more inter-editor communications than neutral groups—evades detection tools, allowing biased content to influence high-traffic pages.[54] Enforcement challenges compound these problems, as declining volunteer numbers (e.g., a 40% drop in medical topic editors from 2008 to 2013) limit oversight, enabling undisclosed edits to proliferate.[50] Machine learning approaches for early detection of undisclosed paid editors have shown promise, outperforming baselines in identifying anomalous patterns, yet widespread adoption remains limited.[55] Consequently, COI and advocacy editing contribute to systemic distortions, where external incentives override empirical sourcing, eroding trust in Wikipedia as a verifiable reference.[50]Error Propagation and Correction Processes
Vandalism and Rapid Reversion Mechanisms
Vandalism on Wikipedia refers to deliberate edits intended to disrupt or degrade content quality, including insertions of falsehoods, obscenities, or nonsensical material. The English Wikipedia encounters roughly 9,000 such malicious edits each day.[56] These acts constitute a small but persistent fraction of overall activity, estimated at about 2% of edits in sampled periods.[57] To counter vandalism, Wikipedia relies on rapid detection and reversion through layered mechanisms combining human oversight and automation. Human patrollers, including approximately 5,000 rollbackers and 1,400 administrators, monitor recent changes feeds to identify and undo suspicious edits via rollback tools that restore prior versions en masse.[56] Assisted tools like Huggle and STiki queue potentially problematic edits for review using algorithms analyzing metadata, language patterns, and edit characteristics.[56] Automated bots form the frontline for swift intervention, scanning edits within seconds of submission. Prominent examples include ClueBot NG, which employs neural networks trained on human-classified data to detect anomalies in edit behavior, achieving reversions in as little as 5 seconds and accumulating over 3 million such actions since 2011.[56] These bots revert approximately one edit per minute on average and eliminate about 50% of vandalistic contributions autonomously.[56] Edit filters, numbering around 100, preemptively block or warn on high-risk edits from new or unregistered users based on predefined rules.[56] The combined efficacy of these systems ensures most obvious vandalism is corrected rapidly, often within minutes, contributing to reverts comprising up to 10% of total edits by 2010.[58] Vandalism prevalence fluctuates, reaching one in six edits during off-peak hours and one in three during high-activity periods, yet reversion graphs confirm high precision (82.8%) and recall (84.7%) in identifying damaging changes post-facto.[59] While effective against blatant disruptions, these mechanisms are less adept at subtle or coordinated efforts, allowing some persistence until manual review.[58]Circular Referencing and Information Loops
Circular referencing occurs when Wikipedia articles cite external sources that, in turn, derive their information directly or indirectly from Wikipedia, forming interdependent loops that mask the absence of independent verification. This practice violates Wikipedia's verifiability policy, which requires claims to be supported by reliable, published sources rather than self-reinforcing cycles.[60] Such loops often arise when unverified or fabricated details are added to Wikipedia, subsequently copied by secondary sources like news outlets or blogs, which then serve as citations back to the original article, creating an illusion of multiple corroborating references.[61] The term "citogenesis," describing this process of fact generation through circular reporting, was coined by Randall Munroe in his October 25, 2011, xkcd comic, which depicted a sequence where a false claim enters Wikipedia, propagates to external media, and returns as "sourced" validation.[62] In practice, this enables the persistence of misinformation, as editors and readers perceive looped citations as evidence of consensus rather than tracing back to potentially dubious origins. For instance, niche historical or biographical details lacking primary documentation can gain entrenched status when media outlets, seeking quick references, reproduce Wikipedia content and get cited in return, amplifying errors across the information ecosystem.[60] These information loops exacerbate reliability challenges by eroding traceability to empirical or authoritative primaries, particularly in under-sourced topics where volunteer editors prioritize apparent sourcing over origin scrutiny. Critics, including academic guides, warn that such cycles facilitate "fact-laundering," where low-quality or invented information acquires undue legitimacy, complicating efforts to correct or debunk it once embedded. Wikipedia acknowledges the risk through guidelines prohibiting self-citation and templates flagging circular sources, yet detection relies on vigilant community oversight, which is inconsistent for obscure entries.[60] Empirical observation from documented cases shows that once loops form, reversion requires dismantling multiple interdependent references, often delaying accurate updates for months or years.[61] The causal mechanism here stems from Wikipedia's open-editing model intersecting with the web's copy-paste culture: initial insertions evade scrutiny due to volume, secondary sources prioritize speed over verification, and feedback reinforces the loop, perpetuating inaccuracies until challenged by external fact-checkers or primary evidence. This dynamic disproportionately affects fringe or evolving subjects, where source scarcity invites speculation disguised as fact, undermining the platform's claim to encyclopedic neutrality and verifiability.[60]Persistence of Misinformation Despite Policies
Despite Wikipedia's core policies requiring verifiability from independent reliable sources and adherence to a neutral point of view, misinformation has repeatedly endured for years without detection or correction. The "Bicholim Conflict," a fabricated account of an undeclared 1640–1641 war between Portugal and the Indian Maratha Empire, persisted as a detailed article from 2007 until its deletion in January 2013, evading scrutiny for over five years despite multiple citations to nonexistent sources.[63][64] Empirical analyses of hoaxes underscore this vulnerability: a comprehensive study of 496 detected hoaxes on English Wikipedia revealed that while 90% were flagged within one hour of creation, 1% endured for more than one year, and another 1% garnered over 1,000 page views or citations in other articles, amplifying their reach before removal.[65] Subtle fabrications, mimicking legitimate content with plausible but invented references, often bypass community patrols and revert mechanisms, as policies depend on editor vigilance rather than automated enforcement.[4] Deliberate insertion experiments further quantify persistence risks. In a 2015 test embedding false but innocuous claims across articles, 63% of the misinformation remained uncorrected for weeks or months, highlighting delays in challenging entries lacking immediate controversy.[66] A 2023 replication and extension of an earlier false-claims study found that while over one-third of added disinformation was reverted within hours, the majority lingered longer, particularly in low-traffic pages where policy enforcement is sporadic. Scientific misinformation exhibits similar inertia: a September 2025 investigation of citations to retracted papers showed that invalid research continues to be referenced in Wikipedia articles long after retractions, with incomplete updates failing to mitigate propagation to readers.[67] This persistence stems from the decentralized editing model, where policy violations require consensus among volunteers, often delaying action on entrenched or unchallenged content until external verification intervenes.Notable Incidents and Case Studies
High-Profile Hoaxes and Fabricated Content
One of the most elaborate fabrications on Wikipedia was the "Bicholim conflict," an invented article detailing a supposed 14th-century border war between the villages of Bicholim and Satari in Goa, India. Created on November 2, 2007, by an anonymous editor, the 4,500-word entry described fictional battles, diplomatic maneuvers, and a fabricated treaty, complete with citations to non-existent historical sources.[63] It evaded detection for over five years until December 20, 2012, when an editor identified inconsistencies in the referenced materials, leading to its deletion on January 4, 2013.[68] The hoax ranked among Wikipedia's longer-running deceptions, highlighting how detailed pseudohistory can persist amid limited expert oversight for obscure topics.[69] Another record-setting hoax, "Jar'Edo Wens," claimed to describe an ancient Australian Aboriginal deity embodying the physical form of earthly knowledge and creator of human suffering through physical contact. Added on May 29, 2005, by an anonymous editor from an Australian IP address—later identified as user "John666"—the article survived nine years, nine months, and five days before deletion on March 3, 2015, after a user queried its legitimacy on the Reliable Sources Noticeboard.[70] It achieved apparent credibility through mutual citations with other hoax entries, such as "Dilga" and "Wagyl," and edits by unwitting contributors, exemplifying circular reinforcement where fabricated content bolsters itself.[71] The perpetrator, an Australian serial hoaxer, had inserted similar fabrications elsewhere, underscoring systemic challenges in verifying culturally niche claims without primary source verification.[72] In a case of rapid misinformation spread, Dublin student Shane Fitzgerald inserted a phony quote—"Gandhi said: 'When I despair, I remember that all through history the way of truth and love has always won. There have been tyrants and murderers and for a time they seem invincible, but in the end, they always fall. Think of it, always' "—into the Mahatma Gandhi Wikipedia page on January 4, 2007, falsely attributing it to a 1940s interview.[73] The fabrication circulated to over 100 news sites, including ABC News and The Guardian, within hours, before Fitzgerald revealed it as a test of media verification practices six days later.[73] This incident demonstrated Wikipedia's potential as an unvetted source for secondary dissemination, with outlets failing to independently confirm the quote despite its absence from verified Gandhi archives.[73] These hoaxes reveal persistent vulnerabilities in Wikipedia's model, where anonymous edits on under-monitored subjects can endure through superficial plausibility and lack of contradictory evidence, even as policies mandate reliable sourcing.[74] In 2021, investigations uncovered a coordinated effort by a Chinese editor fabricating over 200 articles on invented historical events, further illustrating how state-influenced or prank-driven deceptions exploit low-traffic pages.[75] Such cases, often uncovered only by vigilant users rather than proactive checks, question the encyclopedia's safeguards against deliberate invention.[76]Political and Ideological Editing Scandals
In 2006, an investigation using WikiScanner software revealed that staffers from United States congressional offices had made over a thousand edits to Wikipedia articles from official IP addresses, often to remove embarrassing details or insert favorable information about politicians.[77] For instance, edits from Senator Joe Biden's office altered his biography to downplay a plagiarism controversy involving his 1988 presidential campaign speeches.[77] Similar changes were traced to offices of other members, including efforts to delete references to scandals or ethical issues, prompting Wikipedia administrators to block edits from certain government IPs and sparking debates over conflict-of-interest policies.[78] By 2014, persistent disruptive edits from US House of Representatives computers—totaling hundreds annually and focusing on political topics—led to a formal ban on anonymous editing from those IPs, as administrators deemed them violations of neutrality guidelines.[78] Analysis showed patterns of whitewashing controversies, such as softening criticisms of lawmakers' voting records or campaign finance issues, highlighting how institutional access enabled ideological or self-serving manipulations despite Wikipedia's volunteer oversight.[77] More recent scandals involve coordinated ideological campaigns by clusters of editors. A 2025 Anti-Defamation League report documented at least 30 Wikipedia editors collaborating over years to insert anti-Israel narratives into articles on the Israeli-Palestinian conflict, circumventing policies by using sockpuppet accounts and selectively citing sources to amplify biased framing while suppressing counterviews.[54] This included systematic downgrading of pro-Israel perspectives as "propaganda" and elevation of contentious claims without balanced sourcing, illustrating how small, ideologically aligned groups can dominate contentious topics.[79] Wikipedia co-founder Larry Sanger has publicly attributed such incidents to systemic left-leaning bias among editors, arguing in 2024-2025 statements that the platform's reliance on a self-selecting volunteer base—predominantly holding progressive views on politics, religion, and culture—fosters "capture" by ideologues who enforce viewpoints through blacklists of conservative sources and revert edits challenging dominant narratives.[80] A 2024 Manhattan Institute study empirically supported this, finding Wikipedia articles on US politics more likely to incorporate Democratic-aligned language (e.g., "civil rights") over Republican equivalents, with conservative topics showing higher rates of negative framing based on citation patterns.[9] These cases underscore vulnerabilities in Wikipedia's decentralized model, where ideological editing scandals erode claims of neutrality without robust external verification.[81]Scientific and Medical Misinformation Events
In a 2014 analysis published in the Journal of the American Osteopathic Association, researchers compared Wikipedia entries on the ten most costly medical conditions in the United States—such as ischemic heart disease, lung cancer, and hypertension—with information from peer-reviewed medical literature and UpToDate, a clinical decision support resource. The study identified factual errors or omissions in 90% of the Wikipedia articles, including misleading statements on diagnosis (e.g., implying hypertension could be diagnosed from a single elevated reading without specifying confirmatory protocols) and treatment (e.g., incomplete or inaccurate guidance on managing conditions like diabetes or stroke). These discrepancies arose from reliance on secondary sources, outdated data, or unsourced edits, leading the authors to recommend caution in using Wikipedia for medical information, particularly by patients and trainees.[82][83] A 2025 study on Wikipedia's handling of retracted scientific papers revealed persistent citation of invalid research, with 71.6% of 1,181 analyzed instances deemed problematic: many citations were added before retraction but not removed afterward, while others were introduced post-retraction without noting the invalidation. For example, approximately 50% of retracted papers cited in Wikipedia articles lacked any indication of their retraction status, allowing flawed scientific claims—such as fabricated data in biomedical studies—to propagate despite Wikipedia's verifiability policies and tools like Citation Bot. This issue spans fields like medicine and biology, where retracted papers on topics from drug efficacy to genetic mechanisms continued influencing article content years later, highlighting gaps in editor vigilance and automated detection. The analysis, drawing from Retraction Watch data and Wikipedia edit histories, underscored how collaborative editing fails to systematically purge discredited science, potentially misleading readers on empirical validity.[67][84] Chemical inaccuracies provide another documented case of enduring scientific errors. A 2017 letter in the Journal of Chemical Education detailed multiple instances of glaring structural errors in Wikipedia chemistry articles, such as incorrect depictions of molecular bonds and functional groups in entries for compounds like tryptamine and certain neurotransmitters, which persisted for years despite reports to editors. One example involved a misdrawn Kekulé structure for benzene derivatives, violating basic valence rules, while another featured erroneous stereochemistry in alkaloid pages; these flaws remained uncorrected even after direct notifications, attributed to editor inexperience in specialized domains and resistance to non-consensus changes. Such errors, often sourced from unreliable images or unverified uploads, undermine Wikipedia's utility for scientific education and research reference.[85] During the COVID-19 pandemic, Wikipedia's coverage of the SARS-CoV-2 lab-leak hypothesis exemplified delayed correction of scientific narratives amid ideological editing pressures. Until mid-2021, the platform's articles frequently framed the hypothesis—supported by U.S. intelligence assessments and virological analyses of furin cleavage sites—as a "conspiracy theory" or "debunked," citing early consensus statements from sources like The Lancet while downplaying counter-evidence from gain-of-function research at the Wuhan Institute of Virology. Editing wars, documented through talk-page disputes and revert logs, involved blocks of pro-lab-leak edits as "misinformation," with the hypothesis only reclassified as a viable origin scenario after FBI and Department of Energy endorsements in 2023. This persistence reflected broader challenges in neutral sourcing for contentious science, where reliance on mainstream outlets—often aligned with natural-origin advocacy—delayed updates despite accumulating empirical indicators like proximity to high-risk labs and database deletions.[86]Expert and Institutional Perspectives
Academic and Research Evaluations
A 2005 comparative analysis published in Nature examined 42 science articles from Wikipedia and Encyclopædia Britannica, finding that Wikipedia contained on average four serious errors and omissions per article, compared to three in Britannica, leading to the conclusion that Wikipedia approached Britannica's accuracy in scientific entries.[1] However, Britannica contested the methodology, arguing that the study's expert reviewers were not blinded to article sources, potentially introducing bias, and that Wikipedia had 162 factual errors versus Britannica's 123 across the reviewed content.[25] Subsequent pilot studies, such as a 2012 multilingual comparison, echoed similar findings of parity in select topics but highlighted variability by language edition and subject depth.[87] In medical and health domains, evaluations have yielded mixed results; a 2014 review of Wikipedia's coverage of mental health disorders found it comparable to professional sources in accuracy but often lacking in completeness and clinical nuance.[88] A 2011 assessment of pharmacological articles reported high factual overlap with textbooks, yet a broader 2016 analysis of patient drug information revealed gaps in completeness and readability relative to official medication guides.[35] These inconsistencies underscore that while Wikipedia performs adequately in verifiable scientific facts, it frequently underperforms in synthesizing complex, evidence-based recommendations, with accuracy rates varying from 70-90% depending on the metric and topic.[89] Research on ideological bias has identified systematic left-leaning slants, particularly in political and biographical content; a 2012 econometric study of over 28,000 Wikipedia articles developed a slant index based on partisan media citations, revealing a leftward bias stronger than in Britannica or Encyclopædia.com.[3] More recent computational analyses, including a 2024 examination of sentiment associations in articles on public figures, found Wikipedia more likely to link right-of-center terms and individuals with negative connotations, with effect sizes indicating mild to moderate asymmetry not fully mitigated by editor diversity.[9] Field experiments, such as a Yale study randomizing edits to political stubs, confirmed that crowd-sourced contributions exhibit detectable biases mirroring contributors' ideologies, persisting despite reversion policies.[90] These findings suggest that while factual reliability holds in neutral domains, interpretive neutrality falters under open editing, influenced by editor demographics skewed toward progressive viewpoints. Overall, academic consensus acknowledges Wikipedia's utility for broad overviews and as a starting point for research, with error rates often comparable to traditional encyclopedias in STEM fields, but cautions against reliance in contentious or specialized areas due to bias propagation and incomplete sourcing.[88] Longitudinal metrics from multilingual quality assessments further indicate that article reliability correlates positively with edit volume and reference density, yet systemic underrepresentation of conservative perspectives raises questions about causal mechanisms in content curation.[91]Journalistic Reliance and Internal Critiques
Journalists frequently consult Wikipedia for background information and quick reference during reporting, despite guidelines from organizations like Poynter Institute advising against direct citation due to its editable nature and potential for transient errors.[92] This reliance can propagate inaccuracies when unverified content from the encyclopedia is incorporated into news articles without independent fact-checking. A notable 2009 experiment by University College Dublin student Shane Fitzgerald illustrated this vulnerability: he inserted a fabricated quote falsely attributed to Mahatma Gandhi—"When I despair, I remember that all through history the way of truth and love has always won. There have been tyrants and for a time they seem invincible, but in the end, they always fall. Think of it, always."—into the Wikipedia entry on the Indian independence leader; the hoax persisted for five weeks, during which it was reproduced without verification by outlets including The Huffington Post, The Washington Post, and The Globe and Mail.[73] [93] Such incidents underscore how Wikipedia's open-editing model, while enabling rapid updates, exposes journalistic workflows to risks of "citation pollution," where media reports citing erroneous Wikipedia content create circular validation loops.[94] Internal critiques of Wikipedia's reliability have emerged prominently from within its founding and editing community, highlighting systemic issues in editorial control and bias mitigation. Larry Sanger, who co-founded Wikipedia in 2001 alongside Jimmy Wales, has been a leading voice, arguing since his departure in 2002 that the platform's volunteer-driven model has devolved into ideological capture by anonymous activists prioritizing agendas over neutrality.[80] In a May 2020 essay, Sanger detailed how Wikipedia exhibits "serious bias problems" on politically charged topics, such as conservative figures and events, where sourced facts are downplayed or omitted in favor of narratives aligned with left-leaning viewpoints prevalent among editors.[95] By September 2025, in an op-ed for The Free Press, he proposed reforms including stricter expert verification and reduced anonymity to restore reliability, claiming the site's current state renders it untrustworthy for contentious subjects due to unchecked manipulation by a small cadre of ideologically motivated contributors.[80] These concerns align with broader internal acknowledgments of uneven enforcement of neutral point of view policies, as evidenced by Sanger's observation that Wikipedia's reliance on secondary sources from biased institutions amplifies distortions rather than correcting them through first-hand scrutiny.[96] While Sanger's critiques, informed by his foundational role, emphasize causal failures in Wikipedia's decentralized governance—such as the dominance of unaccountable editors over credentialed experts—defenders within the community often counter that aggregate editing corrects errors over time, though empirical cases like prolonged hoaxes suggest otherwise.[97] This internal discord reflects deeper tensions between Wikipedia's aspirational openness and the practical realities of human biases influencing content persistence, with Sanger attributing much of the degradation to a shift from collaborative knowledge-building to factional control post-2000s expansion.[98]Legal, Judicial, and Policy Usage
Courts in the United States have cited Wikipedia in over 400 judicial opinions, sometimes taking judicial notice of its content or basing legal reasoning on it.[99] A 2022 study by researchers at MIT and the University of California analyzed the impact of Wikipedia articles on judicial behavior, finding that the creation of a Wikipedia entry on a specific legal case increased its citations in subsequent court opinions by more than 20 percent, suggesting the encyclopedia shapes judges' awareness and application of precedents.[100] This influence persisted even after controlling for other factors, with a randomized control trial confirming that exposure to Wikipedia articles affects judicial decision-making in experimental settings.[101] Despite such usage, numerous courts have explicitly criticized Wikipedia's reliability for legal purposes, emphasizing its editable nature and potential for anonymous alterations. In a 2008 Texas appellate decision, the court deemed Wikipedia entries "inherently unreliable" because they can be modified by anyone without verification, rejecting their use as evidence.[102] The Texas Supreme Court in 2017 similarly disfavored reliance on Wikipedia, advising against it in formal legal analysis due to risks of inaccuracy.[103] Federal courts have issued parallel warnings, with some circuits holding it as an unreliable source and cautioning against its evidentiary weight.[104] In the United Kingdom, a 2023 analysis highlighted concerns that senior judges' frequent reference to Wikipedia could propagate erroneous information, potentially undermining judgment quality amid unverified edits.[105] Policy contexts reflect similar skepticism; for instance, many academic and professional guidelines in legal education prohibit citing Wikipedia in formal submissions, viewing it as insufficiently authoritative for policy formulation or regulatory reliance. Government entities have occasionally monitored or sought to influence Wikipedia editing rather than adopting it as a policy tool, underscoring doubts about its stability for official use.[75] Overall, while Wikipedia permeates informal judicial research, explicit policy discourages its standalone role in binding decisions to mitigate risks of factual distortion.Views from Traditional Encyclopedia Producers
Robert McHenry, former editor-in-chief of Encyclopædia Britannica, critiqued Wikipedia in a 2004 essay titled "The Faith-Based Encyclopedia," arguing that its reliance on anonymous, volunteer editors without verifiable expertise fosters a system akin to communal faith rather than scholarly accountability, where errors persist due to the absence of identifiable authorship and pre-publication review.[106] He illustrated this by examining Wikipedia's article on a historical figure, noting multiple factual inaccuracies and speculative content that remained uncorrected despite the platform's purported self-correcting mechanism.[106] Encyclopædia Britannica Inc. further challenged Wikipedia's reliability in a 2006 response to a Nature journal study that purported to find comparable error rates in science articles between the two (3.9 errors per Wikipedia article versus 2.9 for Britannica).[107] The company deemed the study "fatally flawed," citing methodological issues such as undisclosed reviewer identities, inconsistent error classification (e.g., counting reviewer misinterpretations as encyclopedia errors), and selective article sampling that overlooked Britannica's broader editorial standards, which include commissioning named experts and rigorous fact-checking.[25] Britannica maintained that its professional processes ensure higher factual precision and depth, contrasting Wikipedia's vulnerability to vandalism, bias from unvetted contributors, and incomplete sourcing.[25] These views underscore a core contention from traditional producers: encyclopedic reliability demands hierarchical expertise and editorial gatekeeping, absent in Wikipedia's decentralized model, which prioritizes volume and accessibility over sustained accuracy.[108] While acknowledging Wikipedia's utility for broad overviews, such critiques emphasize its inadequacy for authoritative reference, particularly in complex or contentious topics where anonymous edits can propagate misinformation without institutional recourse.[107]Tools, Metrics, and Recent Developments
Internal and External Reliability Tools
Wikipedia maintains internal tools primarily designed to detect vandalism, assess edit quality, and support content moderation to bolster article reliability. The Objective Revision Evaluation Service (ORES), launched in 2015, employs machine learning models to score revisions for potential damage and evaluate overall article quality, enabling editors to prioritize problematic edits.[109][110] These models are trained on human-labeled data from tools like Wiki labels, achieving high precision in identifying revertable edits across languages.[109] Automated bots complement ORES by rapidly reverting vandalism; for instance, systems using statistical language models and active learning detect subtle disruptions like sneaky vandalism, reducing response times compared to manual patrols.[111][112] Internal quality assessment frameworks, such as those rating articles from stub to featured class, further guide improvements by evaluating factual completeness and sourcing, though these rely on community consensus rather than automated metrics alone.[113] Externally, third-party tools like WikiTrust provide independent reliability indicators by analyzing revision history and author reputation to color-code text based on trustworthiness.[114] Introduced around 2009, WikiTrust highlights words from anonymous or low-reputation contributors in orange, with fading intensity for persistent content, aiming to alert readers to potentially unreliable passages without altering Wikipedia's core process.[115][116] Evaluations of WikiTrust demonstrated its utility in surfacing vandalism-prone revisions, though adoption waned as it required extensions for MediaWiki and browsers.[117] Recent external efforts include datasets like Wiki-Reliability for training models on content accuracy, facilitating broader research into propagation of errors.[118]Quantitative Metrics for Article Quality
Wikipedia maintains an internal content assessment system that categorizes articles into quality classes ranging from stub (minimal content) to featured article (highest standard, requiring comprehensive sourcing, neutrality, and stability). This system, applied by volunteer editors, provides a quantitative distribution metric: as of October 2023, the English Wikipedia's approximately 6.7 million articles include roughly 6,800 featured articles and 35,000 good articles, representing less than 0.1% and about 0.5% of the total, respectively, while over 80% are stubs or start-class with limited depth and verification.[119][120] Featured articles demonstrate measurably higher stability, maintaining high-quality content for 86% of their lifecycle compared to 74% for non-featured articles, as measured by edit reversion rates and content persistence in a 2010 statistical analysis.[121] Expert-reviewed studies yield error rate metrics, often revealing variability by topic. A 2005 blind peer review by Nature of 42 science articles identified 162 factual errors, omissions, or misleading statements in Wikipedia entries (averaging 3.9 per article) versus 123 in Encyclopædia Britannica (averaging 2.9 per article), indicating comparable but slightly higher error density in Wikipedia.[1] Britannica disputed the findings, claiming methodological flaws such as selective error counting inflated Wikipedia's inaccuracies by a factor of three relative to their own.[25] Subsequent domain-specific assessments show higher precision in technical fields; for instance, a 2014 evaluation of pharmacology articles found Wikipedia's drug information accurate in 99.7% ± 0.2% of cases against expert consensus. Automated predictive models offer scalability metrics for quality estimation. The Objective Revision Evaluation Service (ORES), deployed by the Wikimedia Foundation, uses machine learning to classify articles into six quality tiers, achieving up to 64% accuracy in multi-class prediction and a mean absolute error of 0.09 in regression-based scoring on held-out datasets. Systematic reviews of such models indicate random forest classifiers reach 51-60% accuracy using features like reference count, edit history, and structural elements, though performance drops for lower classes like stubs due to sparse data.[122] These metrics correlate positively with manual assessments: articles with more references and edits (e.g., over 100 revisions) are 2-3 times more likely to reach B-class or higher, per lifecycle analyses.[123]| Metric Type | Example Value | Domain/Notes | Source |
|---|---|---|---|
| Featured Article Proportion | <0.1% of total articles | English Wikipedia, 2023 | [119] |
| Error Rate (Errors per Article) | 3.9 (Wikipedia) vs. 2.9 (Britannica) | Science topics, 2005 | [1] |
| Accuracy in Specialized Topics | 99.7% ± 0.2% | Pharmacology, 2014 | |
| ML Prediction Accuracy | 64% (6-class) | Article quality models, 2023 | |
| High-Quality Lifetime | 86% (featured) vs. 74% (non-featured) | Edit stability, 2010 | [121] |
Reform Efforts and Ongoing Criticisms (2020-2025)
In response to persistent concerns over content reliability and bias, the Wikimedia Foundation advanced several initiatives during the 2020-2025 period as part of its broader Movement Strategy, finalized in 2020, which emphasized structural changes for sustainability, inclusion, and equitable decision-making to indirectly bolster editorial quality.[125] These included investments in tools for provenance tracking and knowledge integrity, such as projects to curate information sources more efficiently and combat disinformation through community-driven verification processes.[126] Additionally, the WikiCred program, ongoing since earlier years but intensified in this timeframe, hosted events like WikiCredCon 2025, which targeted improvements in handling reliable sources amid rising threats like editor harassment and doxxing attempts that undermine neutral editing.[127] Despite these measures, criticisms of Wikipedia's reliability intensified, particularly regarding ideological imbalances in article framing. Co-founder Larry Sanger, who has highlighted systemic left-leaning bias since a 2020 analysis citing uneven sourcing and editorial suppression of conservative viewpoints, reiterated in 2025 that anonymous editing and administrative overreach perpetuate censorship of dissenting perspectives, including on topics like COVID-19 origins and political figures.[95][128] A 2024 Manhattan Institute study analyzing over 1,500 Wikipedia articles via natural language processing found statistically significant negative sentiment toward right-leaning terms and figures compared to left-leaning equivalents, attributing this to editor demographics skewed toward urban, progressive contributors.[40][10] External scrutiny escalated in 2025 with a U.S. House Oversight Committee investigation launched by Republicans in August, probing allegations of coordinated manipulation by foreign actors and U.S. academics to insert anti-Israel and pro-Russia content, demanding details on Wikipedia's detection of such "bad actors" and editor transparency.[129][130] Critics, including Sanger and outlets documenting these issues, argued that Wikipedia's reliable sources policy favors establishment media—often critiqued for left-wing tilts—while marginalizing primary or alternative data, exacerbating reliability gaps in politically charged topics.[131] Concurrently, active editor numbers stagnated around 130,000 monthly contributors by mid-decade, with core experienced editors declining due to burnout and AI competition like ChatGPT diverting research traffic, further straining quality oversight.[132] These developments underscored unresolved tensions between Wikipedia's open model and demands for rigorous, unbiased curation.References
- https://www.mediawiki.org/wiki/ORES
- https://meta.wikimedia.org/wiki/Wikipedia_featured_articles
- https://meta.wikimedia.org/wiki/Movement_Strategy/Recommendations
