Wikipedia stands as an undeniable success in online participation and collaboration. However, previous attempts at studying collaboration within Wikipedia have focused on simple metrics like rigor (i.e., the number of revisions in an article’s revision history) and diversity (i.e., the number of authors that have contributed to a given article) or have made generalizations about collaboration within Wikipedia based upon the content validity of a few select articles. By looking more closely at metrics associated with each extant Wikipedia article (N=3,427,236) along with all revisions (N=225,226,370), this study attempts to understand what collaboration within Wikipedia actually looks like under the surface. Findings suggest that typical Wikipedia articles are not rigorous, in a collaborative sense, and do not reflect much diversity in the construction of content and macro–structural writing, leading to the conclusion that most articles in Wikipedia are not reflective of the collaborative efforts of the community but, rather, represent the work of relatively few contributors.
The English section of Wikipedia boasts over 3.4 million articles that “have been written collaboratively by volunteers around the world” (Wikipedia, 2010) and has consistently remained in the list of the top 10 most visited sites on the Internet (Alexa, 2010). Despite its generally open format, Wikipedia has gained wider acceptance and support since its inception as Internet users have warmed up to the idea of community–generated content being potentially as accurate as that generated by experts. Most studies on Wikipedia up to this point have focused on addressing the validity of select articles (Chesney, 2006; Giles, 2005; Kittur, et al., 2007; Kostakis, 2009; Lih, 2004; Lindsey, 2010). Yet, putting the validity of individual articles aside, the mere existence and popularity of Wikipedia should be of interest to scholars of many fields, because it may be a shining example of truly organic and successful collaboration on the Internet. Addressing this critical issue, an important study conducted by Wilkinson and Huberman (2007) utilizing all existing Wikipedia articles at the time found a general correlation between article quality and number of edits, suggesting that the quality of Wikipedia articles is indeed connected to on–going collaborative efforts. However, the actual form that this collaboration takes within Wikipedia has not been adequately explored, and it is the purpose of the current study to determine what collaboration in the online encyclopedia actually looks like in order to better understand the development of organic online communities and what constitutes collaborative success within them.
Determining the validity of information on Wikipedia has become a major issue in recent years as more and more users are attracted to the site’s seemingly endless supply of community–generated information. Early on in Wikipedia’s development, Lih (2004) sought to establish two metrics for determining the reputability of any given Wikipedia article, which were termed rigor and diversity. Lih’s rigor refers to the number of revisions an article has undergone in its lifespan, while diversity, or diversity of opinion, represents the number of unique contributors taking part in the revision cycle of a given article. By identifying rigor and diversity, Lih hoped that individual Wikipedia articles could be assessed at any given time in their development to determine if they exhibited sufficient validity (depth of treatment of the subject and diversity of perspective) to be used as legitimate sources in journalism. Though Lih’s framework lacked empirical testing at the time, Wilkinson and Huberman’s (2007) later finding that article quality did, in fact, correlate with number of edits (i.e., rigor) suggests that articles did indeed improve with successive edits and followed an overall pattern of improvement as more and more people contributed to them (i.e., diversity).
To corroborate these findings, most attempts at studying Wikipedia have focused upon evaluating the quality of select Wikipedia articles, evaluating them as discrete knowledge products, and have generally yielded positive results. Giles (2005) reported that Wikipedia was no more flawed than was the professionally–written Encyclopædia Britannica in terms of its content, based upon a study which compared numbers of errors found in a handful of articles from each. Similarly, Chesney (2006) distributed Wikipedia articles to expert and non–expert research staff to assess for credibility and concluded that Wikipedia accuracy was high, based upon low reports of errors from subject matter experts. The overall empirical validity of such approaches, however, cannot be said to extend to all articles within Wikipedia, since both studies only focused on featured articles, or those selected by the Wikipedia community or administrators as being of exemplary value.
Attempts at understanding Wikipedia as a collaborative system have followed similar patterns of focusing on featured articles, but again, the findings of such approaches cannot be generalizable to Wikipedia as a whole, because though certain featured articles may reflect deep collaboration, it does not follow that collaboration is a common characteristic of Wikipedia articles in general. An important collaborative finding, however, by Jones (2008) found that the manner in which users contribute to Wikipedia featured articles changes drastically over time and that the time at which a user begins contributing to a featured article impacts how he or she contributes. Whereas early contributors supply article structure and content, later contributors provide copy editing and grammatical changes. This brings up questions for the study conducted by Wilkinson and Huberman (2007), discussed earlier, which found correlations between rigor, diversity, and article quality, because if Jones (2008) is correct, then the differences between featured (or high–quality) articles and non–featured articles may be attributable to differences in spelling, grammar, typography, and other micro–structural edits rather than differences in content validity or depth of treatment of the subject. If this is the case, then the issue may pose concerns for what collaboration on Wikipedia may actually look like in practice, as we move past mere rigor to determine what diverse authors are actually contributing to the development of articles, as the content of articles may end up being the products of a very select few authors and may, thereby, not truly reflect the shared perspective and wisdom of the community at large.
The purpose of this study is to begin to peel back the surface of Wikipedia and to understand what rigor and diversity actually look like in article development generally. By so doing, it is hoped that we can gain a better understanding of what collaboration on Wikipedia actually looks like from the perspective of individual contributors and what role rigor and diversity actually play in the development and validity of an article. As such, this study considers all current Wikipedia articles (N=3,427,236) along with their complete revision histories (N=225,226,370). To understand the collaborative essence of each Wikipedia article and its constituent revisions, analysis takes into consideration the following:
- Rigor, or the number of edits an article undergoes;
- Diversity, or the number of authors contributing to an article;
- Diversity index, or the percentage of edits that are made by unique contributors (i.e., diversity/rigor);
- Revision chaining, or instances wherein the same author edits the same article multiple times in succession without someone else responding;
- Collaborative rigor, or the number of edits an article undergoes if revision chains are collapsed;
- Revision lengths;
- Contributions made by registered users;
- Contribution index, or the percent of revisions made by a contributor that were made to unique articles (i.e., articles revised/revisions).
Revision chaining was deemed to be important, because an author merely saving an edit to continue editing at a later date does not seem to be a valid signifier of rigor from a collaborative perspective, because it does not reflect on–going collaboration or negotiated meaning–making between authors. For this reason, collaborative rigor was used to signify a more accurate representation of the back–and–forth discussions that are indicative of truly collaborative work.
Data gathering and analysis for this study consisted of three main steps:
- downloading the most recent Wikipedia dump (22 July 2011);
- utilizing a PHP–based XML parser and custom scripts to extract desired information and store it in a MySQL database; and,
- utilizing custom PHP scripts and MySQL queries to output results.
The English version of Wikipedia has grown rapidly since articles first started to be created in 2001. Up until recently, the number of articles created and the number of revisions made each year grew annually as more and more people became aware of the encyclopedia and submitted contributions to it. Since 2007, however, there has been a steady decrease in articles created each year, and since 2008, the number of revisions made each year has decreased slightly. The reason for this decrease likely has little to do with the popularity of Wikipedia as a collaborative platform and source of knowledge and probably has more to do with the fact that there are only a finite number of topics that encyclopedia articles can be written about and that article contributors will eventually feel less of a need to contribute to articles as they become more complete. Yet, participation in Wikipedia seems to remain as healthy as ever as revisions made per article created each year has annually increased since 2001 without exception.
Though millions of revisions are created each year to Wikipedia articles, the calculation of rigor for these articles reveals a very large range and margin of error. The mean rigor (i.e., number of revisions) of each article spans between 1 and 30,629, with a median rigor of 21 and a mean of 66 ±263. If we take article chaining into consideration, however, and discount successive revisions made by the same user, the collaborative rigor that emerges is significantly smaller, spanning from 1 to 26,842, with a median collaborative rigor of 14 and a mean of 44 ±186. By removing chained revisions, the overall number of revisions decreases more than 33 percent, revealing that one–in–three revisions in Wikipedia consists of users responding to their own edits or continuing an on–going edit begun by themselves. This finding seems important, because if indeed one–third of the rigor of Wikipedia articles is constituted of authors merely picking up where they left off in an article revision, then these contributions seem no more collaborative than would be a lone writer saving a manuscript for later editing. Though rigor’s connection to article quality may not necessarily be in dispute by this finding, its connection to collaborative input is, suggesting that even the most rigorous Wikipedia articles may have significantly weaker rigor when viewed from a collaborative perspective.
Further, these calculations of rigor are very different from Lih’s comparison of selected topics in which he found median rigor to be 61 (2003) and is certainly due to the non–random selection of articles used in that study and the overlooking of chains. The lower calculations of rigor found in the present study suggest that though some Wikipedia articles may develop to a certain level of quality or validity through successive edits, the standard of rigor established by Lih’s study is far above the collaborative rigor of Wikipedia articles in general (and indeed more than that found in nearly 81 percent of all Wikipedia articles), suggesting that if rigor is an indicator of quality, then four out of five Wikipedia articles do not meet this standard. That having been said, if we look at only the most rigorous Wikipedia articles (n = 120,000), then these numbers jump drastically. The median rigor for these more thoroughly developed articles is 520, with a median collaborative rigor of 343, which is high above that found by Lih and, therefore, seems to suggest a comparatively high level of quality.
Similar to the calculation of rigor, diversity of the selected articles revealed that the number of unique users contributing to each article stretched a wide span from 1 to 12,437, with a median of 12 contributors per article and a mean diversity of 32 ±113. Again, this stands in stark contrast to Lih’s study, which found a median of 37 unique contributors per article (2003) and suggests that the vast majority of Wikipedia articles fall short of the diversity standard. In fact, 40 percent of all Wikipedia articles have a diversity of 10 or less, with two–thirds of all articles having a diversity of 20 or less.
Further, a very high correlation was found between diversity and rigor (r2 = .922) and between diversity and collaborative rigor (r2 = .962), which results in a linear relationship between the two. This extremely high correlation suggests that diversity, rigor, and collaborative rigor may actually be measurements of the same phenomenon and that establishing diversity as a separate measure may represent an overcomplication. To further illustrate this relationship, a diversity index was generated by dividing diversity by rigor, resulting in a range from .05 percent (low diversity) to 100 percent (full diversity). The average diversity index for all articles was 63.1 percent ±21.3 percent, with a median of 63.6 percent, revealing that about two–thirds of all revisions in article pages reflected additions from unique contributors (i.e., those who had not previously made a revision on that same article), and thereby verifying that most contributors do not continue working on an article beyond their initial edit and that diversity and rigor are closely related.
From a collaborative perspective, these considerations of rigor and diversity only matter if revisions contributed by users have individual value. That is, even if an article were to have a high rigor and diversity, depending upon the nature of each contributor’s revisions, the article may exhibit very little collaborative meaning–making (e.g., if one contributor writes an article and then 20 other contributors make punctuation adjustments). Thus, it is also important to recognize the nature of revisions that are being made in order to understand collaboration. Of all revisions made, the median character difference (i.e., the number of letters, numbers, etc. added or removed from the previous revision) is 27 with a mode of 0, meaning that the most common character change of a revision did not change the length of the revision at all. Additionally, the number of revisions that have a particular number of character differences reduces gradually as the character difference increases.
Thus, most revisions make very little change in the number of characters from the previous revision, with 31 percent of all revisions consisting of a character change less than 10 and 51 percent with a character change of less than 30. This suggests that most revisions are microstructural or typographical in nature, since 30 characters is representative of a very short sentence (around five words), as Jones (2008) suggested.
The 3.2 million registered contributing users who made revisions to articles on Wikipedia wrote an average of 49 ±2,231 revisions, with a median of two. This low median and high standard deviation is a precursor to the fact that 51 percent of all registered contributing users made only two or fewer revisions to Wikipedia, with 81 percent making fewer than 10 revisions, and thereby verifies that the typical user does not continue collaborating after contributing one or two edits. Further, a staggering 93 percent of all revisions made by registered contributors were made by 10 percent (n=314,768) of those registered, with 78 percent of all revisions made by registered contributors being made by the top one percent (n=31,914) of contributors.
This finding contradicts the notion that Wikipedia as a whole represents the collaborative efforts of the community of registered volunteers, since top contributors dominate so much in the way of overall contributions. Further, if we generate a contribution index of contributors by dividing the number of articles revised by a given contributor by the number of revisions made by that same contributor, we find that 43 percent of contributors have a contribution index of 100 percent or that they only make a single edit on any article that they modify and that the number of articles edited by a contributor and the number of revisions made by a contributor produce a very high correlation (r2 = .931). This finding, again, suggests that as users contribute to Wikipedia, they in many cases are not taking an on–going interest in articles, but are, rather, correcting minor errors or making stylistic changes that do not reflect continued development on the content of the article topic, jumping between various articles rather than continually focusing upon specific articles.
This study raises some important questions about what collaboration may actually look like in Wikipedia and may cast doubt on idealized notions of open, community–generated knowledge. Rather than reflecting the contributions and expertise of a large group of people, the typical article in Wikipedia reflects the efforts of a relatively small group of users (median of 12) making a relatively small number of edits (median of 21). Further, the nature of revisions made and user contribution histories suggest that the majority of revisions made by users are micro–structural, stylistic, or typographical and, therefore, may have little impact on the validity of article content.
Beyond the questions that these findings raise for collaboration within Wikipedia itself, those interested in similarly open, community–developed projects, including organizations that seek to use wiki–type technologies for documentation purposes and educational endeavors that seek to use wiki–type technologies to support collaborative writing and project development, should consider how accurately the final products (e.g., articles) of a wiki truly reflect the quality and depth of collaboration taking place within them. And, it seems prudent that any attempts at understanding underlying collaborative efforts taking place with the aid of participatory technologies, like a wiki, in the future should account for such factors as collaborative rigor and the nature of individual contributions for the resulting findings to have robust explanatory power.
About the author
Royce Kimmons is a Graduate Research Assistant and doctoral candidate in Instructional Technology at the University of Texas at Austin.
E–mail: royce [at] kimmonsdesign [dot] com
Alexa, 2010. “Alexa top 500 global sites,” at http://www.alexa.com/topsites, accessed 24 April 2010.
T. Chesney, 2006. “An empirical examination of Wikipedia’s credibility,” First Monday, volume, 11, number 11, at http://firstmonday.org/htbin/cgiwrap/bin/ojs/index.php/fm/article/view/1413/1331, accessed 1 December 2011.
J. Giles, 2005. “Internet encyclopaedias go head to head,” Nature, volume 438, number 7070, pp. 900–901.http://dx.doi.org/10.1038/438900a
J. Jones, 2008. “Patterns of revision in online writing: A study of Wikipedia’s featured articles,” Written Communication, volume 25, number 2, pp. 262–289.http://dx.doi.org/10.1177/0741088307312940
A. Kittur, E. Chi, B. Pendleton, B. Suh, and T. Mytkowicz, 2007. “Power of the few vs. wisdom of the crowd: Wikipedia and the rise of the bourgeoisie,” alt.CHI at 25th Annual ACM Conference on Human Factors in Computing Systems (CHI 2007), http://www.parc.com/publication/1749/power-of-the-few-vs-wisdom-of-the-crowd.html, accessed 1 December 2011.
V. Kostakis, 2009. “Identifying and understanding the problems of Wikipedia’s peer governance: The case of inclusionists versus deletionists,” First Monday, volume, 15, number 3, at http://www.uic.edu/htbin/cgiwrap/bin/ojs/index.php/fm/article/view/2613/2479, accessed 18 March 2010.
A. Lih, 2004. “Wikipedia as participatory journalism: Reliable sources? Metrics for evaluating collaborative media as a news resource,” paper presented at the Fifth International Symposium on Online Journalism (16–17 April; Austin), at http://jmsc.hku.hk/faculty/alih/publications/utaustin-2004-wikipedia-rc2.pdf, accessed 1 December 2011.
D. Lindsey, 2010. “Evaluating quality control of Wikipedia’s feature articles,” First Monday, volume 15, number 4, at http://firstmonday.org/htbin/cgiwrap/bin/ojs/index.php/fm/article/view/2721/2482, accessed 1 December 2011.
Wikipedia, 2010. “Wikipedia,” at http://en.wikipedia.org/wiki/Wikipedia, accessed 3 June 2011.
D. Wilkinson and B. Huberman, 2007. “Assessing the value of cooperation in Wikipedia,” First Monday, volume 12, number 4, at http://firstmonday.org/htbin/cgiwrap/bin/ojs/index.php/fm/article/view/1763/1643, accessed 1 December 2011.
Received 6 June 2011; revised 9 September 2011; accepted 28 November 2011.
“Understanding collaboration in Wikipedia” by Royce Kimmons is licensed under a Creative Commons Attribution–NonCommercial–ShareAlike 3.0 Unported License.
Understanding collaboration in Wikipedia
by Royce Kimmons.
First Monday, Volume 16, Number 12 - 5 December 2011
A Great Cities Initiative of the University of Illinois at Chicago University Library.
© First Monday, 1995-2016.