
This research essay is authored by Eva-Maria Schillinger, student in the master course Open Organizations and Organizing Openness at University of Innsbruck.
As a worldwide, open and collaborative knowledge sharing platform, Wikipedia has largely replaced the need for physical encyclopedias for many. From children using it for their first school presentations to university professors and academics, the readership is broad and diverse. And since Wikipedia prides itself on being an open-for-anyone space for collaboration, you would think that the large community of editors and contributors should be similar in diversity, but this is not the case. While there are multiple dimensions of diversity that could be examined in the case of Wikipedia, gender is the most visible and prominent facet. But not only diversity among users is more scarce than expected, furthermore there is a lack of inclusivity of perspectives and alternative opinions, which in some cases can lead to misrepresentation. Recognizing the lack of diversity among editors and the resulting bias in the content displayed is crucial for making attempts to increase inclusivity in both areas, which according to Wikipedia’s principle of equality should be the goal of the organization and its vast community. Finding the causes for this exclusion is the next step. This essay will examine potential exclusionary effects of boundaries around and within Wikipedia and how they inhibit inclusivity for both contributors and content. In total, four boundaries and their potential for exclusion are discussed. The focus lies on the English version of Wikipedia to keep an appropriate scope.
The first set of boundaries to be examined is boundaries of conduct. This encompasses all formal and informal rules and guidelines that aim to guide and regulate conduct and behaviour within the community. Some of these rules have an excluding effect, especially on the diversity of opinion and perspectives. One that stands out especially is the Neutral Point of View rule. The goal of the NOPV rule is to ensure that articles are written in a way that is neutral. The fault of it is that there is no truly neutral point of view, so while the intention behind it is pure, it is often instrumentalized in an attempt to discipline and silence users that hold a different opinion than one’s own. This has happened on many occasions, for instance users used it to try to skew the telling of the Philippine-American war by labelling parts of the chapter as either pro- or anti-American and demanding for it to be changed according to their own opinions (Luyt, 2012). But even when the NPOV rule is not used in a malicious manner, it still leads to a narrowing of perspectives by allowing for the presentation of dominant facts as objectiv truths, and by favouring certain approaches that appear neutral over others. Articles on ageing, for example, are mostly from a biomedical perspective, and there is little to no room for intersectionality and alternative approaches to the concept, such as for instance a cultural perspective on ageing. This results in ageing being presented as an illness that needs to be combatted instead of seeing it in a cultural or social context (Gauthier, 2017).
Various Types of Boundaries
While Wikipedia does not award its users an official expert status, there are community members that are regarded as such on certain topics. This could be because of a certain number of edits in their area, listed competencies and experiences on their user profiles, or because they simply are perceived as knowledgeable about a topic. These boundaries of expertise on Wikipedia are drawn in a broad manner, and more people are included in this expert circle than would be the case in the outside world. Many people who have gained this informal status within the community lack the depth of knowledge and understanding required to be an actual expert, yet still their opinion is held highly, and because of this status they have the power to drown out opposing claims since a part of the community will follow their lead and opinion. This poses a risk as it can lead to the exclusion of opinions and perspectives that would be valid but are not in line what the experts believe to be true. One of the most dominant contributors to the article on the history of the Philippines was regarded as an expert because of his numerous edits on the article and a textbook chapter he had written on the topic. His perspective influenced the tone of the article and many users in their opinion, and he was rarely questioned for the claims he made. When both his contributions and his writings were examined more closely, it turned out that he did not have a sufficient understanding of the history of the Philippines and was not qualified to speak on the topic as an expert, even though he appeared to be competent (Luyt, 2012). Awarding this expert status to users who may not possess the necessary expertise is dangerous, as it gives them the power to influence both other people’s opinions and the corresponding Wikipedia pages in their area of perceived expertise without being questioned in a way that would be appropriate.
The most powerful formal excluding effects are found in the vertical boundaries within the Wikipedia community. They manifest in the form of different roles that Wikipedia users are awarded, depending on how long their accounts have existed, how many edits they have made, and in the case of some of the roles, whether or not their formal request to fill the role was granted by the community through consensus-based decision-making. Some of these different roles include registered and auto-confirmed users, extended confirmed users who are able to edit certain protected articles, and administrators and bureaucrats. Administrators hold the right to block and unblock other users, to delete or protect pages, and they have certain editing privileges. Bureaucrats can add members to the bureaucrat group and remove users from the administrator group. At first glance, this is a governance tool that empowers the community to regulate itself through electing members to fill those positions, and this is also what it is meant to be. But it also means that some users have significantly more power than others.
This power imbalance in the community can be abused, and those with more power inevitably enact their personal biases on the rest of the community. This manifests in the judgement that is expected from administrators when in disputes blocking, deleting or other measures are required (Wikipedia contributors, n.d.). These judgements of course are affected by the personality of the administrator who is making them, and will show any bias they may hold. So this imbalance leads to an active exclusion of users through blocking or banning, where for instance women are blocked for reporting harassment (Paling, 2015), as well as to proactive exclusion of people from these groups by not awarding all community members similar levels of power. Furthermore, bias for and against individuals and in certain topics results in additional exclusion of people and content.
Lastly, mental boundaries pose an underlying and pathological danger for inclusiveness. Wikipedia has an image of being a “men’s club”, and they seem to have a culture to match. While not all areas of Wikipedia are the same, there is an undoubted gender gap across all disciplines, and there are frequent instances of blatant misogyny and harassment towards female editors. Scholars estimate the percentage of male editors to be 85% (of editors that disclosed their gender) (Hinnosaar, 2019), and when looking at users with a higher number of edits, the ratio of female editors with more than 500 edits goes down to 5% (Paling, 2015). When looking at readers of Wikipedia, there is no noticeable gender gap (Collier, 2012), so the website attracts both men and women as readers, but contributing seems to be more attractive to men than it is to women. Furthermore, the Arbitration Committee which is responsible for last-resort conflict resolution, is mostly male. Wikipedia has started multiple initiatives to find ways to close the gender gap, but none of them have yielded any noteworthy results.
Why is closing the gender gap so difficult?
There are multiple theories on why closing the gap may be so difficult, and why it exists at all, some of which include individual mental boundaries such as a lesser confidence in one’s own competence, and a general dislike for the confrontative and quarrelsome culture of Wikipedia found in many women (Collier, 2012). There are, however, also collective mental boundaries within the community that drive away female users. Since the majority of people in the community are men and the content has been edited predominantly by men, this does not only lead to a deterrent effect on women who are or would like to be part of the community, it can also be expected that it results in bias in the content of Wikipedia, and in the representation of topics and persons (Adams et al., 2019).
While most of this happens silently, there are cases of misogyny that show a radicalization in small parts of the community. A female member of the community filed a complaint against another user because he was posting pornographic pictures claiming they were hers. Not only was she disregarded, but the Arbitration Committee also decided that since she had made complaints against others in the past, she must be the problem. This resulted in her being blocked from contributing (Paling, 2015). She also had brought up the idea of an all-female space within the Gender Gap Task Force, an initiative by Wikipedia, because she felt the space was being infiltrated by men who claimed there was no gap and no problem, and she was met with strong resistance, many users claiming they would be “fighting this to death” (Paling, 2015). Another female user reported being treated noticeably different after disclosing her gender, she had previously used a gender-neutral username and pronouns. She claimed that now whenever she tried to incorporate any intersectional perspectives or notions of feminism, reactions were harsh and interactions quickly turned insulting, using terms such as “Feminazi” (Paling, 2015). These mental boundaries and biases affect not only Wikipedia’s inclusiveness of women, it also impacts what topics are covered in more detail and how content is written and presented.
This essay shows that boundaries of Wikipedia, some formal and initially drawn with good intentions, some emergent and informal, contribute to the exclusion of editors and perspectives. These boundaries and their effects lead to a lack of inclusion of diverse contributors, and while in this essay only the dimension of gender was discussed, this is an intersectional issue. Furthermore, bias, fake expertise, and misuse of the NPOV rule result in narrow coverage of topics. This lack of inclusion is dangerous particularly because Wikipedia presents itself as an open and equal space for all. If the excluding effects and their results are not considered, biased content can be mistaken for an objective truth, and alternatives are forgotten. Awareness is the first step that needs to be taken, and Wikipedia has acknowledged some of these issues. Furthermore, there is a need to a culture shift in the community. While only small parts are as extreme as portrayed above, the overall culture is only a result of the predominance of male editors in the community and it is to be expected that this culture is not be as open and affine to diversity issues as it could be. But this culture shift is not only the responsibility of Wikipedia as an organisation, it concerns anyone who is part of the community. And whether this change is something that can be achieved, or if it is merely a Sisyphean task that will never come to fruition, is a matter of time, collective efforts and participation.
References
- Adams, J., Brückner, H., & Naslund, C. (2019). Who Counts as a Notable Sociologist on Wikipedia? Gender, Race, and the “Professor Test.” Socius, 5, 237802311882394, https://doi.org/10.1177/2378023118823946
- Collier, B., & Bear, J. (2012, February). Conflict, confidence, or criticism: An empirical examination of the gender gap in Wikipedia. In Proceedings of the ACM 2012. Conference on Computer Supported Cooperative Work, New York (pp. 383-392).
- Gauthier, M. T., & Sawchuk, K. (2017). Not notable enough: feminism and expertise in Wikipedia. Communication and Critical/Cultural Studies, 14(4), 385–402. https://doi.org/10.1080/14791420.2017.1386321
- Hinnosaar, M. (2019). Gender inequality in new media: Evidence from Wikipedia. Journal of Economic Behavior and Organization, 163, 262–276. https://doi.org/10.1016/j.jebo.2019.04.020
- Jemielniak, D. (2016). Wikimedia movement governance: the limits of a-hierarchical organization. Journal of Organizational Change Management, 29(3), 361-378.
- Luyt, B. (2012). The inclusivity of Wikipedia and the drawing of expert boundaries: An examination of talk pages and reference lists. Journal of the Association for Information Science and Technology, 63(9), 1868–1878. https://doi.org/10.1002/asi.22671
- Paling, E. (2015, October 30). The Sexism of Wikipedia. The Atlantic. https://www.theatlantic.com/technology/archive/2015/10/how-wikipedia-is-hostile-to- women/411619/
- Wikipedia contributors. (n.d.). Wikipedia:User access levels – Wikipedia. https://en.wikipedia.org/wiki/Wikipedia:User_access_levels
- Wikipedia contributors. (2023). Gender bias on Wikipedia. Wikipedia. https://en.wikipedia.org/wiki/Gender_bias_on_Wikipedia