271 lines
40 KiB
HTML
271 lines
40 KiB
HTML
|
<!DOCTYPE html>
|
|||
|
<html xmlns="http://www.w3.org/1999/xhtml" lang="" xml:lang="">
|
|||
|
<head>
|
|||
|
<meta charset="utf-8" />
|
|||
|
<meta name="generator" content="pandoc" />
|
|||
|
<meta name="viewport" content="width=device-width, initial-scale=1.0, user-scalable=yes" />
|
|||
|
<title>Trust <in> formation</title>
|
|||
|
<style type="text/css">
|
|||
|
code{white-space: pre-wrap;}
|
|||
|
span.smallcaps{font-variant: small-caps;}
|
|||
|
span.underline{text-decoration: underline;}
|
|||
|
div.column{display: inline-block; vertical-align: top; width: 50%;}
|
|||
|
</style>
|
|||
|
<style type="text/css">
|
|||
|
|
|||
|
body{
|
|||
|
background-color: rgba(96, 0, 255, 0.2);
|
|||
|
max-width:750px;
|
|||
|
font-size:18px;
|
|||
|
line-height:1.6;
|
|||
|
margin:2em
|
|||
|
}
|
|||
|
blockquote{
|
|||
|
font-size:125%;
|
|||
|
margin:2em;
|
|||
|
width:calc(100% + 2em);
|
|||
|
}
|
|||
|
pre, code{
|
|||
|
font-size:14px;
|
|||
|
line-height:1.25;
|
|||
|
color:blue;
|
|||
|
white-space:pre-wrap;
|
|||
|
}
|
|||
|
pre#title{
|
|||
|
white-space:pre;
|
|||
|
}
|
|||
|
h1, h2, h3, h4, h5, h6{
|
|||
|
line-height:1.1;
|
|||
|
}
|
|||
|
hr{
|
|||
|
border:0;
|
|||
|
border-bottom:1px solid black;
|
|||
|
}
|
|||
|
img{
|
|||
|
max-width:100%;
|
|||
|
}
|
|||
|
small, small a{
|
|||
|
display:inline-block;
|
|||
|
line-height:1.2;
|
|||
|
}
|
|||
|
|
|||
|
|
|||
|
|
|||
|
</style>
|
|||
|
</head>
|
|||
|
<body>
|
|||
|
<header>
|
|||
|
<h1 class="title">Trust <in> formation</h1>
|
|||
|
</header>
|
|||
|
<nav id="TOC">
|
|||
|
<ul>
|
|||
|
<li><a href="#introduction">0. Introduction</a></li>
|
|||
|
<li><a href="#relation-between-fake-news-and-wikipedia">1. Relation between fake news and Wikipedia</a></li>
|
|||
|
<li><a href="#wikipedia-internal-struggle-against-vandalists">2. Wikipedia internal struggle against vandalists</a><ul>
|
|||
|
<li><a href="#cluebotng">2.1 CluebotNG</a></li>
|
|||
|
<li><a href="#ores-a-feminist-machine-learning-project-the-immune-system-of-wikimedia">2.2 ORES, a feminist machine learning project (the immune system of Wikimedia)</a></li>
|
|||
|
</ul></li>
|
|||
|
<li><a href="#exercises">3. Exercises</a><ul>
|
|||
|
<li><a href="#wikilabels">3.1 WikiLabels</a></li>
|
|||
|
<li><a href="#writing-under-ores">3.2 Writing under ORES</a></li>
|
|||
|
</ul></li>
|
|||
|
<li><a href="#links">Links:</a></li>
|
|||
|
</ul>
|
|||
|
</nav>
|
|||
|
<pre id="title" style="font-weight:bold;">
|
|||
|
_____ _ __ _ __ __ _ _
|
|||
|
|_ _| | | / /(_) \ \ / _| | | (_)
|
|||
|
| | _ __ _ _ ___ | |_ / / _ _ __ \ \ | |_ ___ _ __ _ __ ___ __ _ | |_ _ ___ _ __
|
|||
|
| || '__|| | | |/ __|| __| < < | || '_ \ > > | _|/ _ \ | '__|| '_ ` _ \ / _` || __|| | / _ \ | '_ \
|
|||
|
| || | | |_| |\__ \| |_ \ \ | || | | | / / | | | (_) || | | | | | | || (_| || |_ | || (_) || | | |
|
|||
|
\_/|_| \__,_||___/ \__| \_\|_||_| |_|/_/ |_| \___/ |_| |_| |_| |_| \__,_| \__||_| \___/ |_| |_|
|
|||
|
|
|||
|
</pre>
|
|||
|
<p>Workshop during IMPAKT Festival<br> Sunday 28th of October<br> 13:30h - 15:30h<br> <a href="http://impakt.nl/nl/festival/programme/workshops/trust-formation/">impakt.nl/nl/festival/programme/workshops/trust-formation</a></p>
|
|||
|
<blockquote style="color:red;">
|
|||
|
Please use this document for your further reading only. Due to the sensitive nature of the interview, we have yet to confirm details with Amir before making it public.
|
|||
|
</blockquote>
|
|||
|
<h1 id="introduction">0. Introduction</h1>
|
|||
|
<p>In the wake of the current media climate, we find ourselves turning to Wikipedia as a reliable source of information. This might be surprising for someone who has known Wikipedia since its early beginnings, when it was seeding doubt in reputable academic publications. Even in high schools, Wikipedia was considered as a non-reliable source and its fresh in our memories how we were not allowed to quote from or refer to the platform. Meanwhile, the question of article quality, ‘objectivity’ or vandalism on Wikipedia has been a discussion that is still ongoing, and becomes especially challenging when the daily edit count reaches half of a million.</p>
|
|||
|
<p>However, with the tireless help of volunteers from the Wikimedia community, <em>‘all bugs are shallow’</em><a href="#fn1" class="footnote-ref" id="fnref1"><sup>1</sup></a>. Especially when you add machine learning algorithms to the mix.</p>
|
|||
|
<p>Machine learning can be regarded as the ability of a computer programme to spot patterns in large sets of data using advanced statistical models. Wikpedia uses machine learning to built forms of automation that help editors with the assesment of new edits, unfinished articles or acts of vandalism. One of the reasons why we decided to focus on the machine learning services that Wikimedia is working on is because of the way it presents itself: ORES, the project under consideration today, is declared by Aaron Halfaker as a feminist inspired project. We became curious to dive deeper into the project, to explore what this feminist approach contains, how it deals with consensus making and what it could teach us when thinking about ethical ways to do machine learning.</p>
|
|||
|
<p>Last year, in November 2017, we met Amir Sarabadani, who is one of the developers of ORES working at Wikimedia Deutschland. We invited him in the context of an algorithmic literature event in Brussels, where we asked him to speak about ORES. It was our first introduction to the project.</p>
|
|||
|
<p>In the sidelines of that event, Cristina Cochior and Femke Snelting interviewed Amir. We propose to read a part of the interview together, where Amir speaks about objectivity and vandalism within Wikipedia.</p>
|
|||
|
<blockquote>
|
|||
|
<p><strong>Cristina</strong>: <em>I thought it was interesting that yesterday in your talk you referred to the concepts of ‘subjective’ and ‘objective’. You said that the assessment of vandalism is subjective, because it comes down to the personal interpretation of what vandalism is, but then you referred also to the objectivity principle on which Wikipedia is based. You seemed to view these two concepts as coexisting on the same platform. Did I read that correctly?</em></p>
|
|||
|
<p><strong>Amir</strong>: <em>Well, the thing about Wikipedia, especially the policies, is that it’s not very objective. It’s very open to interpretations and it’s very complicated. I don’t know if I told you but there is a law of Wikipedia that says ignore all rules. It means do everything you think is correct and if there’s a problem and you’re violating anything, it could be that you come to a conclusion that maybe we should change that law. It happens all the time. If there’s a consensus, it can be changed. There is a page on Wikipedia that’s called Five Pillars and the five pillars say that except these five pillars, you can change everything. Although I don’t think that’s very objective, everything is subjective on Wikipedia. But when there is an interaction of lots of people, it becomes more natural and objective in a way because there is a lot of dicussion and sometimes there are people who try to change others’ opinions about some issues. When this happens, it makes everything more neutral. In another way, [the sounds are hard to distinguish at this point] by making battles, they both fight and the result is something neutral, which to some degrees is not great, but…</em></p>
|
|||
|
<p><strong>Cristina</strong>: <em>Do you think that the result is aiming to be neutral?</em></p>
|
|||
|
<p><strong>Amir</strong>: <em>The result is aiming to be neutral. And it is, because of the integration of lots of people that are cooperating with each other and who are trying to get things done in a way that doesn’t violate policies. So they tolerate things that they don’t like in the article or sometimes they even add them [themselves] to make it more neutral.</em></p>
|
|||
|
<p><strong>Femke</strong>: <em>Could you give any examples of that?</em></p>
|
|||
|
<p><strong>Amir</strong>: <em>The biggest problem is usually writing about religion. I have seen people who are against a religion and try to make a criticism and when they are writing the article they try to add something in there, like a defence of Muslims, in order to make it more neutral. The people who contribute usually value these pillars, including the pillar of neutrality.</em></p>
|
|||
|
<p><strong>Femke</strong>: <em>I was wondering – when you were speaking – you could say there is vandalism that is not targeted, that is about ‘can I write something’, but there is also vandalism that is about breaking something to signal disagreement or irritation with a certain topic. Do you ever look at the relation between where the vandalism goes and what topics are being attacked?</em></p>
|
|||
|
<p><strong>Amir</strong>: <em>Well, I didn’t, but there are lots of topics about that and things that people have strong feelings about are always good targets for vandals. There is always vandalism around things that have strong politics. It can be sports, it can be religion, it can be any sensitive subject like homosexuality, abortion, in these matters it happens all the time. One thing that I think about is that sometimes when people are reading articles on Wikipedia, sometimes it’s outside of their comfort zone, so they try to change the article and bring it back in, instead of expanding it.</em></p>
|
|||
|
<p><strong>Cristina</strong>: <em>I was wondering actually about this sense of ownership that some editors have over their articles. A lot of reverts and debates that are happening behind the scenes of a specific article is due to the fact that one person started creating the page and put a lot of effort in it and someone else wants to implement some changes with which the first person does not agree. Do you think that the fact that there is only one face of Wikipedia is related to that? If you would have the possibility to have multiple readings of one page, then there would also be more views on one subject.</em></p>
|
|||
|
<p><strong>Amir</strong>: <em>There have been lots of debates about this. I don’t know if you know Vox, the media company from the United States? One thing that they tried to implement was to make a version of Wikipedia that is customisable. For example, if you are pro-Trump, you are given a different article than someone who is a democrat. But immediately you can see the problem, it diverges people. Like what Facebook is doing right now, making people live inside their bubbles. I think this is the reason why the people on Wikipedia are fighting against anything that has this divisive effect.</em></p>
|
|||
|
<p><strong>Femke</strong>: <em>That’s one way of seeing it, I understand. If you would make multiple wikis, you would support, and that’s happening a lot, that separation of world views that is algorithmically induced. I understand why that raises concern. But on the other hand, there is somehow the self-defined need to always come to a consensus. I’m wondering if this is always helpful for keeping the debate alive.</em></p>
|
|||
|
<p><strong>Amir</strong>: <em>For Wikipedia, they knew that consensus is not something you can always reach. They invented a process called Conflict Resolution. When people talk and they see that they cannot reach any consensus, they ask for a third-party opinion. If they couldn’t find any agreement with the third-party opinion, they call for the mediator. But mediators do not have any enforcement authority. If mediators can resolve the conflict, then it’s done, otherwise the next step is arbitration. For example the case of Chelsea Manning. What was her name before the transitioning? I think it’s Brandon Manning, right? So, there was a discussion over what they name on Wikipedia should be: Chelsea Manning or Brandon Manning. So there was lots of transphobia in the discussion and when nothing worked, it went to an ArbiCom (Arbitration Committee). An arbitration committee is like a very scary place, it has a court and they have clerks that read the discussion and the outcome was obviously that it should stay Chelsea Manning. It’s not like you need to reach consensus all the time, sometime consensus will be forced on you. Wikipedia has a policy saying “Wikipedia is not”. One of the things Wikipedia is not is a place for democracy.</em></p>
|
|||
|
<p><strong>Femke</strong>: <em>The Chelsea Manning case is interesting, I didn’t think about it. Is this - let’s say - verdict archived in the article somewhere?</em></p>
|
|||
|
<p><strong>Amir</strong>: <em>The cases usually happen on the ArbiCom page and on the case as such’s page. But finding this in the discussion is hard.</em></p>
|
|||
|
</blockquote>
|
|||
|
<h1 id="relation-between-fake-news-and-wikipedia">1. Relation between fake news and Wikipedia</h1>
|
|||
|
<p>In the past, Wikipedia has created a divide between researchers in regards to how trustable it is as a reference. The comparison between Wikipedia and Encyclopedia Britannica was often made and still is to this day. See for example <a href="http://www.hbs.edu/faculty/Publication%20Files/15-023_e044cf50-f621-4759-a827-e9a3bf8920c0.pdf">this recently published article</a> by the Harvard Business School which looks at a selection of 4000 articles and concludes that there is a bias on the English Wikipedia towards the US Democratic party, but this bias is not very strong. Although research like this still needs to be looked deeply into (while it is a fact that the majority of the English Wikipedia editors come from the US, there are plenty of other foreign editors who do not relate to the Democrat/Republican divide), it shows that doubt is still cast on the encyclopedia. There’s even a page on Wikipedia that documents the many times the site has been called into question<a href="#fn2" class="footnote-ref" id="fnref2"><sup>2</sup></a>.</p>
|
|||
|
<p>However, in the last few years since the linguistic term ‘fake news’ has received increasing coverage, Wikipedia has received a new image. Now it is also pictured as a trustable knowledge platform. See for example the titles of the following articles:</p>
|
|||
|
<ul>
|
|||
|
<li><a href="https://www.wired.co.uk/article/fake-news-wikipedia-arbitration-committee">Inside Wikipedia’s volunteer-run battle against fake news</a></li>
|
|||
|
<li><a href="https://www.thetimes.co.uk/article/youtube-fights-fake-news-with-wikipedia-frkpc8nm2">YouTube uses Wikipedia to fight fake news</a></li>
|
|||
|
<li><a href="https://www.vice.com/en_uk/article/4w54bd/a-wikipedian-told-us-how-wikipedia-stays-reliable-in-the-fake-news-era">A Wikipedian Explains How Wikipedia Stays Reliable in the Fake News Era</a></li>
|
|||
|
<li><a href="https://www.digitaltrends.com/social-media/facebook-about-this-article-wikipedia/">Facebook’s new fake news tool is partially powered by Wikipedia</a></li>
|
|||
|
<li>…</li>
|
|||
|
</ul>
|
|||
|
<p>The discussion around this topic has also reached Wikipedia editors. There is already a page that collects all the fake news websites that the editors encounter: <a href="https://en.wikipedia.org/wiki/List_of_fake_news_websites" class="uri">https://en.wikipedia.org/wiki/List_of_fake_news_websites</a>. The volunteers also reached an agreement to ban the pages DailyMail<a href="#fn3" class="footnote-ref" id="fnref3"><sup>3</sup></a> and Breitbart<a href="#fn4" class="footnote-ref" id="fnref4"><sup>4</sup></a> from being used as a reference in the articles. However, as it happens with large-scale organisation, this decision was taken by a small group and it might take a while until it travels across the whole community.</p>
|
|||
|
<p>Wikipedia is still in the top-10 of most visited websites in the world, which says a lot about the visibility and influence of the project. If, for example, someone decides that they want to make a practical joke, and they change the capital of Bulgaria to Despacito, this will in turn prompt Siri to adopt this false idea<a href="#fn5" class="footnote-ref" id="fnref5"><sup>5</sup></a>.</p>
|
|||
|
<p>Wikipedia’s governing structure is often compared to democratic principles. However, the heavy bureaucratic structures of Wikipedia are hierarchical and the overall goal of the project is to reach consensus, not to follow the majority’s opinion. The case of the Chelsea Manning page and the Arbitration Committee is a good example of this. The Wikipedia page <a href="https://en.wikipedia.org/wiki/Wikipedia:What_Wikipedia_is_not">What Wikipedia is not</a> describes how Wikipedia is not a democracy:</p>
|
|||
|
<blockquote>
|
|||
|
<em>‘Wikipedia is not an experiment in democracy or any other political system. Its primary (though not exclusive) means of decision making and conflict resolution is editing and discussion leading to consensus—not voting (voting is used for certain matters such as electing the Arbitration Committee). Straw polls are sometimes used to test for consensus, but polls or surveys can impede, rather than foster, discussion and should be used with caution’.</em>
|
|||
|
</blockquote>
|
|||
|
<p>Wikipedia’s strive for objectivity and a <a href="https://en.wikipedia.org/wiki/Wikipedia:Neutral_point_of_view">Neutral Point of View</a>, which is the general guideline for Wikipedia editors, shows how not all content is accepted. This has sparked a lot of debate and backlash from critics, but also fringe actors. As a result, groups of people with fringe political positions who did not feel represented by Wikipedia have decided to make their own. Some examples:</p>
|
|||
|
<ul>
|
|||
|
<li><a href="https://infogalactic.com/info/Main_Page"><em>Infogalactic</em></a> <em>“The Planetary Knowledge Core”</em>, an alt-right version on Wikipedia, started by Vox Day.</li>
|
|||
|
<li><em>There’s <a href="https://www.metapedia.org"><em>Metapedia</em></a>, a wiki with a white supremacist bent, which is published in 16 languages but is especially popular in Hungary and Germany. (On Metapedia, Barack Obama isn’t just a former president, he’s a “mixed race former president,” and the Holocaust is a genocide only according to “politically correct history.”)</em></li>
|
|||
|
<li><em>Or there’s <a href="https://www.conservapedia.com/Main_Page"><em>Conservapedia</em></a>, a version aimed at religious conservatives and created by Andrew Schlafly, son of the conservative activist Phyllis Schlafly.</em><a href="#fn6" class="footnote-ref" id="fnref6"><sup>6</sup></a></li>
|
|||
|
<li>Or <a href="https://rationalwiki.org/wiki/Main_Page"><em>Rationalwiki</em></a>.</li>
|
|||
|
<li>Or the trolling project <a href="https://encyclopediadramatica.rs/Main_Page"><em>Encyclopedia Dramatica</em></a>.</li>
|
|||
|
</ul>
|
|||
|
<p>Now, we could do a little <em>Chelsea Manning test</em>. Where we look for a page about Chelsea Manning on each of these wikis, to see what they decided as the name of the article.</p>
|
|||
|
<h1 id="wikipedia-internal-struggle-against-vandalists">2. Wikipedia internal struggle against vandalists</h1>
|
|||
|
<p>But how does a website that is open to editing float above the fake news waters? Wikipedia has a lot of systems in place that attempt to identify damaging intentions to the site. For example, there are ways to limit the ‘editability’ of a page only to editors who have had an account for a certain number of years, or to editors who have a higher status (eg. administrators). There are also multiple types of machine learning algorithms in place that are mobilised to detect vandalist tendencies. We can mention two: CluebotNG and ORES.</p>
|
|||
|
<h2 id="cluebotng">2.1 CluebotNG</h2>
|
|||
|
<p><img src="https://upload.wikimedia.org/wikipedia/commons/5/50/US_Air_Force_021105-O-9999G-001_Spirit_in_the_blue_sky.jpg" /> <small>Profile image of CluebotNG. Source: <a href="https://en.wikipedia.org/wiki/User:ClueBot_NG" class="uri">https://en.wikipedia.org/wiki/User:ClueBot_NG</a></small></p>
|
|||
|
<p>CluebotNG is a machine learning programme using neural networks to identify and revert vandalist edits on Wikipedia. It was made and maintained by Christopher Breneman (Crispy1989), Tim1357, and Jacobi Carter (Cobi). It was filed to become a bot account on Monday October 25, 2010. Since it has been approved, it has been active on the English Wikipedia.</p>
|
|||
|
<p>CluebotNG raised a heated discussion when it was first proposed as an alternative way to fight vandalism. The community opposed the initial parameters that allowed the algorithm to catch more vandalist edits, while at the same time generating many side-effects, such as false positives.</p>
|
|||
|
<p>Currently the bot is very popular with the community, receiving a lot of praise for being very efficient in fighting vandals. However, some might argue that despite its popularity, CluebotNG is driving newcomers away through its categorical decision making (something is either vandalism or not, and if it is, the edit will be reverted directly).</p>
|
|||
|
<h2 id="ores-a-feminist-machine-learning-project-the-immune-system-of-wikimedia">2.2 ORES, a feminist machine learning project (the immune system of Wikimedia)</h2>
|
|||
|
<p><img src="https://upload.wikimedia.org/wikipedia/commons/c/c3/ORES_edit_quality_flow.svg" /> <small>Diagram that is used on the ORES page to illustrate the project. Source: <a href="https://www.mediawiki.org/wiki/ORES#/media/File:ORES_edit_quality_flow.svg" class="uri">https://www.mediawiki.org/wiki/ORES#/media/File:ORES_edit_quality_flow.svg</a></small></p>
|
|||
|
<p><strong>what is it?</strong></p>
|
|||
|
<p>ORES (which stands for Objective Revision Evaluation Service) is a feminist machine learning service developed at the Wikimedia Foundation, the non-profit organisation that hosts Wikipedia and other free knowledge projects. The project is developed to maintain the quality of Wikipedia on the big scale that it is at right now: currently Wikipedia is edited half a million times per day. To empower its volunteers in the processing of all these edits, ORES is build to both make quality control more efficient and to make Wikipedia a more welcoming place for new editors. Wikipedia, especially the English one, is considered to be a hostile environment to newcomers: very often when users are not directly complicit to the guidelines of Wikipedia, their first edits will be reverted.</p>
|
|||
|
<p><small>Speedy deletion anecdote: <a href="https://en.wikipedia.org/wiki/User_talk:Clco" class="uri">https://en.wikipedia.org/wiki/User_talk:Clco</a></small></p>
|
|||
|
<p><em>By highlighting edits that need review, we can reduce the overall reviewing workload of our volunteers by a factor of 10. This turns a 270 hours per day job into a 27 hours per day job. This also means that Wikipedia could grow by 10 times and our volunteers could keep up with the workload.</em> <a href="https://wikimediafoundation.org/2018/10/10/mitigating-biases-in-artificial-intelligences-the-wikipedian-way/" class="uri">https://wikimediafoundation.org/2018/10/10/mitigating-biases-in-artificial-intelligences-the-wikipedian-way/</a></p>
|
|||
|
<p>ORES is not a machine learning product, like for example Siri or Google Translate. Rather, it is a machine learning service. This means that ORES is not built to perform a specific task at a specific place. Instead, it provides results as data endpoints (so called API, an Application Programming Interface) that other projects and tools can use. It curates and highlights, but doesn’t revert any edits that are made. It only provides the machine learning calculations, in order to stimulate and support many other tools to be built on top of them.</p>
|
|||
|
<p>An example API request to ORES requires an <em>input</em> (a so called revid number, that represents a specific edit) and returns a list of numbers as output (structured in the JSON data format). You can choose what kind of prediction you would like to get back, by choosing a specific <em>model</em> (Edit quality or Article Quality). A full API request looks like the following: <a href="https://ores.wmflabs.org/v3/scores/enwiki/?models=draftquality%7Cwp10&revids=34854345%7C485104318">https://ores.wmflabs.org/v3/scores/enwiki/?models=draftquality%7Cwp10&revids=34854345%7C485104318</a>.</p>
|
|||
|
<p>One thing that makes ORES different from many other machine learning projects, is that they chose to not focus on profiling the user. Instead of rating the edits that an editor makes on the basis of all the previous work that the editor has done, the ORES team decided to only work with information that comes from the edits themselves, or the activities around the edits, such as deletion. ORES looks at four different aspect of the editing process:</p>
|
|||
|
<p><em>Edit Quality</em></p>
|
|||
|
<ul>
|
|||
|
<li><em>reverted</em> (Basic support) - Based on the history of reverted edits, using wordlists of BWDS, badwords and stopwords in different languages <a href="https://meta.wikimedia.org/wiki/Research:Revision_scoring_as_a_service/Word_lists" class="uri">https://meta.wikimedia.org/wiki/Research:Revision_scoring_as_a_service/Word_lists</a></li>
|
|||
|
<li><em>damaging/goodfaith</em> (Advanced support) - Based on a training proces done by editors, using the WikiLabels tool <a href="https://labels.wmflabs.org/" class="uri">https://labels.wmflabs.org/</a></li>
|
|||
|
</ul>
|
|||
|
<p><em>Article Quality</em></p>
|
|||
|
<ul>
|
|||
|
<li><em>draftquality</em> (Curation support) - Based on deletion logs and written deletion comments <a href="https://en.wikipedia.org/wiki/Special:Log?type=delete&user=&page=&wpdate=&tagfilter=" class="uri">https://en.wikipedia.org/wiki/Special:Log?type=delete&user=&page=&wpdate=&tagfilter=</a></li>
|
|||
|
<li><em>wp10</em> (Assessment scale support) - Based on the structure of an article, following the Content Assessment scheme <a href="https://en.wikipedia.org/wiki/Wikipedia:Content_assessment" class="uri">https://en.wikipedia.org/wiki/Wikipedia:Content_assessment</a></li>
|
|||
|
</ul>
|
|||
|
<p>More detailed information about the different type of models can be found here: <a href="https://www.mediawiki.org/wiki/ORES" class="uri">https://www.mediawiki.org/wiki/ORES</a>.</p>
|
|||
|
<p><strong>who is behind it?</strong></p>
|
|||
|
<p>The service is developed by the Wikimedia Scoring Platform team, which currently exists out of 6 people:</p>
|
|||
|
<pre><code>Aaron Halfaker (Principal Research Scientist, Team Lead)
|
|||
|
|
|||
|
Amir Sarabadani (Software Engineer (WMDE))
|
|||
|
|
|||
|
Adam Wight (Software Engineer (WMF))
|
|||
|
|
|||
|
James Hare (Associate Product Manager (WMF))
|
|||
|
|
|||
|
Max Klein (Software Engineer (WMF))
|
|||
|
|
|||
|
Marius Hoch (Software Engineer (WMDE))</code></pre>
|
|||
|
<p>More information about the team can be found here: <a href="https://www.mediawiki.org/wiki/Wikimedia_Scoring_Platform_team" class="uri">https://www.mediawiki.org/wiki/Wikimedia_Scoring_Platform_team</a></p>
|
|||
|
<p><strong>how does it process edits?</strong></p>
|
|||
|
<p>To predict if the quality of a new Wikipedia edit is written in goodfaith and is damaging or not, ORES needs to work with <em>features</em>. Features are information points that can be extracted from the comments or their surroundings, that function as the informative data that ORES uses to make its calculations.</p>
|
|||
|
<p>For example, not only the content of the edit itself, but also the short summaries that editors attach to their edit are regarded as features. Just as information about the type of article in which the article is made, or even the section or element in which the edit appeared.</p>
|
|||
|
<p>The following code from the Wikimedia Github shows what type of features are used to calculate Edit Quality.</p>
|
|||
|
<pre>
|
|||
|
damaging = wikipedia.page + \
|
|||
|
wikitext.parent + wikitext.diff + mediawiki.user_rights + \
|
|||
|
mediawiki.protected_user + mediawiki.comment + \
|
|||
|
badwords + informals + dict_words
|
|||
|
</pre>
|
|||
|
<p><small><a href="https://github.com/wikimedia/editquality/blob/master/editquality/feature_lists/enwiki.py">Source of this code</a></small></p>
|
|||
|
<p>On the following pages, you can further inspect how features are defined:</p>
|
|||
|
<ul>
|
|||
|
<li><a href="https://github.com/wikimedia/editquality/blob/master/editquality/feature_lists/wikipedia.py">wikipedia.py</a> - to request the type of page features (eg. articleish, mainspace, draftspace)</li>
|
|||
|
<li><a href="https://github.com/wikimedia/editquality/blob/master/editquality/feature_lists/wikitext.py">wikitext.py</a> - to request the type of article section or element in which the edit is made (eg. external link, heading or template)</li>
|
|||
|
<li><a href="https://github.com/wikimedia/editquality/blob/master/editquality/feature_lists/mediawiki.py">mediawiki.py</a> - to request the comment of the edit and the user rights of the editor (eg. sysop, bot or eliminator)</li>
|
|||
|
</ul>
|
|||
|
<p>Disclaimer: this is something we are still trying to grasp.<a href="#fn7" class="footnote-ref" id="fnref7"><sup>7</sup></a></p>
|
|||
|
<p><strong>why is it a feminist endeavor?</strong></p>
|
|||
|
<p>In the initial blogpost that Aaron Halfaker wrote to introduce ORES, Aaron mentions how the design of ORES is based on feminist principles.<a href="#fn8" class="footnote-ref" id="fnref8"><sup>8</sup></a><a href="#fn9" class="footnote-ref" id="fnref9"><sup>9</sup></a></p>
|
|||
|
<p>Some other reasons why ORES is considered feminist in its approach:</p>
|
|||
|
<p><em>Infrastructural approach</em>: One of the main ideas behind ORES was to not built a ‘full stack’ machine learning project. It is deliberately designed as a service that other developers can build on top of. By doing this, the threshold of making new tools or utilities is much lower. As the groundwork of gathering and labelling data, as well as training the machine learning models, has already been achieved and taken care of.</p>
|
|||
|
<p><em>Transparency</em>: as opposed to other machine learning algorithms, the decisions that have led to the making of the models are openly available on the many pages that have documented the project. Despite the labyrinthic expansion of the documentation, that can be quite confusing to navigate, the members of the team are very fast to respond to queries and try to include as many volunteers in the process as possible through participation at Wikimedia organised hackathons.</p>
|
|||
|
<p><em>Refusal to profile</em>: Cluebot NG profiles the editors by looking up whether the user is anonymous. And if so, where the IP address of the user is coming from, or the time at which the edit is made. In contrast, ORES specifically refers only to the quality and intention of the text itself.</p>
|
|||
|
<p><em>Newcomer friendliness</em>: the reason why ORES began in the first place was as a response to the amount of newcomers who felt discouraged to participate in the editing process by anti-vandalist bots, such as Cluebot NG, or the hostile attitude of some long-time editors.</p>
|
|||
|
<p><em>Participation in the decision making process</em>: Allow editors to play an active role in the algorithm’s mechanisms. The WikiLabels tool was specially made to invite editors to train the <em>Edit Quality</em> model. This also became a useful tool in understanding the decision making process of ORES better. Another tool that is currently developed by the Scoring Platform team is JADE, a Mediawiki extention that can be used by editors to annotate their editing work. This information is then connected to the ORES workflow, to create a feedback loop that re-trains ORES on the basis of the latest editing work.</p>
|
|||
|
<h1 id="exercises">3. Exercises</h1>
|
|||
|
<h2 id="wikilabels">3.1 WikiLabels</h2>
|
|||
|
<p>First we will start with a 15 minutes exercise in which we will be introduced to the WikiLabels system. WikiLabels is a human computing service for Wikipedia, developed by the ORES developers to involve Wikipedia users into the proces of validating ORES results. We will be validating a set of edits on two parameters: <em>good faith</em> and <em>damaging</em>.</p>
|
|||
|
<p>We will start by reading a little bit more from the interview with Amir, in which he starts to speak about the ideas behind good faith and damaging edits.</p>
|
|||
|
<blockquote>
|
|||
|
<p><strong>Amir</strong>: <em>The thing we are trying to tackle in terms of Wikipedia editing, we are trying to make a model not just in terms of binary separation. We have a good faith model, which predicts with the same system between one and zero that an edit has been made in good faith or not. For example, if you see if an edit is damaging, but it was made with a good intention. You see many people that want to help, but because they are new, they make mistakes. We try to tackle this by having another model. So if an edit has both a high vandalist score and a high bad intent score, we can remove it with bots and we can interact with people who make mistakes but have a good intention.</em></p>
|
|||
|
<p><strong>Cristina</strong>: <em>And how do you see the good faith principle in relation to neutrality?</em></p>
|
|||
|
<p><strong>Amir</strong>: <em>I think it’s completely related and I think it comes down to is this user trying to help Wikipedia or not: this is our brainstorm.</em></p>
|
|||
|
<p><strong>Femke</strong>: <em>If you talk about the distinction between good faith and bad faith, it is still about faith-in-something. If you plot the faith according to the line of a neutral point of view, you’re dealing with a different type of good faith and goodness than if you plot the faith along the line of wanting more points of view.</em></p>
|
|||
|
<p><strong>Amir</strong>: <em>I see. I think good faith means good intent. By defining what is ‘good’ in this way, we are following the principles of the whole Wikipedia, good is helping people. Although, it is a very subjective term, and what we are trying to do right now is to make some sort of survey. To take out things that are very computative and can’t be measured easily, like quality, and ask people whether they think an edit looks good or bad. To make things more objective, to make things come together from the integration of observations of lots of people. Obviously, there are a lot of gray areas.</em></p>
|
|||
|
</blockquote>
|
|||
|
<p>Exercise:</p>
|
|||
|
<ol type="1">
|
|||
|
<li>Go to <a href="https://labels.wmflabs.org/">labels.wmflabs.org</a> and choose one of the available wikis</li>
|
|||
|
<li>Choose to work on the ‘edit quality’ workset</li>
|
|||
|
<li>Label a few edits for 5 minutes.</li>
|
|||
|
<li>Recap & reflection. Try to describe your understanding of ‘good faith’ and ‘damaging’ on the etherpad in 5 minutes.</li>
|
|||
|
</ol>
|
|||
|
<h2 id="writing-under-ores">3.2 Writing under ORES</h2>
|
|||
|
<p>For the purpose of the workshop, we’ve written a script that will use ORES to rate a Wikipedia article. We will be writing the article together. The article topic is chosen by us as a group. ORES will be rating it using the following two parameters:</p>
|
|||
|
<ul>
|
|||
|
<li>goodfaith</li>
|
|||
|
<li>damaging</li>
|
|||
|
</ul>
|
|||
|
<p>Amir described 3 types of vandalists: <em>newcomers, cyber-warriors (push their agendas/say messages), for fun</em>. We will be working with the first two types of vandalists. One half of the group will be writing as a newcomer, the other one as a cyber warrior.</p>
|
|||
|
<p>To start:</p>
|
|||
|
<ol type="1">
|
|||
|
<li>Login into the English Wikipedia <a href="https://en.wikipedia.org/wiki/Main_Page">en.wikipedia.org</a>. Your collective Wikipedia username: username: trustinformationws password: trust</li>
|
|||
|
<li>Go to the Sandbox page of this user <a href="https://en.wikipedia.org/wiki/User:Trustinformationws/sandbox">en.wikipedia.org/wiki/User:Trustinformationws/sandbox</a>.</li>
|
|||
|
<li>The Mediawiki software is designed for collaborative writing. However, this design is not made for collective editing in real time. If multiple editors edit an article at the same time, the software does not know how to process the different edits and raises a ‘conflict error’. To circumvent such conflicts, we will write in a chain-format.</li>
|
|||
|
<li>We divide the group in two: TEAM ONE: the newcomers TEAM TWO: the cyber-warriors.</li>
|
|||
|
<li>Before we begin, we choose a topic that we will be writing the article on.</li>
|
|||
|
<li>We take 10 minutes where each editor writes individually a text in the spirit of their team.</li>
|
|||
|
<li>We do an initial round of adding edits to the Sandbox page. Each rating of goodfaith and damage will be written down in the spreadsheet we have prepared.</li>
|
|||
|
<li>Everyone reads what is on the page at the end of the first round.</li>
|
|||
|
<li>From here on, we will do rounds of 1 minute editing. In the mean time, the scores will continue to be logged.</li>
|
|||
|
<li>The first team that reaches 10 points, wins.</li>
|
|||
|
</ol>
|
|||
|
<p>! There will be a timekeeper who keeps an eye on the time and warns you when a minute has passed !</p>
|
|||
|
<p>! If you get a badfaith you are out !</p>
|
|||
|
<ol start="11" type="1">
|
|||
|
<li>Recap & reflection. Now that the article is finished, we will finish the exercise by trying to describe your understanding of ‘good faith’ and ‘damaging’ on the etherpad.</li>
|
|||
|
</ol>
|
|||
|
<h1 id="links">Links:</h1>
|
|||
|
<ul>
|
|||
|
<li>Official Wikimedia ORES page: <a href="https://www.mediawiki.org/wiki/ORES" class="uri">https://www.mediawiki.org/wiki/ORES</a></li>
|
|||
|
<li>Code repo of the writing exercise: <a href="https://git.vvvvvvaria.org/mb/trust-in-formation" class="uri">https://git.vvvvvvaria.org/mb/trust-in-formation</a></li>
|
|||
|
<li>Notes of Amir’s lecture during Algoliterary Encounters, November 2017 in Brussels: <a href="https://pad.constantvzw.org/p/algoliterary.lectures" class="uri">https://pad.constantvzw.org/p/algoliterary.lectures</a> (from line 265)</li>
|
|||
|
<li>April Glaser & Will Oremus (2018) <em>Fact and Fiction on Wikipedia - The Wikimedia Foundation’s executive director explains how the site’s volunteer editors fight misinformation and harassment.</em> <a href="http://www.slate.com/articles/podcasts/if_then/2018/10/wikimedia_s_katherine_maher_on_how_wikipedia_sorts_fact_from_misinformation.html?via=gdpr-consent" class="uri">http://www.slate.com/articles/podcasts/if_then/2018/10/wikimedia_s_katherine_maher_on_how_wikipedia_sorts_fact_from_misinformation.html?via=gdpr-consent</a></li>
|
|||
|
<li><a href="https://commons.wikimedia.org/wiki/File:Using_AI_to_keep_Wikipedia_open.webm">Using AI to keep Wikipedia open - video</a></li>
|
|||
|
</ul>
|
|||
|
<hr />
|
|||
|
<p><code>$ curl https://pad.vvvvvvaria.org/trust%3Cin%3Eformation.css/export/txt > stylesheet.css && curl https://pad.vvvvvvaria.org/trust%3Cin%3Eformation/export/txt | pandoc -f markdown -t html --toc -H stylesheet.css -s -o reader.trust-in-formation.html && rm stylesheet.css</code></p>
|
|||
|
<section class="footnotes">
|
|||
|
<hr />
|
|||
|
<ol>
|
|||
|
<li id="fn1"><p><em>“With enough eyes, all bugs are shallow.” - Linus’ Law</em> mentioned by Eric S. Raymond in <em>The Cathedral And The Bazaar</em> (1999)<a href="#fnref1" class="footnote-back">↩</a></p></li>
|
|||
|
<li id="fn2"><p>https://en.wikipedia.org/wiki/Reliability_of_Wikipedia<a href="#fnref2" class="footnote-back">↩</a></p></li>
|
|||
|
<li id="fn3"><p>Article on The Guardian relating the story: <a href="https://www.theguardian.com/technology/2017/feb/08/wikipedia-bans-daily-mail-as-unreliable-source-for-website" class="uri">https://www.theguardian.com/technology/2017/feb/08/wikipedia-bans-daily-mail-as-unreliable-source-for-website</a>. How the Daily Mail saw this ban: <a href="https://www.dailymail.co.uk/news/article-4280502/Anonymous-Wikipedia-activists-promote-warped-agenda.html" class="uri">https://www.dailymail.co.uk/news/article-4280502/Anonymous-Wikipedia-activists-promote-warped-agenda.html</a><a href="#fnref3" class="footnote-back">↩</a></p></li>
|
|||
|
<li id="fn4"><p>How Breitbart received the news: <a href="https://www.breitbart.com/tech/2018/10/03/breitbart-blacklisted-from-use-on-wikipedia-as-reliable-source/" class="uri">https://www.breitbart.com/tech/2018/10/03/breitbart-blacklisted-from-use-on-wikipedia-as-reliable-source/</a><a href="#fnref4" class="footnote-back">↩</a></p></li>
|
|||
|
<li id="fn5"><p>This has actually happened: <a href="https://www.reddit.com/r/softwaregore/comments/74epbw/siri_thinks_the_national_anthem_of_bulgaria_is/" class="uri">https://www.reddit.com/r/softwaregore/comments/74epbw/siri_thinks_the_national_anthem_of_bulgaria_is/</a> Thanks goes again to Amir for telling us about this great case.<a href="#fnref5" class="footnote-back">↩</a></p></li>
|
|||
|
<li id="fn6"><p>Alexis Sobel Fitts (2017) <em>Welcome to the Wikipedia of the Alt-Right</em>, in Wired <a href="https://www.wired.com/story/welcome-to-the-wikipedia-of-the-alt-right/" class="uri">https://www.wired.com/story/welcome-to-the-wikipedia-of-the-alt-right/</a><a href="#fnref6" class="footnote-back">↩</a></p></li>
|
|||
|
<li id="fn7"><p>Many thanks to Amir for his patience to guide us through the documentation.<a href="#fnref7" class="footnote-back">↩</a></p></li>
|
|||
|
<li id="fn8"><p>Blogpost on the Wikimedia Foundation website in which ORES is initially released: Aaron Halfaker & Dario Taraborelli (2015) <em>Artificial intelligence service “ORES” gives Wikipedians X-ray specs to see through bad edits</em> <a href="https://wikimediafoundation.org/2015/11/30/artificial-intelligence-x-ray-specs/" class="uri">https://wikimediafoundation.org/2015/11/30/artificial-intelligence-x-ray-specs/</a><a href="#fnref8" class="footnote-back">↩</a></p></li>
|
|||
|
<li id="fn9"><p>Designing The Numbers That Govern Wikipedia: Aaron Halfaker on Machine Learning in Large-Scale Open Production <a href="https://civic.mit.edu/2016/02/05/designing-the-numbers-that-govern-wikipedia-aaron-halfaker-on-machine-learning-in/" class="uri">https://civic.mit.edu/2016/02/05/designing-the-numbers-that-govern-wikipedia-aaron-halfaker-on-machine-learning-in/</a><a href="#fnref9" class="footnote-back">↩</a></p></li>
|
|||
|
</ol>
|
|||
|
</section>
|
|||
|
</body>
|
|||
|
</html>
|