abusesaffiliationarrow-downarrow-leftarrow-rightarrow-upattack-typeburgerchevron-downchevron-leftchevron-rightchevron-upClock iconclosedeletedevelopment-povertydiscriminationdollardownloademailenvironmentexternal-linkfacebookfiltergenderglobegroupshealthC4067174-3DD9-4B9E-AD64-284FDAAE6338@1xinformation-outlineinformationinstagraminvestment-trade-globalisationissueslabourlanguagesShapeCombined Shapeline, chart, up, arrow, graphLinkedInlocationmap-pinminusnewsorganisationotheroverviewpluspreviewArtboard 185profilerefreshIconnewssearchsecurityPathStock downStock steadyStock uptagticktooltiptwitteruniversalityweb

Diese Seite ist nicht auf Deutsch verfügbar und wird angezeigt auf English

Artikel

21 Dez 2018

Autor:
Amnesty International

Troll Patrol findings: Using crowdsourcing, data science & machine learning to measure violence & abuse against women on Twitter

Alle Tags anzeigen

These findings are the result of a collaboration between Amnesty International and Element AI,  a global artificial intelligence software product company. Together, we surveyed millions of tweets received by 778 journalists and politicians from the UK and US throughout 2017 representing a variety of political views, and media spanning the ideological spectrum... Amnesty International has repeatedly urged Twitter to publicly share comprehensive and meaningful information about reports of violence and abuse against women, as well as other groups, on the platform, and how they respond to it. On 12 December 2018 Twitter released an updated Transparency Reportin which it included for the first time a section on 'Twitter Rules Enforcement'. This was one of Amnesty International’s key recommendations to Twitter and we see the inclusion of this data as an encouraging step. We are disappointed, however, that the information provided in the transparency report does not go far enough... Our study found that 7.1% of tweets sent to the women in the study were problematic or abusive. This amounts to 1.1 million problematic or abusive mentions of these 778 women across the year, or one every 30 seconds on average. Women of colour were more likely to be impacted - with black women disproportionately targeted with problematic or abusive tweets.

... Amnesty International and Element AI’s experience using machine learning to detect online abuse against women highlights the risks of leaving it to algorithms to determine what constitutes abuse... Human judgement by trained moderators remains crucial for contextual interpretation... Amnesty International’s full set of recommendations to Twitter are available here

Zeitleiste