abusesaffiliationarrow-downarrow-leftarrow-rightarrow-upattack-typeburgerchevron-downchevron-leftchevron-rightchevron-upClock iconclosedeletedevelopment-povertydiscriminationdollardownloademailenvironmentexternal-linkfacebookfiltergenderglobegroupshealthC4067174-3DD9-4B9E-AD64-284FDAAE6338@1xinformation-outlineinformationinstagraminvestment-trade-globalisationissueslabourlanguagesShapeCombined Shapeline, chart, up, arrow, graphLinkedInlocationmap-pinminusnewsorganisationotheroverviewpluspreviewArtboard 185profilerefreshIconnewssearchsecurityPathStock downStock steadyStock uptagticktooltiptwitteruniversalityweb

這頁面沒有繁體中文版本,現以English顯示

文章

2024年5月30日

作者:
Daniel Holznagel, Verfassungsblog

Follow Me to Unregulated Waters! Are Major Online Platforms Violating the DSA’s Rules on Notice and Action?

The Digital Services Act (DSA) is aiming at making the internet safer. Amongst others, the DSA is empowering users to notify platforms about illegal content and (!) to make them take action—so called “notice and action”, Art. 16 DSA. Just a few weeks ago, the European Commission opened proceedings against Meta concerning, amongst others, its reporting mechanisms. Obviously, the Commission is suspecting infringements by Meta in this field—though no details have been published as of today.

In this article, I will demonstrate how some major platforms are failing to properly implement the DSA’s rules on notice and action mechanisms. In my view, many platforms are unduly nudging potential notice-senders (hereinafter: reporters) to submit weak, largely unregulated Community Standards flags. At the same time, platforms are deterring users from submitting (strong) notices regulated under the DSA.

For illustration, TikTok will serve as an example. [...]

The findings in this article are based on a collaboration with the Human Rights Organization HateAid, which has launched broad investigations into reporting mechanisms of all major platforms. [...]

Not only does such a platform design lead to a violation of Art. 16(1) DSA (reporting mechanisms not “easy to access and user-friendly”). In my view, it also leads to follow-on mistakes: As we have seen, platforms are nudging reporters to submit “weak” ToS-flags instead of “strong” DSA-notices. However, this only determines de-facto treatment of notices: From the platforms’ perspectives, reports will mostly belong in the bucket of mere “unregulated” ToS-flags. But how platforms categorize a given report does not ultimately determine the true legal nature of that report. [...]

Conclusion and Outlook: DSA-Proceedings?

Through the design of their reporting flows, platforms are nudging users to submit weak ToS-notices, which leads platforms to count fewer (strong) DSA-notices falling under the regulatory oversight of the DSA. Such a design might be described as a “follow me to unregulated waters” – approach. In my view, this amounts to a violation of Art. 16(1) DSA. It also might lead to follow-on mistakes when DSA-notices are erroneously not treated as such. [...]

隱私資訊

本網站使用 cookie 和其他網絡存儲技術。您可以在下方設置您的隱私選項。您所作的更改將立即生效。

有關我們使用網絡儲存技術的更多資訊,請參閱我們的 數據使用和 Cookie 政策

Strictly necessary storage

ON
OFF

Necessary storage enables core site functionality. This site cannot function without it, so it can only be disabled by changing settings in your browser.

分析cookie

ON
OFF

您瀏覽本網頁時我們將以Google Analytics收集信息。接受此cookie將有助我們理解您的瀏覽資訊,並協助我們改善呈現資訊的方法。所有分析資訊都以匿名方式收集,我們並不能用相關資訊得到您的個人信息。谷歌在所有主要瀏覽器中都提供退出Google Analytics的添加應用程式。

市場營銷cookies

ON
OFF

我們從第三方網站獲得企業責任資訊,當中包括社交媒體和搜尋引擎。這些cookie協助我們理解相關瀏覽數據。

您在此網站上的隱私選項

本網站使用 cookie 和其他網絡儲存技術來增強您在必要核心功能之外的體驗。