abusesaffiliationarrow-downarrow-leftarrow-rightarrow-upattack-typeburgerchevron-downchevron-leftchevron-rightchevron-upClock iconclosedeletedevelopment-povertydiscriminationdollardownloademailenvironmentexternal-linkfacebookfiltergenderglobegroupshealthC4067174-3DD9-4B9E-AD64-284FDAAE6338@1xinformation-outlineinformationinstagraminvestment-trade-globalisationissueslabourlanguagesShapeCombined Shapeline, chart, up, arrow, graphLinkedInlocationmap-pinminusnewsorganisationotheroverviewpluspreviewArtboard 185profilerefreshIconnewssearchsecurityPathStock downStock steadyStock uptagticktooltiptwitteruniversalityweb

这页面没有简体中文版本,现以English显示

评论文章

2020年12月1日

作者:
Ashley Nancy Reynolds,
作者:
Business & Human Rights Resource Centre

Human rights due diligence within the tech sector: Developments and challenges

This blog is part of our series on Beyond Social Auditing.

In August 2017, the Myanmar military launched a campaign of ethnic cleansing against the Rohingya, resulting in more than 10,000 deaths, widespread sexual violence, and 730,000 refugees.

More than a year after the violence began, Facebook released the findings of a human rights impact assessment that determined the social media platform was used to “foment division and incite offline violence” in Myanmar, as UN investigators and human rights had argued. BSR (Business for Social Responsibility) detailed how Facebook employees failed to stop misinformation, hate speech and inflammatory posts that were part of a systematic campaign to target the Rohingya.

This is not the first time technology and social media companies have been involved in human rights abuses: from exacerbating racial discrimination and facilitating ICE raids to contributing to the arrest of peaceful protesters in Russia and spreading hate speech against LGBT+ people in the MENA region.

While the United Nations has recognized the role of new technologies in the realization of economic, social and cultural rights, it also acknowledges digital technologies can further exacerbate inequality, restrictions on freedom of expression and discrimination, and urges tech companies to conduct due diligence and impact assessment.

Although it is clear technology companies can have far-reaching and devastating impacts, it is more challenging for human rights due diligence efforts to fully capture these impacts, despite emerging attempts to do so. But will they be able to fully capture the myriad and widespread effects that technology and digital activities have on human rights?

Challenges to a Tech-Sector HRIA

“In the past couple of years, it has become evident that tech companies must evolve and improve their due diligence practices in order to adequately respect human rights,” explained Emil Lindblad Kernell, Adviser on Human Rights and Business at the Danish Institute for Human Rights. “Companies must ‘know’ their impacts and ‘show’ they have processes in place to address them. Human rights impact assessments can be a key due diligence tool in those efforts. The digital transition of societies all over is at full speed and the negative impacts will follow unless adequate measures are in place. There is little time to waste.”

Human rights impact assessments (HRIAs) identify actual and potential human rights implications of business projects and activities. For instance, a company operating a factory might find that while it has positive impacts on the right to work and the right to housing, it also has severe negative impacts on the right to water, the right to form and join unions, and the right to an adequate standard of living. Although a number of methodologies and guides on HRIA are currently available, very few specifically target the digital sphere and its unique characteristics.

Assessing the impacts of technology (particularly digital products and services such as social media platforms) presents a number of challenges. Firstly, technology evolves quicker than the law, making exact obligations and challenges difficult to navigate. “Revenge porn” and “upskirt” photos were widespread long before any legal developments made them a crime. This raises questions of responsibility and culpability not only for the uploader, but for the platform hosting and potentially spreading the content.

Application and enforcement of laws is also extremely difficult, due to the global nature of the internet and the difficulty in holding users across borders accountable for violations. Effectively monitoring for hate speech, for instance, would require a large number of staff proficient in many languages. Additionally, different cultures may have different understandings of rights such as free speech. In these cases, which rights, and which understandings, are tech companies expected to adhere to? Additionally, it can be difficult to predict how a new technology will be used or what effects it will have, especially if that technology has little precedent.

A Way Forward

But progress is being made. A few days ago, the Danish Institute for Human Rights released its guidance on human rights impact assessments of digital activities. It has also released a guidance document on addressing digital technologies in National Action Plans on Business and Human Rights.

According to the Institute, in order to be able to conduct in-depth HRIAs in the tech sector, assessments must be adequately scoped to a particular country/regional context, product, and/or user base. Rightsholder engagement is essential and adequate resources and time must be dedicated to engaging with those who might be impacted by the technology in question.

Other organizations weighing in include the Global Network Initiative (GNI), which recently hosted a panel on human rights due diligence in ICT. JustPeace Labs published a report on conflict sensitivity for the tech industry, identifying risks including the weaponization of social media, facial recognition and state surveillance, and AI-driven warfare.

Consulting firms such as Article One and BSR have conducted several assessments for tech-sector companies, including Google, Yahoo, Facebook, and Intel. Following an assessment of its salient human rights issues, Microsoft commissioned a HRIA of its artificial intelligence technology. While some of these assessments are publicly available, many are fully or partially confidential.

“The tech industry has spent the last few years grappling with the unintended consequences of its products and services,” said Chloe Poynton, Co-Founder of Article One. “The industry’s embrace of HRIAs shows the value of the global human rights framework, which allows the management of borderless technology to be grounded in internationally recognized norms. At the same time, HRIAs provide the ability for tech companies to understand the unique experiences of rightsholders in specific locations and ensure those insights are brought back to engineers and corporate policy teams to better mitigate risks in an ongoing basis. Our hope is that more technology companies embrace the UNGP framework and work to prioritize the experience of rightsholders, ensuring the products they develop contribute to a better, more open world.”

超越社会稽核

评论文章

French case law confirms necessity to reassess the weight given to audits in business and human rights court cases

Laura Bourgeois, Litigation and advocacy officer at Sherpa & Clara Grimaud, Legal intern at Sherpa 2024年3月26日

评论文章

Is the Auditing and Certification Industry Fit for Human Rights Due Diligence?

Hannah Shaikh, Canadian lawyer and LLM Candidate at NYU School of Law, and Claudia Müller-Hoff, German lawyer and Senior Legal Advisor at ECCHR’s Business and Human Rights Program. 2021年8月25日

View Full Series

隐私资讯

本网站使用 cookie 和其他网络存储技术。您可以在下方设置您的隐私选项。您所作的更改将立即生效。

有关我们使用网络存储的更多信息,请参阅我们的 数据使用和 Cookie 政策

Strictly necessary storage

ON
OFF

Necessary storage enables core site functionality. This site cannot function without it, so it can only be disabled by changing settings in your browser.

分析 cookie

ON
OFF

您浏览本网页时我们将以Google Analytics收集信息。接受此cookie将有助我们理解您的浏览资讯,并协助我们改善呈现资讯的方法。所有分析资讯都以匿名方式收集,我们并不能用相关资讯得到您的个人信息。谷歌在所有主要浏览器中都提供退出Google Analytics的添加应用程式。

市场营销cookies

ON
OFF

我们从第三方网站获得企业责任资讯,当中包括社交媒体和搜寻引擎。这些cookie协助我们理解相关浏览数据。

您在此网站上的隐私选项

本网站使用cookie和其他网络存储技术来增强您在必要核心功能之外的体验。