[ad_1]
US social media giant Facebook has invested way less proportionately on preventing hate-speech and misinformation on its platform for users in India, developing and also in some non-English speaking nations compared to the United States, according to US media reports based on internal documents made public on Saturday.
{{^userSubscribed}}
{{/userSubscribed}}
The United States accounts for less than 10% of Facebook’s daily users but it corners 84% of the company’s global budget, The Washington Post reported based in a 2020 company document. Only 16% went for the same purpose in the “Rest of World,” included India, France and Italy.
One Facebook document viewed and cited by The Washington Post showed that the company had not developed algorithms for Hindi and Bengali, the world’s fifth and seventh most widely used language. The report cited company spokesperson saying hate-speech classifiers for Hindi and Bengal were introduced in 2018 and 2019 and systems for detecting violence and incitement in Hindi and Bengali were added as recently as 2021.
Facebook was acutely aware at all times of its loose systems for monitoring and taking down hate-speech and misinformation in India.
{{^userSubscribed}}
{{/userSubscribed}}
In February 2019, the company’s India staff did an integrity test of the network through a dummy user, whose profile identified her as a 21-year-old woman in north India. The user’s newsfeed was hit at first by soft-porn posts. However, in the aftermath of the Pulwama terrorist attack and resulting escalation in India-Pakistan tensions and the retaliatory airstrike by India, it resulted in unsolicited pro-Indian government propaganda and anti-Muslim hate speech.
The dummy test was called an “integrity nightmare” by the Facebook in an internal document.
Documents cited by Washington Post in this report included copies of internal papers, memo and reports that whistleblower Frances Haugen has provided to the US stock market regulator Securities and Exchange Commission, with redacted versions to US Congress. Some of these papers formed the basis for initial reports by The Wall Street Journal that the social media giant prioritised profit over public safety. The reports led to one of them damaging Congressional hearings yet for Facebook.
{{^userSubscribed}}
{{/userSubscribed}}
Facebook spokesperson Dani Lever told The Washington Post that the company had made “progress” and had “dedicated teams working to stop abuse on our platform in countries where there is heightened risk of conflict and violence. We also have global teams with native speakers reviewing content in over 70 languages along with experts in humanitarian and human rights issues”.
In India, Lever said, the “hypothetical test account inspired deeper, more rigorous analysis of our recommendation systems”.
[ad_2]
Source link