Staffers alerted Facebook to misinformation, polarising content in India, but company ignored

admin

Staffers alerted Facebook to misinformation, polarising content in India, but company ignored



This report notes that in just three weeks, the test user’s news feed had “become a near constant barrage of polarizing nationalistic content, misinformation, and violence and gore”.The test user followed only the content recommended by the platform’s algorithm. This account was created on February 4, it did not ‘add’ any friends, and its news feed was “pretty empty”.“The quality of this content is… not ideal,” the report by the employee said, adding that the algorithm often suggested “a bunch of softcore porn” to the user.Over the next two weeks, and especially following the February 14 Pulwama terror attack, the algorithm started suggesting groups and pages which centred mostly around politics and military content. The test user said he/she had “seen more images of dead people in the past 3 weeks than I have seen in my entire life total”.The researchers recommended that for tackling inflammatory content Facebook must invest more resources to build out underlying technical systems that will detect and enforce on inflammatory content in India, the way human reviewers might.



Source link