29 C
Mumbai
Friday, November 8, 2024

Meta erased 180K Australian content bits for having harmful health material

Meta erased 180,000 bits of Australian content last year as they contained ‘harmful health material.’ Meta also stated that the information deleted from other countries helped Australians as well, with a total of 11 million individuals affected internationally.

Last year, more than 180,000 pieces of material were banned from Australian Facebook and Instagram Pages and accounts for breaking Meta’s Community Standards for containing harmful health misinformation. It should be noted that the number of 180,000 is an increase from 110,000 in 2020.

Meta stated that the information deleted from other countries helped Australians as well, with a total of 11 million individuals affected internationally.

The information was published in a Meta transparency report focused on Australia, which was provided by the Digital Industry Group (DIGI) as part of its monitoring of the Australian Code of Practice on Disinformation and Misinformation. In the fourth quarter of 2021, Meta recorded over 3.5 million visits from Australian visitors to a knowledge center dedicated to Covid-19 on its platforms.

It has banned nearly 3,000 accounts, pages, and organizations from the start of the epidemic until June 2021 for breaking its policies against distributing Covid-19 and vaccination disinformation.

It further said that some observers are concerned that social media distributes disinformation while also encouraging echo chambers and polarization.

According to the paper, academic research on the role of algorithms in political and social polarization has produced contradictory results, with many studies showing that social media is not the major source of polarization. “However, Meta aspires to provide people more transparency and control over how algorithms rank and distribute material.” To that end, we’ve added an extra responsibility to give transparency about the work we perform here,” Meta said.

It is worth noting that in February 2021, Google, Microsoft, TikTok, Twitter, Facebook, and Redbubble (an online marketplace for print-on-demand items) signed DIGI’s voluntary code of conduct to combat the spread of disinformation in Australia. However, the code has attracted two additional signatories since its inception: Apple, and Adobe.

Signatories to the Australian Code of Practice on Disinformation and Misinformation have agreed to efforts to combat online falsity, such as establishing and implementing policies on their approach. They will also allow users to report content that may violate those policies.

The vow of the code includes the posting of transparency reports on each company’s operations on their respective platforms.

DIGI added, “If we can deepen awareness of these complex difficulties over time, industry, government, civic society, and academia can all constantly improve their policies and practices.”

According to Sunita Bose, managing director of DIGI, the code supports greater openness and public accountability in tech initiatives to prevent harmful disinformation, and DIGI looks forward to working with the next government and others to maximize its impact.

She said, “The 2021 transparency reports provide updated statistics on misinformation in Australia, as well as a number of initiatives to delete and flag fraudulent claims and accounts, enhance trustworthy content, and engage with academics.”

Related Articles

Latest Articles