Asianet News English

Twitter, Facebook get to work; focus on regaining trust and transparency

First Published Feb 26, 2021, 2:33 PM IST
  • Facebook
  • Twitter
  • Whatsapp

While Twitter has said that it will invest in a simple and quick appeals process to quickly address mistakes, Facebook has assured that it will make adjustments where needed.

Social media giants are finally talking about accountability and transparency seriously.
 

Both Facebook and Twitter are now on course to not just improve content moderation but also address the trust issues that are now more pronounced than ever.
 

Addressing Twitter's 2021 Analyst Day, CEO Jack Dorsey confessed that many people do not trust the platform.
 

While acknowledging the same, Dorsey said that the only way trust can be earned is through transparency, accountability, reliability and choice.

Social media giants are finally talking about accountability and transparency seriously.
 

Both Facebook and Twitter are now on course to not just improve content moderation but also address the trust issues that are now more pronounced than ever.
 

Addressing Twitter's 2021 Analyst Day, CEO Jack Dorsey confessed that many people do not trust the platform.
 

While acknowledging the same, Dorsey said that the only way trust can be earned is through transparency, accountability, reliability and choice.

"Twitter is making a lot of progress with accountability by owning up mistakes and correcting, and reliability by following published principles and not wavering. However, it lacks transparency and giving people more choice and control," Dorsey said.
 

He went to say that Twitter will make its content moderation practices more transparent, give people more controls to moderate their interactions.
 

On Thursday, Indian IT and Communications Minister Ravi Shankar Prasad said social media platforms would now have to establish a grievance redressal mechanism for receiving and resolving complaints from users.
 

In a country where Twitter has witnesses 74 per cent Year on Year monetizable daily active usage growth in Q4 of 2020, such a requirement can't be brushed under the carpet. And Twitter too agrees to that.

"Twitter is making a lot of progress with accountability by owning up mistakes and correcting, and reliability by following published principles and not wavering. However, it lacks transparency and giving people more choice and control," Dorsey said.
 

He went to say that Twitter will make its content moderation practices more transparent, give people more controls to moderate their interactions.
 

On Thursday, Indian IT and Communications Minister Ravi Shankar Prasad said social media platforms would now have to establish a grievance redressal mechanism for receiving and resolving complaints from users.
 

In a country where Twitter has witnesses 74 per cent Year on Year monetizable daily active usage growth in Q4 of 2020, such a requirement can't be brushed under the carpet. And Twitter too agrees to that.

According to Twitter's Legal, Policy and Trust chief Vijaya Gadde, the platform needs to continue investing in a simple and quick appeals process to quickly address mistakes.
 

Vijaya reiterated that transparency, consistency and proportionality are key to building trust in Twitter's enforcement process.
 

Stating that we live in a highly-dynamic global regulatory environment, she said that there is significant interest in content moderation policies and enforcement by governments around the world.
 

In 2020, there has been a rapid increase in proposed legislation related to content moderation, privacy, and data protection, Vijaya said, adding that the trend will continue in 2021.

According to Twitter's Legal, Policy and Trust chief Vijaya Gadde, the platform needs to continue investing in a simple and quick appeals process to quickly address mistakes.
 

Vijaya reiterated that transparency, consistency and proportionality are key to building trust in Twitter's enforcement process.
 

Stating that we live in a highly-dynamic global regulatory environment, she said that there is significant interest in content moderation policies and enforcement by governments around the world.
 

In 2020, there has been a rapid increase in proposed legislation related to content moderation, privacy, and data protection, Vijaya said, adding that the trend will continue in 2021.

As for enforcement actions, Twitter says it has already been developing a broader range of remedies including interstitials, account labels, Tweet labels, and de-amplification.
 

These remedies, Vijaya said, allowed more speech to remain on the platform while providing a toolkit of enforcement actions that can be tailored and proportionate to the violation.
 

These remarks come even as Twitter is engaged in a faceoff with the Indian government over hate and anti-India content being spread using the platform. Twitter and the Indian government even locked horns over blocking certain accounts which the latter claimed were trying to project India in a bad light and inciting people to take up acts of violence in the name of fighting for farmers' rights.

As for enforcement actions, Twitter says it has already been developing a broader range of remedies including interstitials, account labels, Tweet labels, and de-amplification.
 

These remedies, Vijaya said, allowed more speech to remain on the platform while providing a toolkit of enforcement actions that can be tailored and proportionate to the violation.
 

These remarks come even as Twitter is engaged in a faceoff with the Indian government over hate and anti-India content being spread using the platform. Twitter and the Indian government even locked horns over blocking certain accounts which the latter claimed were trying to project India in a bad light and inciting people to take up acts of violence in the name of fighting for farmers' rights.

Facebook, meanwhile, will look to improve its content moderation.
 

Facebook has taken note of the recommendations made by the Oversight Board -- an independent body instituted by Facebook to tackle issues and make suggestions on content moderation.
 

Nick Clegg, Vice President for Global Affairs and Communications at Facebook, announced that the company has been working on a Transparency Centre which will educate people about the platform's Community Standards and their enforcement.
 

The Transparency Centre is expected to be up and running in the coming months.

Facebook, meanwhile, will look to improve its content moderation.
 

Facebook has taken note of the recommendations made by the Oversight Board -- an independent body instituted by Facebook to tackle issues and make suggestions on content moderation.
 

Nick Clegg, Vice President for Global Affairs and Communications at Facebook, announced that the company has been working on a Transparency Centre which will educate people about the platform's Community Standards and their enforcement.
 

The Transparency Centre is expected to be up and running in the coming months.

Facebook is also working on tweaking its use of automation with regard to the detection and removal of harmful content.
 

"Technology allows us to detect and remove harmful content before people report it, sometimes before people see it. We typically launch automated removals when they are at least as accurate as those by content reviewers. We will continue to evaluate which kind of reviews or appeals should be done by people and which can be safely handled by automated systems, and how best to provide transparency about how decisions were made.," Nick said.
 

Facebook's Oversight Board has already cleared the suggestion made by Union Minister Ravi Shankar Prasad with regard to informing users when their content is removed by automation. 
 

Facebook says it will continue testing the impact of telling people more about how an enforcement action decision was made. It has assured that it will continue to monitor enforcement and appeals systems to ensure that there is an appropriate level of manual review and will make adjustments where needed.

Facebook is also working on tweaking its use of automation with regard to the detection and removal of harmful content.
 

"Technology allows us to detect and remove harmful content before people report it, sometimes before people see it. We typically launch automated removals when they are at least as accurate as those by content reviewers. We will continue to evaluate which kind of reviews or appeals should be done by people and which can be safely handled by automated systems, and how best to provide transparency about how decisions were made.," Nick said.
 

Facebook's Oversight Board has already cleared the suggestion made by Union Minister Ravi Shankar Prasad with regard to informing users when their content is removed by automation. 
 

Facebook says it will continue testing the impact of telling people more about how an enforcement action decision was made. It has assured that it will continue to monitor enforcement and appeals systems to ensure that there is an appropriate level of manual review and will make adjustments where needed.

Follow Us:
Download App:
  • android
  • ios