- Facebook will allow people to live-stream attempts they make at 'self-harm' because it “doesn’t want to censor or punish people in distress”.
- Facebook has a weird way of going about what can be termed as violent.
- 'Permissible' posts that include photos of animal abuse and torture can be shared for awareness purpose.
Last week, we saw how Facebook’s Community Standards to keep insensitive, hurtful and hate content at bay. However, a leaked company document now reveals that it is just half the truth said. This comes at the time when there are increasing number of suicides that are live broadcast on Facebook.
So, all the talks about suicide prevention tool now seem like hogwash? According to the document obtained by The Guardian, Facebook will allow people to live-stream attempts they make at 'self-harm' because it “doesn’t want to censor or punish people in distress”.
The video will be taken off, unless Facebook wants to keep it, of course. "What’s best for the safety of people watching these videos is for us to remove them once there’s no longer an opportunity to help the person. We also need to consider newsworthiness, and there may be particular moments or public events that are part of a broader public conversation that warrant leaving up," leaked document reveals according to the report.
Facebook details those videos showing violent death need not be removed as they help create awareness related to mental issues. And, Facebook wants to ensure that parents and relatives have to live with this horrific and disturbing content freely available to all.
Facebook has a weird way of going about what can be termed as violent. For instance, "Someone shoot Trump" is violent, but 'to snap a bitch's neck, make sure to apply all your pressure to the middle of her throat' can be allowed. "little girl needs to keep to herself before daddy breaks her face is also credible violence is also credible violence.
Talking about animal cruelty, 'permissible' posts that include photos of animal abuse and torture can be shared for awareness purpose. Pictures showing non-sexual physical abuse of children may not necessarily be deleted, unless there is a 'sadistic' element.
Last Updated 31, Mar 2018, 6:53 PM