Facebook is at fault for the viral video of suicide, not Tik Tok.
And despite Facebook’s promise to commit to removing graphic or violent videos and livestreams, especially so after the Christchurch Mosque attacks, Facebook clearly failed to act on a live stream of a man taking his own life.
Trigger Warning: This article talks about a situation on Facebook, where suicide was involved.
Just yesterday or so, a live stream of a man taking his own life had circulated Tik Tok, and other platforms, including Twitter. And almost immediately, everyone was quick to blame Tik Tok for the video that made the rounds on social media. So much that high schools warned students against using Tik Tok, and warnings were issued about the platform. But little did they know was that the origin of the video came from a Facebook live stream, and NOT Tik Tok.
Facebook is responsible for this video. They allowed the distressed man in question to live stream a very distressing situation where he ended up taking his own life. Then people reportedly shared and downloaded the live stream and uploaded it onto other platforms, many of which were for abusive purposes, including trolling and memes. What’s more, when reports were made to Facebook about the footage, Facebook refused to remove it, until a day later.
Unlike Facebook, Tik Tok is swiftly taking action against the viral video, including removing uploads and content and banning users who continue to post it. Tik Tok has warned people against watching, sharing or interacting with the video as seen in the tweet below. There has also been no comment from Facebook relating to the viral video despite the fact that removing the video is their responsibility.
The questions remain: How did Facebook, despite promising to ban violent or suicidal livestreams, allow this live stream to remain on the platform, enough for many people to share the video onto other platforms, including Tik Tok and Twitter? And why did Facebook consider this to not violate the community standards, but instead, ban a live stream of baby owls for “nudity” — when the fact remained that there was no human nudity in the owl footage? Also, why does Facebook think nudity is worse than the act of suicide, even if there was human nudity in the footage, given that companies like AJ Hackett do nude bungy jumping, and the fact remains that nudity is actually natural?
Both Tik Tok and the family of the man who tragically took his own life should take legal action against Facebook for allowing the video to flourish, resulting in damage and more grief. Tik Tok should take action against Facebook for defamation and libel, as well as the damage that occurred as a result of Facebook failing to take action against the video.
Facebook needs to do better. Facebook needs to step up — and accept responsibility. Facebook needs to learn there are consequences for allowing videos like this to remain on the platform. And more importantly, Facebook needs to be held to account when they don’t take action on graphic and distressing content.
Where to find help if you need it:
Internationally:
A global list of suicide hotlines are listed in this link.
Locally (New Zealand):
Included in this link below are a list of helplines, for different purposes. Some are for addictions, others deal with issues like rape and gender identity.