TikTok is yet again surrounded with criticism when over their incompetency while filtering their indecent content for the audience.
Recently a video was in the recommended section for viewers in which n aged man was seen committing suicide. Since Sunday, the video has been circulating on the platform since Sunday and due to harmless thumbnail, users accidently open the video but the content shouldn’t have been lying there just like that to be viewed casually.
Even though TikTok claimed that the video first surfaced on other platforms like Facebook, they now have removed the disturbing video from TikTok.
Now they are banning the accounts posting anything related to the suicidal video.
A statement by TikTok read: “Our systems have been automatically detecting and flagging these clips for violating our policies against content that displays, praises, glorifies, or promotes suicide. We are banning accounts that repeatedly try to upload clips, and we appreciate our community members who’ve reported content and warned others against watching, engaging, or sharing such videos on any platform out of respect for the person and their family. If anyone in our community is struggling with thoughts of suicide or concerned about someone who is, we encourage them to seek support, and we provide access to hotlines directly from our app and in our Safety Center”
This incident has brought TikTok’s incompetent content moderation algorithm into light.