If you happen to run afoul of TikTok and find your video deleted from the platform, you’ll now know why.
TikTok just announced that it’s adding “clarity” going forward to its content removal policies. In an official blog posted by the platform, TikTok confirmed that it would now offer some sort of indication as to what policy you violated to have your content removed. Previously, you received little more than some vague insinuation that you broke a rule without any real reasoning.
Now, you’ll be able to see the specific portion of the community guidelines you’ve violated when investigating a removal. You can also submit an appeal upon receiving notification of your indiscretion. According to TikTok, the company has been tinkering with these notifications over the past few months, and has experienced a 14 percent decrease in appeals as a result.
Previously, TikTok indicated with its July transparency report that it had been tracking the reasons it had removed offensive videos from the platform. TikTok has not offered much more of an explanation as far as what users can expect as part of the new notification process, but the most important thing is that, perhaps, users won’t be left in the dark about what is and what isn’t acceptable.
Meanwhile, TikTok still remains embroiled in controversy regarding national security. With the Trump administration working toward banning TikTok, the app’s overall future is uncertain. However, it still has a long way to go if it’s ever going to seriously rid the platform of questionable content, especially from its swath of underage users.