Chinese short-form video app TikTok reportedly shows videos of child sexual abuse as part of training its content moderators. This information has been given in the media report. According to Forbes, a large unsecured collection of images of child sexual abuse has been made available to third-party TikTok content moderators as reference guides. Former TikTok moderator Whitney Turner said, “These parents don’t know that we have this picture, this video, this crime has survived.” Had the parents known, I am pretty sure they would have set TikTok on fire. Turner has worked for third-party moderation company Teleperformance’s TikTok program in El Paso, Texas. They were given access to a shared spreadsheet filled with content in violation of TikTok’s community guidelines, including pictures of naked children and hundreds of images of them being abused. Sources told Forbes that hundreds of people from both companies had access to the document. The DRR and other training content were stored in Lark, an internal workplace software developed by TikTok’s China-based parent company ByteDance, the report said. Whitney also reported this to the Federal Bureau of Investigation (FBI), but to no avail. A TikTok spokesperson said, “The training material has strict access controls and does not include visible examples of CSAM (child sexual abuse material). However, the spokesperson said it works with third-party firms that may have their own procedures. Teleperformes also denied that it showed employees sexually abusive content. The report mentioned that Teleperformance showed graphic photos and videos of employees tagging them on TikTok as an example.
Add A Comment