Instagram is unable to remove accounts that seduce hundreds of sexual comments on photos of children in swimsuits or other forms of partial clothing, even after these accounts have been reported. Even though the platform’s parent company, Meta, claims to have a zero-tolerance policy on child exploitation, accounts that have been flagged as suspicious have been deemed acceptable through its automated moderation tools. In one case, one such account that shared images of children in sexualized poses was reported by a researcher. Instagram responded the same day claiming that “due to high volume”, it had not been able to see the report, but its “technology has found that this account is probably not in conflict with our Community Guidelines”. After this, the researcher was advised to either block the account, unsubscribe, or perhaps even report it back.
Similar accounts have also been found up and running on Twitter. One account, which shared pictures of a man performing sexual acts to images of a 14-year-old TikTok influencer, was deemed not to break Twitter’s guidelines. What is even more worrying is the man sought others to connect with thanks to his posts. “Looking to trade some younger stuff,” One of his tweets read. It was deleted after the campaign group Collective Shout released about it. Andy Burrows, head of online safety policy at NSPCC, called the accounts a “showcase” for pedophiles. He also called on MPs to discuss the technical aspects of the online safety bill, which seeks to regulate social media companies and will be debated in Parliament on April 19. He said that these companies should be forced to deal not just with illegal content but also with what is harmful but may not meet the criminal threshold.