Instagram's new TV service recommended videos of potential child abuse
Friday - 21/09/2018 13:18
Potential child exploitation was not the only questionable content being recommended by IGTV's algorithm.
A Business Insider investigation into Instagram's new TV service found it recommending a crop of graphic and disturbing videos, whose content appeared to include child exploitation and genital mutilation.
Two of the videos discovered by Business Insider were reported to the police by the National Society for the Prevention of Cruelty to Children, a British children's charity. Instagram removed them five days after Business Insider reported them through the app's official reporting channel.
Instagram apologized to users who saw the videos and said it wanted to make IGTV a safe space for young people.
The findings come at a time when Facebook is under extraordinary scrutiny over inappropriate content on its platforms. Facebook and Instagram share a community-operations team, and Mark Zuckerberg's company has hired an army of 7,500 moderators and is using artificial intelligence to snuff out posts that break its guidelines.
Instagram launched IGTV in June, a move many viewed as Facebook pushing into YouTube's territory. It allows users to set up their own channels and upload video lasting up to an hour. Anyone with an Instagram account can make a channel, and users swipe through them much as they would flick through channels on a television.
IGTV recommends content in three ways: a For You tab, which plays videos as soon as you open IGTV; a Popular section; and a Following menu, which offers videos from people you follow.
Instagram did not answer Business Insider's questions on how IGTV's algorithm recommends certain videos and why videos were suggested that appeared to show child exploitation. But it appears that the For You section recommends things users will like, possibly based on past activity. The Popular tab seems to gather trending content from across IGTV.
Users can scroll through the recommended videos by swiping left, or IGTV will automatically play the next video. It is clearly designed to encourage scrolling and continued viewing, in much the same way the YouTube algorithm recommends content through its Up Next bar.
Disturbing videos of young girls
Business Insider monitored the For You and Popular tabs for almost three weeks to establish what kinds of content IGTV's algorithm was serving up for users.
We did so in two ways: first through the account of this author and other Business Insider journalists and then with an anonymous login set up as a child's account. This second account had no activity history on Instagram and a user age set to 13, which is the earliest people can officially sign up on the app.
Within days of monitoring IGTV through Business Insider accounts, a video appeared in the For You section titled "Hot Girl Follow Me." It showed a young girl, we speculate to have been 11 or 12, in a bathroom. She glanced around before moving to take her top off. Just as she's about to remove her clothing, the video ends.
The video, uploaded by a user Business Insider is not naming for legal reasons, also appeared under the Popular tab on IGTV. It was also one of the first videos recommended under the For You section to the child account set up by Business Insider, which had no prior history of activity on Instagram.
The same user who uploaded the "Hot Girl Follow Me" video posted another video, titled "Patli Kamar follow me guys plzz," which was also recommended to our child Instagram account under the For You section. It featured another clearly underage girl exposing her belly and pouting for the camera.
The same two videos were separately uploaded by a different user, whom again Business Insider has chosen not to identify. The video named "Hot Girl Follow Me" was called "Follow me guys" by this second user and was also circulating on IGTV's suggested posts.
Comments on the videos show they were being recommended to other IGTV users. They were also being interpreted by other users as sexually suggestive.
Some condemned the videos and questioned why they had been suggested. "BRO SHE'S LIKE FUCKING 10 WHY THE FUCK IS THIS IN MY INSTAGRAM RECOMMENDED," one user said, commenting on the "Hot Girl Follow Me" video.
Others were more predatory in tone. "Superb," one user commented on the "Patli Kamar follow me guys plzz" video. "Sexy grl," another added.
The National Society for the Prevention of Cruelty to Children, which is frequently involved in law-enforcement activities around child abuse, reviewed the videos and reported them to the police. It was concerned that they could constitute illegal indecent images under UK law because they appeared to feature footage of erotic posing.
"This is yet another example of Instagram falling short by failing to remove content that breaches its own guidelines," a spokeswoman for the group said.
Business Insider reported the videos through Instagram's official reporting function. Because there were no obvious criteria for alerting the company to potential child exploitation, they were logged as "nudity or pornography."
The videos remained online for five days. It was only after Business Insider contacted Instagram's press office that the content was removed. By this time, the two videos — and other versions uploaded by the second user — had more than 1 million views.
Instagram left the accounts that posted the videos active, however. Business Insider asked why the accounts were left up, as Instagram has a "zero-tolerance policy" on child abuse. Instagram said the policy applied to the content and not to the account uploading it.