Instagram Algorithm Spreads Pedophile Networks, WSJ Investigation Finds
A recent investigation by the Wall Street Journal has revealed that Instagram's algorithm readily spreads underage sexual content. The investigation found that the platform's recommendation algorithms linked and even promoted a “vast pedophile network” that advertised the sale of illicit “child-sex images and videos”.
The report also highlighted how Instagram's algorithm was able to connect users with similar interests, allowing them to find each other more easily. This allowed the pedophile network to grow and spread its content more quickly than it otherwise would have been able to.
In response to this alarming report, Instagram CEO Adam Mosseri has vowed to take action and ensure that such content is not spread on their platform. He has stated that they use a combination of algorithms, processes, and classifiers to determine the most relevant content for each user and will be taking steps to ensure that such inappropriate content is not recommended or shared on their platform.
This news comes as a shock to many users of the popular social media platform who had previously thought it was safe from such activities. It is now clear that Instagram needs to take further steps in order to protect its users from any kind of illegal activity taking place on its platform.