“Vast Pedophile Network’ Discovered on Dem Donor Social Platform

In yet another sobering reminder that children should be kept far away from most forms of social media, a damning new report has uncovered a “vast pedophile network” on the imminently popular Instagram platform.

For the unfamiliar, Instagram — owned by Facebook parent Meta — is a more visual-oriented form of social media (whereas Twitter originally focused on text or “140 characters,” and Facebook is somewhere in between).

Communication is largely relegated to the messaging under photos, and that’s where The Wall Street Journal learned that a lot of this perversion was hiding in plain sight.

The report Wednesday said those implicated have been busted using innocuous emojis as code to help proliferate their sick network of pedophilia.

Those emojis included a map (“MAP” is the acronym for “minor-attracted person”), a cheese pizza (CP is also the initials of “child pornography”) and a reverse arrow next to a person’s age (so “Age 31” in a bio really has some sort of connection to a 13-year-old).

It’s all sickening stuff, but it’s made all the more ominous as the report detailed just how deep this network goes.

The Journal, in conjunction with researchers at Stanford University and the University of Massachusetts, Amherst, found that Instagram, through either malicious ignorance or something more disturbing, has allowed this community of pedophiles to fester and grow and network.

Worse yet?

The investigation found that, because of how social media algorithms work, all it takes is one visit to one of these pedophile profiles for the app to begin promoting and flooding your feed with related suggestions.

Yes, in short, Instagram’s automation has paved the way for the system to be gamed by pedophiles.

And it’s those promotions and “follow” suggestions that make this even more horrific.

The report noted that while pedophiles have long used the darkest corners of the internet to satiate themselves, it was always a conscientious choice and something the pervert would have to seek out.

Instagram and its algorithms basically eliminate the conscientious part of it by automating and streamlining it all.

The Journal added another sickening layer to this: “The researchers found that Instagram enabled people to search explicit hashtags such as #pedowhore and #preteensex and connected them to accounts that used the terms to advertise child-sex material for sale. Such accounts often claim to be run by the children themselves and use overtly sexual handles incorporating words such as ‘little slut for you.'”

Those hashtags aren’t even subtle, and yet they could easily be found on Instagram.

Alex Stamos, the head of the Stanford Internet Observatory and Meta’s chief security officer until 2018, blasted how easy it was for people with “limited access” to dive so deep into this network of pedophiles.

“That a team of three academics with limited access could find such a huge network should set off alarms at Meta,” Stamos told the Journal. “I hope the company reinvests in human investigators.”

Brian Levine, director of the UMass Rescue Lab, which researches and combats online child victimization, expressed concern that even if Instagram were to begin restricting reach and access, the popular social media platform is almost a “gateway drug” to much worse corners of the internet.

“Instagram is an on-ramp to places on the internet where there’s more explicit child sexual abuse,” Levine said.

That fact alone makes Instagram’s dereliction of duty in squelching the promotion of pedophilia a sore spot for him. When told that Meta and Instagram were working on safeguards against pedophilia, Levine responded with derision.

“Pull the emergency brake,” he said. “Are the economic benefits worth the harms to these children?”

Meta, for its part, said it was looking into the situation but also admitted a shocking sequence of errors.

A company representative told the Journal that a review of how Meta handled reports of child sex abuse found a number of issues, such as a “software glitch” that prevented child sex abuse reports from reaching the proper people, and said company staff wasn’t even properly enforcing the rules should a child sex abuse report make it through.

Given all that, it might be in Meta’s best interest to indeed pump the “emergency brake.”

via unsilencednews

Get your Real American news

    Recent Articles

    Recent Posts