THE AMERICA ONE NEWS
Oct 21, 2024  |  
0
 | Remer,MN
Sponsor:  QWIKET.COM 
Sponsor:  QWIKET.COM 
Sponsor:  QWIKET.COM Sports News Monitor and AI Chat.
Sponsor:  QWIKET.COM Sports News Monitor and AI Chat.
back  
topic
Breccan F. Thies, Investigative Reporter


NextImg:Pornhub had roadblock for reviewing potential child sexual content, documents show

Child sexual abuse videos that were published on Pornhub needed to be reported numerous times before content moderators would review, documents show.

Emails and direct messages dredged up in the discovery phase of a California lawsuit against MindGeek, the parent company of the pornography streaming site, showed a video needed to be flagged more than 15 times before steps were taken.

PANDEMIC POLITICS RETURNS TO THE SENATE AS COVID-19 CASES RISE

The lawsuit argues that "for more than a decade, MindGeek has knowingly received and distributed child sexual abuse material (CSAM) and benefitted financially from child sex trafficking on its network of wildly popular pornography websites."

The partially redacted 2020 emails show conversations between MindGeek executives and senior personnel on how to respond to a Mastercard inquiry flagging suspected child pornography on the site, by sending them a hyperlink and asking about their protocols.

"Let's NOT poke MC," the redacted MindGeek CEO, who at the time was founder Feras Antoon, said in a May 27 email chain. "This is very frustrating cuz imagine they are planning to shut us down."

The CEO, who stepped down last year, asked, "Can we say, it slipped through? ... The question is precise? What did u do verify age. They are aware we don't do that to UGC [user-generated content]."

Mastercard, along with Visa and Discover, eventually severed ties with Pornhub in December 2020, blocking card users from making purchases on the site. In 2022, Mastercard went further, suspending ties with MindGeek advertising wing TrafficJunky.

MindGeek's standards were revealed in another May 27 email chain involving the CEO and an unnamed chief product officer, likely Karim El Marazi, on what to tell Mastercard. Emails reveal the "only way" videos were being prioritized for manual review was to meet a threshold of over 15 flags in order to enter the queue. After that threshold was met, it would take about 72 hours to review.

The number of active videos on the site that had between one and 15 flags, which meant they were not reviewed, was 706,425. Moreover, there was only one person per day, five days per week, responsible for reviewing flagged content. Having this process revealed, the MindGeek CEO said it "seems good and reasonable."

A separate threshold was discussed, where the CPO said review could be triggered after 1.8% of the average views per video were flags, to which the CEO replied, "1 million views at 1.8% flags is 18,000 flags," adding, "My fear with a % number instead of 15 might look like we want to on purpose profit from a popular high viewed video."

Laila Mickelwait, founder and CEO of Justice Defense Fund and founder of the Traffickinghub Movement, told the Washington Examiner: "It means it was intentional. So what I see here is we've gone from civil prosecution to a potential criminal prosecution, which I've been calling for from for the beginning.

"But now we actually have admissions from the owners themselves of knowledge, of intention, of knowing that there was criminal content on the site like child abuse and intentionally allowing it to be there, allowing it to be monetized distributed, which are felony level offenses in the United States.

"They were making them up as they went, asking themselves, 'What can we say? How do we frame this for Mastercard to make it sound less bad?'

"Pornhub employed a policy where a child victim of rape could flag their abuse video 15 times without triggering any review and Pornhub only employed one person to work five days a week reviewing flagged videos and they had a backlog of over 700,000 flagged videos."

Pornhub told the Washington Examiner that a video could be flagged for a variety of reasons, not just child sexual abuse materials.

"It is categorically inaccurate and grossly misleading to categorize any social media content as criminal just because it has been flagged on a platform for potentially violating its terms of service," a spokesperson said.

"Just because a piece of content is flagged doesn't mean it is CSAM, and it is disingenuous of anyone to present it as such.

"We absolutely have sufficient protocols in place, and in fact have a Trust and Safety process that far surpasses that of any other major platform on the internet."

A 2020 New York Times article, "The Children of Pornhub," had detailed accounts of CSAM on the site, and compared CSAM statistics from other sites as compared to Pornhub. The National Center for Missing & Exploited Children (NCMEC) received 69.2 million reports in 2019. In addition, Facebook removed 12.2 million child exploitation images in 2020, and Twitter closed 264,000 accounts in six months of 2019.

By contrast, Pornhub had only reported 118 instances of CSAM in the preceding three years.

Mickelwait told the Washington Examiner she has spoken to victims who would "beg for their videos to be removed and were reporting them all the time and nothing would happen and when they would be taken down, they would just go right back up again."

MindGeek is embroiled in several lawsuits because of it.

In December 2020, Pornhub changed the way it operates, announcing it would remove all "unverified" content, knocking the total number of videos available on the site from 13.5 million to under 3 million.

The changes also altered how creators are verified, making them upload a picture of themselves with their username — though, Mickelwait contends that the "site is still a crime scene of nonconsensual content."

Further muddying MindGeek's CSAM protocols are direct messages and depositions from employees, numerous examples of which Mickelwait posted on social media. In one exchange, employees directly reference child pornography and appear to suggest that management did not want rules on content enforced.

In a redacted April 2020 DM chain between JD and AH, JD said: "he doenst wnat [sic] to know how much cp [child porn] we have ignored for the past 5 years?"

Another exchange that raises questions is a June 2020 DM chain among MM, JM, and PD, where the trio discussed flagging content to "Matt," whom Mickelwait speculates was MindGeek vice president for operations Matthew Kilicci.

The trio complained the rules were "too vague," adding: "The problem is that management doesn't want the rules enforced as written."


Separately, an apparent MindGeek employee testified in a deposition that CSAM was not reported to law enforcement in November 2019, but the company "began reporting to NCMEC in April of 2020."

CLICK HERE TO READ MORE FROM THE WASHINGTON EXAMINER

Last month, MindGeek announced it would be rebranding as Aylo.

MindGeek did not respond to multiple requests for comment regarding their monitoring and reporting practices prior to 2020, as well as the DM exchanges, but told the Washington Examiner: "Aylo has instituted some of the most comprehensive safeguards in user-generated platform history in order to prevent and eliminate illegal material from its platforms.

"NCMEC reported that Aylo platforms removed material they flagged quicker upon being notified than other major tech platforms, including Facebook, X, Google, Instagram, and more."