MasterCard Investigates Claims of Child Abuse on Pornhub

Home*Cover Story*News

MasterCard Investigates Claims of Child Abuse on Pornhub

Mastercard has launched an investigation against Pornhub after a recent New York Times article alleges the Montreal-based adult entertainment website depicts child abuse and non-consensual sexual behaviour.

The credit card company accepts payments through Pornhub and says it has a “zero tolerance” policy for “illegal activity” within its network.

“We work closely with law enforcement and organizations like the National and International Center for Missing and Exploited Children to monitor, detect and prevent illegal transactions,” says Mastercard Senior VP for Communications, in an email shared with Daily Hive.

Mastercard says it is “investigating the allegations raised in the New York Times and are working with MindGeek’s bank to understand this situation, in addition to the other steps they have already taken.”

The credit card company says that if the claims prove to be true, they will take “immediate action.” Mastercard says its policy is to “terminate the relationship” when illegal activity takes place, “unless an effective compliance plan is put in place.”

In the New York Times article, Pulitzer Prize winner Nicholas Kristof describes videos on Pornhub’s website of assaults of unconscious women and girls, including child sex abuse material (CSAM). Kristof says Pornhub allows videos to be downloaded directly from its server, allowing for content to be spread, reshared, and reuploaded after it’s been taken down.

In an email sent to Daily Hive, a Pornhub spokesperson says “any assertion that we allow CSAM is irresponsible and flagrantly untrue.” Pornhub says it has a zero-tolerance policy for CSAM and is “unequivocally committed to combating CSAM, and has instituted an industry-leading trust and safety policy to identify and eradicate illegal material from our community.

Pornhub says due to the nature of its industry, people’s preconceived notions of the website’s values and process “often differ from reality.” The adult entertainment website says it is “counterproductive to ignore the facts regarding a subject as serious as CSAM.”

The company shared citation numbers, indicating that “family-friendly sites like Facebook” reported that it removed 84,100,000 incidents of CSAM over two and a half years, compared to the 118 CSAM incidents removed from Pornhub over a three year period.

In October, as part of International Day of Non-Violence, a group organized by Arreter ExploitationHub gathered outside of Pornhub’s main office in Montreal calling for “the site to be shut down for allegedly enabling and profiting from the sex trafficking and rape of women and children, as evidenced by a growing number of public cases.”

A petition to shut down Pornhub had surpassed two million signatures from 192 countries, alleging that Pornhub doesn’t verify the age of video participants.