Child Sexual Abuse Material Policy
June 2022
We have a zero-tolerance policy towards any material that sexualizes, sexually exploits, or endangers children
on our platform. If we find or are made aware of it, we will report it.
RedTube Premium is deeply committed to fighting the spread of child sexual abuse material (CSAM). This includes
media, text, illustrated, or computer-generated images. We view it as our responsibility to ensure our platform
is not used for sharing or consuming CSAM, and to deter users from searching for it.
Any content featuring or depicting a child (real, fictional, or animated) or promoting child sexual exploitation
is strictly forbidden on our platform and is a severe violation of our Terms of Service.
Written content (including, but not limited to, comments, content titles, content descriptions, messages,
usernames, or profile descriptions) that promotes, references, or alludes to the sexual exploitation or abuse of
a child is also strictly prohibited.
For the purposes of this policy, a child is any person under eighteen (18) years of age. We report all cases of
apparent CSAM to the National Center for Missing and Exploited Children
(NCMEC), a nonprofit organization which operates a centralized clearinghouse for reporting incidents of online
sexual exploitation of children. NCMEC makes reports available to appropriate law enforcement agencies globally.
Additionally, at RedTube Premium we endorse and stand behind the objectives of the
Voluntary Principles to Counter Online Child Sexual Exploitation and Abuse;
a collaborative initiative launched by the Five Country Ministerial (5 Eyes)1 and backed by industry-leading tech
companies to combat online child sexual exploitation and abuse. While some bad actors seek to exploit advances
in technology and the digital world, we believe robust, efficient, and flexible policies, as well as
participating in and supporting global cross-sector collaboration, can effectively eradicate the spread of
online abuse.
If you encounter child sexual abuse material on RedTube Premium, please report it to us via our
CONTENT REMOVAL REQUEST FORM.
For information on how to report Child Sexual Abuse Material please refer to the Additional Resources and
Support section below.
1 The Five Country Ministerial is made up of the Homeland Security, Public Safety and Immigration
Ministers of Australia, Canada, New Zealand, the United Kingdom, and the United States, who gather annually to
collaborate on meeting common security challenges.
Anyone can report potential violations of this policy.
For more information on how to report content, see the section titled “How Can you help us”.
All complaints and reports to RedTube Premium are kept confidential and are reviewed by human
moderators who work swiftly to handle the content appropriately. If you believe a child is in imminent
danger, please also alert your local law enforcement authorities immediately.
Guidelines
DO NOT post material (whether visual, audio or written content) that*:
- Features, involves, or depicts a child.
-
Sexualizes a child. This includes content that features, involves, or depicts a child (including any
illustrated, computer-generated, or other forms of realistic depictions of a human child) engaged in
sexually explicit conduct or engaged in sexually suggestive acts.
* This serves as an indicative list and does not constitute an exhaustive list. For a more detailed description,
please review our Terms of Service,
under the section entitled, “Prohibited Uses”. RedTube Premium reserves the right at all times to determine
whether content is appropriate and in compliance with our Terms of Service,
and may, without prior notice and in its sole discretion, remove content at any time.
Enforcement
We have strict policies, operational mechanisms, and technologies in place to tackle and take swift action
against CSAM. We also cooperate with law-enforcement investigations and promptly respond to valid legal requests
received to assist in combating the dissemination of CSAM on our platform.
Our team of human moderators work around the clock to review all content uploaded to prevent any content which
may be in violation of our CSAM or other policies, from appearing on our platform. Additionally, when we are
alerted to an actual or potential instance of CSAM appearing on the platform, we remove and investigate the
content and report any material identified as CSAM. As part of our ongoing efforts, we regularly audit our
websites to update and expand our list of banned search words, titles, and tags, to ensure our community remains
safe, inclusive, diverse, and free from abusive and illegal content.
In conjunction with our team of human moderators and regular audits of our platform, we also rely on innovative
industry-standard technical tools to assist in identifying, reporting, and removing CSAM and other types of
illegal content from our platform. We use automated detection technologies as added layers of protection to keep
CSAM off our platform.
These technologies include:
-
Youtube’s CSAI Match, a
tool that assists in identifying known child sex abuse videos;
-
Microsoft’s PhotoDNA,
a tool that aids in detecting and removing known images of child sexual abuse;
-
Google's Content Safety API,
a cutting-edge artificial intelligence (AI) tool that scores and prioritizes content based on the likelihood
of illegal imagery to assist reviewers in detecting unknown CSAM.
-
Safer, Thorn's comprehensive CSAM detection
tool utilized to keep platforms free of abusive material
-
MediaWise® service from Vobile ®,
a state-of-the-art fingerprinting software and database, which scans all new user uploads to help prevent
previously identified offending content from being re-uploaded.
-
Safeguard –our proprietary image fingerprinting and recognition technology designed with the purpose of
combatting both child sexual abuse imagery and the distribution of non-consensual intimate images,
and to help prevent the re-uploading of that content to our platform.
We also utilize age estimation capabilities to analyze content uploaded to our platform using a combination
of internal proprietary software and
Microsoft Azure Face API
in an effort to strengthen the varying methods we use to prevent the upload and publication of potential or
actual CSAM.
Together, these tools play a fundamental role in our shared fight against the dissemination of CSAM on our
platform, as well as our mission to assist in collective industry efforts to eradicate the horrendous global
crime that is online child sexual exploitation and abuse.
How can you help us?
If you believe you have come across CSAM, or any other content that otherwise violates our
Terms of Service, we strongly
encourage you to immediately alert us by flagging the content for our review.
If you are the victim or have first-hand knowledge that content violates our CSAM policy, please report the
content to us by completing and submitting our CONTENT REMOVAL REQUEST FORM.
Please include all relevant URL links to the content in question and we will address your request confidentially
and remove the content expeditiously.
Anyone can report violations of this policy using our CONTENT REMOVAL REQUEST FORM,
whether they have an account on our platform or not.
Consequences for violating this policy
We have a zero-tolerance policy towards any content that involves a child or constitutes child sexual abuse
material. All child sexual abuse material that we identify or are made aware of results in the immediate removal
of the content in question and the banning of its uploader. We report all cases of apparent CSAM to the National
Center for Missing and Exploited Children.
Additional Resources and Support
If you believe a child is in imminent danger, you should reach out to your local law enforcement agency to
report the situation immediately.
You may also choose to reach out to and report cases of child sexual exploitation or abuse material to any of
the following resource organizations dedicated to eliminating and preventing child sexual exploitation. Reports
can be made anonymously and are an integral part in protecting the safety of children.
International Association of Internet Hotlines
National Center for Missing and Exploited Children (NCMEC)
Canadian Centre for Child Protection
Internet Watch Foundation
We partner with multiple organizations whose work is dedicated to fighting child sexual exploitation around the
world.