Child Sexual Abuse and Exploitation (CSAE) Policy
Last Updated: August 11, 2025
1. Policy Statement
Pentabit Apps maintains a zero-tolerance policy towards Child Sexual Abuse and
Exploitation (CSAE), including the creation, distribution, or promotion of Child Sexual Abuse Material
(CSAM) in any form. We are committed to protecting children from sexual exploitation, abuse, grooming,
and any conduct that endangers their safety across all our apps and related services. This policy is
designed to comply with the Google Play Developer Programme Policies, applicable international child
protection laws, and industry safety standards.
2. Definition of CSAE
For the purposes of this policy, Child Sexual Abuse and Exploitation (CSAE) refers to
any content, conduct, or behaviour that sexually exploits, abuses, or endangers a minor (any person
under the legal age of consent in their jurisdiction). This includes, but is not limited to
- Grooming: Establishing an emotional connection or trust with a child,
often online, to manipulate or lower their inhibitions with the intent of sexual exploitation.
- Luring: Using communication, persuasion, or other means to entice a child
to meet in person or engage in sexual activity, whether the contact occurs online or offline.
- Sextortion: Coercing, blackmailing, or threatening a child to obtain
sexually explicit material or to engage in sexual activity, typically via digital communication.
- Child Trafficking for Sexual Purposes: Recruiting, transporting,
transferring, harbouring, or receiving a child for the purpose of sexual exploitation.
- Sexual Exploitation or Abuse: Engaging a child in sexual activity,
including the creation, distribution, or possession of Child Sexual Abuse Material (CSAM),
regardless of whether the abuse was perpetrated in person or via digital means.
3. App Content and Conduct Rules
Our apps strictly prohibit:
- Any form of Child Sexual Abuse Material (CSAM) or content that depicts,
promotes, instructs on, or glorifies CSAE in any format, including text, images, videos, audio, or
links to external material.
- Any user behaviour intended to groom, lure, solicit, or exploit children in any
way.
- Any communications that encourage, pressure, or manipulate children to share
sexually explicit content or engage in sexual activity.
- Any functionality, feature, or tool within the app that could reasonably be used
to harm, exploit, or endanger children, including features that bypass privacy settings or allow
unwanted contact.
- Attempts to evade detection systems or reporting processes related to CSAE.
- Any third-party integrations, embedded content, or in-app browser features must
comply with all CSAE-related rules and safety measures in this policy.
4. User Responsibilities
All users of our apps are expected to comply with the following child safety
obligations
- Follow all applicable laws relating to child protection and the reporting
of suspected CSAE.
- Report violations promptly using the in-app reporting tools or our
external reporting channels if they encounter any suspected CSAE or harmful conduct toward minors.
- Respect age restrictions and avoid creating accounts or profiles that
misrepresent age.
- Avoid contact with minors unless it is lawful, appropriate, and consented
to by a parent or guardian where required.
- Do not attempt to bypass safety features such as privacy settings,
blocking tools, or restricted modes intended to protect minors.
- Cooperate with investigations by providing accurate information when
contacted by our safety team in relation to a child protection matter.
Failure to meet these responsibilities may result in account suspension, permanent ban,
and referral to law enforcement or child protection agencies.
5. Safeguarding Measures
We have implemented the following measures to prevent, detect, and respond to CSAE
risks
- User Verification: Verification and authentication procedures are in
place to help prevent unauthorised or inappropriate access to the app, particularly by individuals
seeking to harm minors.
- Content Moderation: A combination of manual review and automated
detection tools is used to identify, block, and remove prohibited content or behaviour. This
includes proactive scanning of user-generated content, including text, images, videos, links,
metadata, and filenames, for indicators of prohibited material.
- Reporting Mechanisms: Users can report inappropriate content or behaviour
at any time via the in-app reporting feature or the support email. All reports relating to suspected
CSAE are treated with the highest priority and investigated without delay.
- Collaboration with Authorities: Confirmed CSAE-related incidents are
immediately reported to the appropriate authorities, such as the National Crime Agency (NCA), the
Internet Watch Foundation (IWF), or equivalent bodies in the relevant jurisdiction.
- Staff Training: Our safety and moderation teams receive regular training
on identifying, handling, and escalating suspected CSAE cases in compliance with applicable laws and
best practices.
6. Reporting and Response Procedures
We take all reports of suspected CSAE extremely seriously and follow strict procedures
to ensure swift and lawful action
- Receiving Reports: Reports can be submitted through the in-app reporting
tool or via our support email. We also accept reports from parents, guardians, law enforcement,
child protection organisations, and other trusted sources.
- Immediate Review: All CSAE-related reports are prioritised for immediate
review by trained safety personnel. Content under review may be hidden or removed pending
investigation.
- Preservation of Evidence: Any relevant data, including user information,
chat logs, or uploaded content, is securely preserved in accordance with applicable data protection
and evidence-handling laws to support potential law enforcement investigations.
- Escalation to Authorities:Where there is a reasonable belief that CSAE
has occurred or is being attempted, the matter is promptly escalated to the appropriate law
enforcement agencies and recognised child protection hotlines in the relevant jurisdiction. Where
applicable, incidents may also be reported to international bodies such as the National Center for
Missing & Exploited Children (NCMEC) or equivalent organisations.
- User Account Actions: Accounts involved in CSAE are immediately suspended
or permanently banned. We may also restrict related accounts or devices to prevent re-registration.
- Confidentiality and Protection: The identity of reporters is kept
confidential to the extent permitted by law, and retaliation against individuals who make good-faith
reports is strictly prohibited.
7. Data Handling and Privacy in CSAE Cases
We recognise that CSAE cases involve highly sensitive information and are committed to
handling such data with the utmost care and in compliance with applicable privacy laws worldwide
- Data Minimisation: Only the information strictly necessary to assess,
investigate, and report CSAE incidents is collected and retained.
- Secure Storage: All CSAE-related data is stored using strong encryption
and secure access controls to prevent unauthorised access, modification, or disclosure.
- Limited Access: Access to CSAE-related data is restricted to authorised
personnel who have received specialised training in handling such cases.
- Retention Period: CSAE-related data is retained only for as long as
necessary to fulfil legal obligations, assist in investigations, and comply with applicable law
enforcement requests. Once no longer required, the data is securely deleted or anonymised.
- Legal Disclosures: Information may be shared with law enforcement
agencies, recognised child protection organisations, and other authorised bodies when legally
required or when there is a good-faith belief that such disclosure is necessary to protect a child
from harm.
8. Legal Compliance
This policy is designed to comply with applicable laws and regulations in all
jurisdictions where our apps are available, as well as recognised international standards, including:
- Relevant national laws addressing child sexual abuse, exploitation, trafficking,
and related offences in each applicable jurisdiction.
- International frameworks such as the United Nations Convention on the Rights of
the Child and the Optional Protocol on the Sale of Children, Child Prostitution, and Child
Pornography.
- Google Play Developer Programme Policies, particularly those concerning child
endangerment, sexual abuse material, and CSAE.
9. Enforcement and Action
Enforcement actions may extend to all related Pentabit Apps accounts, services, and
associated devices to prevent re-offending. Breaches of this policy will result in:
- Immediate suspension or termination of accounts involved in CSAE activities.
- Permanent removal of any offending content from the platform.
- Reporting to law enforcement agencies for investigation and legal action where
necessary.
Users who believe enforcement action was taken in error may contact us via the provided
support channels to request a review.
10. Policy Review
We regularly review and update this policy to reflect changes in laws, technology, and
safeguarding best practice.
11. Contact Information
For any support, queries or to report CSAE-related concerns:
Email: support@pentabitapps.com