Combatting CSAM at Megirlnextdoor

Combatting CSAM at Megirlnextdoor

 

Protecting our community is vital at Megirlnextdoor, and we are committed to eradicating the creation and distribution of Child Sexual Abuse Material (CSAM) from our platform. We stand firm: CSAM is unacceptable, illegal, and strictly against our Terms of Service and Acceptable Use Policy.

 

What is CSAM?

CSAM refers to any visual depiction of sexually explicit conduct involving a person under the age of 18. This content constitutes child sexual abuse and exploitation.

 

How We Address CSAM at Megirlnextdoor

We have a dedicated team working relentlessly to prevent and swiftly remove any suspected CSAM. Here’s how we combat CSAM:

 

1. Proactive Monitoring and Reporting:

- We utilize advanced digital technologies to scan and prevent the posting of CSAM.

- All content is reviewed by our trained moderators within 24 hours. Any content suspected to be CSAM is escalated and removed immediately.

 

2. Training and Technology:

- Our content moderators are extensively trained to identify and report any suspicions of CSAM.

- We compare content against databases and tools used by law enforcement to prevent the distribution of known CSAM.

- New CSAM, which is not part of existing databases, is meticulously inspected, and any confirmed instances are reported to relevant authorities.

 

3. Immediate Action and Reporting:

- Suspected CSAM is promptly removed and reported to the National Center for Missing & Exploited Children (NCMEC) via their CyberTipline.

- We collaborate with NCMEC and law enforcement agencies to investigate and prosecute individuals attempting to create or distribute CSAM on our platform.

- Users involved in such activities are permanently banned.

 

Handling Direct Messages and Private Posts

Megirlnextdoor does not use end-to-end encryption for content shared on our platform, including direct messages. Everything is visible to our team of trained moderators, ensuring that no hidden or secret areas exist where CSAM can be shared.

 

Does Our Subscription Model Facilitate CSAM?

No, our subscription model makes it more challenging for any illegal activities to occur:

- Every user must pass rigorous identity verification checks, ensuring we know the legal identity of all members.

- Since anonymity is not allowed, the risk of CSAM creation and distribution is significantly reduced.

- If any user tries to create or distribute CSAM, we can identify, report, and ban them from our platform immediately.

 

Reporting Suspected CSAM

If you encounter content you suspect might be CSAM:

- Use the report button available on every post and account.

- Alternatively, email your concerns to support@megirlnextdoor.com.

 

Our Commitment to Transparency

We are committed to maintaining the safest digital media platform. We regularly publish transparency reports that provide data on our actions against CSAM. Additionally, we engage independent third-party monitors to evaluate our processes.

 

Additional Preventive Measures

- We work closely with governments, regulatory bodies, law enforcement, non-profits, charities, and other organizations to fight CSAM.

- We proactively gather intelligence and trends relating to child safety and offender prevention.

- As part of NCMEC’s TakeItDown initiative, we offer support services for individuals under 18 concerned about self-generated explicit content being shared online.

 

For more information on our efforts to combat CSAM or to get involved, please contact us at compliance@megirlnextdoor.com.

 

Thank you for helping us maintain a safe and secure platform for all users.