NCPCR supports parental warnings for children viewing adult content

The National Commission for Protection of Child Rights (NCPCR) held a meeting with social media platforms including Meta, X, Snapchat, Reddit, Sharechat and Bumble to discuss protecting children from encountering adult content and preventing the spread of child sexual abuse material (CSAM), IANS reported. The commission made a series of recommendations to the platforms, including requiring parental consent for using social media and making parents liable for children accessing adult content.

The NCPCR is a statutory body established under Section 3 of the Commission for Protection of Child Rights Act, 2005. The functions of the commissions include monitoring the proper and effective implementation of the Protection of Children from Sexual Offences (POCSO) Act, 2012 and the Juvenile Justice (Care and Protection of Children) Act, 2015.

What recommendations has the NCPCR made to social media platforms?

  • Introduction of KYC requirements as prescribed under the DPDP Act: Section 9 of the Digital Personal Data Protection Act (DPDP Act) sets the age limit at 18 years and mandates that data controllers and online platforms obtain “verifiable consent” from the parent or legal guardian of minor users, before processing their personal data, in a manner “as prescribed”. The NCPCR has recommended that platforms implement KYC requirements for users. It had earlier made this recommendation to the Ministry of Electronics and Information Technology (MeitY) in August this year. The NCPCR has stated that this will ensure the safety of children.
  • Mandatory reporting of CSAM: Platforms are legally required to report any instance of child abuse material to law enforcement agencies. This is in line with Section 19 of the POCSO Act, which states that anyone who “suspects” that an offence is likely to be committed or “knows that such an offence has been committed” must report it.
  • Ask your parents’ permission before entering into a contract: The NCPCR has also recommended that platforms seek parental consent before entering into a contract with children. That is, parents must consent to all terms and conditions that a user must accept when using a platform.
  • Display disclaimers for adult content: Social media platforms should display disclaimers in English, Hindi and other local languages ​​before showing adult content. The NCPCR said this pertains to Section 11 of the POCSO Act, which classifies showing an object to a child in any form or media for pornographic purposes as sexual harassment, and Section 75 of the Juvenile Justice Act, which punishes guardians for cruelty to children. It said these disclaimers should warn parents that they could be liable if a child watches adult content under the said legal provisions.
  • Sharing data with NCMEC: It also asked social media platforms to provide the National Center for Missing and Exploited Children (NCMEC) with data on the number of cases filed from January 2024 to June 2024, or two quarters. It said the data should include categories of image or video hashes, content details (e.g., child pornography, child exploitation), timestamped logs, metadata and any other relevant information.

What was the meeting about?

The NCPCR informed that the following were also discussed during the meeting:

  1. Age Verification Mechanisms
  2. Security tools used by platforms
  3. Mechanisms for detecting and reporting CSAM
  4. Support for law enforcement agencies
  5. Restrict or block CSAM
  6. Tools to identify deepfakes and locate predators
  7. Measures to protect victims and privacy
  8. Parameters for reporting to the NCMEC
  9. Measures to protect children from explicit content

Challenges with age verification

Age verification can be one way to protect children from sexually explicit content. But it can come with several challenges, Snap’s head of public policy Uthara Ganesh explained at MediaNama’s PrivacyNama 2023 event. “There’s self-disclosure, which is easy to circumvent. The second is of course ID-based verification, which of course we know has data privacy risks, but then there’s also the trade-off that some people may not even have an ID. There’s a trade-off there between access and accuracy. The third is of course using some kind of biometrics, which some experts think could actually be quite good from an accuracy perspective, but then there’s variations because of things like skin color and your physical characteristics, etc.,” she said.

Advertisements

Also read:

You May Also Like

More From Author