NCPCR urges social media platforms to strengthen child protection

India’s National Commission for Protection of Child Rights (NCPCR) met with major social media companies including Meta (Facebook, Instagram and WhatsApp), Snapchat, X, Google, YouTube, Reddit, Sharechat and Bumble to crack down on minors accessing inappropriate content online. They presented guidelines to improve digital safety, prevent exposure to adult material and curb the spread of child-exploiting content.

As reported by IANS, the cornerstone of the NCPCR’s recommendations is the proposal to make parental consent mandatory for children using social media platforms. The Commission stressed that platforms should require verifiable consent from a parent or legal guardian, in line with the provisions of the Digital Personal Data Protections (DPDP) Act.

Article 9 of the DPDP Act sets the minimum age for social media users at 18 years and requires parental consent before processing personal data of minors.

Moreover, under the POCSO Act and the Juvenile Justice Act, parents can be held liable if their children access adult content.

In a bid to enhance child safety, the NCPCR reiterated its earlier recommendation for Know Your Customer (KYC) verification for all social media users. The measure, earlier proposed to the Ministry of Electronics and Information Technology (MeitY), is aimed at ensuring that only verified users can access platforms, thereby reducing the chances of minors coming across harmful content.

Platforms are legally required to report cases of CSAM to law enforcement agencies. The NCPCR stressed the importance of adhering to Section 19 of the POCSO Act, which mandates reporting of any suspected cases of known child sexual exploitation.

Failure to do so could have serious legal consequences for the platforms.

Before adult content was introduced, social media platforms were also used to display explicit disclaimers in multiple languages, including English and Hindi. These warnings were intended to inform users — especially parents — that they could face legal consequences if a child under their supervision is exposed to such material.

Another key recommendation concerns platforms that share detailed data with the US-based National Center for Missing and Exploited Children (NCMEC). This data should include case summaries, image or video hashes, metadata, and timestamps for CSAM cases detected between January and June 2024.

The NCPCR meeting provided a forum for in-depth discussions on mechanisms for protecting children on social media. Topics included the tools platforms use to detect and report CSAM, how they cooperate with law enforcement, and the technical measures to block explicit consent.

In addition, new threats such as deepfakes and the challenges of verifying users’ age online were discussed.

In the news: Telegram updates policy: will now share data with governments

You May Also Like

More From Author