Summary of the Fourth Report of the Parliamentary Committee on the Empowerment of Women on Cyber Crimes and Cyber Safety of Women

Parliamentary Committee Report of the Committee on the Empowerment of Women (CEW) dated March 2026 

Introduction 

The Fourth Report of the Parliamentary Standing Committee (Committee) on the Empowerment of Women (CEW) titled “Cyber Crimes and Cyber Safety of Women” (the Report) was presented before both the houses of the Parliament on 23 March 2026 The Committee's mandate is to review and monitor measures taken by the Union Government to secure equality, status, and dignity for women. Its functions include, examining reports submitted by the National Commission for Women (NCW) and reporting on steps taken by the Union Government to improve the condition of women. Prominent members of the Committee include Chairperson Dr. D. Purandeswari (Bharatiya Janata Party), Harsimrat Kaur Badal (Shiromani Akali Dal Bathinda), Hema Malini (Bharatiya Janata Party), Sagarika Ghose (All India Trinamool Congress), Sunetra Ajit Pawar (Nationalist Congress Party) and Sudha Murty (Nominated member of the Rajya Sabha). 

The Report is based on inputs received from the Ministry of Home Affairs (MHA), the Ministry of Electronics and Information Technology (MeitY), the Centre for Development of Advanced Computing (CDAC), the Cyber Peace Foundation, SMIs (such as Google, and Meta), across four Committee sittings held between June and August 2025. It examines the vulnerabilities women and children face in the digital space and evaluates the legal, institutional, and technological response to the rise of cybercrimes against them. It also scrutinises the role and accountability of Social Media Intermediaries (SMIs).   

In their written submissions, Google, Meta, and X outlined their respective measures for platform safety and law enforcement cooperation which includes dedicated channels for Law Enforcement Agencies (LEAs) requests, automated detection systems for harmful content, partnerships with child safety organisations, and training programmes for Indian investigative agencies. (Pages 37–41 of the Report) 

This blogpost provides an overview of the Report's key findings and a summary of the Committee's recommendations across cybercrime investigation, platform accountability, victim support, international cooperation, and the regulatory outlook for SMIs in India. 

 About the Report
The Report is structured in two parts. Part I examines the existing legal framework governing SMIs under the Information Technology Act, 2000 (IT Act), the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021 (IT Rules) and other acts relevant in terms of protecting women and children from threats in the digital space. Part II sets out the Committee's recommendations for MeitY and MHA, based on the findings. 

The following findings by the Committee form the basis of the Report and shape the direction of the recommendations: 

  • As per data by the National Crime Records Bureau (NCRB), crimes against women rose by 239 percent between 2017 and 2022. Cases involving children rose twentyfold in the same period. 

  • Over 2.48 lakh complaints related to women and children were filed on the National Cybercrime Reporting Portal (NCRP) between 2019 and April 2025.  

  • Generative AI has introduced deepfakes, synthetic explicit content, and scalable impersonation as new and rapidly evolving threat vectors.  

  • Existing technological interventions, like the Proactive Monitoring Tool (PMT), SAHYOG, and NCRP, require significant upgradation. The Committee found that AI-enabled crimes, encrypted platforms, and anonymisation tools are outpacing investigative and regulatory capacity. 

  • The Committee noted serious underreporting due to stigma, digital illiteracy, and fear of retaliation, suggesting actual incidence is significantly higher. 

  • Multiple States report acute shortages of skilled cyber investigators, digital forensic analysts, and prosecutors with techno-legal expertise. Training participation remains highly variable, with several frontline police stations still struggling with digital evidence preservation, deepfake detection, and cross-jurisdictional cybercrime investigations. 

 Recommendations of the Committee 

The Committee's recommendations span platform accountability and SMI obligations, as well as broader measures on law enforcement capacity, victim support, international cooperation, and legislative reform. 

1.Compliance timelines and periodic audits: Despite prescribed timelines, the Committee found that harmful content often remains accessible for extended periods. During one of the Committee meetings with Google and Meta, Members of Parliament (MPsraised concerns about delays in removal of Non-Consensual Intimate Imagery (NCII), deepfakes, and Child Sexual Exploitative and Abuse Material (CSEAM), unchecked proliferation of fake accounts, lack of transparency in complaint handling, and weak regional language safeguards. MPs also pressed Google and Meta to provide year-wise and city-wise complaint data, staff strength deployed in India, and average turnaround time for responding to law enforcement requests through SAHYOG. They suggested new women-centric tools, such as a Safe Comment Section on YouTube and Safe DM filters on Instagram, alongside stronger awareness campaigns in regional languages and cyber safety education in schools(Appendix IV, Page 116 of the Report) 

The Committee recommended SMIs to strictly adhere to removal timelines under the IT Rules 2021. Additionally, given real-time monitoring capabilities, intermediaries adopt near-immediate action for the highest-priority categories i.e. NCII, CSEAM, malicious deepfakes, and impersonation content. The Committee recommended MeitY to conduct periodic audits and initiate proceedings under Rule 7 of the IT Rules, 2021 (loss of safe harbour) whenever there are compliance gaps. (Page no. 81 & 82 of the Report) 

2.Personal accountability of Designated Officers (DO): It was noted that DOs under Rule 4(1) of IT Rules, 2021 are not currently personally accountable for compliance failures (like delays or negligence in responding to government orders related to women’s online safety). The Committee recommended for the DOs appointed by SSMIs to be made personally responsible for any such delays or negligence related to women’s online safety (Page 82 of the Report)  

3.Mandatory KYC-based verification: The Committee noted that fake profiles, impersonation, and anonymous harassment are among the most frequently reported forms of online abuse against women. The unchecked proliferation of anonymous accounts and the use of Virtual Private Networks (VPNs) create practical impunity for perpetrators. (Page 81, 87 of the Report) 

The Committee recommended introducing mandatory Know Your Customer (KYC)-based identity verification across all social media, dating, and gaming platforms. It was also stated that platforms must carry out periodic re-verification and maintain high-risk flags for accounts repeatedly reported for abuse. Strict age-verification and licensing norms must apply to dating and gaming apps, with penalties for platforms that fail to comply. (Page 82 of the Report) 

4.Deepfake governance: The Committee notes that deepfake pornography and synthetic explicit content targeting women are rising sharply. These advancements have significantly complicated the detection, classification, and takedown of such content. (Page 68 and 77 of the Report) The Report also mentions the advisories issued by MeitY reiterating the SMI’s obligations under IT Rules. It emphasised that intermediaries must use technologies (like AI) responsibly, implementing content labelling, user awareness, and mechanisms for user reporting. (Page 28 of the Report)  

The Committee recommended MeitY and MHA to bring rules on deepfake governance, particularly on: (a) safety filters for deepfake detection; (b) watermarking of AI-generated content; (c) real-time detection systems; and (d) severe penalties for non-compliance. Intermediaries must deploy advanced automated tools under Rule 4(4), including hash-matching, image forensics, and behavioural analysis. (Page 82 of the Report) The Committee further recommended a specialised “Central Deepfake Detection Infrastructure” under MeitY to support States, police units, and courts with rapid verification. (Page 78 of the Report)  

Note: Several of these recommendations have since been partially addressed by the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Amendment Rules, 2026, notified on 10 February 2026 and effective from 20 February 2026. Several elements recommended by the Committee (such as a dedicated deepfake governance rules framework and a Central Deepfake Detection Infrastructure) go beyond what the 2026 amendment addresses and remain under consideration. 

5.Mandatory SAHYOG integration: The Report finds that the onboarding of intermediaries onto SAHYOG and usage across States is uneven. (Page 85 and 112 of the Report) The Committee recommended that all intermediaries (including social media companies, messaging services, cloud platforms, dating apps, and Virtual Asset Service Providers (VASPs)) be mandated to integrate with SAHYOG for seamless, time-bound compliance. It was also recommended that MeitY establish enforceable timelines, automated compliance dashboards, and financial or legal penalties for non-compliance in the removal of harmful content. (Page 78 of the Report) 

The Report also notes that the SAHYOG portal is proposed to be upgraded. In its next phase, the portal will enable LEAs to submit structured data requisition requests to SMIs, going beyond the current content takedown notices. (Page 15 of the Report) 

6.CSEAM: Direct reporting to Indian LEAs: The Committee highlighted that SMIwould report CSEAM solely to the National Center for Missing and Exploited Children (NCMEC). NCMEC is a  a US-based non-profit body that operates a global cyber tipline and routes these reports to the I4C under a Memorandum of Understanding (MoU) between the two bodies. The Supreme Court, in the Just Rights for Children Alliance case, acknowledged this MoU but observed that SMIs were using it as a substitute for direct reporting to Indian authorities (taking an “easy path”) without assuming the timely responsibility of alerting local LEAs who can take immediate enforcement action. The Committee stated here that this practice by the SMIs does not satisfy the legal obligation under Rule 11 of the Protection of Children from Sexual Offences Rules 2020, (POCSO Rules) and the Supreme Court’s direction in the said case. SMIs are to report CSEAM directly to Indian LEAs, Special Juvenile Police Units, and cybercrime.gov.in. Non-reporting attracts liability under the POCSO Act and its rules, as well as under the IT Act. SMIs must also hand over the relevant material, including source information, to local police. (Page 61 and 62 of the Report) 

7.Age-appropriate regulation: The Committee noted that digital platforms, particularly social media, messaging, and hosting services, must be held to higher accountability standards on children’s and women's safety. To protect children and adolescents from adverse psychological impacts, the introduction of age-appropriate regulations and usage limits on social media platforms, alongside safety-by-design standards, was recommended. (Page 70 of the Report)  

8.Central Compliance Review Board: To enhance SMI’s accountability, the Committee recommended the creation of a Central Compliance Review Board, jointly operated by MeitY and MHA. The Board would evaluate platform-wise adherence to deadlines, responsiveness to State Police requests, and proactive removal of harmful content. Persistent non-compliance would attract penalties and result in the loss of safe harbour protection.  (Page 86 of the Report)  

9.Public awareness and community outreach: The Committee found that despite wide-ranging campaigns by MHA and MeitY, cyber safety awareness remains inadequate, particularly in rural and low-digital-literacy areas. Most initiatives are episodic rather than institutionalised, resulting in limited recall and low legal understanding. Women often hesitate to report due to stigma and unfamiliarity with reporting mechanisms. 

The Committee recommended that MHA and MeitY jointly design a sustained, community-anchored national cyber safety awareness programme, covering schools, Panchayati Raj Institutions, Self-Help Groups, and Anganwadi and ASHA networks. Awareness material must be produced in all major Indian languages and local dialects. The Committee also recommended an annual "National Cyber Safety Week" and the training of frontline workers as "Cyber Safety Ambassadors." (Pages 70–71 of the Report) 

10.Strengthening investigation, forensics and law enforcement capacity: Existing capacity-building initiatives remain fragmented and uneven across States. These include CyTrainCyber Crime Prevention against Women and Children (CCPWC) Scheme, the National Cyber Forensic Laboratory (NCFL), State Connect, and Cyber Commandos. Multiple States report acute shortages of skilled cyber investigators and digital forensic analysts. Most States lack adequate tools for deepfake detection and advanced mobile forensics. 

The Committee recommended that MHA urgently develop a National Cyber Capacity-Building Framework for women-related cybercrimes, setting uniform training and certification standards for police, prosecutors, and judicial officers. It recommended rapid expansion of regional forensic units and mobile cyber forensic vans, and that fast-track courts for women-related cybercrimes have judges specifically trained in digital evidence. MeitY must significantly scale up CyberShakti with a dedicated vertical for training women police officers and cyber cell personnel. (Pages 71–76 of the Report) 

11.Unified National Cyber Coordination Grid: Despite existing coordination structures like the Joint Cyber Crime Coordination Teams (JCCTs), SAMANVAYA, and Cyber Multi Agency Centre (CyMAC), States continue to face difficulties securing timely cooperation from SMIs and obtaining critical metadata for investigation. Delays of even a few hours allow morphed images and harmful content to go viral, causing irreparable harm to victims. 

The Committee recommended the establishment of a Unified National Cyber Coordination Grid, integrating all Ministries, State Police Forces, CERT-In, DoT, Cyber Forensic Labs, JCCTs, and SMIs onto a common, real-time communication loop for immediate response in women-related cybercrime cases. State-wise Cyber Coordination Control Rooms for Women's Safety, integrated with NCRP and the 1930 helpline, were also recommended. (Pages 84–85 of the Report)

12.International cooperation: Many SMIs are headquartered abroad, and Indian LEAs face significant delays in obtaining user data and digital evidence. Most platforms respond only through Mutual Legal Assistance Treaty (MLAT) channels, which can take several months. There is no fast-track global mechanism for real-time cross-border cooperation in emergency cases. 

The Committee recommended that India leverage the UN Convention on Cybercrime to establish fast-track cross-border assistance protocols for cybercrimes against women, negotiate bilateral rapid-response agreements with countries hosting major digital platforms, and establish 24x7 cyber liaison officers in key Indian Missions abroad. India must also push for global obligations on digital intermediaries regardless of physical presence, to cooperate with lawful requests within defined timelines. (Pages 86–88 of the Report)

13.Counselling and rehabilitation of cyber victims: Robust systems for counselling, psychosocial support, and rehabilitation of women cyber victims remain inadequate and inconsistently accessible across States. Victims of online harassment, sextortion, and identity theft frequently experience severe emotional distress, social stigma, and economic hardship. The Committee recommended a victim-centred framework ensuring every cyber victim who files a complaint is automatically connected to the nearest One Stop Centre (OSC) or MWCD-supported services. It also recommended dedicated Cyber Counselling Units within OSCs, and district police cyber cells, Regional Cyber Rehabilitation Centres, and the establishment of a Cyber Survivor Compensation Fund for victims of severe cyber offences. (Pages 88–89 of the Report) 

14.Need for comprehensive cybercrime legislation: Cyber offences against women are currently addressed through multiple dispersed statutes i.e. the IT Act, BNS, POCSO, and the Indecent Representation of Women (Prohibition) Act, 1986. This results in overlapping mandates, uneven enforcement, and procedural delays. The law remains largely offence-specific and incident-driven, rather than victim-centric and future-ready. 

The Committee recommended that the Government initiate a structured, time-bound examination toward comprehensive, gender-sensitive cybercrime legislation that consolidates substantive offences, delineates intermediary liabilities, and statutorily mandates victim support and rehabilitation. The Committee framed this as complementing, not replacing, existing statutes. (Pages 89–90 of the Report) 

Steps going forward 

The Report carries persuasive weight and is treated as influential advice to the Government. Under the Practice and Procedure of ParliamentMeitY and MHA are required to submit action taken replies on each recommendation. These are examined by the Committee, and an Action Taken Report is subsequently presented and discussed in Parliament. That said, there is no fixed timeline by which the Ministries are required to respond. 

_______________

Author credits: This article has been authored by Chhavi Sharma, Associate, with inputs from Nirmal Bhansali, Associate, and Rahil Chatterjee, Principal Associate at Ikigai Law. 

Image source: Photo by Apex 360 on Unsplash 

For any queries, get in touch with us at contact@ikigailaw.com

 

 

Challenge
the status quo

Sparking Curiosity...