The European Parliament and Council of the European Union (EU) have agreed to ban artificial intelligence (AI) systems that generate non-consensual sexually explicit content and child sexual abuse material (CSAM), including nudification apps, as part of a deal to simplify the AI Act [1].

This ban makes it illegal to build or sell a nudification tool within the EU [1].

In contrast, India's legal framework presents a gap in addressing this issue [1].

India's IT Amendment Rules 2026, notified on February 10, 2026, and in force from February 20, require platforms to prevent the creation and sharing of unlawful synthetically generated information (SGI), including CSAM and non-consensual intimate imagery (NCII), with a three-hour takedown window [1].

However, these rules do not include a ban on nudification tools themselves [1].

The IT Amendment Rules 2026 place proactive obligations only on platforms and intermediaries, and a standalone nudification app that does not operate as a social media intermediary faces no obligation under Indian law [1].

MeitY's NCII standard operating procedure (SOP), issued in November 2025, mandates a 24-hour takedown but covers only women and girls [1].

The Madras High Court's order prompted MeitY to outline what "a victim girl must do", anchoring the framework to female victims [1].

A Rajasthan High Court order was required to remove a minor boy’s obscene images from Instagram, highlighting the limitations of the framework [1].

The Digital Personal Data Protection (DPDP) Act, 2023, and its rules, notified in November 2025, have substantive obligations enforceable only from May 2027, with no content regulation framework [1].

Nudification tools fall entirely outside the scope of the DPDP Act [1].

How this was made. This article was assembled by Startupniti's editorial AI from the source listed in the right rail. The synthesis ran through our 4-model cascade (Gemini Flash Lite → GPT-4o-mini → DeepSeek → Llama 3.3 70B), logged to ops.llm_calls. Every fact traces to a citation. If a fact looks wrong, write to corrections.