Delhi High Court Directs Centre to Nominate Members for Deepfake Committee
The Delhi High Court vide order dated 21st November 2024 directed the Centre to nominate members for a committee constituted to examine the issue of deepfakes. The court was informed by the Union Ministry of Electronics and Information Technology (MeitY) that a committee had been formed on 20 November 2024 on deepfake matters. The Delhi High Court passed an order while hearing two writ petitions against the non-regulation of deepfake technology in the country and the threat of its potential misuse. The Centre submitted that it was actively taking measures to address and mitigate the issues related to deepfake technology. The court directed the central government to nominate the members within a week.
The court further stated that the committee shall examine and take into consideration the suggestions filed by the petitioners and consider the regulations as well as statutory frameworks in foreign countries like the European Union. The court has directed the committee to invite the experiences and suggestions of stakeholders such as intermediary platforms, telecom service providers, victims of deepfakes, and websites which provide and deploy deepfakes. The counsel for the petitioners stated that delay in the creation, detection and removal of deepfakes is causing immense hardship to the public at large. Further, the court has directed the said committee to submit its report, as expeditiously as possible, preferably within three months. The matter is further listed on 24th March 2025.
CyberPeace Outlook
Through the issue of misuse of deepfakes by bad actors, it has become increasingly difficult for users to differentiate between genuine and altered content created by deepfakes. This increasing misuse has led to a rise in cyber crimes and poses dangers to users' privacy. Bad actors use any number of random pictures or images collected from the internet to create such non-consensual deepfake content. Such deepfake videos further pose risks of misinformation and fake news campaigns with the potential to sway elections, cause confusion and mistrust in authorities, and more.
The conceivable legislation governing the deepfake is the need of the hour. It is important to foster regulated, ethical and responsible consumption of technology. The comprehensive legislation governing the issue can help ensure technology can be used in a better manner. The dedicated deepfake regulation and deploying ethical practices through a coordinated approach by concerned stakeholders can effectively manage the problems presented by the misuse of deepfake technology. Legal frameworks in this regard need to be equipped to handle the challenges posed by deepfake and AI. Accountability in AI is also a complex issue that requires comprehensive legal reforms. The government should draft policies and regulations that balance innovation and regulation. Through a multifaceted approach and comprehensive regulatory landscape, we can mitigate the risks posed by deepfakes and safeguard privacy, trust, and security in the digital age.
References
- https://www.devdiscourse.com/article/law-order/3168452-delhi-high-court-calls-for-action-on-deepfake-regulation
- https://images.assettype.com/barandbench/2024-11-23/w63zribm/Chaitanya_Rohilla_vs_Union_of_India.pdf
Related Blogs

Modern international trade heavily relies on data transfers for the exchange of digital goods and services. User data travels across multiple jurisdictions and legal regimes, each with different rules for processing it. Since international treaties and standards for data protection are inadequate, states, in an effort to protect their citizens' data, have begun extending their domestic privacy laws beyond their borders. However, this opens a Pandora's box of legal and administrative complexities for both, the data protection authorities and data processors. The former must balance the harmonization of domestic data protection laws with their extraterritorial enforcement, without overreaching into the sovereignty of other states. The latter must comply with the data privacy laws in all states where it collects, stores, and processes data. While the international legal community continues to grapple with these challenges, India can draw valuable lessons to refine the Digital Personal Data Protection Act, 2023 (DPDP) in a way that effectively addresses these complexities.
Why Extraterritorial Application?
Since data moves freely across borders and entities collecting such data from users in multiple states can misuse it or use it to gain an unfair competitive advantage in local markets, data privacy laws carry a clause on their extraterritorial application. Thus, this principle is utilized by states to frame laws that can ensure comprehensive data protection for their citizens, irrespective of the data’s location. The foremost example of this is the European Union’s (EU) General Data Protection Regulation (GDPR), 2016, which applies to any entity that processes the personal data of its citizens, regardless of its location. Recently, India has enacted the DPDP Act of 2023, which includes a clause on extraterritorial application.
The Extraterritorial Approach: GDPR and DPDP Act
The GDPR is considered the toughest data privacy law in the world and sets a global standard in data protection. According to Article 3, its provisions apply not only to data processors within the EU but also to those established outside its territory, if they offer goods and services to and conduct behavioural monitoring of data subjects within the EU. The enforcement of this regulation relies on heavy penalties for non-compliance in the form of fines up to €20 million or 4% of the company’s global turnover, whichever is higher, in case of severe violations. As a result, corporations based in the USA, like Meta and Clearview AI, have been fined over €1.5 billion and €5.5 million respectively, under the GDPR.
Like the GDPR, the DPDP Act extends its jurisdiction to foreign companies dealing with personal data of data principles within Indian territory under section 3(b). It has a similar extraterritorial reach and prescribes a penalty of up to Rs 250 crores in case of breaches. However, the Act or DPDP Rules, 2025, which are currently under deliberation, do not elaborate on an enforcement mechanism through which foreign companies can be held accountable.
Lessons for India’s DPDP on Managing Extraterritorial Application
- Clarity in Definitions: GDPR clearly defines ‘personal data’, covering direct information such as name and identification number, indirect identifiers like location data, and, online identifiers that can be used to identify the physical, physiological, genetic, mental, economic, cultural, or social identity of a natural person. It also prohibits revealing special categories of personal data like religious beliefs and biometric data to protect the fundamental rights and freedoms of the subjects. On the other hand, the DPDP Act/ Rules define ‘personal data’ vaguely, leaving a broad scope for Big Tech and ad-tech firms to bypass obligations.
- International Cooperation: Compliance is complex for companies due to varying data protection laws in different countries. The success of regulatory measures in such a scenario depends on international cooperation for governing cross-border data flows and enforcement. For DPDP to be effective, India will have to foster cooperation frameworks with other nations.
- Adequate Safeguards for Data Transfers: The GDPR regulates data transfers outside the EU via pre-approved legal mechanisms such as standard contractual clauses or binding corporate rules to ensure that the same level of protection applies to EU citizens’ data even when it is processed outside the EU. The DPDP should adopt similar safeguards to ensure that Indian citizens’ data is protected when processed abroad.
- Revised Penalty Structure: The GDPR mandates a penalty structure that must be effective, proportionate, and dissuasive. The supervisory authority in each member state has the power to impose administrative fines as per these principles, up to an upper limit set by the GDPR. On the other hand, the DPDP’s penalty structure is simplistic and will disproportionately impact smaller businesses. It must take into regard factors such as nature, gravity, and duration of the infringement, its consequences, compliance measures taken, etc.
- Governance Structure: The GDPR envisages a multi-tiered governance structure comprising of
- National-level Data Protection Authorities (DPAs) for enforcing national data protection laws and the GDPR,
- European Data Protection Supervisor (EDPS) for monitoring the processing of personal data by EU institutions and bodies,
- European Commission (EC) for developing GDPR legislation
- European Data Protection Board (EDPB) for enabling coordination between the EC, EDPS, and DPAs
In contrast, the Data Protection Board (DPB) under DPDP will be a single, centralized body overseeing compliance and enforcement. Since its members are to be appointed by the Central Government, it raises questions about the Board’s autonomy and ability to apply regulations consistently. Further, its investigative and enforcement capabilities are not well defined.
Conclusion
The protection of the human right to privacy ( under the International Covenant on Civil and Political Rights and the Universal Declaration of Human Rights) in today’s increasingly interconnected digital economy warrants international standard-setting on cross-border data protection. In the meantime, States relying on the extraterritorial application of domestic laws is unavoidable. While India’s DPDP takes measures towards this, they must be refined to ensure clarity regarding implementation mechanisms. They should push for alignment with data protection laws of other States, and account for the complexity of enforcement in cases involving extraterritorial jurisdiction. As India sets out to position itself as a global digital leader, a well-crafted extraterritorial framework under the DPDP Act will be essential to promote international trust in India’s data governance regime.
Sources
- https://gdpr-info.eu/art-83-gdpr/
- https://gdpr-info.eu/recitals/no-150/
- https://gdpr-info.eu/recitals/no-51/
- https://www.meity.gov.in/static/uploads/2024/06/2bf1f0e9f04e6fb4f8fef35e82c42aa5.pdf
- https://www.eqs.com/compliance-blog/biggest-gdpr-fines/#:~:text=ease%20the%20burden.-,At%20a%20glance,In%20summary
- https://gdpr-info.eu/art-3-gdpr/
- https://www.legal500.com/developments/thought-leadership/gdpr-v-indias-dpdpa-key-differences-and-compliance-implications/#:~:text=Both%20laws%20cover%20'personal%20data,of%20personal%20data%20as%20sensitive.
.jpeg)
Introduction and Brief Analysis
A movie named “The Artifice Girl” portrayed A law enforcement agency developing an AI-based personification of a 12-year-old girl who appears to be exactly like a real person. Believing her to be an actual girl, perpetrators of child sexual exploitation were caught attempting to seek sexual favours. The movie showed how AI aided law enforcement, but the reality is that the emergence of Artificial Intelligence has posed numerous challenges in multiple directions. This example illustrates both the promise and the complexity of using AI in sensitive areas like law enforcement, where technological innovation must be carefully balanced with ethical and legal considerations.
Detection and Protection tools are constantly competing with technologies that generate content, automate grooming and challenge legal boundaries. Such technological advancements have provided enough ground for the proliferation of Child Sexual Exploitation and Abuse Material (CSEAM). Also known as child pornography under Section 2 (da) of Protection of Children from Sexual Offences Act, 2012, it defined it as - “means any visual depiction of sexually explicit conduct involving a child which includes a photograph, video, digital or computer-generated image indistinguishable from an actual child and image created, adapted, or modified, but appears to depict a child.”
Artificial Intelligence is a category of technologies that attempt to shape human thoughts and behaviours using input algorithms and datasets. Two Primary applications can be considered in the context of CSEAM: classifiers and content generators. Classifiers are programs that learn from large data sets, which may be labelled or unlabelled and further classify what is restricted or illegal. Whereas generative AI is also trained on large datasets, it uses that knowledge to create new things. Majority of current AI research related to AI for CSEAM is done by the use of Artificial neural networks (ANNs), a type of AI that can be trained to identify unusual connections between items (classification) and to generate unique combinations of items (e.g., elements of a picture) based on the training data used.
Current Legal Landscape
The legal Landscape in terms of AI is yet unclear and evolving, with different nations trying to track the evolution of AI and develop laws. However, some laws directly address CSEAM. The International Centre for Missing and Exploited Children (ICMEC) combats Illegal sexual content involving children. They have a “Model Legislation” for setting recommended sanctions/sentencing. According to research performed in 2018, Illegal sexual content involving children is illegal in 118 of the 196 Interpol member states. This figure represents countries that have sufficient legislation in place to meet 4 or 5 of the 5 criteria defined by the ICMEC.
CSEAM in India can be reported on various portals like the ‘National Cyber Crime Reporting Portal’. Online crimes related to children, including CSEAM, can be reported to this portal by visiting cybercrime.gov.in. This portal allows anonymous reporting, automatic FIR registration and tracking of your complaint. ‘I4C Sahyog Portal’ is another platform managed by the Indian Cyber Crime Coordination Centre (I4C). This portal integrates with social media platforms.
The Indian legal front for AI is evolving and CSEAM is well addressed in Indian laws and through judicial pronouncements. The Supreme Court judgement on Alliance and Anr v S Harish and ors is a landmark in this regard. The following principles were highlighted in this judgment.
- The term “child pornography” should be substituted by “Child Sexual Exploitation and Abuse Material” (CSEAM) and shall not be used for any further judicial proceeding, order, or judgment. Also, parliament should amend the same in POCSO and instead, the term CSEAM should be endorsed.
- Parliament to consider amending Section 15 (1) of POCSO to make it more convenient for the general public to report by way of an online portal.
- Implementing sex education programs to give young people a clear understanding of consent and the consequences of exploitation. To help prevent Problematic sexual behaviour (PSB), schools should teach students about consent, healthy relationships and appropriate behaviour.
- Support services to the victims and rehabilitation programs for the offenders are essential.
- Early identification of at-risk individuals and implementation of intervention strategies for youth.
Distinctive Challenges
According to a report by the National Centre for Missing and Exploited Children (NCMEC), a significant number of reports about child sexual exploitation and abuse material (CSEAM) are linked to perpetrators based outside the country. This highlights major challenges related to jurisdiction and anonymity in addressing such crimes. Since the issue concerns children and considering the cross-border nature of the internet and the emergence of AI, Nations across the globe need to come together to solve this matter. Delays in the extradition procedure and irregular legal processes across the jurisdictions hinder the apprehension of offenders and the delivery of justice to victims.
CyberPeace Recommendations
For effective regulation of AI-generated CSEAM, laws are required to be strengthened for AI developers and trainers to prevent misuse of their tools. AI should be designed with its ethical considerations, ensuring respect for privacy, consent and child rights. There can be a self-regulation mechanism for AI models to recognise and restrict red flags related to CSEAM and indicate grooming or potential abuse.
A distinct Indian CSEAM reporting portal is urgently needed, as cybercrimes are increasing throughout the nation. Depending on the integrated portal may lead to ignorance of AI-based CSEAM cases. This would result in faster response and focused tracking. Since AI-generated content is detectable. The portal should also include an automated AI-content detection system linked directly to law enforcement for swift action.
Furthermore, International cooperation is of utmost importance to win the battle of AI-enabled challenges and to fill the jurisdictional gaps. A united global effort is required. Using a common technology and unified international laws is essential to tackle AI-driven child sexual exploitation across borders and protect children everywhere. CSEAM is an extremely serious issue. Children are among the most vulnerable to such harmful content. This threat must be addressed without delay, through stronger policies, dedicated reporting mechanisms and swift action to protect children from exploitation.
References:
- https://www.sciencedirect.com/science/article/pii/S2950193824000433?ref=pdf_download&fr=RR-2&rr=94efffff09e95975
- https://aasc.assam.gov.in/sites/default/files/swf_utility_folder/departments/aasc_webcomindia_org_oi d_4/portlet/level_2/pocso_act.pdf
- https://www.manupatracademy.com/assets/pdf/legalpost/just-rights-for-children-alliance-and-anr-vs-sharish-and-ors.pdfhttps://www.icmec.orghttps://www.missingkids.org/theissues/generative-ai
%203rd%20Sep%2C%202025.webp)
In the past decade, India’s gaming sector has seen a surprising but swift advancement, which brought along millions of players and over billions in investments and has even been estimated to be at $23 billion. Whether it's fantasy cricket and Ludo apps, high-stakes poker, or rummy platforms, investing real money in online gaming and gambling has become a beloved hobby for many. Moreover, it not only gave a boost to the economy but also contributed to creative innovation and the generation of employment.
The real concern lies behind the glossy numbers, tales of addiction, financial detriment, and the never-ending game of cat and mouse with legal loopholes. The sector’s meteoric rise has raised various concerns relating to national financial integrity, regulatory clarity and consumer safety.
In light of this, the Promotion and Regulation of Online Gaming Act, 2025, which was passed by Parliament and signed into law on August 22, stands out as a significant development. The Act, which is positioned as a consumer protection and sector-defining law, aims to distinguish between innovation and exploitation by acknowledging e-sport as a legitimate activity and establishing unambiguous boundaries around the larger gaming industry.
Key Highlights of the Act
- Complete Ban on all games involving Real-Money: All e-games, whether based on skill or luck, that involve monetary stakes have been banned.
- Prohibition of Ads: Promotion of such e-games has also been disallowed across all platforms.
- Legal Ramifications: Operation of such games may lead to up to 3 years in prison with a 1 cr fine; Advertisement for the same may lead to up to 2 years in prison with a 50 lakh fine. However, in case of repeat offences, this may go up to 3-5 years in prison and 2 cr in fines.
- Creation of Online Gaming Authority: The creation of a national-level regulatory body to classify and monitor games, register platforms and enforce the dedicated rules.
- Support for eSports and Social & Educational games: All kinds of games that are non-monetary, promote social and educational growth, will not only be recognised but encouraged. Meanwhile, eSports will also gain official recognition under the Ministry of Sports.
Positive Impacts
- Addressal & Tackling of Addiction and Financial Ruin: The major reason behind the ban is to reduce the cases of players, mainly youth, getting into gambling and losing huge amounts of money to betting apps and games, and to protect vulnerable users
- Boost to eSports & Regulatory Clarity: The law not only legitimises the eSport sector but also provides opportunities for scholarship and other financial benefits, along with windows for professional tournaments and platforms on global stages. Along with this aims to bring about an order around e-games of skill versus luck.
- Fraud Monitoring & Control: The law makes sure to block off avenues for money laundering, gambling and illegal betting networks.
- Promotion of Safe Digital Ecosystem: Encouraging social, developmental and educational games to focus on skill, learning and fun.
Challenges
The fact that the Promotion and Regulation of Online Gaming Act, 2025 is still in its early stages, which must be recognised. In the end, its effectiveness will rely not only on the letter of the law but on the strength of its enforcement and the wisdom of its application. The Act has the potential to safeguard the interests of at-risk youth from the dangers of gambling and its addiction, if it is applied carefully and clearly, all the while maintaining the digital ecosystem as a place of innovation, equity, and trust.
- Blanket Ban: By imposing a blanket ban on games that have long been justified as skill-based like rummy or fantasy cricket, the Act runs the risk of suppressing respectable enterprises and centres of innovation. Many startups that were once hailed for being at the forefront of India’s digital innovation may now find it difficult to thrive in an unpredictable regulatory environment.
- Rise of Illegal Platforms: History offers a sobering lesson, prohibition does not eliminate demand, it simply drives it underground. The prohibition of money games may encourage the growth of unregulated, offshore sites, where players are more vulnerable to fraud, data theft, and abuse and have no way to seek consumer protection.
Conclusion
The Act is definitely a tough and bold stand to check and regulate India’s digital gaming industry, but it is also a double-edged sword. It brings in much-needed consumer protection regulations in place and legitimises e-Sports. However, it also casts a long shadow over a successful economy and runs the risk of fostering a black market that is more harmful than the issue it was intended to address.
Therefore, striking a balance between innovation and protection, between law and liberty, will be considered more important in the coming years than the success of regulations alone. India’s legitimacy as a digital economy ready for global leadership, as well as the future of its gaming industry, will depend on how it handles this delicate balance.
References:
- https://economictimes.indiatimes.com/tech/technology/gaming-bodies-write-to-amit-shah-urge-to-block-blanket-ban-warn-of-rs-20000-crore-tax-loss/articleshow/123392342.cms
- https://m.economictimes.com/news/india/govt-estimates-45-cr-people-lose-about-rs-20000-cr-annually-from-real-money-gaming/articleshow/123408237.cms
- https://www.cyberpeace.org/resources/blogs/promotion-and-regulation-of-online-gaming-bill-2025-gets-green-flag-from-both-houses-of-parliament
- https://www.thehindu.com/business/Industry/real-money-gaming-firms-wind-down-operations/article69965196.ece