#FactCheck: AI-Generated Photo Shared to Claim Boycott of Hindu Sammelan
Executive Summary:
A photo circulating on social media shows a stage with the words “Hindu Sammelan” (Hindu Conference) written in large letters. In front of the stage, rows of chairs appear largely empty, with only a few people seated while most seats remain vacant.
Users sharing the image claim that the event, held under the banner of a “Hindu Sammelan,” was in fact a “Brahmin Sammelan,” and that indigenous communities chose to stay away, resulting in poor attendance.
It is noteworthy that, on the occasion of the centenary year of the Rashtriya Swayamsevak Sangh (RSS), various “Hindu Sammelan” events are being organized across the country. The viral image is being linked to this broader context.
However, research conducted by the CyberPeace found the viral claim to be false. Our research revealed that the image being shared on social media is not authentic but AI-generated and is being circulated with a misleading narrative.
Claim
On February 21, 2026, a Facebook user shared the viral image. The original and archived links are provided below
- https://www.facebook.com/photo?fbid=935049042540479&set=gm.2425972001215469&idorvanity=465387370607285
- https://ghostarchive.org/archive/sxC6d

Fact Check:
A keyword search on Google confirmed that several “Hindu Sammelan” events have indeed been organized across the country as part of the RSS centenary year. For instance, media reports have covered such events in different cities, including Nagpur.

However, upon closely examining the viral image, we observed certain visual inconsistencies and unnatural elements that raised suspicion of AI generation. We first analyzed the image using the AI detection tool Hive Moderation, which indicated a 79.3 percent probability that the image was AI-generated.

To further verify, we scanned the image using another AI detection platform, Sightengine. The results showed a 97 percent likelihood that the image was AI-generated.

Conclusion
Our research confirms that the image circulating on social media is not genuine. It has been artificially created using AI technology and is being shared with a misleading claim.
Related Blogs

A war in the twenty-first century does not start when the first bullet or missile is fired. It begins much earlier, covertly, and without any official announcement. Cyberspace is this new battlefield. States now use a variety of ransomware, malicious codes, and disinformation campaigns to undermine their enemies' capabilities before launching an offensive. These pre-conflict cyber operations are now the primary frontline of contemporary hybrid warfare, which is changing how conflicts are fought and conducted.
The Birth of a Digital Battlefield
Hybrid Warfare is a blend of conventional military force with nonmilitary tactics like economic coercion, disinformation, and cyberattacks that have evolved rapidly in recent decades. Hybrid methods of warfare are nothing new, as the scale and sophistication of cyber operations in modern conflicts are unprecedented. Russia’s actions in Ukraine demonstrated the capability of digital tools to paralyse the critical systems before its heavy munitions could be deployed for combat operations. Within days of the 2022 invasions, Ukraine faced massive Distributed Denial of Service (DDoS) attacks targeting banks, government websites, and energy infrastructures. The digital frontlines have softened the physical defences long before the conventional warfare began.
According to the FP Analytics’ “Digital Front Lines” Project, cyber operations are no longer an auxiliary tactic but a core component of hybrid warfare, blurring the boundary between peace and war. They enable states to exert pressure, gather intelligence, and disrupt adversaries, often without being attributed or held accountable.
Cyber Operations: The modern Prelude to War
The use of digital technologies for surveillance, information network disruption, or critical infrastructure destruction is known as cyber operations. They are especially useful instruments for pre-conflict manipulation because of their ambiguity and stealth. Cyberattacks, in contrast to conventional military strikes, can accomplish strategic goals while providing plausible deniability.
Coordinated cyberattacks that spread misinformation and damaged public confidence disrupted government communication systems prior to Russia's invasion of Ukraine. These sorts of incidents highlight the integrated nature of cyber and kinetic operations, where digital assaults often serve as the initial phases of modern wars.
The Expanding Spectrum of Actors or Threat
Cyberspace has democratized warfare, which once required an army, can now be initiated by a handful of skilled programmers with access to the right tools. The cyber landscape of the present times features a wide spectrum of threat actors, which can be understood as;
- State actors like intelligence or military agencies conduct cyber operations as part of official foreign policy.
- Cybercriminals pursue financial gains, often overlapping with political motives.
- Terrorist groups use cyberspace to spread propaganda for coordinated attacks.
- Cyber mercenaries being hired by both the state and nonstate clients can blur the ethical and legal boundaries.
This diversity can complicate the attribution by determining that anyone who is actively working behind conducting cyberattacks can be notoriously difficult, allowing the states to hide behind “plausible deniability.” This ‘Gray Zone’ of conflict below the threshold of a declared war, above mere diplomacy, has become the preferred arena for modern power struggles.
Civilian Involvement and Ethical Dilemmas
Unlike traditional warfare, where the cyber domain entangles civilians as both participants and targets. Much of the nation’s critical infrastructure, which includes energy grids, hospitals, transportation, and communication systems, is owned and operated by private entities. As a result, the civilian industries and experts are becoming central to both cyber defence and offence.
During the Russia–Ukraine War, the volunteer hackers from around the world were many of whom are being coordinated through the app Telegram, which is termed as ‘IT Army of Ukraine’, are known for conducting digital strikes on Russian networks. Conversely, the Russia-affiliated hacker groups like Conti had vowed to retaliate against any nations that supported Ukraine.
This civilian participation raises profound legal and moral questions, over a private company’s role in defending their networks of becoming a combatant, or the impact of retaliatory cyberattacks on civilian infrastructure war crimes. International law has yet to provide a clear answer, which can leave dangerous gaps in the governance to counter cybercrimes.
Susceptibility of Contemporary Society to Cyber Warfare
Cyberwarfare can impact an entire global digital ecosystem due to its interconnectedness. Power grids, hospitals, air traffic systems, and even automation devices can be compromised. While the NotPetya ransomware, which was cloaked as ransomware, caused billions of losses and caused worldwide economic damage from shipping companies to pharmaceutical companies, the WannaCry ransomware attacks in 2017 paralysed hospitals throughout the UK's National Health Service.
When taken as a whole, these incidents have also shown that cyberattacks are no longer limited to espionage situations and can have real-world consequences comparable to those of conventional warfare. The consequences of cyberattacks could increase dramatically as our dependence on technology increases. Because these effects are profoundly psychological in nature and seek to sow fear, mistrust, and social disintegration, they are not merely technical or economic in nature.
The Future: Permanent Cyber Frontlines
Technological developments have made cyberspace a permanent theatre of conflict, joining the land, sea, air, and space. Countries are currently making significant investments in cyber capabilities for deterrence as well as defence. According to security experts like Eriksson and Giacomello, societies are now inherently fragile due to our increasing reliance on information technologies.
Cyber operations in this context are about strategic dominance in a globalised world, not just digital espionage. Who controls the networks and algorithms that run contemporary civilisation will determine the future of war, not just who controls the skies or the seas. As per the new reality, before the drop of the first bomb, a silent war in cyberspace will already be underway.
References
- https://digitalfrontlines.io/2023/05/25/the-evolution-of-cyber-operations-in-armed-conflict/
- https://theses.ubn.ru.nl/server/api/core/bitstreams/9d74149e-fb9a-402f-aa65-a90445ad7603/content
- https://cybersecurityguide.org/resources/cyberwarfare/
- https://re.public.polimi.it/retrieve/e0c31c0b-ce6c-4599-e053-1705fe0aef77/21%20Century%20Cyber%20Warfare.pdf

In the vast, interconnected cosmos of the internet, where knowledge and connectivity are celebrated as the twin suns of enlightenment, there lurk shadows of a more sinister nature. Here, in these darker corners, the innocence of childhood is not only exploited but also scarred, indelibly and forever. The production, distribution, and consumption of Child Sexual Abuse Material (CSAM) have surged to alarming levels globally, casting a long, ominous shadow over the digital landscape.
In response to this pressing issue, the National Human Rights Commission (NHRC) has unfurled a comprehensive four-part advisory, a beacon of hope aimed at combating CSAM and safeguarding the rights of children in this digital age. This advisory dated 27/10/23 is not merely a reaction to the rising tide of CSAM, but a testament to the imperative need for constant vigilance in the realm of cyber peace.
The statistics paint a sobering picture. In 2021, more than 1,500 instances of publishing, storing, and transmitting CSAM were reported, shedding a harsh light on the scale of the problem. Even more alarming is the upward trend in cases reported in subsequent years. By 2023, a staggering 450,207 cases of CSAM had already been reported, marking a significant increase from the 204,056 and 163,633 cases reported in 2022 and 2021, respectively.
The Key Aspects of Advisory
The NHRC's advisory commences with a fundamental recommendation - a redefinition of terminology. It suggests replacing the term 'Child Pornography' with 'Child Sexual Abuse Material' (CSAM). This shift in language is not merely semantic; it underscores the gravity of the issue, emphasizing that this is not about pornography but child abuse.
Moreover, the advisory calls for the definition of 'sexually explicit' under Section 67B of the IT Act, 2000. This step is crucial for ensuring the prompt identification and removal of online CSAM. By giving a clear definition, law enforcement can act swiftly in removing such content from the internet.
The digital world knows no borders, and CSAM can easily cross jurisdictional lines. NHRC recognizes this challenge and proposes that laws be harmonized across jurisdictions through bilateral agreements. Moreover, it recommends pushing for the adoption of a UN draft Convention on 'Countering the Use of Information and Communications Technologies for Criminal Purposes' at the General Assembly.
One of the critical aspects of the advisory is the strengthening of law enforcement. NHRC advocates for the creation of Specialized State Police Units in every state and union territory to handle CSAM-related cases. The central government is expected to provide support, including grants, to set up and equip these units.
The NHRC further recommends establishing a Specialized Central Police Unit under the government of India's jurisdiction. This unit will focus on identifying and apprehending CSAM offenders and maintaining a repository of such content. Its role is not limited to law enforcement; it is expected to cooperate with investigative agencies, analyze patterns, and initiate the process for content takedown. This coordinated approach is designed to combat the problem effectively, both on the dark web and open web.
The role of internet intermediaries and social media platforms in controlling CSAM is undeniable. The NHRC advisory emphasizes that intermediaries must deploy technology, such as content moderation algorithms, to proactively detect and remove CSAM from their platforms. This places the onus on the platforms to be proactive in policing their content and ensuring the safety of their users.
New Developments
Platforms using end-to-end encryption services may be required to create additional protocols for monitoring the circulation of CSAM. Failure to do so may invite the withdrawal of the 'safe harbor' clause under Section 79 of the IT Act, 2000. This measure ensures that platforms using encryption technology are not inadvertently providing safe havens for those engaged in illegal activities.
NHRC's advisory extends beyond legal and law enforcement measures; it emphasizes the importance of awareness and sensitization at various levels. Schools, colleges, and institutions are called upon to educate students, parents, and teachers about the modus operandi of online child sexual abusers, the vulnerabilities of children on the internet, and the early signs of online child abuse.
To further enhance awareness, a cyber curriculum is proposed to be integrated into the education system. This curriculum will not only boost digital literacy but also educate students about relevant child care legislation, policies, and the legal consequences of violating them.
NHRC recognizes that survivors of CSAM need more than legal measures and prevention strategies. Survivors are recommended to receive support services and opportunities for rehabilitation through various means. Partnerships with civil society and other stakeholders play a vital role in this aspect. Moreover, psycho-social care centers are proposed to be established in every district to facilitate need-based support services and organization of stigma eradication programs.
NHRC's advisory is a resounding call to action, acknowledging the critical importance of protecting children from the perils of CSAM. By addressing legal gaps, strengthening law enforcement, regulating online platforms, and promoting awareness and support, the NHRC aims to create a safer digital environment for children.
Conclusion
In a world where the internet plays an increasingly central role in our lives, these recommendations are not just proactive but imperative. They underscore the collective responsibility of governments, law enforcement agencies, intermediaries, and society as a whole in safeguarding the rights and well-being of children in the digital age.
NHRC's advisory is a pivotal guide to a more secure and child-friendly digital world. By addressing the rising tide of CSAM and emphasizing the need for constant vigilance, NHRC reaffirms the critical role of organizations, governments, and individuals in ensuring cyber peace and child protection in the digital age. The active contribution from premier cyber resilience firms like Cyber Peace Foundation, amplifies the collective action forging a secure digital space, highlighting the pivotal role played by think tanks in ensuring cyber peace and resilience.
References:
- https://www.hindustantimes.com/india-news/nhrc-issues-advisory-regarding-child-sexual-abuse-material-on-internet-101698473197792.html
- https://ssrana.in/articles/nhrcs-advisory-proliferation-of-child-sexual-abuse-material-csam/
- https://theprint.in/india/specialised-central-police-unit-use-of-technology-to-proactively-detect-csam-nhrc-advisory/1822223/

Executive Summary
A postcard claiming that Uttar Pradesh Deputy Chief Minister Keshav Prasad Maurya commented on the Supreme Court’s stay on the new UGC regulations is being widely shared on social media. The viral postcard suggests that Maurya stated the Modi government would “fight till its last breath” to implement the UGC law and appealed to Dalit, backward and tribal communities to trust the government as their true well-wisher. However, an research by the CyberPeace has found that the viral postcard is fake. Keshav Prasad Maurya has not made any such statement.
Claim
A Facebook user shared the postcard with the caption:“Now read it yourself. Statement of Deputy CM Keshav Prasad Maurya — the Modi government will fight till its last breath to implement the UGC law. An appeal to Dalit, backward and tribal communities to trust the government, calling it their true well-wisher.”
(Archived version of the post available here.)

Fact Check:
During the research, we did not find any credible news reports mentioning such a statement by Deputy Chief Minister Keshav Prasad Maurya regarding the UGC regulations or the Supreme Court’s order. A closer examination of the viral postcard revealed several inconsistencies. Notably, the text on the postcard lacks proper punctuation, such as commas and full stops, which is unusual for professionally designed news graphics. The postcard carries the logo of Navbharat Times (NBT). However, when compared with genuine NBT postcards, the font style used in the viral image does not match NBT’s official design. We also traced the original NBT postcard that appears to have been edited to create the fake one. In the authentic postcard, shared by NBT on January 20, Keshav Prasad Maurya is quoted as saying: Where the lotus has bloomed, it will continue to bloom, and where it has not, under the guidance of PM Modi and the leadership of Nitin Nabin, the lotus will bloom.”

The original statement was digitally altered, and a fabricated quote was inserted to create the viral postcard.
Conclusion
CyberPeace research clearly establishes that the viral postcard is fake. The original Navbharat Times postcard has been tampered with, and Keshav Prasad Maurya’s actual statement has been replaced with a fabricated quote, which is now being circulated with a misleading claim.