Navigating the Path to CyberPeace: Insights and Strategies
Featured #factCheck Blogs

Executive Summary
Amid the ongoing tensions between the United States, Israel, and Iran, a video circulating on social media claims that Israeli Prime Minister Benjamin Netanyahu was seen running after Iran launched an attack on Israel. However, research by the CyberPeace found the viral claim to be misleading. Our research revealed that the video has no connection with the current tensions between the United States, Israel, and Iran. In reality, the clip dates back to 2021, when Netanyahu was rushing inside Israel’s parliament to cast his vote after arriving late.
Claim:
On the social media platform X (formerly Twitter), a user shared the video on March 5, 2026, claiming that Netanyahu had fled and gone into hiding due to fear of Iran. The post included inflammatory remarks suggesting that Iran had demonstrated its power and that Netanyahu had abandoned his country out of fear.

Fact Check
To verify the authenticity of the video, we extracted several keyframes and conducted a reverse image search on Google. During the research, we found the same video on the official X account of Benjamin Netanyahu, posted on December 14, 2021. In the post, Netanyahu wrote in Hebrew, which translates to,“I am always proud to run for you. Photographed half an hour ago in the Knesset.”

Further research also led us to a Hebrew news website where the same video was published.

According to the report, voting in the Knesset (Israel’s parliament) continued throughout the night, and an explosives-related bill was passed by a very narrow margin. At the time, opposition leader Benjamin Netanyahu was in his room inside the Knesset building. When he was called for the vote, he hurried through the parliament corridors to reach the chamber in time to cast his vote.
Conclusion:
Our research found that the viral video is unrelated to the ongoing tensions involving the United States, Israel, and Iran. The footage is from 2021 and shows Benjamin Netanyahu rushing inside the Knesset to participate in a parliamentary vote after being called in at the last moment.

Executive Summary
A 57-second video featuring India’s Chief of Army Staff Upendra Dwivedi is widely circulating on social media. The clip is being shared with the claim that the Army chief admitted India had “betrayed” Iran by providing the location of an Iranian naval ship to Israel, allegedly leading to its destruction The video is spreading amid heightened tensions in West Asia involving United States, Israel, and Iran. According to posts sharing the claim, the Iranian naval vessel IRIS Dena, which had participated in a naval event in Visakhapatnam and was returning to Iran with around 130 personnel onboard, was torpedoed by a US submarine near the southern coast of Sri Lanka on March 4 while sailing in the Indian Ocean.
In the viral clip, the speaker—presented as the Indian Army chief—appears to say that India informed Israel about the exact location of the Iranian ship after it left Indian waters, describing Israel as a strategic ally and suggesting that the attack occurred in international waters. The clip also claims that India had no direct involvement in the alleged joint US-Israel torpedo strike.
However, research conducted by the CyberPeace found the claim to be false. Our research shows that the video does not contain a genuine statement from Army Chief Upendra Dwivedi and is in fact a manipulated clip.
Claim
On X (formerly Twitter), a page named GPX (@GPX_Press) shared the video on March 9 with the caption: “India confesses it BETRAYED Iran by leaking the location of an Iranian ship to Israel, leading to its total destruction!”

Fact Check
During the verification process, researchers noticed a ticker in the viral video reading “Raisina Dialogue 2026 × Firstpost.” Using this clue, we conducted a keyword search on YouTube and located a video uploaded by Firstpost on March 7 titled “India’s Army Chief Speaks on Op Sindoor, Pakistan and Future of Warfare | Raisina Dialogue 2026.”
In the 21-minute interview, Army Chief Upendra Dwivedi is seen speaking with strategic affairs expert Harsh V. Pant. According to the video description, the discussion focuses on lessons from Operation Sindoor and the evolving nature of modern warfare.

The viral clip appears to be taken from this interview. However, throughout the conversation, Dwivedi does not mention any conflict involving the United States, Israel, and Iran, nor does he refer to the sinking of an Iranian naval ship in the Indian Ocean. This indicates that the circulating clip has been edited and misrepresented to create a misleading narrative.
For additional verification, the viral video was analyzed using the AI detection tool Hive Moderation. The results suggested a 99.9% probability that the speech in the clip was generated using AI, indicating manipulation of the original footage.

Conclusion
The research makes it clear that the viral video does not reflect an authentic statement by India’s Army Chief Upendra Dwivedi. The clip has been altered and the audio appears to be AI-generated. In other words, the circulating video is a deepfake being shared with a misleading claim.
.webp)
Executive Summary:
A video showing a car catching fire is rapidly going viral on social media. In the clip, a family can be seen bursting firecrackers in front of a newly purchased car. Moments later, the vehicle also appears to catch fire. The video is being shared with the claim that the family was celebrating the purchase of a new car with fireworks, which accidentally led to the vehicle going up in flames. Many users are circulating the clip as footage of a real incident. However, an research by the CyberPeace found that the video is not from a real-life event but has been created using Artificial Intelligence (AI).
Claim
On February 25, 2026, an X user named “Mamta Rajgarh” shared the viral video with the caption:“This was supposed to be a grand celebration for buying a new car, but it turned into a ceremony of burning the car. What do you say? Comment below.”
- Post link: https://x.com/rajgarh_mamta1/status/2026696175311786408?s=20
- Archived link: https://perma.cc/22AA-KBS4

Fact Check:
To verify the claim, we conducted a keyword search on Google but found no credible news reports supporting the alleged incident. Upon closely examining the video, we noticed several technical inconsistencies. The car’s number plate is unclear, a common flaw often seen in AI-generated content. Additionally, the sequence of events appears unnatural — the firecrackers seem to extinguish first, and only after a delay does the car suddenly catch fire. These irregularities raised suspicion that the video may have been artificially generated. To further verify, we analyzed the clip using AI detection tools. Hive Moderation indicated a 98.7 percent likelihood that the video was generated using Artificial Intelligence.

Another AI detection tool, Undetectable.ai, suggested a 77 percent probability that the video was AI-generated.
Conclusion
Our research confirms that the viral video does not depict a real incident. It has been created using Artificial Intelligence and is being misleadingly shared as genuine footage.

Executive Summary:
A video circulating on social media shows a group of people tearing Congress posters and raising controversial slogans. The clip is being shared with the claim that the individuals seen in the video are workers of the Congress party who were protesting against Rahul Gandhi and raising slogans against him. However, research by the CyberPeace found the viral claim to be misleading. Our research revealed that the video dates back to February 21, 2026. On that day, members of the Bharatiya Janata Yuva Morcha (BJYM) staged a protest outside a Congress office. During the demonstration, they raised slogans and tore Congress posters. The same video is now being circulated with a false narrative.
Claim
On February 24, 2026, a Facebook user shared the viral video with the caption:“Rebellion against Rahul Gandhi in Congress’ own stronghold! Party workers themselves tore posters and raised slogans — ‘Rahul Gandhi is a thief… a thief!’ This video exposes the internal truth of Congress. Congress itself is Muslim League.”

Fact Check
To verify the claim, we extracted key frames from the viral video and conducted a reverse image search using Google Lens. During the search, we found the same video uploaded on YouTube on February 21, 2026.
According to the description accompanying the video, BJP workers had staged a protest outside a Congress building. The report mentioned vandalism and stone-pelting during the protest, resulting in injuries to several individuals
- https://www.youtube.com/watch?v=pW-13mSvJ2c

Using this lead, we conducted a keyword search on Google and found a report published on February 21, 2026, by the Hindi news website Raj Express. The visuals in the report closely matched those seen in the viral clip.

According to the report, the protest in Bhopal was organized by the Bharatiya Janata Yuva Morcha in response to a T-shirt protest staged by the Youth Congress during an AI Summit held at Bharat Mandapam in New Delhi. The situation escalated when protesters marched toward the state Congress office in Shivaji Nagar. Police attempted to disperse the crowd using water cannons, but some protesters reportedly entered the Congress office premises, leading to tension.
Further, we found the same viral video on the official Facebook page of Indian National Congress - Madhya Pradesh, where it was posted on February 26, 2026. In the post, the Congress unit alleged that BJYM workers and BJP-affiliated individuals had entered the Congress office, vandalized property, and created chaos in the presence of police officials.

Conclusion
Our research found that the viral claim is misleading. The video is from February 21, 2026, when BJYM workers protested outside a Congress office and engaged in vandalism. The footage is now being falsely shared as evidence of an internal rebellion by Congress workers against Rahul Gandhi.

Executive Summary:
Following India’s heavy defeat to South Africa in the T20 World Cup 2026, the team has been facing intense trolling on social media. Amid this backdrop, a video of Indian cricket team head coach Gautam Gambhir has gone viral. In the clip, Gambhir can be heard saying,“Even people who have nothing to do with cricket have made comments. An IPL owner also wrote about split coaching. It’s surprising. People must stay in their own domain. If we don’t interfere in someone else’s domain, they have no right to interfere in ours.”The video is being shared with the claim that Gambhir made these remarks recently in response to trolling after India’s loss to South Africa in the T20 World Cup 2026. However, research by the CyberPeace found the claim to be misleading. The viral video is not related to the T20 World Cup 2026. It is from December 2025 and pertains to India’s Test series defeat against South Africa. An old video is being circulated with a misleading context.
Claim
An Instagram user, ‘rns_news200’, shared the viral video on February 23, 2026, claiming that after the loss to South Africa, head coach Gautam Gambhir issued a stern warning to Indian fans. The caption stated that Suryakumar Yadav was heavily trolled on social media after the match, and Gambhir responded strongly, saying players should not be unfairly targeted and the team deserves support, especially during difficult times.

Fact Check
To verify the claim, we conducted a keyword search on Google. We found the same video on the official X (formerly Twitter) account of sports journalist Vikrant Gupta. The video was posted on December 7, 2025. According to the caption, Gambhir was expressing dissatisfaction following India’s performance.

We also found the longer version of the video on the official website of the Board of Control for Cricket in India (BCCI), where it was published on December 6, 2025. In the full video, Gambhir is clearly seen speaking about India’s defeat to South Africa in a Test match. The specific segment that went viral appears around the 1 minute 58 second mark.

Conclusion
Our research found that the viral claim about Gautam Gambhir’s video being linked to trolling after the T20 World Cup 2026 is misleading. The clip is from December 2025 and relates to India’s Test series defeat against South Africa — not the T20 World Cup 2026.An old video is being reshared with a false and misleading context.

Executive Summary:
Amid escalating tensions between Afghanistan and Pakistan, a video is being widely shared on social media claiming that Afghanistan has shot down a Pakistani fighter jet. The posts further allege that the incident marks the formal beginning of a war between the two countries. However, research conducted by the CyberPeace found the viral claim to be false and the research revealed that the circulating video is not authentic but AI-generated.
Claim
On February 24, 2026, a user on X (formerly Twitter) shared the viral video with the caption: “Afghanistan has shot down a Pakistani fighter jet! Afghanistan announces that war with Pakistan has begun.”
- Original post link: https://x.com/JyotiDevSpeaks/status/2026348257186545914
- Archived link: https://ghostarchive.org/archive/7l00Y

Fact Check:
A careful review of the viral video revealed unusual visual patterns and artificial-looking effects, raising suspicions that it may have been created using artificial intelligence.We analyzed the video using the AI detection tool Hive Moderation, which indicated an 86 percent probability that the video was AI-generated.

To further verify the findings, we scanned the footage using another AI detection platform, Sightengine. The results showed a 99 percent likelihood that the video was AI-generated.

To understand the broader context of the ongoing tensions, we conducted a keyword search and found a report published on February 22, 2026, by BBC Hindi. According to the report, Pakistan claimed it had targeted “seven terrorist hideouts and camps” along the Pakistan–Afghanistan border based on intelligence inputs. Meanwhile, a spokesperson for the Taliban government in Afghanistan stated that Pakistani airstrikes in Nangarhar and Paktika provinces resulted in the deaths of dozens of people, including women and children.
- https://www.bbc.com/hindi/articles/clyz8141397o
Conclusion
Our research confirms that the viral video claiming Afghanistan shot down a Pakistani fighter jet and formally declared war on Pakistan is fake. The footage is AI-generated and is being circulated with a false and misleading narrative.

Executive Summary:
A photo circulating on social media shows a stage with the words “Hindu Sammelan” (Hindu Conference) written in large letters. In front of the stage, rows of chairs appear largely empty, with only a few people seated while most seats remain vacant.
Users sharing the image claim that the event, held under the banner of a “Hindu Sammelan,” was in fact a “Brahmin Sammelan,” and that indigenous communities chose to stay away, resulting in poor attendance.
It is noteworthy that, on the occasion of the centenary year of the Rashtriya Swayamsevak Sangh (RSS), various “Hindu Sammelan” events are being organized across the country. The viral image is being linked to this broader context.
However, research conducted by the CyberPeace found the viral claim to be false. Our research revealed that the image being shared on social media is not authentic but AI-generated and is being circulated with a misleading narrative.
Claim
On February 21, 2026, a Facebook user shared the viral image. The original and archived links are provided below
- https://www.facebook.com/photo?fbid=935049042540479&set=gm.2425972001215469&idorvanity=465387370607285
- https://ghostarchive.org/archive/sxC6d

Fact Check:
A keyword search on Google confirmed that several “Hindu Sammelan” events have indeed been organized across the country as part of the RSS centenary year. For instance, media reports have covered such events in different cities, including Nagpur.

However, upon closely examining the viral image, we observed certain visual inconsistencies and unnatural elements that raised suspicion of AI generation. We first analyzed the image using the AI detection tool Hive Moderation, which indicated a 79.3 percent probability that the image was AI-generated.

To further verify, we scanned the image using another AI detection platform, Sightengine. The results showed a 97 percent likelihood that the image was AI-generated.

Conclusion
Our research confirms that the image circulating on social media is not genuine. It has been artificially created using AI technology and is being shared with a misleading claim.

Executive Summary
A video circulating on social media shows an electric car allegedly being powered by a portable generator attached to it. The clip is being shared with the claim that the generator is directly running the vehicle, suggesting a groundbreaking or unusual technological feat. However, research conducted by the CyberPeace found the viral claim to be false. Our research revealed that the video is not authentic but AI-generated.
Claim
On February 22, 2026, a user on X (formerly Twitter) shared the viral video with the caption: “After watching this video, Newton might turn in his grave.” The post implied that the video demonstrates a scientific impossibility.

Fact Check:
To verify the claim, we conducted a keyword search on Google. However, we found no credible reports from any reputable media organization supporting the assertion made in the viral post. A close examination of the video revealed several visual inconsistencies and unnatural elements, raising suspicion that the footage may have been generated using artificial intelligence. We then analyzed the video using the AI detection tool Hive Moderation. The results indicated a 96 percent probability that the video was AI-generated.

In the next step of our research , we scanned the video using another AI detection platform, WasItAI, which also concluded that the viral video was AI-generated.

Conclusion
Our research confirms that the viral video is not real. It has been artificially created using AI technology and is being circulated with a misleading claim.

Executive Summary
A photo is going viral on social media showing a young man dressed in traditional Arab attire warmly embracing an elderly woman. The post claims that the man flew in from Saudi Arabia to Kerala just to meet his “Hindu mother,” portraying the image as a heartwarming example of communal harmony. However, research by the CyberPeace found that the claim being shared with the image is misleading.
Claim
The viral post narrates an emotional story, alleging that years ago a Hindu woman from Kerala worked in Saudi Arabia caring for children and loved a young boy like her own son. After she returned to India, the boy—now grown up—reportedly searched for her for months, booked a flight, and finally reached Kerala to reunite with her. The post describes an emotional reunion filled with tears, affection, and a bond beyond religion and nationality.

Fact Check
A reverse image search of the viral picture led us to a video uploaded on August 18, 2023, on the YouTube channel of social media influencer Hashim Abbas. In the video, he is seen meeting and hugging the elderly woman while extending Onam greetings.

Further examination of Hashim Abbas’ social media accounts revealed several other videos from his Kerala visit. Our research also found that Abbas played a significant role in the Malayalam film Kondotty Pooram.

Additionally, we found a video posted on August 13, 2023, by actress and theatre artist Sandhya Rajendran, daughter of veteran Malayalam actress Vijayakumari. The video shows Vijayakumari teaching Onam songs to Hashim Abbas.

Conclusion
The evidence clearly establishes that the viral claim is misleading. The man seen in the image is Hashim Abbas, who was meeting senior Malayalam actress Vijayakumari to extend Onam greetings. The emotional story about a son flying from Saudi Arabia to reunite with his Hindu mother is fictional and not connected to the viral image.

Executive Summary
The film ‘Yadav Ji Ki Love Story’, scheduled to release on February 27, has become embroiled in controversy over its title. Several organizations have expressed objections, registering their displeasure regarding the name of the film. Amid the row, a video is being widely circulated on social media. The footage shows a large crowd holding banners and posters while staging a protest. Users sharing the clip claim that it is from South India, where members of the Yadav community have allegedly launched a large-scale agitation against the film. However, research conducted by the CyberPeace found the viral claim to be false. Our research revealed that the video is not authentic but AI-generated, and is being shared with a misleading narrative.
Claim
On February 22, 2026, a Facebook user shared the viral video claiming it depicts protests by the Yadav community in South India against the film. The original and archived links to the post are provided below

Fact Check:
Upon closely examining the viral video, we noticed several anomalies in the visuals, crowd movements, and certain frames. The unnatural patterns and inconsistencies raised suspicions that the footage may have been generated using artificial intelligence. To verify this, we analyzed the video using the AI detection tool Aurigin AI, which indicated that the footage was AI-generated.

We further scanned the clip using another AI detection platform, Hive Moderation. The results showed a 99 percent probability that the video was AI-generated.

Conclusion
Our findings confirm that the viral video is not real. It has been artificially created using AI technology and is being circulated with a false and misleading claim.

Executive Summary
A video circulating on social media shows a lion carrying away a woman who was washing clothes near a pond. Users are sharing the clip claiming it depicts a real incident. However, research by CyberPeace found the viral claim to be false. The research revealed that the video is not real but AI-generated.
Claim
A user on Facebook shared the viral video claiming that a lion attacked and carried away a woman from a pond while she was washing clothes. The link to the post and its archived version are provided below

Fact Check:
Upon closely examining the viral clip, we noticed several visual inconsistencies that raised suspicion about its authenticity. The video was then analyzed using the AI-detection tool Sightengine. According to the analysis results, the viral video was identified as AI-generated.

Conclusion
The research confirms that the viral video does not depict a real incident. The clip is digitally created using artificial intelligence and is being falsely shared as a genuine event.

Executive Summary
A video circulating on social media shows a woman using abusive language in front of a camera. Users sharing the clip claim that the woman is a professor at Galgotias University and that the video exposes her alleged reality. However, an research by CyberPeace found the claim to be misleading. The probe revealed that the woman seen in the viral video has no connection with Galgotias University and is not a professor there.Fact-checking further showed that the video is not recent but around seven years old. The woman featured in the clip was identified as Shubhrastha, who is a political strategist by profession.
Claim:
A user on X (formerly Twitter) shared the viral video on February 18, 2026, claiming: “A ‘class in abuse studies’ at Galgotias University? An obscene video of a professor teaching ethics has gone viral. Another shameful chapter has been added to the list of controversies surrounding Galgotias University.” The post further alleged that after falsely claiming a Chinese robot as its own, the university’s “Culture and Ethics” faculty member was seen publicly using abusive language in the viral clip. The post link and its archived version are provided below:

Fact Check:
To verify the authenticity of the viral claim, we extracted key frames from the video and conducted a reverse image search using Google Lens. During the research , we found the same video uploaded on the Indian Spectator’s YouTube channel on June 9, 2018

The video was also found on another YouTube channel, where it had been uploaded on June 12, 2018.

Conclusion
The research clearly establishes that the woman seen in the viral video has no association with Galgotias University and is not a professor there. The clip is also not recent but approximately seven years old. The woman in the video was identified as Shubhrastha, a political strategist.
.webp)
Executive Summary
The U.S. Department of Justice recently released nearly three million pages of documents, along with thousands of videos and photographs, related to its research into convicted offender Jeffrey Epstein. Meanwhile, a video showing a massive crowd protesting on a street is going viral on social media The video, which had earlier circulated with false claims linking it to anti-government protests in Iran, is now being shared by several users who claim that the protest took place in the United States after the release of the Epstein files. Research by CyberPeace found the viral claim to be false. The video being linked to protests in the United States following the release of the Epstein files is not real and was generated using artificial intelligence (AI).
Claim:
An Instagram user uploaded the viral video on February 9, 2026, with the caption: “After Epstein files released in America. All eyes on America.”
- https://www.instagram.com/reel/DUjLe-XE5lA
- https://ghostarchive.org/archive/tkP6W

Fact Check:
To verify the claim, we first conducted a reverse search of the viral video using Google Lens. The same video was found posted on January 10, 2026, by an Instagram account named “elnaz555,” where it was shared in the context of recent protests in Iran. The post also mentioned that the video was created using AI.

Based on this lead, we further analyzed a higher-quality version of the viral video using Hive Moderation, a tool used to detect AI-generated images and videos. The analysis indicated a 97.9% probability that the video was generated using artificial intelligence. The research clearly shows that the video is not authentic and has been falsely linked to protests in the United States after the release of the Epstein files.

Conclusion:
The claim circulating on social media is false. The viral video allegedly showing protests in the United States following the release of the Epstein files is AI-generated and not related to any real event.

Executive Summary
A video is going viral on social media showing a woman performing a pre-wedding ritual called “Roka” for a couple at a metro station. Many users are sharing the clip believing it to be a real incident. CyberPeace found in its research that the viral claim is false. The video is actually scripted.
Claim:
An Instagram user posted the video on February 7, 2026, with the caption, “A mother performed her son’s Roka with his girlfriend at a metro station.”

Fact Check:
To verify the claim, we conducted a reverse image search using Google Lens on screenshots from the viral video. We found the same video was first uploaded on February 5, 2026, by an Instagram account named “chalte_phirte098.” The profile belongs to digital content creator Aarav Mavi, who regularly posts relationship and breakup-related videos.

Although the viral clip does not include any disclaimer stating that it is scripted, an older video posted by the creator on December 16, 2025, clarifies that his content is based on real-life stories shared by people but is filmed using professional actors. Several similar staged videos are also available on his profile on Instagram.

Conclusion:
Our research clearly shows that the viral video claiming to show a pre-wedding Roka ceremony at a metro station is not real. It was created by a content creator for entertainment purposes. Therefore, the claim circulating on social media is misleading.