#FactCheck: Viral Video Showing Pakistan Shot Down Indian Air Force' MiG-29 Fighter Jet
Executive Summary
Recent claims circulating on social media allege that an Indian Air Force MiG-29 fighter jet was shot down by Pakistani forces during "Operation Sindoor." These reports suggest the incident involved a jet crash attributed to hostile action. However, these assertions have been officially refuted. No credible evidence supports the existence of such an operation or the downing of an Indian aircraft as described. The Indian Air Force has not confirmed any such event, and the claim appears to be misinformation.

Claim
A social media rumor has been circulating, suggesting that an Indian Air Force MiG-29 fighter jet was shot down by Pakistani Air forces during "Operation Sindoor." The claim is accompanied by images purported to show the wreckage of the aircraft.

Fact Check
The social media posts have falsely claimed that a Pakistani Air Force shot down an Indian Air Force MiG-29 during "Operation Sindoor." This claim has been confirmed to be untrue. The image being circulated is not related to any recent IAF operations and has been previously used in unrelated contexts. The content being shared is misleading and does not reflect any verified incident involving the Indian Air Force.

After conducting research by extracting key frames from the video and performing reverse image searches, we successfully traced the original post, which was first published in 2024, and can be seen in a news article from The Hindu and Times of India.
A MiG-29 fighter jet of the Indian Air Force (IAF), engaged in a routine training mission, crashed near Barmer, Rajasthan, on Monday evening (September 2, 2024). Fortunately, the pilot safely ejected and escaped unscathed, hence the claim is false and an act to spread misinformation.

Conclusion
The claims regarding the downing of an Indian Air Force MiG-29 during "Operation Sindoor" are unfounded and lack any credible verification. The image being circulated is outdated and unrelated to current IAF operations. There has been no official confirmation of such an incident, and the narrative appears to be misleading. Peoples are advised to rely on verified sources for accurate information regarding defence matters.
- Claim: Pakistan Shot down an Indian Fighter Jet, MIG-29
- Claimed On: Social Media
- Fact Check: False and Misleading
Related Blogs

Executive Summary
Amid reports of a two-week ceasefire announced on April 8, 2026, between the United States and Iran, a video showing a sudden explosion inside a building has gone viral on social media. The clip shows a fire brigade vehicle stationed outside a structure, with people entering the premises moments before a blast occurs. Social media users are sharing the video with claims that Iran carried out an attack on Israeli Defence Minister Yoav Gallant, alleging that the building shown is linked to Israel’s defence ministry.
However, a research by CyberPeace has found the claim to be false. The viral video is not recent and has no connection to Israel or any ongoing conflict.
Claim
A Facebook user shared the video on April 3, 2026, claiming that Iran had attacked Israeli Defence Minister Yoav Gallant and severely damaged a building associated with him.

Fact Check
To verify the claim, we extracted keyframes from the viral video and conducted a reverse image search. This led us to the same video posted on an X account named Fernanda Melchionna on December 31, 2025.

According to the available information, the video is from Santana do Livramento, where a major fire broke out in a supermarket. Further keyword searches led us to a report published on December 31, 2025, by the Brazilian news website GZH (gaúcha zh clicrbs). The report stated that a fire had erupted in a supermarket in Santana do Livramento, and firefighters had reached the spot to control the blaze. During the operation, an explosion occurred, leaving around 17 people injured. The injured were later taken to a hospital.

We also found the same video uploaded on the YouTube channel Terra Brasil on January 1, 2026, further confirming its origin and timeline.

Conclusion
The viral claim is false and misleading. The explosion video being shared as an attack on Israeli Defence Minister Yoav Gallant is unrelated to the ongoing Middle East situation. The footage is actually from December 2025 and shows an incident in Brazil, where a fire in a supermarket led to an explosion during firefighting operations. There is no evidence to suggest any such attack took place in Israel. The video has been taken out of context and circulated with a fabricated narrative to mislead users and exploit geopolitical tensions.

Introduction
Prebunking is a technique that shifts the focus from directly challenging falsehoods or telling people what they need to believe to understanding how people are manipulated and misled online to begin with. It is a growing field of research that aims to help people resist persuasion by misinformation. Prebunking, or "attitudinal inoculation," is a way to teach people to spot and resist manipulative messages before they happen. The crux of the approach is rooted in taking a step backwards and nipping the problem in the bud by deepening our understanding of it, instead of designing redressal mechanisms to tackle it after the fact. It has been proven effective in helping a wide range of people build resilience to misleading information.
Prebunking is a psychological strategy for countering the effect of misinformation with the goal of assisting individuals in identifying and resisting deceptive content, hence increasing resilience against future misinformation. Online manipulation is a complex issue, and multiple approaches are needed to curb its worst effects. Prebunking provides an opportunity to get ahead of online manipulation, providing a layer of protection before individuals encounter malicious content. Prebunking aids individuals in discerning and refuting misleading arguments, thus enabling them to resist a variety of online manipulations.
Prebunking builds mental defenses for misinformation by providing warnings and counterarguments before people encounter malicious content. Inoculating people against false or misleading information is a powerful and effective method for building trust and understanding along with a personal capacity for discernment and fact-checking. Prebunking teaches people how to separate facts from myths by teaching them the importance of thinking in terms of ‘how you know what you know’ and consensus-building. Prebunking uses examples and case studies to explain the types and risks of misinformation so that individuals can apply these learnings to reject false claims and manipulation in the future as well.
How Prebunking Helps Individuals Spot Manipulative Messages
Prebunking helps individuals identify manipulative messages by providing them with the necessary tools and knowledge to recognize common techniques used to spread misinformation. Successful prebunking strategies include;
- Warnings;
- Preemptive Refutation: It explains the narrative/technique and how particular information is manipulative in structure. The Inoculation treatment messages typically include 2-3 counterarguments and their refutations. An effective rebuttal provides the viewer with skills to fight any erroneous or misleading information they may encounter in the future.
- Micro-dosing: A weakened or practical example of misinformation that is innocuous.
All these alert individuals to potential manipulation attempts. Prebunking also offers weakened examples of misinformation, allowing individuals to practice identifying deceptive content. It activates mental defenses, preparing individuals to resist persuasion attempts. Misinformation can exploit cognitive biases: people tend to put a lot of faith in things they’ve heard repeatedly - a fact that malicious actors manipulate by flooding the Internet with their claims to help legitimise them by creating familiarity. The ‘prebunking’ technique helps to create resilience against misinformation and protects our minds from the harmful effects of misinformation.
Prebunking essentially helps people control the information they consume by teaching them how to discern between accurate and deceptive content. It enables one to develop critical thinking skills, evaluate sources adequately and identify red flags. By incorporating these components and strategies, prebunking enhances the ability to spot manipulative messages, resist deceptive narratives, and make informed decisions when navigating the very dynamic and complex information landscape online.
CyberPeace Policy Recommendations
- Preventing and fighting misinformation necessitates joint efforts between different stakeholders. The government and policymakers should sponsor prebunking initiatives and information literacy programmes to counter misinformation and adopt systematic approaches. Regulatory frameworks should encourage accountability in the dissemination of online information on various platforms. Collaboration with educational institutions, technological companies and civil society organisations can assist in the implementation of prebunking techniques in a variety of areas.
- Higher educational institutions should support prebunking and media literacy and offer professional development opportunities for educators, and scholars by working with academics and professionals on the subject of misinformation by producing research studies on the grey areas and challenges associated with misinformation.
- Technological companies and social media platforms should improve algorithm transparency, create user-friendly tools and resources, and work with fact-checking organisations to incorporate fact-check labels and tools.
- Civil society organisations and NGOs should promote digital literacy campaigns to spread awareness on misinformation and teach prebunking strategies and critical information evaluation. Training programmes should be available to help people recognise and resist deceptive information using prebunking tactics. Advocacy efforts should support legislation or guidelines that support and encourage prebunking efforts and promote media literacy as a basic skill in the digital landscape.
- Media outlets and journalists including print & social media should follow high journalistic standards and engage in fact-checking activities to ensure information accuracy before release. Collaboration with prebunking professionals, cyber security experts, researchers and advocacy analysts can result in instructional content and initiatives that promote media literacy, prebunking strategies and misinformation awareness.
Final Words
The World Economic Forum's Global Risks Report 2024 identifies misinformation and disinformation as the top most significant risks for the next two years. Misinformation and disinformation are rampant in today’s digital-first reality, and the ever-growing popularity of social media is only going to see the challenges compound further. It is absolutely imperative for all netizens and stakeholders to adopt proactive approaches to counter the growing problem of misinformation. Prebunking is a powerful problem-solving tool in this regard because it aims at ‘protection through prevention’ instead of limiting the strategy to harm reduction and redressal. We can draw parallels with the concept of vaccination or inoculation, reducing the probability of a misinformation infection. Prebunking exposes us to a weakened form of misinformation and provides ways to identify it, reducing the chance false information takes root in our psyches.
The most compelling attribute of this approach is that the focus is not only on preventing damage but also creating widespread ownership and citizen participation in the problem-solving process. Every empowered individual creates an additional layer of protection against the scourge of misinformation, not only making safer choices for themselves but also lowering the risk of spreading false claims to others.
References
- [1] https://www3.weforum.org/docs/WEF_The_Global_Risks_Report_2024.pdf
- [2] https://prebunking.withgoogle.com/docs/A_Practical_Guide_to_Prebunking_Misinformation.pdf
- [3] https://ijoc.org/index.php/ijoc/article/viewFile/17634/3565

Introduction
Misinformation is rampant all over the world and impacting people at large. In 2023, UNESCO commissioned a survey on the impact of Fake News which was conducted by IPSOS. This survey was conducted in 16 countries that are to hold national elections in 2024 with a total of 2.5 billion voters and showed how pressing the need for effective regulation had become and found that 85% of people are apprehensive about the repercussions of online disinformation or misinformation. UNESCO has introduced a plan to regulate social media platforms in light of these worries, as they have become major sources of misinformation and hate speech online. This action plan is supported by the worldwide opinion survey, highlighting the urgent need for strong actions. The action plan outlines the fundamental principles that must be respected and concrete measures to be implemented by all stakeholders associated, i.e., government, regulators, civil society and the platforms themselves.
The Key Areas in Focus of the Action Plan
The focus area of the action plan is on the protection of the Freedom of Expression while also including access to information and other human rights in digital platform governance. The action plan works on the basic premise that the impact on human rights becomes the compass for all decision-making, at every stage and by every stakeholder. Groups of independent regulators work in close coordination as part of a wider network, to prevent digital companies from taking advantage of disparities between national regulations. Moderation of content as a feasible and effective option at the required scale, in all regions and all languages.
The algorithms of these online platforms, particularly the social media platforms are established, but it is too often geared towards maximizing engagement rather than the reliability of information. Platforms are required to take on more initiative to educate and train users to be critical thinkers and not just hopers. Regulators and platforms are in a position to take strong measures during particularly sensitive conditions ranging from elections to crises, particularly the information overload that is taking place.
Key Principles of the Action Plan
- Human Rights Due Diligence: Platforms are required to assess their impact on human rights, including gender and cultural dimensions, and to implement risk mitigation measures. This would ensure that the platforms are responsible for educating users about their rights.
- Adherence to International Human Rights Standards: Platforms must align their design, content moderation, and curation with international human rights standards. This includes ensuring non-discrimination, supporting cultural diversity, and protecting human moderators.
- Transparency and Openness: Platforms are expected to operate transparently, with clear, understandable, and auditable policies. This includes being open about the tools and algorithms used for content moderation and the results they produce.
- User Access to Information: Platforms should provide accessible information that enables users to make informed decisions.
- Accountability: Platforms must be accountable to their stakeholders which would include the users and the public, which would ensure that redressal for content-related decisions is not compromised. This accountability extends to the implementation of their terms of service and content policies.
Enabling Environment for the application of the UNESCO Plan
The UNESCO Action Plan to counter misinformation has been created to create an environment where freedom of expression and access to information flourish, all while ensuring safety and security for digital platform users and non-users. This endeavour calls for collective action—societies as a whole must work together. Relevant stakeholders, from vulnerable groups to journalists and artists, enable the right to expression.
Conclusion
The UNESCO Action Plan is a response to the dilemma that has been created due to the information overload, particularly, because the distinction between information and misinformation has been so clouded. The IPSOS survey has revealed the need for an urgency to address these challenges in the users who fear the repercussions of misinformation.
The UNESCO action plan provides a comprehensive framework that emphasises the protection of human rights, particularly freedom of expression, while also emphasizing the importance of transparency, accountability, and education in the governance of digital platforms as a priority. By advocating for independent regulators and encouraging platforms to align with international human rights standards, UNESCO is setting the stage for a more responsible and ethical digital ecosystem.
The recommendations include integrating regulators through collaborations and promoting global cooperation to harmonize regulations, expanding the Digital Literacy campaign to educate users about misinformation risks and online rights, ensuring inclusive access to diverse content in multiple languages and contexts, and monitoring and refining tech advancements and regulatory strategies as challenges evolve. To ultimately promote a true online information landscape.
Reference
- https://www.unesco.org/en/articles/online-disinformation-unesco-unveils-action-plan-regulate-social-media-platforms
- https://www.unesco.org/sites/default/files/medias/fichiers/2023/11/unesco_ipsos_survey.pdf
- https://dig.watch/updates/unesco-sets-out-strategy-to-tackle-misinformation-after-ipsos-survey