#FactCheck - Viral Image of Bridge claims to be of Mumbai, but in reality it's located in Qingdao, China
Executive Summary:
The photograph of a bridge allegedly in Mumbai, India circulated through social media was found to be false. Through investigations such as reverse image searches, examination of similar videos, and comparison with reputable news sources and google images, it has been found that the bridge in the viral photo is the Qingdao Jiaozhou Bay Bridge located in Qingdao, China. Multiple pieces of evidence, including matching architectural features and corroborating videos tell us that the bridge is not from Mumbai. No credible reports or sources have been found to prove the existence of a similar bridge in Mumbai.

Claims:
Social media users claim a viral image of the bridge is from Mumbai.



Fact Check:
Once the image was received, it was investigated under the reverse image search to find any lead or any information related to it. We found an image published by Mirror News media outlet, though we are still unsure but we can see the same upper pillars and the foundation pillars with the same color i.e white in the viral image.

The name of the Bridge is Jiaozhou Bay Bridge located in China, which connects the eastern port city of the country to an offshore island named Huangdao.
Taking a cue from this we then searched for the Bridge to find any other relatable images or videos. We found a YouTube Video uploaded by a channel named xuxiaopang, which has some similar structures like pillars and road design.

In reverse image search, we found another news article that tells about the same bridge in China, which is more likely similar looking.

Upon lack of evidence and credible sources for opening a similar bridge in Mumbai, and after a thorough investigation we concluded that the claim made in the viral image is misleading and false. It’s a bridge located in China not in Mumbai.
Conclusion:
In conclusion, after fact-checking it was found that the viral image of the bridge allegedly in Mumbai, India was claimed to be false. The bridge in the picture climbed to be Qingdao Jiaozhou Bay Bridge actually happened to be located in Qingdao, China. Several sources such as reverse image searches, videos, and reliable news outlets prove the same. No evidence exists to suggest that there is such a bridge like that in Mumbai. Therefore, this claim is false because the actual bridge is in China, not in Mumbai.
- Claim: The bridge seen in the popular social media posts is in Mumbai.
- Claimed on: X (formerly known as Twitter), Facebook,
- Fact Check: Fake & Misleading
Related Blogs

Introduction
Assisted Reproductive Technology (“ART”) refers to a diverse set of medical procedures designed to aid individuals or couples in achieving pregnancy when conventional methods are unsuccessful. This umbrella term encompasses various fertility treatments, including in vitro fertilization (IVF), intrauterine insemination (IUI), and gamete and embryo manipulation. ART procedures involve the manipulation of both male and female reproductive components to facilitate conception.
The dynamic landscape of data flows within the healthcare sector, notably in the realm of ART, demands a nuanced understanding of the complex interplay between privacy regulations and medical practices. In this context, the Information Technology (Reasonable Security Practices And Procedures And Sensitive Personal Data Or Information) Rules, 2011, play a pivotal role, designating health information as "sensitive personal data or information" and underscoring the importance of safeguarding individuals' privacy. This sensitivity is particularly pronounced in the ART sector, where an array of personal data, ranging from medical records to genetic information, is collected and processed. The recent Assisted Reproductive Technology (Regulation) Act, 2021, in conjunction with the Digital Personal Data Protection Act, 2023, establishes a framework for the regulation of ART clinics and banks, presenting a layered approach to data protection.
A note on data generated by ART
Data flows in any sector are scarcely uniform and often not easily classified under straight-jacket categories. Consequently, mapping and identifying data and its types become pivotal. It is believed that most data flows in the healthcare sector are highly sensitive and personal in nature, which may severely compromise the privacy and safety of an individual if breached. The Information Technology (Reasonable Security Practices And Procedures And Sensitive Personal Data Or Information) Rules, 2011 (“SPDI Rules”) categorizes any information pertaining to physical, physiological, mental conditions or medical records and history as “sensitive personal data or information”; this definition is broad enough to encompass any data collected by any ART facility or equipment. These include any information collected during the screening of patients, pertaining to ovulation and menstrual cycles, follicle and sperm count, ultrasound results, blood work etc. It also includes pre-implantation genetic testing on embryos to detect any genetic abnormality.
But data flows extend beyond mere medical procedures and technology. Health data also involves any medical procedures undertaken, the amount of medicine and drugs administered during any procedure, its resultant side effects, recovery etc. Any processing of the above-mentioned information, in turn, may generate more personal data points relating to an individual’s political affiliations, race, ethnicity, genetic data such as biometrics and DNA etc.; It is seen that different ethnicities and races react differently to the same/similar medication and have different propensities to genetic diseases. Further, it is to be noted that data is not only collected by professionals but also by intelligent equipment like AI which may be employed by any facility to render their service. Additionally, dissemination of information under exceptional circumstances (e.g. medical emergency) also affects how data may be classified. Considerations are further nuanced when the fundamental right to identity of a child conceived and born via ART may be in conflict with the fundamental right to privacy of a donor to remain anonymous.
Intersection of Privacy laws and ART laws:
In India, ART technology is regulated by the Assisted Reproductive Technology (Regulation) Act, 2021 (“ART Act”). With this, the Union aims to regulate and supervise assisted reproductive technology clinics and ART banks, prevent misuse and ensure safe and ethical practice of assisted reproductive technology services. When read with the Digital Personal Data Protection Act, 2023 (“DPDP Act”) and other ancillary guidelines, the two legislations provide some framework regulations for the digital privacy of health-based apps.
The ART Act establishes a National Assisted Reproductive Technology and Surrogacy Registry (“National Registry”) which acts as a central database for all clinics and banks and their nature of services. The Act also establishes a National Assisted Reproductive Technology and Surrogacy Board (“National Board”) under the Surrogacy Act to monitor the implementation of the act and advise the central government on policy matters. It also supervises the functioning of the National Registry, liaises with State Boards and curates a code of conduct for professionals working in ART clinics and banks. Under the DPDP Act, these bodies (i.e. National Board, State Board, ART clinics and banks) are most likely classified as data fiduciaries (primarily clinics and banks), data processors (these may include National Board and State boards) or an amalgamation of both (these include any appropriate authority established under the ART Act for investigation of complaints, suspend or cancellation of registration of clinics etc.) depending on the nature of work undertaken by them. If so classified, then the duties and liabilities of data fiduciaries and processors would necessarily apply to these bodies. As a result, all bodies would necessarily have to adopt Privacy Enhancing Technologies (PETs) and other organizational measures to ensure compliance with privacy laws in place. This may be considered one of the most critical considerations of any ART facility since any data collected by them would be sensitive personal data pertaining to health, regulated by the Information Technology (Reasonable Security Practices And Procedures And Sensitive Personal Data Or Information) Rules, 2011 (“SPDI Rules 2011”). These rules provide for how sensitive personal data or information are to be collected, handled and processed by anyone.
The ART Act independently also provides for the duties of ART clinics and banks in the country. ART clinics and banks are required to inform the commissioning couple/woman of all procedures undertaken and all costs, risks, advantages, and side effects of their selected procedure. It mandatorily ensures that all information collected by such clinics and banks to not informed to anyone except the database established by the National Registry or in cases of medical emergency or on order of court. Data collected by clinics and banks (these include details on donor oocytes, sperm or embryos used or unused) are required to be detailed and must be submitted to the National Registry online. ART banks are also required to collect personal information of donors including name, Aadhar number, address and any other details. By mandating online submission, the ART Act is harmonized with the DPDP Act, which regulates all digital personal data and emphasises free, informed consent.
Conclusion
With the increase in active opt-ins for ART, data privacy becomes a vital consideration for all healthcare facilities and professionals. Safeguard measures are not only required on a corporate level but also on a governmental level. It is to be noted that in the 262 Session of the Rajya Sabha, the Ministry of Electronics and Information Technology reported 165 data breach incidents involving citizen data from January 2018 to October 2023 from the Central Identities Data Repository despite publicly denying. This discovery puts into question the safety and integrity of data that may be submitted to the National Registry database, especially given the type of data (both personal and sensitive information) it aims to collate. At present the ART Act is well supported by the DPDP Act. However, further judicial and legislative deliberations are required to effectively regulate and balance the interests of all stakeholders.
References
- The Information Technology (Reasonable Security Practices And Procedures And Sensitive Personal Data Or Information) Rules, 2011
- Caring for Intimate Data in Fertility Technologies https://dl.acm.org/doi/pdf/10.1145/3411764.3445132
- Digital Personal Data Protection Act, 2023
- https://www.wolterskluwer.com/en/expert-insights/pharmacogenomics-and-race-can-heritage-affect-drug-disposition
.webp)
The Digital Covenant: Aligning Communication with SDG Goals
“Rethinking Communication, Cyber Responsibility, and Sustainability in a Connected World”
Introduction
It is rightly said by Antonio Guterres, United Nations Secretary General, “Everyone should be able to express themselves freely without fear of attack. Everyone should be able to access a range of views and information sources.” In 2024, when the Global Alliance for PR and Communication Management asserted that it aligns with the era of digital transformation, where technology is moving at terminal velocity and bringing various risks and threats, it called on the global leaders and stakeholders to proclaim ‘Responsible Communication’ as the 18th Sustainable Development Goal (SDG). On May 17th, as we celebrate World Telecommunication and Information Society Day (WTISD) 2025, we must align our personal, professional, and virtual spaces with a safe and sustainable information age.
In terms of digital growth, it is indubitable that India is growing at a brisk pace consistently in alignment with its South Asian and Western counterparts and has incorporated international covenants on digital personal data and cyber crimes within its domestic regime.
UN Global Principles for Information Integrity
The United Nations has displayed its constant commitment to the achievement of the seventeen SDGs that were adopted at the United Nations Conference in 2012 in Rio de Janeiro. It recognises that you cannot isolate the digital transformation, technology, and digitisation from other areas that are included within the SDGs, such as health, education, and poverty. The UN released Policy Brief 8 in June 2023 by the UN Secretary-General that seeks to empirically derive data on the threats posed to information integrity and then come up with norms that help guide the member states, the digital platforms, and other stakeholders. The norms must be in conformity with the right to freedom of opinion and expression and the right to information access.
In line with its agenda, it has formulated Global Principles of Information Integrity, which include “Societal Trust and Resilience”, “Healthy Incentives”, “Public Empowerment”, “Independent, Free and Pluralistic Media” and “Transparency and Research”. The principles recognise the harm caused by hatred, misinformation, and disinformation propagated by the misuse of advances in Artificial Intelligence Technology (AI).
Breaking the Binary: Bridging the Gender Digital Divide
The reflection of how far we have come and how far we have to go can be deciphered with a single sentence, i.e., using digital technologies to promote gender equality. This can be seen both as a paradox and a pressing call to action. As we celebrate WTISD 2025, the day highlights the fundamental role of Information and Communication Technologies (ICTs) in accelerating progress and bringing those not included in this digital transformation to become a part of this change, especially the female population that remains isolated from mainstream growth. As per the data given by ITU, “Out of the world population, 70 per cent of men are using the internet, compared with 65 per cent of women.”
This exclusion is not merely a technical gap but a societal and economic chasm, reinforcing existing inequalities. By including such an important goal in the theme of this day, it marks a critical moment towards the formation of gender-sensitive digital policies, promoting digital literacy among women and girls, and ensuring safe, affordable, and meaningful connectivity. We can explore the future potential where technology is the true instrument for gender parity, not a mirror of old hierarchies.
India and its courts have time and again proven their commitment to cultivating digital transformation as an inherent strength to bridge this digital divide, and the recent judgement where the court declared the right to digital access an intrinsic part of the right to life and liberty is a single instance among many.
CyberPeace Resolution on World Telecommunication and Information Society Day
CyberPeace is actively bridging the gap between digital safety and sustainable development through its initiatives, aligning with the principles of the Sustainable Development Goals (SDGs). The ‘CyberPeace Corps’ empowers communities by fostering cyber hygiene awareness and building digital resilience. The ‘CyberPeace Initiative’, a project with Google.org, tackles digital misinformation, promoting informed online engagement. Additionally, Digital Shakti, now in its fifth phase, empowers women by enhancing their digital literacy and safety. These are just a few of the many impactful initiatives by CyberPeace, aimed at creating a safer and more inclusive digital future. Together, we are spreading awareness and strengthening the foundation for a safer and more inclusive digital future and promoting responsible tech use. Let us be resolute on this World Telecommunication and Information Society Day for “Clean Data. Safe Clicks. Stronger Future. Pledge to Cyber Hygiene Today!”
References

Introduction
The recent inauguration of the Google Safety Engineering Centre (GSEC) in Hyderabad on 18th June, 2025, marks a pivotal moment not just for India, but for the entire Asia-Pacific region’s digital future. As only the fourth such centre in the world after Munich, Dublin, and Málaga, its presence signals a shift in how AI safety, cybersecurity, and digital trust are being decentralised, leading to a more globalised and inclusive tech ecosystem. India’s digitisation over the years has grown at a rapid scale, introducing millions of first-time internet users, who, depending on their awareness, are susceptible to online scams, phishing, deepfakes, and AI-driven fraud. The establishment of GSEC is not just about launching a facility but a step towards addressing AI readiness, user protection, and ecosystem resilience.
Building a Safer Digital Future in the Global South
The GSEC is set to operationalise the Google Safety Charter, designed around three core pillars: empowering users by protecting them from online fraud, strengthening government cybersecurity and enterprise, and advancing responsible AI in the platform design and execution. This represents a shift from the standard reactive safety responses to proactive, AI-driven risk mitigation. The goal is to make safety tools not only effective, but tailored to threats unique to the Global South, from multilingual phishing to financial fraud via unofficial lending apps. This centre is expected to stimulate regional cybersecurity ecosystems by creating jobs, fostering public-private partnerships, and enabling collaboration across academia, law enforcement, civil society, and startups. In doing so, it positions Asia-Pacific not as a consumer of the standard Western safety solutions but as an active contributor to the next generation of digital safeguards and customised solutions.
Previous piloted solutions by Google include DigiKavach, a real-time fraud detection framework, and tools like spam protection in mobile operating systems and app vetting mechanisms. What GSEC might aid with is the scaling and integration of these efforts into systems-level responses, where threat detection, safety warnings, and reporting mechanisms, etc., would ensure seamless coordination and response across platforms. This reimagines safety as a core design principle in India’s digital public infrastructure rather than focusing on attack-based response.
CyberPeace Insights
The launch aligns with events such as the AI Readiness Methodology Conference recently held in New Delhi, which brought together researchers, policymakers, and industry leaders to discuss ethical, secure, and inclusive AI implementation. As the world grapples with how to deal with AI technologies ranging from generative content to algorithmic decisions, centres like GSEC can play a critical role in defining the safeguards and governance structures that can support rapid innovation without compromising public trust and safety. The region’s experiences and innovations in AI governance must shape global norms, and the role of Tech firms in doing so is significant. Apart from this, efforts with respect to creating digital infrastructure and safety centres addressing their protection resonate with India’s vision of becoming a global leader in AI.
References
- https://www.thehindu.com/news/cities/Hyderabad/google-safety-engineering-centre-india-inaugurated-in-hyderabad/article69708279.ece
- https://www.businesstoday.in/technology/news/story/google-launches-safety-charter-to-secure-indias-ai-future-flags-online-fraud-and-cyber-threats-480718-2025-06-17?utm_source=recengine&utm_medium=web&referral=yes&utm_content=footerstrip-1&t_source=recengine&t_medium=web&t_content=footerstrip-1&t_psl=False
- https://blog.google/intl/en-in/partnering-indias-success-in-a-new-digital-paradigm/
- https://blog.google/intl/en-in/company-news/googles-safety-charter-for-indias-ai-led-transformation/
- https://economictimes.indiatimes.com/magazines/panache/google-rolls-out-hyderabad-hub-for-online-safety-launches-first-indian-google-safety-engineering-centre/articleshow/121928037.cms?from=mdr