#FactCheck - AI-Cloned Audio in Viral Anup Soni Video Promoting Betting Channel Revealed as Fake
Executive Summary:
A morphed video of the actor Anup Soni popular on social media promoting IPL betting Telegram channel is found to be fake. The audio in the morphed video is produced through AI voice cloning. AI manipulation was identified by AI detection tools and deepfake analysis tools. In the original footage Mr Soni explains a case of crime, a part of the popular show Crime Patrol which is unrelated to betting. Therefore, it is important to draw the conclusion that Anup Soni is in no way associated with the betting channel.

Claims:
The facebook post claims the IPL betting Telegram channel which belongs to Rohit Khattar is promoted by Actor Anup Soni.

Fact Check:
Upon receiving the post, the CyberPeace Research Team closely analyzed the video and found major discrepancies which are mostly seen in AI-manipulated videos. The lip sync of the video does not match the audio. Taking a cue from this we analyzed using a Deepfake detection tool by True Media. It is found that the voice of the video is 100% AI-generated.



We then extracted the audio and checked in an audio Deepfake detection tool named Hive Moderation. Hive moderation found the audio to be 99.9% AI-Generated.

We then divided the video into keyframes and reverse searched one of the keyframes and found the original video uploaded by the YouTube channel named LIV Crime.
Upon analyzing we found that in the 3:18 time frame the video was edited, and altered with an AI voice.

Hence, the viral video is an AI manipulated video and it’s not real. We have previously debunked such AI voice manipulation with different celebrities and politicians to misrepresent the actual context. Netizens must be careful while believing in such AI manipulation videos.
Conclusion:
In conclusion, the viral video claiming that IPL betting Telegram channel promotion by actor Anup Soni is false. The video has been manipulated using AI voice cloning technology, as confirmed by both the Hive Moderation AI detector and the True Media AI detection tool. Therefore, the claim is baseless and misleading.
- Claim: An IPL betting Telegram channel belonging to Rohit Khattar promoted by Actor Anup Soni.
- Claimed on: Facebook
- Fact Check: Fake & Misleading
Related Blogs

Introduction
In the digital realm of social media, Meta Platforms, the driving force behind Facebook and Instagram, faces intense scrutiny following The Wall Street Journal's investigative report. This exploration delves deeper into critical issues surrounding child safety on these widespread platforms, unravelling algorithmic intricacies, enforcement dilemmas, and the ethical maze surrounding monetisation features. Instances of "parent-managed minor accounts" leveraging Meta's subscription tools to monetise content featuring young individuals have raised eyebrows. While skirting the line of legality, this practice prompts concerns due to its potential appeal to adults and the associated inappropriate interactions. It's a nuanced issue demanding nuanced solutions.
Failed Algorithms
The very heartbeat of Meta's digital ecosystem, its algorithms, has come under intense scrutiny. These algorithms, designed to curate and deliver content, were found to actively promoting accounts featuring explicit content to users with known pedophilic interests. The revelation sparks a crucial conversation about the ethical responsibilities tied to the algorithms shaping our digital experiences. Striking the right balance between personalised content delivery and safeguarding users is a delicate task.
While algorithms play a pivotal role in tailoring content to users' preferences, Meta needs to reevaluate the algorithms to ensure they don't inadvertently promote inappropriate content. Stricter checks and balances within the algorithmic framework can help prevent the inadvertent amplification of content that may exploit or endanger minors.
Major Enforcement Challenges
Meta's enforcement challenges have come to light as previously banned parent-run accounts resurrect, gaining official verification and accumulating large followings. The struggle to remove associated backup profiles adds layers to concerns about the effectiveness of Meta's enforcement mechanisms. It underscores the need for a robust system capable of swift and thorough actions against policy violators.
To enhance enforcement mechanisms, Meta should invest in advanced content detection tools and employ a dedicated team for consistent monitoring. This proactive approach can mitigate the risks associated with inappropriate content and reinforce a safer online environment for all users.
The financial dynamics of Meta's ecosystem expose concerns about the exploitation of videos that are eligible for cash gifts from followers. The decision to expand the subscription feature before implementing adequate safety measures poses ethical questions. Prioritising financial gains over user safety risks tarnishing the platform's reputation and trustworthiness. A re-evaluation of this strategy is crucial for maintaining a healthy and secure online environment.
To address safety concerns tied to monetisation features, Meta should consider implementing stricter eligibility criteria for content creators. Verifying the legitimacy and appropriateness of content before allowing it to be monetised can act as a preventive measure against the exploitation of the system.
Meta's Response
In the aftermath of the revelations, Meta's spokesperson, Andy Stone, took centre stage to defend the company's actions. Stone emphasised ongoing efforts to enhance safety measures, asserting Meta's commitment to rectifying the situation. However, critics argue that Meta's response lacks the decisive actions required to align with industry standards observed on other platforms. The debate continues over the delicate balance between user safety and the pursuit of financial gain. A more transparent and accountable approach to addressing these concerns is imperative.
To rebuild trust and credibility, Meta needs to implement concrete and visible changes. This includes transparent communication about the steps taken to address the identified issues, continuous updates on progress, and a commitment to a user-centric approach that prioritises safety over financial interests.
The formation of a task force in June 2023 was a commendable step to tackle child sexualisation on the platform. However, the effectiveness of these efforts remains limited. Persistent challenges in detecting and preventing potential child safety hazards underscore the need for continuous improvement. Legislative scrutiny adds an extra layer of pressure, emphasising the urgency for Meta to enhance its strategies for user protection.
To overcome ongoing challenges, Meta should collaborate with external child safety organisations, experts, and regulators. Open dialogues and partnerships can provide valuable insights and recommendations, fostering a collaborative approach to creating a safer online environment.
Drawing a parallel with competitors such as Patreon and OnlyFans reveals stark differences in child safety practices. While Meta grapples with its challenges, these platforms maintain stringent policies against certain content involving minors. This comparison underscores the need for universal industry standards to safeguard minors effectively. Collaborative efforts within the industry to establish and adhere to such standards can contribute to a safer digital environment for all.
To align with industry standards, Meta should actively participate in cross-industry collaborations and adopt best practices from platforms with successful child safety measures. This collaborative approach ensures a unified effort to protect users across various digital platforms.
Conclusion
Navigating the intricate landscape of child safety concerns on Meta Platforms demands a nuanced and comprehensive approach. The identified algorithmic failures, enforcement challenges, and controversies surrounding monetisation features underscore the urgency for Meta to reassess and fortify its commitment to being a responsible digital space. As the platform faces this critical examination, it has an opportunity to not only rectify the existing issues but to set a precedent for ethical and secure social media engagement.
This comprehensive exploration aims not only to shed light on the existing issues but also to provide a roadmap for Meta Platforms to evolve into a safer and more responsible digital space. The responsibility lies not just in acknowledging shortcomings but in actively working towards solutions that prioritise the well-being of its users.
References
- https://timesofindia.indiatimes.com/gadgets-news/instagram-facebook-prioritised-money-over-child-safety-claims-report/articleshow/107952778.cms
- https://www.adweek.com/blognetwork/meta-staff-found-instagram-tool-enabled-child-exploitation-the-company-pressed-ahead-anyway/107604/
- https://www.tbsnews.net/tech/meta-staff-found-instagram-subscription-tool-facilitated-child-exploitation-yet-company
.webp)
Introduction
The Digital Personal Data Protection (DPDP) Act, of 2023, introduces a framework for the protection of personal data in India. Data fiduciaries are the entity that essentially determines the purpose and means of processing of personal data. The small-scale industries also fall within the ambit of the term. Startups/Small companies and Micro, Small, and Medium Enterprises (MSMEs) while determining the purpose of processing of personal data in the capacity of ‘data fiduciary’ are also required to comply with the DPDP Act provisions. The obligations set for the data fiduciary will apply to them unilaterally, though compliance with this Act and can be challenging due to resource constraints and limited expertise in data protection.
DPDP Act, 2023 Section 17(3) gives power to the Central Government to exempt Startups from being obligated to comply with the Act, taking into account the volume and nature of personal data processed. It is the nation's first standalone law on data protection and privacy, which sets forth strict rules on how data fiduciaries can collect and process personal data, focusing on consent-based mechanisms and personal data protection. Small-scale industries are given more time to comply with the DPDP Act. The detailed provisions to be notified in further rulemaking called ‘DPDP rules’.
Obligations on Data Fiduciary under the DPDP Act, 2023
The DPDP Act focuses on processing digital personal data in a manner that recognizes both the right of individuals to protect their personal data and the need to process such personal data for lawful purposes and for matters connected therewith or incidental thereto. Hence, small-scale industries also need to comply with provisions aimed at protecting digital personal data.
The key requirements to be considered:
- Data Processing Principles: Ensuring that data processing is done lawfully, fairly, and transparently. Further, the collection and processing of personal data is only for specific, clear, and legitimate purposes and only the data necessary for the stated purpose. Ensuring that the data is accurate and up to date is also necessary. An important part is that the data is not retained longer than necessary and appropriate security measures are taken to protect the said data.
- Consent Management: Clear and informed consent should be obtained from individuals before collecting their personal data. Further, individuals have the option to withdraw their consent easily.
- Rights of Data Principals: Data principals (individuals) whose data is being collected have the right to Information, the right to correction and erasure of data, the right to grievance redressa, Right to nominate.the right to access, correct, and delete their personal data. Data fiduciaries need to be mindful of mechanisms to handle requests from data principals regarding their concerns.
- Data Breach Notifications: Data fiduciaries are required to notify the data protection board and the affected individuals in case a data breach has occurred.
- Appropriate technical and organisational measures: A Data Fiduciary shall implement appropriate technical and organisational measures to ensure effective observance of the provisions of this Act and the rules made thereunder.Cross-border Data Transfers: Compliance with regulations in relation to the transfer of personal data outside of India should be ensured.
Challenges for Small Scale Industries for the DPDP Act Compliance
While small-scale industries have high aims for their organisational growth and now in the digital age they also need to place reliance on online security measures and handling of personal data, with the DPDP act in the picture it becomes an obligation to consider and comply with. As small-scale industries including MSMEs, they might face certain challenges in fulfilling these obligations but digital data protection measures will also boost the competitive market and customer growth in their business. Bringing reforms in methods aimed at better data governance in today's digital era is significant.
One of the major challenges for small-scale industries could be ensuring a skilled workforce that understands and educates internal stakeholders about the DPDP Act compliances. This could undoubtedly become an additional burden.
Further, the limited resources can make the implementation of data protection, which is oftentimes complex for a layperson in the case of a small-scale industry, difficult to implement. Limitations in resources are often financial or human resources.
Cybersecurity, cyber awareness, and protection from cyber threats need some form of expertise, which is lacking in small enterprises. The outsourcing of such expertise is a decision that is sometimes taken too late, and some form of harm can take place between the periods by which an incident can occur.
Investment in the core business or enterprise many times doesn't include technology other than the basic requirements to run the business, nor towards ensuring that the data is secure and all compliances are met. However, in the fast-moving digital world, all industries need to be mindful of their efforts to protect personal data and proper data governance.
Recommendations
To ensure the proper and effective personal data handling practices as per the provisions of the act, the small companies/startups need to work backend and frontend and ensure that they take adequate measures to comply with the act. While such industries have been given more time to ensure compliance, there are some suggestions for them to be compliant with the new law.
Small companies can ensure compliance with the DPDP Act by implementing robust data protection policies, investing in and providing employee training on data privacy, using age-verification mechanisms, and adopting privacy-by-design principles. Conduct a gap analysis to identify areas where current practices fall short of DPDP Act requirements. Regular audits, secure data storage solutions, and transparent communication with users about data practices are also essential. Use cost-effective tools and technologies for data protection and management.
Conclusion
Small-scale industries must take proactive steps to align with the DPDP Act, 2023 provisions. By understanding the requirements, leveraging external expertise, and adopting best practices, small-scale industries can ensure compliance and protect personal data effectively. In the long run, complying with the new law would lead to greater trust and better business for the enterprises, resulting in a larger revenue share for them.
References
- https://pib.gov.in/PressReleaseIframePage.aspx?PRID=1959161
- https://www.financialexpress.com/business/digital-transformation-dpdp-act-managing-data-protection-compliance-in-businesses-3305293/
- https://economictimes.indiatimes.com/tech/technology/big-tech-coalition-seeks-12-18-month-extension-to-comply-with-indias-dpdp-act/articleshow/104726843.cms?from=mdr

Introduction
Social media is the new platform for free speech and expressing one’s opinions. The latest news breaks out on social media and is often used by political parties to propagate their parties during the elections. Hashtag (#)is the new weapon, a powerful hashtag that goes a long way in making an impact in society that so at a global level. Various hashtags have gained popularity in the last years, such as – #blacklivesmatter, #metoo, #pride, #cybersecurity, and many more, which were influential in spreading awareness among the people regarding various social issues and taboos, which then were removed from multiple cultures. Social media is strengthened by social media influencers who are famous personalities with a massive following as they create regular content that the users consume and share with their friends. Social media is all about the message and its speed, and hence issues like misinformation and disinformation are widespread on nearly all social media platforms, so the influencers play a keen role in making sure the content on social media is in compliance with its community and privacy guidelines.
The Know-How
The Department of Consumer Affairs under the Ministry of Consumer Affairs, Food and Public Distribution released a guide, ‘Endorsements Know-hows!’ for celebrities, influencers, and virtual influencers on social media platforms, The guide aims to ensure that individuals do not mislead their audiences when endorsing products or services and that they are in compliance with the Consumer Protection Act and any associated rules or guidelines. Advertisements are no longer limited to traditional media like print, television, or radio, with the increasing reach of digital platforms and social media, such as Facebook, Twitter, and Instagram, there has been a rise in the influence of virtual influencers, celebrities, and social media influencers. This has led to an increased risk of consumers being misled by advertisements and unfair trade practices by these individuals on social media platforms. Endorsements must be made in simple, clear language, and terms such as “advertisement,” “sponsored,” or “paid promotion” can be used. They should not endorse any product or service and service in which they have done due diligence or that they have not personally used or experienced. The Act established guidelines for protecting consumers from unfair trade practices and misleading advertisements. The Department of Consumer Affairs published Guidelines for prevention of Misleading Advertisements and Endorsements for Misleading Advertisements, 2022, on 9th June 2022. These guidelines outline the criteria for valid advertisements and the responsibilities of manufacturers, service providers, advertisers, and advertising agencies. These guidelines also touched upon celebrities and endorsers. It states that misleading advertisements in any form, format, or medium are prohibited by law.
The guidelines apply to social media influencers as well as virtual avatars promoting products and services online. The disclosures should be easy to notice in post descriptions, where you can usually find hashtags or links. It should also be prominent enough to be noticeable in the content,
Changes Expected
The new guidelines will bring about uniformity in social media content in respect of privacy and the opinions of different people. The primary issue being addressed is misinformation, which was at its peak during the Covid-19 pandemic and impacted millions of people worldwide. The aspect of digital literacy and digital etiquette is a fundamental art of social media ethics, and hence social media influencers and celebrities can go a long way in spreading awareness about the same among common people and regular social media users. The increasing threats of cybercrimes and various exploitations over cyberspace can be eradicated with the help of efficient awareness and education among the youth and the vulnerable population, and the influencers can easily do the same, so its time that the influencers understand their responsibility of leading the masses online and create a healthy secure cyber ecosystem. Failing to follow the guidelines will make social media influencers liable for a fine of up to Rs 10 lakh. In the case of repeated offenders, the penalty can go up to Rs 50 lakh.
Conclusion
The size of the social media influencer market in India in 2022 was $157 million. It could reach as much as $345 million by 2025. Indian advertising industry’s self-regulatory body Advertising Standards Council of India (ASCI), shared that Influencer violations comprise almost 30% of ads taken up by ASCI, hence this legal backing for disclosure requirements is a welcome step. The Ministry of Consumer Affairs had been in touch with ASCI to review the various global guidelines on influencers. The social media guidelines from Clairfirnia and San Fransisco share the same basis, and hence guidelines inspired by different countries will allow the user and the influencer to understand the global perspective and work towards securing the bigger picture. As we know that cyberspace has no geographical boundaries and limitations; hence now is the time to think beyond conventional borders and start contributing towards securing and safeguarding global cyberspace.