- Charvi Maini
- Posts
- #12 Fake fan pages to hoax calls
#12 Fake fan pages to hoax calls
The latest AI scams
Recently, a fake fan page of the Indian actor Sidharth Malhotra recently scammed a fan in the United States by saying that Kiara Advani was a threat to Sidharth’s life. The fan was scammed out of 50 lakhs INR. Not only this, the American actor Chris Evans has also been an unfortunate part of scams, wherein fans claimed he messaged them asking for money.
Honestly, people thinking that Chris Evans aka Captain America, is messaging them, that too asking for money, should’ve been their biggest red flag in the first place.
Scams including hoax phone calls impersonating police officers, by using AI powered voice cloning software in India are also on the rise. In one such case, the “police officer” told the woman that she was accused in a money laundering case and successfully extracted 3 lakhs from her. Now, even if you have nothing to hide, your first natural instinct will be fear and panic instead of rational logic. This is precisely what these scamsters prey on.

Other scams of such type include voice impersonation of your friends and family. A woman in Madhya Pradesh got a call from a random number telling her that her daughter had been kidnapped. She could even hear her “daughter” crying in the background, who at this time was sleeping in her hostel with her phone switched off, a fact the woman later got to know from her husband. Unfortunately, at this point, she had already transferred 50,000 INR to the fraudsters.
Almost 77% of AI voice scam victims end up losing money. Despite the high numbers, the awareness for such scams remains low.
A study suggests that just 3 seconds of your voice is enough to create an 85% match to your actual voice. But how do they get their hands on these voices in the first place?
Social media. Every time you upload even a short reel, you risk your voice getting cloned. If three seconds of your voice is enough to get an 85% match, imagine what 90 seconds of the same can achieve.
With the increasing number of deepfakes (video, image or recording made with AI to look real, but is, of course, fake), and the rapid advancement of technology, there also needs to be rapid development and law-making surrounding AI.
To make any laws and regulations surrounding AI, we need to first begin by defining AI. While everyone’s working on building AI, who’s working on regulating it?
Well, a small number is. Very few countries at this point have formulated laws and regulations surrounding AI. The US is working on applying fair and appropriate regulations to ensure responsible AI development and deployment. Similarly, the European Union (EU) has proposed the AI Act, which categorizes AI applications into different risk levels and imposes stricter regulations on high-risk AI systems. China, on the other hand, has a national plan to become a world leader in AI by 2030, including ethical guidelines and regulatory measures. Countries like Canada, the United Kingdom, Singapore, and Japan are also taking steps to create regulatory frameworks to govern AI usage responsibly and ethically.
Now, how can you protect yourself? Well, the first step is to read my newsletter to stay informed (and share with your friends and family to make them aware as well😝).
No, but for real, awareness is the first form of protection you can give yourself. If you read the newspaper daily, you will definitely find at least one piece of news of such frauds. Always verify identities of the other side if you can, and try to limit the information going out. Have strong passwords to protect your accounts, monitor and cross check any suspicious transactions, and try to limit sharing your information on social media.
Here are additional links if you want to read more about it:
AI is here to stay, and in this era of deepfakes, one needs to keep their eyes and ears open. While AI can help us in tremendous ways, it can also harm us equally. It is up to us to decide which side we want to be on. So, if you ever get a call from me asking for money, please let me know so that I can share my actual bank details.😌
PS. If you like my newsletters, feel free to share them with your friends and family!
If this was shared with you, you can subscribe here:
Reminder: Don’t forgot to move my emails to your primary inbox to make sure you keep receiving them!
Reply