Beware: AI Voice Scams on the Rise as Fraudsters Exploit Technology to Deceive Victims

Share this News:

Pune, 18th November 2023 – We frequently hear about online fraud and scams, underscoring the importance of staying vigilant. A new scam has surfaced, where fraudsters attempt to deceive individuals by using AI to alter their voices and extort money. A recent incident highlights this scam, where a 59-year-old woman fell victim, losing around Rs 1.5 lakh.

While AI has gained considerable popularity, the associated risks are also escalating. Each day brings news of a new scam. In this particular incident, scammers exploited AI-generated voices, posing as acquaintances or family members and compelling the victim into parting with money.

Delving into the specifics of the scam, AI has facilitated a variety of fraudulent activities, including the creation of deepfakes and, more recently, the spread of AI voice scams. Fraudsters leverage AI to produce convincingly authentic audio of human voices, making it challenging for individuals to identify the deception and rendering them vulnerable to scams. Scammers capitalize on this technique to extract personal information and money from unsuspecting victims.

To stay safe, it is crucial to adopt certain precautionary measures. When receiving a call, individuals should only impart personal information after verifying the caller’s identity. Caution is advised if someone on the phone requests money or personal details, as they may be potential scammers. In cases of uncertainty regarding a number or call, immediate disconnection followed by verification with the relevant company is recommended. When someone claims to be a family member, it is wise to cross-check the number to confirm the caller’s identity before sharing any information.