Ai
Art
Beauty
Biotech
Business
City
Clothing
Communication
Construction
Economy
Education
Energy
Entertainment
Family
Food
Gadgets
Government
Healthcare
Home
Human
Love
Medicine
Nature
Privacy
Production
Robots
Science
Society
Space
Sport
Threats
Transport
Work

Protection from Phone Scammers in the Future with AI

In the future, AI-enhanced voice cloning will develop, as phone scammers will utilize advanced and more sophisticated AI to clone the voices of individuals known to the victim to try and obtain funds from them. They will use AI to gather contextual information about victims from social media and other online sources, creating more personalized and believable scams, so that there are more chances of people falling for them. The victims will be chosen beforehand either manually or with special algorithms, and their data will be carefully collected to ensure seeming credibility of the scam.  

Kaspersky Who Calls will also develop new technologies to counter new AI scam tools. Voice analysis will be implemented to detect subtle anomalies in speech patterns, flagging potential scam calls. A special encryption will be developed, which can be added to your voice in audio messages and phone calls to prevent scammers using and cloning it. The new technology will add a particular noise frequency to audios and change it with every recording to protect your data.  

Also, real-time AI that cross-references call data with known scam patterns and marks suspicious calls will be developed. This will allow catching potential scammers before they have a chance of enacting their scam. Kaspersky's products will integrate deeper social media monitoring and contextual analysis to pre-emptively warn users of potential risks. 

I agree
4
I don't agree
2