Artificial intelligence (AI) can believably imitate the voice of a well-known person, expert or family member. A short recording from a social network or a video is enough for her. Fraudsters abuse this in phone scams. Citizens should therefore be cautious. The police drew attention to this on the social network.
“The scenario is usually similar. The caller offers a profitable investment, a quick appreciation of money or claims that it is an urgent situation. It uses pressure, emotion and time constraints. The voice sounds trustworthy, calm and professional. That is the biggest risk – people believe what they hear. The money is subsequently transferred to foreign accounts or to unknown investment platforms,” the police said.
In other cases, the person’s voice is real and the fraudster uses a so-called fake legend. “The caller introduces himself as an employee of a bank, police or investment company. He claims that immediate action is necessary to prevent the loss of funds,” approached the police.
Especially recently, there have been cases where the fraudster pretends to be an investigator and sends the victim a fake police card. The police warned that its members never ask for money and never send copies of their service cards.