Real doctors, fake sales: scams with IA deceive patients in product sales

Dr. Robert H. Lustig is an endocrinologist, emeritus professor of pediatrics at the University of California, San Francisco, and author of bestselling books on obesity.

It is definitely not – despite what you can see and listen on Facebook – selling “liquid pearls” with dubious promises of weight loss. “Without injections, without surgery, only results,” is what it seems to say in a publication.

In fact, someone used artificial intelligence to create a video that mimics him and his voice – all without his knowledge, much less consent.

Free tool

XP Simulator

Real doctors, fake sales: scams with IA deceive patients in product sales

Learn in 1 minute how much your money can yield

These publications are part of a global wave of frauds that kidnap the online image of renowned medical professionals to sell unavoidable health products or simply fool unsuspecting consumers, according to doctors, government authorities, and researchers accompanying the problem.

Although those developed by large technology companies are allowing those responsible for these imitations to reach millions of people on the internet – and profit from it. The result is the dissemination of misinformation, the loss of confidence in the profession and the possible risk to patient health.

Even though products are not dangerous, selling useless supplements can create false hopes on people who should be receiving proper medical treatment.

Continues after advertising

“There are so many wrong things about all this,” said Lustig – the real – in an interview, when he was informed about the imitations. This was the first time he knew the case.

The FDA (Anvisa -like body) and other government agencies, as well as defense groups and private inspection agencies, intensified alerts about counterfeit or fraudulent health products on the Internet, but seem to have done little to contain the advance.

AI tool advances have facilitated the creation of convincing content and their dissemination on social networks and e -commerce sites, which often do not apply their own policies against fraud.

Continues after advertising

Today there are hundreds of tools capable of recreating someone’s image and voice, Vijay BalasubraManiyan, CEO of Pibirp, a company that monitors deceptive AI uses. Technology is so advanced that scammers can create convincing impostors from few videos or photos.

“I can create an AI bot that looks like you and has complete conversations based only on your LinkedIn profile,” he said.

Dr. Gemma Newman, a family doctor in England and author of two nutrition and health books, used Instagram in April to alert her followers about a Tiktok video that had been changed to look like it promoted vitamin B12 capsules and 9,000 milligrams of pure and rich nutrient beet.

Continues after advertising

Newman was horrified: his image was promoting a supplement that, in high doses, could be harmful, exploring women’s insecurities-suggesting that the pills could make them “feel desirable, admired and confident.”

The video was so realistic that even her own mother believed it was her.

“It’s a double betrayal, because my image is there, supporting something I don’t believe in,” he said.

Continues after advertising

The imitation of health professionals goes beyond unproven supplements.

Dr. Eric Topol, cardiologist and founder of Scripps Research Translational Institute in San Diego, discovered dozens of alleged AI versions of his latest book in Amazon. One of his patients bought without knowing a fake autobiography, with a portrait generated by AI de Topol on the cover.

Christopher Gardner, Stanford Nutrition Scientist, recently found it was the involuntary face of at least six channels on Youtube, including a so -called “Nutrient Nerd.”

Continues after advertising

Together, these channels have hundreds of videos, many narrated by a version generated by AI of Gardner’s voice. Most videos are directed to the elderly and offers advice that it does not support, dealing with problems such as pain of arthritis and muscle loss. These imitations may be an attempt to build enough followers to participate in programs that share ad recipe.

The proliferation of these fake content has made traditional advice on how to find reliable internet health information seemed outdated, Dr. Eleonora Teplinsky, an oncologist who found Facebook, Instagram and Tiktok.

“This undermines everything we tell people on how to identify online misinformation: Are they real people? Do you have a hospital page? How do you know it’s me?” He asked.

Gardner said he was concerned about the amount of misinformation about nutrition on the internet. He has been active on social networks and participated in podcasts to clarify the facts.

Now Gardner wonders if these efforts are not used against him-providing a recording library to imitate him. Experiences like his can discourage other experts to participate in online conversations, he said. “Thus, the reliable voices will be even stuffy.”

Gardner and a Stanford representative spent hours reporting the videos to YouTube and the federal authorities. They also commented on the videos warning that they were counterfeits, but most comments were deleted in less than a minute.

A YouTube spokesman Jack Malon told The New York Times that the platform removed “various channels” for violating its spam policies. Tiktok said in a statement that it does not allow most imitations, but did not comment on the Newman video.

Other doctors, unsuccessfully in the complaints, resorted to more desperate measures to overthrow false pages.

After several denunciations and a threatening legal letter ignored by the goal-Dr. Tiffany Troso-Sandoval, an oncologist who manages educational pages about women’s cancers, paid about $ 260 to someone at Fiverr, online service market, which promised to take a fake page on Facebook. It didn’t work.

The supplement mark that used Lustig’s image on Facebook also created fake posts in several other countries, including with real doctors in Australia and Ireland.

Geographic range shows that it is a large and sophisticated operation that has become a threat to brands worldwide, said Yoav Keren, CEO of Brandshield, the Israeli cybersecurity company that discovered the scheme.

The campaign, which seems to have started late last year, took advantage of the popularity of a class of medicines called GLP-1, which revolutionized the treatment of. The announced product was called Peaka, which seemed to be liquid capsules. (The only approved forms of LPG-1 available today are injections.)

They are sold on temporary sites recorded in Hong Kong, Keren said, but who is behind it is still unknown.

Despite the uncertain origin and fake advertising, the product was available for purchase on major e -commerce platforms such as Amazon and Walmart, and appeared on Google searches as a sponsored product. (Amazon began to remove it after contact from Times. Walmart said the product is not sold at its stores, but third party sellers, violating the retailer’s policies, marketed the product on the site; these sellers were removed.)

In addition to imitating doctors, the campaign used regulatory agencies and defense groups from various countries, such as Mexico, Norway, the United Kingdom, Canada and New Zealand, giving the false impression of official approval, according to Brandshield. In the US, groups included Obesity Society, whose site now displays a warning about what they call the “e -commerce scam.”

Dr. Caroline Appeian, codirer of the Brigham and Women’s Welfare and Welfing Center and Harvard Medical School, found that she had become an involuntary Peaka promoter when patients started sending messages asking about what her “approval” seemed to be on Facebook.

Apivian and colleagues found 20 accounts that mimicked her, with posts and ads assembled from real details and photos of her Facebook and LinkedIn profiles. She called the campaign “treacherous and dangerous.”

Facebook users raised questions about the product in the comments of fake posts or in groups dedicated to weight loss. Some warned that the product was not what it claimed to be. A woman who bought the product asked False Appeian about the dosage, as she was not clear in the packaging.

Cam Carter, who lives near Victoria, Australia, said Peaka claimed to be manufactured in Australia, but the three boxes he bought for about 45 Australian dollars arrived from China. He was also charged more at Paypal and had to run after to receive a partial refund.

“It does nothing you promise,” he wrote as he reported his experience.

Meta, a company owner of Facebook, prohibits imitations on its platforms, but said it was unaware of false accounts until it was contacted by Times. Last week, she began to remove the accounts from several doctors involved, the company said.

“We know that there may be cases that go unnoticed,” the company said in a statement, stating that it is implementing new efforts to detect imitations of public figures or celebrities.

Michael Horowitz, a renowned gastroenterology researcher at the University of Adelaide and the Royal Adelaide Hospital in Australia, said the fraud takes advantage of people who have been fighting for years against weight loss or diabetes and may not pay for GLP-1 approved genuine medicines.

“They identify the vulnerable, knowing that what they offer is for nothing,” he said. “I consider this a reprehensible behavior.”

c.2025 The New York Times Company

Source link

News Room USA | LNG in Northern BC