AI is not your friend, scientists warn. Millions of people are building emotional bonds with chatbots

Tens of millions of people around the world use artificial intelligence as their companion. Although it helps many to ward off boredom or curiosity, scientists are still raising a warning finger. In the latest evaluation report on the security of AI, they warn that emotional ties to machines can deepen loneliness, and therefore we should not ignore this problem and engage critical thinking when using it.

A new report, published on Tuesday by a group of leading experts and academics led by Professor Yoshua Bengio, warns of a surge in the popularity of so-called “AI companions”.

Expert and professor Mária Bieliková has already commented on artificial intelligence. He says AI doesn’t think like a human. It dispels various myths about what people think about AI. For example, a fundamental property of most artificial intelligence algorithms is that they are ultimately indeterminate and ambiguous. In conclusion, the expert adds that students should learn to write, read, count and communicate with a computer at school. Mainly because of generative artificial intelligence.

False support and “snobbery”

Applications specialized in simulating relationships, such as Replika or Character.ai, already have tens of millions of users. However, people are also looking for comfort and companionship in common universal tools such as OpenAI’s ChatGPT, Google’s Gemini, and Anthropic’s Claude.

“Even ordinary chatbots can become companions,” points out Yoshua Bengio, one of the world’s most respected artificial intelligence experts. “In the right context and with enough interactions, a relationship can develop between the user and the AI,” he explains.

The risk lies in the nature of these systems. Chatbots are set up to please their user as much as possible – what experts call “squeaky nature”.

“AI is trying to please us in the moment, to make us feel good, but that’s not always in our best interest,” warns Bengio. According to him, technology hides similar traps as social networks.

The EU is considering regulation

Although many people seek AI precisely to alleviate the feeling of loneliness, the effect can be exactly the opposite. The report notes that while the evidence on psychological impacts is so far mixed, some studies are already showing worrying patterns.

Frequent users of AI companions have seen an increase in loneliness and a decrease in social interactions with real people.

The scientists’ warning comes just two weeks after dozens of MEPs called on the European Commission to examine the possibility of restricting these services under the AI ​​Act. The reason is concerns about the mental health of citizens.

Bengio expects new regulations to be necessary. Instead of specific rules just for “digital friends”, however, he is advocating for broader legislation that would address multiple risks at the same time.

The report on AI security was released ahead of the upcoming global summit, which begins on February 16 in India. In addition to emotional risks, the document also draws attention to other threats that world leaders will have to deal with – from cyber attacks controlled by artificial intelligence, through sexual deepfake videos, to systems that could provide instructions for the production of biological weapons. He also draws attention to the situation.

source

News Room USA | LNG in Northern BC