GPT chat and company caught to collect and share confidential data

by Andrea
0 comments
Send prohibited messages in oppressive regimes? Chatgpt can help

GPT chat and company caught to collect and share confidential data

Artificial Intelligence Assistants (AI) are collecting and sharing confidential data from users, such as medical and banking records, without proper safeguard.

A new study from College London University in the United Kingdom and the University of Reggio Calabria in Italy, which will be presented and published in, which runs between Wednesday and Friday in the US, analyzed ten AI assistants.

The great discovery of this study, emphasize the researchers, is that these Applications collect personal data from the activity of users on the Internet, improperly.

The analysis revealed that several assistants transmitted full content of internet pages, including visible information on the screen, for its servers.

According to the study, one of the AI assistants, Merlin, captured forms of forms, such as bank or health datawhile chatgpt, copilot, monica and sider had the ability to infer the user’s attributes, such as age, gender, income and interests, and used this information to customize the answers even in different navigation sessions.

“This collection and sharing of data is not trivial. In addition to the sale or sharing of data with third parties, in a world where mass cyberatains are frequent, There is no way to know what is happening to navigation data after collected“, He warned, cited in a statement from College London University, one of the study’s authors, Anna Maria Mandalariquoted by Lusa.

For the study, the researchers simulated real -world navigation scenarios by creating the identity of “Rich Man of Generation Y [nativa digital] from California, ”who used to interact with assistants while completing ordinary internet tasks, including news reading, Amazon shopping, and video viewing on YouTube.

Also included in private space activities were included, such as accessing a university health portal, starting a session in a dating service or accessing pornography, which researchers assumed that users did not want to be tracked, as the data is personal and sensitive.

According to the study, Some AI assistants did not stop, as they should, to record the activity when the user moved to the private space and violated the US data protection legislationwhen collecting confidential health and education information.

Source link

You may also like

Our Company

News USA and Northern BC: current events, analysis, and key topics of the day. Stay informed about the most important news and events in the region

Latest News

@2024 – All Right Reserved LNG in Northern BC