A new study by university researchers in the US has shown that artificial intelligence applications available on OpenAI’s GPT store could be dangerous. It found that the store, which allows users to deploy their own GPTs, collects data, raising privacy concerns.
Giant vulnerability in GPT Store
Over a four-month period, researchers analyzed more than 120,000 artificial intelligence systems, called GPTs, and the 2,500 services they had access to. They found that many services were collecting sensitive user data such as passwords, web browsing histories and personal details without permission.
Another worrying aspect is that the GPTs have “unlimited access” to each other’s data. This means that data used in one chat can be shared with other chats. So the information you use in ChatGPT could end up in the hands of different people.
According to the analysis, only 6 percent of services explicitly addressed privacy policies. It also found that many GPTs were collecting more data than necessary. For example, some GPTs were found to store user passwords.
OpenAI removed more than 2800 GPTs for violating its privacy policies. However, the researchers revealed what third-party developers can collect. Accordingly, he wants OpenAI to be more controlled.
Chatgpt-4o love, Chatgpt-4o, artificial intelligence love, open ai
Users do not need any code knowledge to develop personal ChatGPTs. However, developers have already created thousands of custom chat bots, from website creation to tax consulting.