ChatGPT data privacy issues are raising concerns as many apps on the platform aren’t properly protecting user information.
A study found that many apps on the ChatGPT Store aren’t giving enough information about how they collect data and don’t follow privacy rules. Researchers from the University of Washington looked at over 120,000 GPT apps and more than 2,500 “Actions” apps over four months and discovered that a lot of these apps don’t clearly say what kind of data they collect in their privacy policies.
The study showed that GPT apps gather a lot of sensitive information like passwords and personal details. “Actions” apps, in particular, collect data for things like ad tracking and analytics, which is a big privacy issue seen in mobile and web apps too. The researchers also pointed out that many of these apps don’t follow OpenAI’s data privacy standards.
It was mentioned that OpenAI doesn’t do enough to monitor how its apps collect data, and the current practices aren’t good enough to protect user privacy. This lack of proper data protection puts user security at risk.
Also Read: Telegram Crime Issues: Why the App is Dubbed ‘The Dark Web in Your Pocket’
[rank_math_rich_snippet id=”s-f3e5ff31-ca75-4310-9638-84d5064f89bc”]