Private conversations become public data
According to a report from Fast Company, thousands of private conversations have been indexed by Google. This could be just the “tip of the iceberg,” with millions more conversations at risk of being exposed.
While these conversations did not include directly identifying information, many users shared extremely sensitive personal details, from relationships to traumatic experiences, making identification possible.
It all started when some cybersecurity experts discovered that with just a simple Google query, users could easily access publicly shared chats.
The leaked content includes not only sample paragraphs and technical questions, but also personal information, sensitive work sharing, and even private confidences.
The cause was determined to be the "Share chat" feature, a utility deployed by OpenAI to help users share chat content with others. When users select "Make this chat discoverable", the system will create a public link that can be indexed by search engines.
It is worth mentioning that the interface of this feature is not clear enough, causing many people to misunderstand that they are only sharing content with friends or colleagues, instead of making it public online.
In response to the backlash, OpenAI took immediate action: it paused the sharing feature and worked with Google to remove the relevant links.
According to Fast Company, more than 4,500 such links have been discovered, a significant number given that ChatGPT has hundreds of millions of global users.
What is worrying is that many people have used ChatGPT to write emails, exchange work, exploit medical information, or even confide personal psychological problems, believing that this is a private space.
Alarm bells ringing about privacy in the age of AI
The incident raises big questions about the responsibility of AI companies for user data. Is OpenAI providing enough transparency for users to understand their privacy? Is the level of data protection users are receiving really commensurate with the sensitivity of the content?
OpenAI CEO Sam Altman has warned users not to share sensitive personal details with ChatGPT, admitting that the company could be forced to provide such information if legally required by a court.
Notably, however, Altman made no mention in his statements that conversations that users voluntarily share could be publicly indexed on search engines.
Furthermore, this is not the first time ChatGPT has been questioned regarding data leaks. Researchers have warned that large language models like GPT can “accidentally” recreate old data if users ask questions in a clever way.
There’s no denying that tools like ChatGPT have revolutionized the way people search for information and create content. However, along with convenience, users need to be aware that nothing is truly private without strong technological barriers and careful personal manipulation.
Source: https://baovanhoa.vn/nhip-song-so/hang-nghin-cuoc-tro-chuyen-chatgpt-bi-ro-ri-cong-khai-tren-google-158723.html
Comment (0)