Data you share with ChatGPT is not private: So be careful with what you reveal

When you ask ChatGPT about its privacy policy, it says that ‘OpenAI collects personal information such as name, email address, and payment information when necessary for business purposes’. Yes, you read that right.

A few days ago, a bug allowed ChatGPT users to see descriptions of conversations that other users were having with the system. The bug was later fixed and CEO Sam Altman apologized for it.

This in essence was not a privacy breach, but was scary nevertheless. ChatGPT rival Bard also made shocking revelations that it was trained with user’s gmail data, though Google denied it. This brings us to the question — is my personal data safe with ChatGPT or other AI tools?

AI platforms can pose a threat to privacy

When using AI tools, it is important that one knows about the privacy risks they pose. Mukesh Choudhary, founder and CEO of Cyberops, an information security organization, says that there are several ways AI platforms can be a threat to your privacy.

First is user profiling, where by analyzing user behaviours, AI platforms can build a user profile that can then be used for targeted advertising or other purposes. Second is data sharing, where for a variety of purposes, including research or advertising, AI platforms may share user data with other businesses or organizations.

The next is third-party access. Here, AI systems might give access to user data to outside developers who could exploit it for their own reasons. And finally, there is the issue of data breach. AI platforms are not immune to data breaches, which might result in unauthorized individuals accessing user information. These breaches are possible as a result of flaws in the platform’s security procedures, such as weak passwords or unencrypted user data.

“ChatGPT goes and reads everything on the Internet. But it has got no consent from us. And they could be selling off this data,” says Ritesh Bhatia, V4WEB Cybersecurity, a cybercrime investigations and digital forensics firm.

What the law says

In India, even presently, we do have a law regarding protection of sensitive personal information. The classification has been made u/s 43A of the Information Technology Act.

“Under section 43A any organization is under a responsibility to protect personal information shared with the organization and follow due diligence or reasonable security practices for protection of the data. If there is a data leak or breach in security, that organization is liable to compensate the victim. However, if the victim has shared the personal data willingly as per the privacy policy of the body corporate and given consent for sharing such data, then that person does not have any right against the organization when the data is shared with the third party,” says Vaishali Bhagwat, Advocate, a practicing civil and cyber lawyer.

Source : https://www.businessinsider.in/tech/news/data-you-share-with-chatgpt-is-not-private-so-be-careful-with-what-you-reveal/articleshow/99116335.cms?utm_source=social_sticky_non_amp&utm_medium=social_sharing&utm_campaign=Click_through_social_share

Exit mobile version