Site icon IT World Canada

Google warns that Gemini conversations are not private

Google’s latest Gemini privacy notice serves as a stark reminder of the potential privacy risks associated with AI applications. The tech giant explicitly warns users against sharing personal information with its Gemini app, formerly known as Bard, highlighting the type of data it collects during interactions. This development underscores the delicate balance between technological convenience and user privacy in the age of artificial intelligence.

Google admits to collecting conversations, location data, user feedback, and usage information through Gemini. This comprehensive data collection is intended to refine and enhance Google’s products, services, and machine learning technologies.

Despite assurances of taking privacy seriously and not selling personal information, Google’s policy of retaining a subset of conversations to improve Gemini — while anonymizing user-identifying details — raises questions about who exactly has access to this data.

The policy reveals that conversations reviewed by human reviewers are retained separately from users’ Google Accounts for up to three years, even if the user deletes their Gemini app activity. This long retention period, coupled with the fact that conversations can be saved with the user’s account for up to 72 hours even when Gemini Apps Activity is off, further complicates the privacy landscape.

Google’s emphatic warning not to share confidential information via Gemini interactions reflects a growing concern over the privacy implications of AI technologies. Users are advised to be cautious about the information they divulge in these digital exchanges.

In essence, while AI applications like Gemini promise to revolutionize how we interact with technology, they also introduce significant privacy considerations. Users must navigate these waters carefully, balancing the benefits of AI-enhanced convenience against the potential for privacy intrusion. Google’s latest warning is a clear signal to users to exercise caution and discretion in their interactions with AI technologies.

Sources include: ZDNet

Exit mobile version