Secrets of Privacy

Secrets of Privacy

Share this post

Secrets of Privacy
Secrets of Privacy
How Google's Gemini AI Accessed Private Tax Data
Copy link
Facebook
Email
Notes
More

How Google's Gemini AI Accessed Private Tax Data

Unpacking the Latest AI Privacy Controversy: Google Gemini's Unsolicited Document Analysis

Jul 19, 2024
∙ Paid
1

Share this post

Secrets of Privacy
Secrets of Privacy
How Google's Gemini AI Accessed Private Tax Data
Copy link
Facebook
Email
Notes
More
3
Share

In a recent X/Twitter thread, Georgetown Law Professor Kevin Bankston shared a concerning experience with Google's new AI tool, Gemini. The incident was shocking to even long-time Google critics like us, but did lead to a necessary discussion about privacy, data security, and transparency issues with Big Tech AI tools.

Welcome to another issue of Secrets of Privacy where we discuss personal privacy related topics and provide practical tips to immediately shield your personal privacy.

If you’re reading this but haven’t yet signed up, join the growing Secrets of Privacy community for free and get our newsletter delivered to your inbox by subscribing here 👇

The Incident

Bankston recounted how he accessed his tax return document stored in Google Docs. To his surprise, Gemini, Google's AI, automatically generated a summary of his tax return without any prompt or request from him. This unrequested action by Gemini raised immediate red flags about the extent of data access and processing capabilities embedded in Google's AI systems.

Privacy Implications

Bankston's experience highlights several critical privacy issues with AI tools and cloud services:

  1. Unsolicited Data Processing. The AI's ability to autonomously process and summarize sensitive documents without express user consent poses a significant threat to privacy. Google has created a de facto opt-in for your personal documents.

  2. Data Security Risks. If AI systems can access and process private documents without explicit permission, it raises questions about the security measures in place to protect user data from unauthorized access.

  3. Transparency. The incident underscores the need for guidelines and transparency in AI tools. Users must be informed about how their data is being used and have control over AI interactions with their personal information. Burying terms in click through agreements is not acceptable.

Share Secrets of Privacy

Google's Response

Keep reading with a 7-day free trial

Subscribe to Secrets of Privacy to keep reading this post and get 7 days of free access to the full post archives.

Already a paid subscriber? Sign in
© 2025 Secrets of Privacy
Privacy ∙ Terms ∙ Collection notice
Start writingGet the app
Substack is the home for great culture

Share

Copy link
Facebook
Email
Notes
More