What McDonalds Hot Coffee Can Teach You About Protecting Your Data
How Businesses Decide When to Protect Your Privacy (and When Not To)
Travel back in time to a McDonald's drive-thru in the 1990s.
You grab your morning coffee, take a sip, and suddenly…burning pain. The coffee is scalding hot, dangerously so. People were regularly burned back then from McDonalds’ hot coffee, though the company kept it quiet.
Until one woman decided to sue and was awarded a huge payday. Interestingly, public opinion was overwhelmingly against the injured woman. The lawsuit even became part of pop culture.
But here’s the thing most people didn’t know about this story (likely because of a good PR campaign by McDonalds and Big Business allies):
McDonalds knew their coffee was served at temperatures that could cause serious burns. But they made a business decision to serve it that way anyway.
Why?
The benefits of keeping it that hot outweighed the risks of lawsuits and bad press.
That’s how corporate decision-making works.
And Companies don’t only make these calculated trade-offs in the fast-food industry. Companies are companies, and incentives are incentives, regardless of industry.
The same thinking that resulted in McDonalds serving boiling hot coffee despite knowing the risks applies to how companies treat your personal data.
The Corporate Approach to Privacy: A Risk-Based Decision
Most people assume companies protect their data because of privacy laws or ethical responsibility. But that’s not how it works.
Sure, there are some companies that will have legitimate ethical privacy practices originating from a place of wanting to do the right thing. Those are the exceptions.
Most companies don’t protect your privacy because they care. They protect it when the cost of violating it outweighs the benefits of exploiting it.
Take Amazon’s recent decision to eliminate local processing on Alexa devices. Previously, some Alexa inquiries could be processed on the device itself. Now, every command you give must go through Amazon’s cloud.
Why?
Because Amazon knows that controlling more user data gives them a massive advantage in the AI race. Whatever legal risks or backlash may come from this move, they’ve determined that the benefits far outweigh the downsides.
The same applies OpenAI’s ChatGPT, which recently launched a feature allowing users to upload personal photos to generate cartoon anime versions.
Innocent fun?
Maybe. But also an incredibly effective way to collect millions of user images for AI training.
Bonus legal infraction: ChatGPT, by allowing prompts specifically using the term “ghiblli”, may have also triggered an IP infringement claim by the Japanese company Ghibli Studio. Which ChatGPT would have known, but made a business decision to proceed anyway.
Companies don’t always tell you the full story about how your data is used. Privacy notices are carefully crafted legal documents, often vague enough to grant companies broad rights while maintaining flexibility. And that’s assuming the privacy notice is accurate and not out of date or outright wrong. We discussed this in a prior post:
The LinkedIn Playbook: Ask for Forgiveness, Not Permission
Tech companies have a history of ethically questionable, and sometimes illegal, data collection practices.
Remember when LinkedIn once used its access to users’ email contacts to aggressively grow its network? They made a business decision: harvesting user contacts would supercharge their platform’s growth, and any lawsuits or regulatory fines would likely be a minor inconvenience in the grand scheme of things. They bet that they can grow quickly enough to make the costs of any lawsuits and regulatory oversight a rounding error. In LinkedIn’s case, they bet correctly.
This isn’t paranoia, it’s a pattern.
Companies collect your data, sometimes without explicit consent, then justify it as an acceptable business risk. If caught, they pay a fine, tweak their policies, and move on. Meanwhile, the damage to user privacy is already done.
Small Wins: How to Push Back
The good news? You don’t have to accept this dynamic passively. Here are a few practical steps you can take:
Keep reading with a 7-day free trial
Subscribe to Secrets of Privacy to keep reading this post and get 7 days of free access to the full post archives.