Understanding Data Privacy Policies for AI Tools
What you need to know about GDPR, customer data, and staying compliant in the UK.
Why It Matters
AI tools like ChatGPT, Claude, and Gemini are increasingly used at work. But many people don’t realise that these tools process data – and often, that includes customer or personal information.
If you use AI at work, you’re responsible for how that data is handled. Understanding an AI tool’s privacy policy helps you avoid breaching UK GDPR and maintain customer trust.
What Is a Data Privacy Policy?
A privacy policy outlines how a company collects, uses, stores, and shares data. When it comes to AI tools, the privacy policy tells you:
- What data is collected (e.g. your prompts, documents, uploads)
- Whether your data is stored or deleted
- If your data is used to train models
- Where the data is processed (UK, EU, or internationally)
- What options you have to opt out
What UK GDPR Requires
Under the UK GDPR, you must protect personal data – even when using third-party tools. Key requirements include:
- Lawfulness, fairness, and transparency
- Purpose limitation
- Data minimisation
- Accuracy
- Storage limitation
- Security and confidentiality
- Accountability (you must prove you’ve followed the rules)
If you paste identifiable customer data into a free AI tool, that could be a breach.
Key Questions to Ask Before Using an AI Tool
- Is my data stored?
- Is it used for training the model?
- Can I opt out of data use?
- Where is the data processed?
- Can I get a Data Processing Agreement (DPA)?
What to Look for in the Privacy Policy
Search for:
- "retention"
- "data use"
- "training"
- "improve our model"
- "data processing"
- "international transfer"
- "DPA"
If any of this is missing, vague, or hard to find – that's a red flag.
Real-World Examples (as of 2024)
Tool | Default Policy | Can You Opt Out? |
---|---|---|
ChatGPT (free) | Stores data and may use it for training | ✅ Yes (from "improving" - in the settings) |
ChatGPT (Team/Enterprise) | No training, data not stored | ✅ Yes |
Claude | No training on input data | ✅ Yes |
Gemini | Data may be used unless settings are changed | ✅ For business |
Disclaimer: Check the latest policies yourself – these change regularly.
Safe Practices
- Use dummy data when testing
- Don’t use tools without a DPA for sensitive data
- Stick to enterprise-grade tools with clear policies
- Don’t assume “free” means “safe”
- Ask your DPO or legal team if unsure
Remember
Understanding a tool’s privacy policy is key to using AI responsibly at work. If in doubt, stop and check. Data privacy is everyone’s job – not just IT.
Disclaimer
This guide is for general informational purposes only. It is not legal advice. Always consult your organisation’s legal or data protection officer when evaluating AI tools or processing personal data.
Want More Resources Like This?
Sign up for our Thoughts by Humans newsletter to receive the latest AI and data resources directly to your inbox.