If you create content, sell digital products, or advise clients — and you use AI to help produce your work — there are compliance questions you need to have answered. Not hypothetical future questions. Questions you should be able to answer today, in writing, if a client or regulator asks.
The Questions You Need to Answer
Start here. Can you confidently answer the following?
- What AI tools do you use, and what data do those tools' providers collect?
- What client or customer information (if any) goes into AI tools, and under what terms?
- Do the terms of service for your AI tools permit you to sell the outputs commercially?
- Is there an attribution or disclosure obligation for AI-assisted work in your field or jurisdiction?
- If your work is challenged as low-quality or inaccurate, can you demonstrate human review?
If you can't answer any of these confidently, that's where to start.
Privacy and GDPR
If you're in the UK or EU (or serving customers there), GDPR applies to personal data you process — including data you enter into third-party AI tools. Entering client names, emails, or any identifiable information into a commercial AI tool is a data processing activity. That tool becomes a data processor. You need either:
- A Data Processing Agreement (DPA) with the AI provider, or
- A process for anonymising data before it enters the tool
Most major AI providers (OpenAI, Anthropic, Google) offer DPAs under their business/enterprise plans. Check your current plan's terms. Consumer-tier accounts typically don't include the DPA needed for processing client personal data.
Terms of Service: What You Can and Can't Sell
Read the commercial use clause in the terms of service for every AI tool you use. The majority of major AI tools permit commercial use of outputs — but there are variations:
- Some tools restrict use of outputs for training competing AI systems
- Some tools' free tiers retain rights to use your inputs for model training
- Some tools explicitly disclaim any warranty on the outputs (relevant if you're selling deliverables)
- Image generation tools have specific and widely-varying rules about commercial licensing
Check the current terms — not a summary, not an article from 2023. Terms change.
Attribution and Disclosure
Legal disclosure requirements vary by jurisdiction and sector. In most jurisdictions, there's currently no general legal requirement to disclose AI involvement in written content — but this is changing, particularly in advertising, journalism, and some regulated industries. Check current requirements for your specific field.
Separately from legal requirements: professional ethics. If a client is paying for your expertise and receives AI output with light editing, they may have a reasonable expectation of knowing that. The ethical answer is to be transparent about your process, especially with ongoing client relationships.
Demonstrating Human Review
If you're challenged on the quality or accuracy of AI-assisted work, your defence is your review process. This means having a documented, consistent review step: how you check facts, how you verify claims, how you ensure the output meets your standard. Keep records — even informal notes — of your review process for significant deliverables. "I reviewed it" is weaker than "I checked all specific claims against primary sources and made the following edits before delivery."
The One-Page Policy
For any creator or consultant using AI in client work: write a one-page internal policy that covers the four areas above — data handling, commercial use, disclosure, and review. Share it with clients who ask. Update it when your tools or practices change. Having it written doesn't just protect you — it forces you to think through the questions before a client or regulator forces you to think through them under pressure.