
How does Figma AI affect your copyright and NDAs? I asked a lawyer.
Most likely, you hate dealing with the legal aspects of being a designer like me. Yet, AI brings new complexities to our work and the legal aspect of being a designer.
For example, Figma is going to train an AI on your work that’s under an NDA. Does it mean you’ve breached it? They will train their AI model on your copyrighted work. Who owns the copyright now? Do you own the copyright for the designs Figma AI is going to generate for you?
To answer these questions, I hired a lawyer, specializing in contracts, privacy policies, and intellectual property for a 10-hour research (thanks, Maryna). I asked her to review the new Figma’s AI terms and the three last contracts with NDAs I signed as a freelancer.
Full results are here, a summary and my interpretations are below in this article.
And yes, you can still opt out of Figma AI training. However, training AI on our work is likely to become even more widespread and a norm. With time, the opt-out option may become less common.
The information in this article is for informational purposes only and does not constitute legal advice.
Allowing AI training could breach your NDA
If you’re a full-time employee, you can skip this part. You’re likely using your corporate Figma account, so the legal complexity is on your company, not on you.
As a freelancer or an agency, the NDA you signed with your clients most likely forbids you from sharing their data with third parties. Allowing Figma to train on your files could fall into this category.
Yes, Figma says they will only “use data in a de-identified and aggregated way to protect your privacy”. If they do that, no one will see your client’s data. Besides, your clients may never know that you trained the AI on their data
However, allowing training can still be considered as sharing clients’ data with a third party.
NDAs often allow designers to use clients’ information solely for providing services. AI training is not related to providing services to the client, especially if the designer can opt out.
Having said that there were still no legal cases when companies sued contractors for allowing AI to train on their data. Most probably because this practice is still in its early stages.
What you can do
To reduce risks:
- Review contracts for any AI-specific requirements to ensure you comply. They aren’t common today but are likely to appear soon.
- If you can, opt out of AI training on client files to reduce legal exposure.
- If you can’t opt out, add a disclosure to the contract. Say that to perform your duties you’d have to allow the software you use to be able to train on customer’s data.
It becomes hard not to violate others’ copyright
You own the copyright for your work even if Figma trains AI on it. You also own the copyright for the designs you generate with Figma AI (unlike Midjourney for example).
Here is where it becomes tricky. What if your files include someone else’s work that you don’t own a copyright for and Figma uses it to train their model?
Let’s say you create a mood board with screenshots of others’ designs. You don’t own the copyright for these designs, but now you’ve allowed Figma to train their AI on it. Now you’ve violated the copyright of the original owner.
Figma could be held as a co-responsible party for such a violation. However, Figma’s terms state that “users will hold Figma harmless against any claims concerning customer content.”
So if someone will sue Figma for using their copyrighted work that you uploaded, you can be held accountable. Of course, it’s only if Figma can prove you’re the one who had those files.
What you can (not) do
It seems convenient for Figma —their users supply them with copyrighted work for training. And if a dispute arises, the responsibility for the copyright infringement is transferred to the users.
It’s unlikely to be an intentional move by Figma but more of a side-effect of their AI training program. Since I’m not sure how it could be avoided, besides shutting the whole AI training program down completely.
The legal recommendation is to avoid using any copyrighted work in your files that could be used to train an AI model.
However, using others’ designs in our Figma files, especially when in the exploration and research phases, is an essential part of our workflow. We probably won’t be able to avoid it, so I’m not sure if we can follow that advice.
Currently, it seems that the reality is that it doesn’t matter how much we try and protect our work from AI training. If it’s on the internet (especially if it’s public), it will be eventually used to train AI.
If you’d like to increase your impact as a designer and build a better career, check out my new book The Effective Designer.