← Back to Knowledge Base

January 5, 2026 | Law, GDPR

Is uploading contracts to public AI legal?

Many companies fascinated by the capabilities of generative AI forget one key aspect: where does the data entered into the chat window actually go?

The Problem of Model “Training”

Most free, publicly available AI tools offered by global cloud operators reserve the right in their terms of service to use user conversations to “improve service quality.” In practice, this means training future versions of models on your data.

Example: If you paste a fragment of an NDA agreement or client personal data into the chat, this information is sent to servers outside the European Economic Area (usually to the USA) and may become part of the model’s knowledge base.

What about GDPR?

According to GDPR, the data controller must know where the data is processed and who has access to it. Using public language models without a signed Data Processing Agreement (DPA) is risky under European law and may result in penalties imposed by Data Protection Authorities.

The Solution: Private Instances

An alternative for companies are solutions like the PrivatAI.pl private model, which operate on local, isolated servers or in a private cloud. In this model, the service provider has no right to use client data to train their models, and the data remains under your full control.

Want to use AI securely in your company?

Don’t risk data leaks to the public cloud. Test a secure, GDPR-compliant AI environment.

Check PrivatAI Pricing