The Dutch Data Protection Authority (Autoriteit Persoonsgegevens or AP) is concerned about the handling of personal data of organizations that use so-called generative artificial intelligence (AI), such as ChatGPT. The AP is therefore taking several actions in the coming period. First, the AP has asked software developer OpenAI for clarification about chatbot ChatGPT. Among other things, the AP wants to know how OpenAI handles personal data when training the underlying system.

ChatGPT is a chatbot that can provide seemingly convincing answers to many different questions. For example, users can ask ChatGPT to do their math homework or write computer code, or can use the chatbot to ask for advice on relationship problems or medical issues. In the Netherlands alone, 1.5 million people used the chatbot in the first 4 months after its launch.

Trained with personal data

ChatGPT is based on an advanced language model (GPT) that is trained with data. This can be done by collecting data already on the Internet, but also by storing and using the questions people ask. That data can contain sensitive and very personal information, such as when someone asks for advice about a marital dispute or medical issues.

The AP wants to know from OpenAI whether people’s questions are used to train the algorithm, and if so, in what way. The AP also has questions about how OpenAI collects and uses personal data from the Internet. Further, the AP has concerns about information about people that GPT “generates” by providing answers to questions. The generated content may be inaccurate, outdated, inaccurate, inappropriate, offensive, or objectionable and may take on a life of its own.  Whether and if so how OpenAI can rectify or delete that data is unclear.

Other privacy regulators in Europe are also concerned about ChatGPT and believe a joint approach is necessary. Consequently, privacy regulators in Europe have set up a ChatGPT task force within their collaborative framework, the European Data Protection Board (EDPB), to exchange information and coordinate actions.


Personal data is often used in algorithms and AI. This means that the use of algorithms must comply with the General Data Protection Regulation (GDPR). The AP oversees this.

Because algorithms – with or without the use of personal data – can be found in all sectors, it is important to keep an overall eye on the risks and effects. And that regulators, inspectorates and other organizations that deal with them share knowledge and strengthen existing collaborations. The AP coordinates and promotes this joint supervision.

Does your organization use ChatGPT or any other form of AI? If you have any questions or need assistance, contact us, the Experts in Data Privacy, at