Board of German data protection authorities (“DSK”) publishes first guidelines on data protection for AI

The DSK guidance document "Artificial intelligence and data protection" (available in German here) primarily addresses controllers using AI, but also indirectly developers, manufacturers and providers of AI solutions. It provides an overview of relevant criteria from the perspective of the authorities but should not be understood as an exhaustive list of requirements. Nevertheless, the document contains references to a large number of different legal requirements. It is noticeable, however, that in many sections the statements are not substantiated very much and are more likely to provide initial general information because the topic is so relevant in practice right now. The focus lies on large language models offered in form of chatbots or used as a basis for another form of application. The document is divided into the following three sections: 1) Conception of the use and selection of AI applications, 2) Implementation of AI applications and 3) Use of AI applications. In the following, we summarize the respective key statements for each section separately and at the end you will find a brief critical analysis of selected statements made in the guidelines.

1. Key statements on the conception of the use and selection of AI applications

The guidance essentially contains the following key statements in the first section:

  • The field of application and the purpose of the data processing should be defined in advance, as this is the only way for controllers to check the criterion of necessity of the relevant legal basis - in the case of public bodies, the performance of their tasks is also relevant (para. 1 and 2).
  • Some types of use of AI are completely prohibited or only permitted in very narrow exceptional cases. In this context, the DSK refers to the AI Act and, as examples, to social scoring and biometric real-time monitoring of public spaces (para. 3).
  • The DSK considers it possible to define areas of application without personal data. As an example, reference is made to analyses with geological maps without any connection to a residential area. When using a chatbot to check non-personal code sequences, it should be checked whether the AI model itself (for example, possible in relation to users of the service) may nevertheless involve the processing of personal data (para. 4 to 6).
  • Controllers which have not developed an AI themselves must consider whether the AI has been trained with personal data and in accordance with the GDPR. Even without influence on the development process, such controllers must ensure that any training of AI that does not comply with data protection requirements does not affect the lawfulness of the data processing carried out under their own responsibility (para. 7 and 8).
  • A legal basis is required for each processing step - the DSK keeps the relevant parts very general here and also refers to the discussion paper "Legal bases in data protection when using artificial intelligence" (available in German here) by the LfDI Baden-Württemberg (para. 9 to 11).
  • In the context of 22 (1) GDPR, the DSK is of the opinion that in the case of preparatory decisions made with the help of AI (where the final decision is made by a human), the human making the final decision should not only be involved formally. As an example, the DSK mentions an AI which independently evaluates applications and sends invitations to job interviews. In the opinion of the DSK, such an approach violates Art. 22 (1) GDPR. The guidelines also address automated administrative decisions (para. 12 to 14).
  • In the opinion of the DSK, closed AI systems are preferable compared to open systems. This is justified, among other things, by the fact that closed systems pose a risk of further processing for purposes other than those intended, and that data may be available to third parties. It should also be considered whether the AI itself has access to the internet and whether a personal reference can be established due to this (para. 15 to 20).
  • If input data and output data are used to train the AI, this must be transparent. In addition, the DSK believes that it should be possible to switch off the use of data for training purposes (para. 24).
  • In particular, if several employees use a shared account and the previous history is displayed there, this must be made transparent. In addition, there should be an option for users to individually determine whether the input history is saved (para. 25).
  • Data subjects' rights must also be implemented in the context of data processing carried out with AI. In the opinion of the DSK, the right to rectification is particularly relevant here because some providers of AI applications themselves state that the results must be checked for accuracy. The DSK emphasizes that although suppression of output results does not result in deletion, filters used for this purpose are nevertheless helpful for guaranteeing the rights of data subjects (para. 26 to 30).
  • The data protection officer should be involved. Especially if the AI is used to make preparatory decisions for later decisions with legal effect (para. 31).

2. Key statements on the implementation of AI applications

The guidelines essentially contain the following key statements on the implementation of AI applications:

  • It should be clarified in advance who the controller is and which other actors are acting in which data protection roles. If the AI application is operated by one entity exclusively for its own purposes on its own servers, this entity is generally also the sole controller in the opinion of the DSK. If an entity uses an AI offered by another entity for its own purposes, then this entity using the AI generally acts as the controller and the provider as the processor. In the opinion of the German authorities, joint controllership may apply, for example, if an AI application is fed or trained with different data sets in the case of cooperation between several entities. The same should apply if an entity's AI application is further developed on its platform into new AI applications by other entities (recitals 32 to 34).
  • The DSK points out that in the case of joint controllership, each controller requires its own legal basis and the exchange of data between joint controllers also requires a legal basis that is not contained in Art. 26 GDPR (para. 35).
  • There should be clear internal rules and instructions on the use of AI. In this context, the DSK mentions that many companies are currently experiencing uncontrolled use of AI by employees. In the opinion of the German authorities, it should be regulated, for example, under which conditions and for which purposes AI may be used and which services may be used. Employees should also receive general training and be made aware of whether and under what conditions AI may be used (para. 36 to 37 and para. 46).
  • In the opinion of the DSK, a data protection impact assessment should often be carried out. If AI from a third-party provider is used, it should already be considered during the purchasing process whether the provider can provide the company using the AI with sufficient meaningful information for this purpose (recitals 38 to 40).
  • The DSK points out that employees should use AI applications with company devices. In the opinion of the authorities, it is even necessary that a general company email address (e.g. info@) is used for account registration and that the account names used do not reveal the names of the employees (para. 42).
  • Existing requirements in the area of data security and data protection by design and default must be implemented. For example, the default settings regarding the use of input and output data for training and the display of the history in the account should be made. With regard to data security, reference is also made to publications of the German Federal Office for Information Security which are available in German here (para. 43 to 45).
  • The DSK points out that additional requirements from the AI Act must be observed and that technical and legal developments should continue to be monitored (para. 47).

3. Key statements on the use of AI applications

The guidelines essentially contain the following key statements on the use of AI applications:

  • When using AI, attention should be paid to whether personal data is processed upon input of information and whether the result provided by the AI may be personal data. If this is the case, there must be a legal basis and, if necessary, data subjects must be informed in accordance with Art. 13 / 14 GDPR. The DSK gives various examples of inputs and outputs with and without personal reference (para. 48 to 61).
  • The DSK points out increased admissibility requirements for the processing of special categories of personal data. Such processing would be given, for example, if information about medicine consumption or regular visits to a church are processed. In the example provided by the authorities, it is pointed out that AI applications are widely used in the context of cancer diagnostics and that their use can also be based on Art. 9 (2) h GDPR in conjunction with the treatment contract (Art. 6 (1) b GDPR) if they are approved as a medical device. According to the DSK, if there is no such approval, the only remaining option is consent (para. 62 and 63).
  • The DSK once again draws attention to the requirements for accuracy of the processed data and advises reviewing the results produced by an AI and the associated consequences for the permissibility of data processing (para. 64).
  • In the opinion of the DSK, accurate personal data can also have a discriminatory effect. In these cases, the authorities believe that there may be no legal basis for processing such data. As an example, reference is made to the fact that the balancing of interests pursuant to Art. 6 (1) f GDPR cannot result in favor of the controller if the processing aims to carry out discrimination prohibited under the German Federal General Equal Treatment Act (para. 66 to 69).

4. Brief critical analysis

First of all, it is positive that the authorities have positioned themselves on the very up-to-date and practically relevant topic of AI in terms of data protection law. However, data controllers should not expect the guidance to contain in-depth analyses and different views on data protection issues. In many places, the authorities make statements without justifying them further or explaining in detail why the result is as it is. It is already mentioned in the document that the guidelines will be supplemented in the future. This also means that statements could be further substantiated or specified in more detail in the future.

Some parts of the guidelines also frankly seem a little small-scale and ignore the major data protection challenges. For example, when the DSK emphasizes in para. 42 that accounts should not contain the names of individual employees and that a functional email address of the company should be used, this certainly does not address important data protection issues for the use of AI. References to the use of self-hosted servers for AI data processing are also part of this. The suggestion that a closed AI system is preferable compared to an open one also seems to ignore the reality of average companies. Considering the current market situation in the AI sector, such companies often have no other option than to use open systems from a third-party provider. In addition, in the context of the accuracy of personal data, the question arises as to how companies should deal with the fact that AI - just like humans - does not always provide 100% correct answers. The relevant parts of the guidelines could also be understood to mean that in such cases, the use of potentially incorrect results is always associated with data protection issues.

Even if the guidelines certainly do not answer the biggest data protection questions around the use of AI, it is good to get a sense of which aspects the German authorities pay the most attention to. Data controllers using AI can use the requirements addressed by the DSK and check for their own AI use how far away they are from the authorities' views. It may make sense to document arguments that differ from the DSK view in order to be able to argue better in case of an audit by an authority. At the same time, it seems rather unlikely that the data protection authorities will examine the use of AI by companies on a large scale in the near future.

Business Lawyer, Counsel
Philipp Quiel, LL.M.
Business Lawyer, Counsel
Philipp Quiel, LL.M.

Go back


Board of German data protection authorities (“DSK”) publishes first guidelines on data protection for AI

The DSK guidance document "Artificial intelligence and data protection" (available in German here) primarily addresses controllers using AI, but also indirectly developers, manufacturers and providers of AI solutions. It provides an overview of relevant criteria from the perspective of the authorities but should not be understood as an exhaustive list of requirements. Nevertheless, the document contains references to a large number of different legal requirements.

The Legal 500 Germany: Dr. Carlo Piltz as leading name in data protection 2024

Once again Dr. Carlo Piltz is included among the leading names in the field of data protection in the latest edition of the Legal 500 Germany.

ECJ ruling on VIN and general aspects of the term 'personal data'

The consequences of the ECJ's decision in Case C-319/22 (also referred to as the ‘Scania case’) of November 9, 2023 will certainly be discussed in the data protection scene for a long time to come. It is already visible that the judgment creates big waves in the automotive industry and related sectors, but also in the data protection community in general. However, it seems doubtable whether this is justified or whether essentially the same aspects as before the decision must be taken into account when clarifying the question of the existence of personal data. In the exact case dealt with by the ECJ, it will first be decided by the Regional Court of Cologne whether the VIN is indeed personal data for vehicle manufacturers and independent operators. The ECJ ruling itself does not yet provide a direct and unambiguous answer