Newsletter data protection

Dear readers,

On February 7, 2024, the State Commissioner for Data Protection and Freedom of Information Rhineland-Palatinate (LfDI) celebrated the 50th anniversary of the Rhineland-Palatinate State Data Protection Act, which came into force on January 25, 1974, with a ceremony in Mainz (press release dated 07.02.2024). After Hesse and Sweden, Rhineland-Palatinate was the third state in the world to have its own data protection law. Various provisions of the Rhineland-Palatinate State Data Protection Act can also be found in the General Data Protection Regulation (GDPR), such as the requirement of a legal basis, independent supervisory bodies and the safeguarding of data processing through technical and organizational measures. The original Data Protection Act only referred to the processing of data by public bodies. Regulations for non-public bodies, in particular companies, were only introduced with the Federal Data Protection Act (BDSG).

"Half a century of data protection in Rhineland-Palatinate: this is an occasion to pay tribute to a great pioneering achievement. The Rhineland-Palatinate Data Protection Act has proven to be consistently modern and relevant - perhaps more modern than could have been expected 50 years ago. Many of the principles laid down back then still apply today. In the face of rapid technological developments, we must, want to and can show that answers can be provided to the challenges of our time that are compatible with data protection and the fundamental right to informational self-determination. Open and solution-oriented: The fact that so many players from politics, society, administration and business are celebrating the anniversary of the State Data Protection Act with us shows me that we are on the right track," said LfDI Prof. Dr. Dieter Kugelmann.

For feedback on this newsletter or questions related to the newsletter topics, please email us at datenschutz@brandi.net. You can also find the other contact details on our homepage.

Dr. Sebastian Meyer and the BRANDI data protection team

Topic of the month: The record of processing activities - What is a procedure and how many procedures must be documented?

Under the General Data Protection Regulation (GDPR), companies are subject to the so-called accountability obligation pursuant to Article 5 (2) GDPR. This means that they must provide positive proof that they comply with data protection regulations. This requires comprehensive documentation of the issues relevant to data protection law. This documentation also includes the creation of a record of processing activities. According to Article 30 (1) (1) GDPR, every controller and, where applicable, their representative is obliged to keep a record of all processing activities for which they are responsible. All data processing operations in which personal data is processed must be listed in the record, insofar as this falls within the responsibility of the controller. The record has the task of summarizing and documenting the essential information on the individual processes. At the same time, the documentation also serves the purpose of self-monitoring. The mandatory information that the record must contain is then derived from Article 30 (1) (2) GDPR. This includes the name and contact details of the controller and its data protection officer, the purposes of the processing, a description of the data subject and the personal data processed as part of the process, the recipients to whom the data is disclosed, the transfer of data to third parties and the duration of storage. The record must be maintained on an ongoing basis so that it always reflects the current status of data processing in the company.

According to the concept of the GDPR, the creation and maintenance of the record is the responsibility of the data controller, i.e. the company. However, the data protection officer provides support and advice in the creation and maintenance of the register, particularly in the context of risk assessments and legal evaluation. The record must be made available to the supervisory authorities upon request. The processes must be documented in such a way that the supervisory authority can gain an initial impression of whether the controller is complying with its data protection obligations by requesting the list of processing activities or individual process descriptions. With the help of the record of processing activities, the company can simultaneously prove that the required examination of the data protection requirements for each data processing operation has taken place.

To the complete main topic

Digital Service Act came into force on February 17, 2024

The Digital Service Act (DSA) came into force on November 16, 2022. The law has been fully applicable since February 17, 2024. The DSA is intended to ensure a safe and responsible online environment and protect the fundamental rights of users on the internet more comprehensively. Above all, the new law is intended to make it easier to take action against illegal content online, such as hate speech and counterfeit products. The DSA applies to all digital services that provide consumers with goods, services or content. These include internet providers, hosting services, cloud services, online marketplaces, app stores and social media platforms. The DSA creates various new obligations for online companies, including the obligation to set up a central contact point for authorities and users, explanatory obligations in the general terms and conditions and the obligation to publish annual transparency reports. Hosting providers and online platforms are also subject to the obligation to set up a reporting and remediation procedure for illegal content and reporting obligations in the event of specific criminal offenses. In addition, online platforms in particular have various other obligations, such as protection against misuse, transparency obligations, advertising and the protection of minors. Very large online platforms and search engines are also subject to special additional regulations under the DSA because they entail particular risks.

In Germany, the Federal Network Agency is responsible for monitoring compliance with the provisions of the DSA and can take the necessary measures in the event of violations. In the event of a breach of the DSA, the competent authority can impose fines of up to 6% of annual global turnover. In addition, the law enforcement and market surveillance authorities as well as the state media authorities can also take action against illegal content.

(Christina Prowald)

OLG Nürnberg: Excessive assertion of the right of access not an abuse of rights

If an employee asserts their right of access against their former employer in an excessive manner in accordance with Article 15 GDPR and the employer incurs considerable expense as a result, this does not automatically lead to an abuse of rights in the opinion of the OLG Nürnberg (OLG Nürnberg, decision dated 29.11.2023 - Ref. 4 U 347/21).

The plaintiff was a former employee of the defendant and asserted a comprehensive claim for information against it. The defendant initially provided information about data stored in the company's master system. The plaintiff then demanded that the defendant provide him with all data available to it, such as minutes of board meetings, e-mail correspondence, etc., and

not just the data stored in the master system. The defendant subsequently asserted that the claim asserted by the plaintiff was excessive within the meaning of Article 12 (5) (2) GDPR due to the disproportionate effort involved in the fulfillment.

The court stated that the term "personal data" is to be understood broadly. The right of access is also unconditional. The ECJ had also recently ruled that the data must be made available by the controller even if the request in question is based on a purpose unrelated to data protection. The plaintiff's claim is also not excluded under Article 12 (5) (2) GDPR. According to the wording and purpose of the provision, there is no abuse if a data subject uses the right of access for purposes unrelated to data protection. This also applies if the assertion of the right causes a high level of effort on the part of the controller or if the data subject asserts multiple claims for information. The data disclosure in the present case is not excessive because it is the first request. The assertion of the claim was also not an abuse of rights. The fact that the defendant has a lot of data due to the duration and activity of the plaintiff does not prohibit the assertion of the plaintiff's rights.

(Christina Prowald)

OLG Köln on the data protection-compliant design of a cookie banner

On January 19, 2024, the OLG Köln ruled that the buttons - Agree and Disagree - in a cookie banner must have an equivalent design (OLG Köln, decision dated January 19, 2024 - Ref. 6 U 80/23).

The decision concerned the cookie banner on the WetterOnline website. This was designed in such a way that only the "Accept" and "Settings" buttons could be found on the first page of the banner. The option to reject cookies could only be accessed after clicking on the settings option on the second page of the cookie banner. In the top corner of the cookie banner there was also a button with the title "Accept & Close X".

The court ruled that the design of the cookie banner did not offer the data subject an equivalent alternative to consent on either the first or second page. What would have been required was a simple option to refuse, based on clear and comprehensive information. This would steer the data subject towards consent and deter them from refusing. The consent obtained with the help of the cookie banner could therefore not be regarded as voluntary and informed within the meaning of Section 25 (1) TTDSG and Article 4 No. 11 GDPR. With the chosen design and designation of the buttons, it is also not clear to the average user which functions are hidden behind the respective buttons and how the rejection of technically non-essential cookies can be achieved. A real choice is not given.

The lettering in the top corner of the banner also violated the principles of transparency and the voluntary nature of consent and rendered it ineffective. The X symbol is known to users as a way to close windows, not to give consent. The average user is not aware that clicking on the button constitutes consent. The link between the X symbol and the further labeling is misleading and non-transparent. In this respect, the consent is neither to be regarded as unambiguous or clearly confirming, nor as voluntary within the meaning of Section 25 (1) TTDSG, Article 4 No. 11 GDPR.

(Christina Prowald)

LDI NRW grants authorization for first German certification body

The Bonn-based EuroPriSe Cert GmbH is the first body in Germany to be authorized by the State Commissioner for Data Protection and Freedom of Information of North Rhine-Westphalia (LDI NRW) to certify data processing procedures of processors (notification dated 02.02.2024). The "European Privacy Seal" (EuroPriSe) certificate issued by EuroPriSe Cert GmbH is intended to certify to processors in future that their data processing procedures meet the requirements of European data protection law.

At the end of 2022, the LDI NRW had already approved the certification criteria that form the basis of the certification body's activities. The LDI NRW and Deutsche Akkreditierungsstelle GmbH then jointly carried out the necessary accreditation procedure. Accreditation is granted for a maximum period of five years; subsequent re-accreditation is possible.

LDI NRW, Bettina Gayk, explained: "Certificates are an important instrument for ensuring a high level of data protection. Anyone who selects a certified processor is making a good choice and can be sure that their data protection compliance is checked and monitored in a transparent process."

(Christina Prowald)

BayLDA: Checklist for AI tools

In view of the increasing relevance of the topic of artificial intelligence (AI), the Bavarian State Office for Data Protection Supervision (BayLDA) has published a checklist for the use of AI tools with test criteria in accordance with the GDPR under the title "Data protection-compliant artificial intelligence". The checklist sets out requirements for the development and use of applications in the AI category. With this publication, the BayLDA aims to fulfill its awareness-raising mandate. The document is to be regularly adapted to developments at German and European level.

The checklist first explains how AI applications work and how the AI models on which the applications are based are trained. In principle, a large amount of training data is required to train the models. A detailed checklist is used to explain which data protection requirements must be complied with during training and which aspects should be checked by those responsible. In particular, the documentation obligations under the GDPR, specifically the inclusion of the process in the list of processing activities and the implementation of a data protection impact assessment, the principle of data minimization, the legal basis for the data processing in question, the implementation of information obligations and the implementation of data subjects' rights are discussed.

The fact that the use of AI tools is associated with various risks - including for the rights and freedoms of those affected by the data processing - is then addressed. It is pointed out that the risks must be mitigated with effective measures (Articles 25, 32, 35 GDPR). To this end, the first step is to identify and document the risks associated with the specific application. Subsequently, the respective risks must be addressed as part of the preparation of a data protection impact assessment. Finally, the use of AI applications is discussed. The points relevant to data protection law are again presented using a checklist. Various aspects already relevant to the training of AI models are taken up again and placed in the context of the use of AI tools.

(Christina Prowald)

BlnBDI: Publication of membership lists

The Berlin Commissioner for Data Protection and Freedom of Information (BlnBDI) has commented on the publication of membership lists of an association or a party to individual members and emphasized the importance of protecting special categories of personal data (notification dated 13.02.2024). From a data protection perspective, the disclosure of data is particularly controversial and data transfer is regularly prohibited if the information about membership allows conclusions to be drawn about particularly sensitive personal data, such as a political stance, trade union membership or health.

However, in order to justify the transfer of data, the association can obtain the members' consent to the disclosure of their data. This would have to meet all conditions of validity and, in particular, be voluntary for the data subjects. Alternatively, the disclosure to a trustee could be considered. In this case, a legal obligation and technical qualifications could ensure that the data is processed in compliance with data protection regulations.

BlnBDI, Meike Kamp, says: "If people support associations with their membership that, for example, advocate the free development of sexual identity or stand up for women's rights with strong opinions, it can be concluded that they share the goals of the association and the associated political positions. Nevertheless, members may wish to keep their membership and contact details secret because they fear for their safety or because their personal situation does not allow them to publicly declare their support for the aims of the association. Members of politically active associations and parties therefore have a right to ensure that membership information is treated confidentially and is not disclosed carelessly - not even to other members."

(Christina Prowald)

Safer Internet Day 2024: TLfDI clears up data protection misconceptions

The Thuringian State Commissioner for Data Protection and Freedom of Information (TLfDI), Dr. Lutz Hasse, has used Safer Internet Day to clarify some data protection misconceptions (notification dated 06.02.2024). In his opening remarks, he makes it clear that data protection is not intended to prevent digitalization, but rather to make it legally compliant. The GDPR has largely harmonized data protection law within the EU, which has significantly simplified data traffic within Europe.

He goes on to explain that the provisions of the GDPR do not only have to be complied with when personal data is processed electronically. Rather, the scope of application also includes organized, structured paper files and written records. He also makes it clear that the pseudonymization of personal data does not remove the scope of the GDPR, as the pseudonymized data can be returned to its original data with additional tools. The situation is different in the case of anonymization, where it is no longer possible to restore the original data. Furthermore, he clarifies that not just any data processing is permitted with the consent of the data subject. Consent is always tied to a specific purpose. The data of the data subject may not be processed beyond the stated purpose on the basis of consent.

With regard to the notification obligation pursuant to Article 33 (1) (1) GDPR, he explains that not only "major data breaches" must be reported to the supervisory authority. The obligation to notify is only waived if it can be ruled out that the personal data breach has led to a risk to rights and freedoms. As the cases in which there is really no risk are very rare, reporting to the

supervisory authority is generally unavoidable even in the case of minor incidents. In this respect, he also makes it clear that the technical and organizational measures implemented by the controller must be subject to regular review with regard to their appropriateness. The current appropriateness is always based on the state of the art. In particular, a data protection incident means that existing processes and measures must be reviewed again.

Finally, he also points out that the legislator cannot simply abolish data protection. Data subjects would have to take into account their right to informational self-determination, which is derived from the general right of personality pursuant to Article 2 (1) in conjunction with Article 1 (1) GG and the right to protection of personal data concerning them under Article 8 of the Charter of Fundamental Rights of the European Union.

(Christina Prowald)

Netherlands: Fine of 10 million euros imposed on Uber

On December 11, 2023, the Dutch Data Protection Authority (AP) imposed a fine of 10 million euros on Uber Technologies, Inc. and Uber B.V. for violation of information obligations and violation of the principle of transparency (notification dated 31.01.2024). The content of the complaint concerned the company's failure to disclose the full details of the retention periods for European drivers' data. Secondly, the supervisory authority criticized the fact that the company did not provide any information on the third countries to which the users' data is transferred. In particular, the data protection provisions did not specify how long the data is stored and what measures are taken to safeguard transfers to third countries. It was also found that Uber hindered drivers' efforts to exercise their rights under data protection law. The company made it unnecessarily complicated for drivers to submit requests for information. The form to be used for such requests was hidden in the app and spread across various menus. The responses that users received in relation to their requests were also not clear and subsequently difficult to interpret.

The chairman of the AP, Aleid Wolfsen, said: "Drivers have the right to know how Uber handles their personal data. However, Uber has not explained this clearly enough. Uber should have informed its drivers better and more carefully in this regard. Transparency is an essential part of personal data protection. If they do not know how their personal data is being handled, they cannot determine whether they are being disadvantaged or treated unfairly. And they cannot stand up for their rights."

The fine was imposed after more than 170 drivers complained to the French human rights organization Ligue des droits de l'Homme et du citoyen, which in turn lodged a complaint with the French data protection supervisory authority. The latter forwarded the complaint to the competent Dutch authority. When setting the fine, particular account was taken of the size of the company and the severity and scope of the infringements. Uber has already lodged an appeal against the AP's decision. The AP has noted that Uber has already improved its processes in the meantime. 

(Christina Prowald)

France: Fine of 10 million euros imposed on Yahoo EMEA Ltd.

On December 29, 2023, the French supervisory authority (CNIL) imposed a fine of 10 million euros on Yahoo Emea Limited because the company did not implement the choice of users who refused the setting of technically unnecessary cookies on the company's website and also did not allow users of the e-mail program to freely revoke their consent to the setting of cookies (notification dated 18.01.2024).

CNIL's decision was in response to complaints from 27 Yahoo users. Based on the complaints, the supervisory authority carried out several online reviews of the website "Yahoo.com" and the email program "Yahoo! Mail" and came to the conclusion that the company did not comply with its obligations under Article 82 of the Data Protection and Freedom Act. As part of its review, CNIL found that users were directed via the cookie banner to a page consisting of numerous buttons to obtain consent for the use of cookies. The supervisory authority also came to the conclusion that around 20 technically unnecessary cookies are stored on the user's device even if consent is not given. CNIL also criticized the fact that users can no longer use their email inbox if they withdraw their consent to the placement of technically unnecessary cookies. Linking a service to the setting of non-essential cookies is permissible in principle, but only if consent is given voluntarily. In this respect, no disadvantages for the user may be linked to their consent or the withdrawal of consent. However, the company did not provide users with an alternative way to continue using the mailbox. CNIL was of the opinion that users could not freely withdraw their consent under the given circumstances.

When setting the fine, it was taken into account that the company did not respect the user's decision and took measures to prevent users from withdrawing their consent.

(Christina Prowald)

France: Fine of 100,000 euros imposed on PAP

On January 31, 2024, the CNIL imposed a further fine of 100,000 euros on the company PAP for non-compliance with data retention periods and deficiencies in data security (notification dated 13.02.2024).

The company PAP is the publisher of the website pap.fr, which allows private individuals to view and publish real estate advertisements. As part of an investigation, the supervisory authority found violations of data retention periods, the provision of information, the contractual basis of PAP's cooperation with a processor and data security. In this respect, the supervisory authority criticized the fact that the company set a retention period of ten years for certain customer accounts without this period being justified under the provisions of the Consumer Code to which the company referred. The company also had an incomplete privacy policy. It lacked explanations on the legal basis, the processors used, the retention periods and the right to lodge a complaint with the supervisory authority. CNIL also found that an agreement on commissioned processing between PAP and a service provider used did not meet the requirements stipulated in Article 28 GDPR. Due to various security deficiencies, the supervisory authority also considered the data of data subjects to be exposed to an increased risk of attacks and data leaks.

(Christina Prowald)

On our own account: 5th BRANDI Data Protection Law Day 2024

We cordially invite you to our 5th BRANDI-Data Protection Law Day on May 24, 2024. The face-to-face event will take place this year in Paderborn at the Heinz Nixdorf Museum Forum on the premises of the University of Applied Sciences. In addition, you will also have the opportunity to participate passively online in our Data Protection Law Day this year.

We have once again been able to attract a renowned expert for the event. This year, we will be discussing with Thilo Weichert, a long-standing jury member who helps decide the winners of the annual Big Brother Awards. Dr. Weichert is one of Germany's best-known data protection

experts and former head of the data protection supervisory authority in Schleswig-Holstein (ULD).

You can look forward to an exciting keynote speech by Dr. Weichert as well as discussions on current and practice-relevant topics. This year will also see the return of our BRANDI Young Talents round, in which prospective lawyers will give short presentations on interesting data protection topics.

You can already register for the event using our registration form under the following link: Registration We will inform you about further details and the content of the event as soon as possible.

(Christina Prowald)