Probleme bei der Darstellung des Newsletters? Dann klicken Sie hier für die Webansicht

Newsletter data protection

Dear readers,

The Hessian Commissioner for Data Protection and Freedom of Information (HBDI) Prof. Dr. Alexander Roßnagel, who was a guest at our BRANDI-Data Protection Law Day 2023, will take over the chairmanship of the Conference of Independent Federal and State Data Protection Supervisory Authorities (DSK) from May 16 to December 31, 2024.

Roßnagel commented on assuming the chairmanship as follows: "I am delighted to take on this special task. The data protection supervisory authorities are facing major challenges. They must help to ensure that the digital transformation of the economy, administration and society takes place in a future-oriented and humane manner and offers more advantages than disadvantages for freedom and democracy. The future European AI Regulation assigns them important tasks, particularly with regard to the latest developments in the field of artificial intelligence. Their fulfillment requires intensive cooperation and coordination between the independent data protection supervisory authorities."

For feedback on this newsletter or questions related to the newsletter topics, please email us at datenschutz@brandi.net. You can also find the other contact details on our homepage.

Dr. Sebastian Meyer and the BRANDI data protection team

Topic of the month: BRANDI-Data Protection Law Day on the topic of „Security begins with Data Protection”

On May 24, 2024, Dr. Thilo Weichert, former head of the data protection supervisory authority in Schleswig-Holstein (ULD), and Prof. Dr. Eckhard Koch, Vice President for Research, Development and Transfer at FHDW Paderborn, were guests at BRANDI. As part of this year’s Data Protection Law Day on the topic of “Security begins with Data Protection”, our guests gave exciting insights into various data protection and IT security law topics, current procedures and their daily work in discussions with experts from BRANDI, including Dr. Sebastian Meyer, Dr. Christoph Rempe, Johanna Schmale, Dr. Carina Thull and Dr. Daniel Wittig.

To the complete main topic

EU Council adopts AI regulation

On 21 May 2024, the Council of the EU adopted the Regulation laying down harmonized rules on artificial intelligence (Artificial Intelligence Act) (communication of 21.05.2024). The AI regulation is the first of its kind in the world and could set international standards.

The new regulation takes a risk-based approach. The higher the risk of damage to society, the stricter the regulations must be. While only comparatively low transparency obligations apply to systems with limited risk, high-risk AI systems must meet certain requirements and obligations in order to gain access to the market. Various areas of application, such as cognitive behavioral manipulation, social credit systems and predictive policing, are even completely excluded. This is intended to promote the development and use of safe and trustworthy AI systems. The regulations are also intended to ensure compliance with the fundamental rights of EU citizens.

With the introduction of the AI Regulation, various new bodies will be set up at the same time. The Office for Artificial Intelligence is responsible for enforcing the regulations, the Scientific Panel of Independent Experts supports the enforcement measures, the Committee on Artificial Intelligence primarily takes on an advisory role and the Stakeholder Advisory Forum provides technical expertise.

Mathieu Michel, Belgian Secretary of State for Digitalization, commented on the new regulation: "The adoption of the AI law is a significant milestone for the European Union. This groundbreaking regulation - the first of its kind in the world - addresses a global technological challenge that will also create opportunities for our societies and economies. With the AI Act, Europe is demonstrating the importance of trust, transparency and accountability when dealing with new technologies. At the same time, it ensures that this rapidly changing technology can flourish and give a boost to European innovation."

After being signed by the President of the European Parliament and the President of the Council, the AI Regulation will be published in the Official Journal of the European Union and will enter into force 20 days after publication. With the exception of individual provisions, the rules of the Regulation must be complied with after a transitional period of two years following its entry into force.

(Christina Prowald)

ECJ rules again on the claim for damages

In two decisions of June 20, 2024, the ECJ again ruled on the claim for damages under Article 82 GDPR.

In the first case, documents were sent by a tax consultancy firm to an outdated address following a change of address (ECJ, decision dated 20.06.2024 - Ref. C-590/22). In the main proceedings, it was not possible to clarify which documents were in the envelope and whether the new residents were aware of the contents of the incorrectly addressed letter. As a result, a claim for damages in the amount of 15,000 euros was asserted against the tax consulting firm due to the disclosure of personal data to third parties.

The second proceeding is based on two initial proceedings with largely identical facts, in which an attack on a trading app occurred in the course of which personal data and data on the plaintiffs' securities accounts were tapped by third parties whose identity is unknown (ECJ, decision dated 20.06.2024 - Ref. C-182/22 and C-189/22). According to the app provider, the data has not yet been used fraudulently.

Referring to its previous case law, the ECJ reiterated in both proceedings that Article 82 GDPR requires a breach of the GDPR, damage and a causal link between the breach and the damage. In this respect, a mere infringement does not necessarily justify a claim for damages. In contrast, a materiality threshold does not have to be exceeded. However, the affected party must prove the damage and the court of the member state is free to award even minor damages. The fear that personal data has been passed on to third parties without this being able to be proven is sufficient in principle, provided that the fear, including the negative consequences, can be properly proven. With regard to the amount of damages, the ECJ further stated that the criteria of Article 83 GDPR cannot be used for assessment. The definition of the criteria for determining the amount of compensation is a matter for the law of the individual Member State, whereby the principles of equivalence and effectiveness must be observed. However, the right to compensation does not have a punitive function, which is why the amount may not exceed full compensation for the damage. An additional breach of national provisions and the degree of seriousness and possible intentionality of the breach should not be taken into account in the assessment.

(Christina Prowald)

OLG Hamm on claim for damages against X due to API bug

The Higher Regional Court of Hamm has now also ruled that a person affected by the API bug at X (formerly Twitter) is not entitled to claim damages from the company (OLG Hamm, decision dated 14.05.2024 - Ref. 7 U 14/24). The Regional Court of Stuttgart and the Regional Court of Freiburg had previously ruled in different ways on data protection claims in connection with an API bug at X (we reported on this in May 2024).

In the court's view, immaterial damage attributable to possible infringements had not been sufficiently demonstrated. A loss of control could not be assumed. The plaintiff merely suspects such an occurrence, but has not proven it. Furthermore, he has not provided any evidence as to why he assumes that he was affected by the API bug. It is undisputed that the defendant informed all users affected by the incident. However, the plaintiff did not belong to this group and did not explain why he nevertheless believed that he was affected by the incident. Furthermore, the plaintiff could not claim damages for failure to provide information in accordance with the statutory requirements. Irrespective of the question of whether the information was not provided properly, no damage could be identified in this respect.

(Christina Prowald)

LG Gießen: Transmission of positive data to SCHUFA is not a violation of data protection law

The Regional Court of Gießen has ruled that the transmission of positive data to SCHUFA, specifically the notification of the conclusion of a contract by a telecommunications company, does not justify a claim for damages under Article 82 GDPR (LG Gießen, decision dated 03.04.2024 - Ref. 9 O 523/23, GRUR-RS 2024, 7986).

Even the infringement of Article 6 (1) (1) (f) GDPR alleged by the plaintiff does not exist. It is fundamentally disputed whether the interests of the defendant, specifically fraud prevention, over-indebtedness prevention, precision of default risk forecasts and validation of the data, outweigh the plaintiff's right to informational self-determination. However, the better arguments were in favor of this, as no more suitable means of achieving the legitimate interests of the defendant could be identified. Furthermore, there was also a lack of damage. Although the concept of immaterial damage is to be understood broadly, a formulaic, non-individualized submission is not sufficient to fill out the constituent element. The alleged feeling of loss of control and great concern regarding his creditworthiness due to the positive notification and not because of various other negative entries was so obviously false that it could only be an intentional false allegation. The plaintiff's credit rating was rightly poor because he had not serviced his debts. The defendant's notification was not suitable to further worsen the plaintiff's creditworthiness. The decision thus contradicts various proceedings (e.g. LG Munich I, decision dated 25.04.2023 - Ref. 33 O 5976/22 and LG Frankfurt, decision dated 26.05.2023 - Ref. 2-24 O 156/21), which were conducted against companies that transmit positive data to SCHUFA and were prohibited from doing so, a practice that SCHUFA has largely discontinued anyway. In the present case, however, claims for damages were to be asserted by an affected party, which remained unsuccessful.

(Christina Prowald)

LG Kiel: Cyber insurance does not have to pay out for false statements

If a customer makes false statements when taking out cyber insurance, the insurance company does not have to pay in the event of a claim and can contest the contract due to fraudulent misrepresentation, according to the Regional Court of Kiel (LG Kiel, decision dated 23.05.2025 - Ref. 5 O 128/21).

In the case in question, the company had stated, among other things, that all available security updates were carried out and that only software products for which security updates were provided by the manufacturer were used when querying the risk circumstances when taking out cyber insurance. In addition, the company uses up-to-date malware detection software on all work computers. In fact, the company was using a Web SQL server with the Windows 2008 operating system in its web store, for which no software and security updates had been made available since January 2020. The server also did not have anti-virus software. The company also used outdated systems in some cases. After it was discovered that an external attacker had gained access to the company's systems and data had been leaked, a damage report was submitted. The cause of the data outflow was malware introduced by the attack via the Windows 2008 computer. After it emerged during the claims settlement process that the company had answered some of the questions about the circumstances of the risk incorrectly, the insurance company declared that it was contesting the contract and refused to pay benefits. The Regional Court of Kiel ultimately determined that the insurance contract was void due to rescission on the grounds of fraudulent misrepresentation regarding risks relevant to the contract by answering the risk questions incorrectly. As a result, the insurance company was not obliged to provide the agreed insurance benefits.

(Christina Prowald)

LAG Niedersachsen: No right of a works council member to receive pay slips from his colleagues

On April 26, 2024, the Lower Saxony Higher Labor Court ruled that an exempted works council member is not entitled to the submission of his colleagues' pay slips in preparation for a claim pursuant to Section 37 (2) BetrVG (LAG Niedersachsen, decision dated 26.04.2024 - Ref. 14 Sa 736/23; BeckRS 2024, 12675).

In the course of a dispute regarding the legally compliant remuneration of a member of the works council who had been released from his duties for many years, the member (defendant) asserted a claim for information against his employer (plaintiff). The defendant was of the opinion that the plaintiff should also provide him with information about the remuneration and its composition of colleagues in his reference group so that he could check the calculation of his remuneration by the plaintiff.

The court found that the defendant was not entitled to information regarding the pay slips of colleagues. The request was to be understood in such a way that the defendant demanded not only information about the remuneration itemized in detail but also the submission of the statements as proof. In the opinion of the court, the disclosure is not lawful and therefore cannot be demanded. The disclosure is data processing and must therefore be based on a legal basis. Article 6 (1) (1) (f) GDPR cannot be considered as a legal basis, as the requested documents would contain very sensitive data of the colleagues and in this respect there is a high interest in protection, while there is no recognizable legitimate interest on the part of the defendant to obtain this data.

(Christina Prowald)

AG Gelnhausen: Video surveillance using a swivel camera is not permitted

On March 4, 2024, the Gelnhausen District Court ruled that the installation of a surveillance camera is already inadmissible if it can be electronically panned onto the neighboring property (AG Gelnhausen, decision dated 04.03.2024 - Ref. 52 C 76/24). For a claim for injunctive relief, it is necessary, but also sufficient, that a so-called surveillance pressure exists. In this respect, the decisive factor is that third parties must seriously fear surveillance. For this to be the case, it is sufficient that there is a tense neighborly relationship and that the camera can be directed towards the neighboring property using an electronic control mechanism. Whether the function is actually used is irrelevant. The fact that one's own property is to be protected by means of video surveillance is certainly a legitimate interest. In the present case, however, this is accompanied by a disproportionate impairment of the neighbors.

(Christina Prowald)

EDPB: Report on the work of the „ChatGPT” task force

The "ChatGPT" task force published a report on its work on May 23, 2024. As the work of the task force has not yet been fully completed, the contents of the report are, according to the task force, only preliminary results. The background to the investigation is the increasing spread of large language models (LLM). In view of the fact that the data processing procedures associated with such models must comply with data protection requirements, various supervisory authorities have already initiated investigation proceedings against OpenAI, the operator of ChatGPT, among others. As OpenAI did not have an establishment in the EU until February 15, 2024, the EDPB decided to set up a task force in April 2023 to facilitate cooperation and the exchange of information on possible measures in the context of ChatGPT between the supervisory authorities.

According to the task force, a distinction must be made between different phases when assessing the legality of the data processing procedures in question. These are the collection of training data, pre-processing of the data, training, ChatGPT input ("prompts" or inputs in ChatGPT) and ChatGPT output as well as training of ChatGPT with prompts. The first three phases are associated with particular risks to the fundamental rights and freedoms of natural persons in view of the "web scraping" carried out in this context, i.e. the collection and extraction of information from publicly accessible sources. The information collected could contain not only personal data, but also particularly sensitive data, for the processing of which a special legal basis is required. To secure the processes, it is also necessary to take appropriate technical measures and to provide processes for deleting and anonymizing data. A final assessment of the legality of this is still pending. With regard to the further phases, it is important to inform users transparently about which of the data they enter is used for training purposes. Furthermore, it is not permissible in good faith to transfer the company's risks to data subjects. In this respect, compliance with data protection regulations may not be transferred to the data subjects. A clause in the GTC stating that data subjects are responsible for their chat entries is therefore inadmissible. With regard to the results provided by ChatGPT, the principle of data accuracy must also be observed. Users must also be informed about the reliability of the results provided.

Over the course of several meetings, the task force also developed a questionnaire that could serve as a basis for exchange with OpenAI. The document was developed with the aim of facilitating standardized investigations. Among other things, the questionnaire contains questions on legal bases and the principles of data processing, risk management and data protection impact assessments, information obligations and data subject rights as well as data transfers to other companies and third countries.

(Christina Prowald)

SDTB controls 30,000 websites

In May 2024, the Saxon Data Protection and Transparency Commissioner (SDTB) examined around 30,000 Saxon websites with regard to data protection violations (notification of 13.06.2024). In particular, the SDTB also looked at the use of the Google Analytics service. If tracking tools are to be used on websites, effective user consent must be obtained in advance. As part of its review, the SDTB found that website operators did not comply with the applicable requirements in this respect to the necessary extent in 2,300 cases. The affected companies, associations and public bodies are now requested by the authority to rectify the data protection breach and delete all unlawfully collected data. If the responsible bodies do not comply with the request, they are threatened with formal administrative proceedings following a new review.

SDTB Dr. Juliane Hundert commented: "Tracking services such as Google Analytics provide in-depth insights into the behavior and privacy of website visitors. Under data protection law, the interests of the operators therefore take second place. This means that if data controllers want to use Google Analytics, they are obliged to obtain consent from users."

(Christina Prowald)

HmbBfDI: Position paper on applicant data protection and recruiting

In view of the increasing relevance of digitalization and artificial intelligence in the application process, the Hamburg Commissioner for Data Protection and Freedom of Information (HmbBfDI) published a position paper on applicant data protection and recruiting on June 6, 2024.

First of all, the HmbBfDI emphasizes that application documents contain a large amount of sensitive data, which is why data protection-compliant handling is of the utmost importance. The data of applicants must be treated with the same care and discretion as that of employees. After an introduction to the principles of the GDPR, a discussion of the relevant terminology and an overview of the different phases of the recruiting process, he addresses some specific questions.

Inclusion in a talent pool is only possible, for example, if the applicant has given their express consent. It is particularly important that the consent has been given transparently and voluntarily in compliance with the information obligations. This requires comprehensive information about the data processing procedures that take place in connection with data storage in the talent pool. It is also advisable to obtain consent for a specific period of time and to ask for consent again if necessary.

If information about an applicant is to be obtained via internet research or the search function in social networks, this is possible in principle. However, information that goes beyond the employer's right to ask questions may have to be disregarded. Likewise, information from private networks that has been disclosed for private purposes may not be used for background checks. Applicants must also be informed about the collection of information in accordance with Article 14 GDPR.

The HmbBfDI then addresses the use of AI tools as part of the application process and the resulting data protection issues. The use of tools for reading application documents and for the structured transfer of data to an application management system is generally permissible as long as the principle of data accuracy is observed. However, if additional data analyses are to be carried out after parsing, the requirements of Article 22 GDPR for automated individual case decisions must be met. This means, among other things, that decisions with legal effect may only be made by humans. In this respect, merely formal human involvement is not sufficient. Rather, the decision-maker must have a genuine scope for decision-making. Emotion analyses, in contrast, are generally inadmissible. The main problem in this respect is that the analyses are generally not necessary and the aspect of voluntariness is doubtful.

In conclusion, the data protection officer notes that the integration of data protection-compliant automated analyses can have a positive impact on the entire recruiting process by enabling personalized, fair and non-discriminatory application procedures. In this respect, however, companies are required to comply with data protection regulations and implement transparent procedures. The challenge is to find a balance between technological progress and data protection for applicants.

(Christina Prowald)

LfDI BW: Checklist for the use of TikTok

The State Commissioner for Data Protection and Freedom of Information of Baden Württemberg (LfDI) has commented on the use of TikTok by public bodies (communication of 28.05.2024).

He first points out the general problem that when using social media platforms, there are often only limited possibilities to influence the data processing procedures and the terms of use. Against this background, it is doubtful whether the use of TikTok can even meet the requirements of data protection law. Specifically, it was questionable whether the data processing carried out in the course of using TikTok could comply with the principles of Article 5 GDPR (processing in good faith, data minimization and integrity and confidentiality) and 25 GDPR (data protection by design), whether it was based on a valid legal basis, whether the requirements of Article 8 GDPR for the processing of minors' data and the requirements for transparent user information resulting from Article 13 and 14 GDPR could be complied with.

In addition, a checklist for data protection-compliant use by public bodies is provided. In this respect, the type of TikTok account, the configurations made to protect personal data, the scope of the tools used, the allocation of roles under data protection law, the legal basis, the security measures taken and alternative communication options are particularly important. In addition, the use of TikTok must be included in the record of processing activities.

(Christina Prowald)

Lower Saxony: Data protection supervisory authority publishes annual report 2023

On June 6, 2024, the State Commissioner for Data Protection of Lower Saxony (LfD), Denis Lehmkemper, presented his authority's 29th activity report for the year 2023.

Among other things, the report shows that the number of reports has increased compared to 2022. This applies to both the number of complaints (+7%) and the number of reported data breaches (+13%). Since the introduction of the GDPR, the number of reports of data breaches has risen continuously. In the reporting period, 51 fines totaling around 5.3 million euros were also imposed. Many of these fines related to video surveillance. The data protection violations ranged from unauthorized recordings via dashcam, to video surveillance in the workplace, to the surveillance of an event area.

The report also addresses the topic of "artificial intelligence", which is becoming increasingly relevant, and looks at the draft European AI regulation. Among other things, the authority provides an overview of data protection challenges in the training and use of AI systems. A newly appointed committee of experts will also provide new impetus for the use of AI and develop framework conditions for data protection-compliant use.

Overall, Lehmkemper draws a positive conclusion: "Most companies and public bodies in Lower Saxony take data protection seriously and have adapted their processes to the requirements of the General Data Protection Regulation." Nevertheless, there is still a lot to do in view of digitalization and new challenges such as artificial intelligence.

(Christina Prowald)

Italy: Fine for continued use of a business e-mail address

On January 24, 2024, the Italian supervisory authority (GDPD) imposed a fine of 15,000 euros on MP1 srl because the company had kept an employee's email address active for months after the employee left the company and continued to use it (notification of 24.01.2024). The decision was made following a complaint by the former employee.

In response to the complaint, the company explained that it had forwarded the incoming emails to another mailbox in order to be able to process incoming inquiries. In addition, the sender had been informed that the employee had left the company. Following a further complaint, the company switched off the e-mail address. Furthermore, the company did not respond to the formal requests of the person concerned, whereupon he turned to the supervisory authority.

As part of its investigation, the supervisory authority then found that the company had violated data protection regulations. The company's failure to respond to the data subject's requests violated Article 5 (1) (c), 12 and 17 GDPR. The company should have responded to the deletion request within one month at the latest. GDPD also pointed out that even if deletion did not have to be carried out due to one of the exceptions, a response to the request would have been necessary. The data subject should have been informed of the reasons why their request was not granted and what data subject rights they were entitled to. The supervisory authority also stated that although the temporary forwarding of incoming emails was possible, the processing in question was not proportionate, particularly in view of the period of the redirection measure. Taking into account the principle of data minimization, it is necessary to deactivate the mailbox after the employee leaves, to inform third parties of this fact and to provide them with alternative communication addresses. This would also serve to protect the sender. As a result, the supervisory authority imposed a fine of 15,000 euros for the violations.

(Christina Prowald)

Italy: Fine for advertising despite objection

On February 22, 2024, the Italian supervisory authority (GPDP) imposed a further fine of 90,000 euros on Coop Italia Società Cooperativa because the company sent advertising communications despite the objection of the person concerned (notification of 22.02.2024).

The proceedings were based on the complaint of a data subject who told the supervisory authority that he had contacted the company to object to the processing of his data for advertising purposes and to assert his right of access. The company then merely informed him that it had registered his objection to advertising. The person concerned then asked for information again and also for his data to be deleted. However, the company continued to send him advertising messages by SMS. During the investigation by the GDPD, the company then stated that it had not complied with the data subject's requests due to an internal misunderstanding.

The supervisory authority found that the company used various customer data for advertising purposes, among other things, and passed it on to third parties without the users' effective consent.

The supervisory authority stated that although the breach was not systematic in nature, it was nevertheless of particular importance and constituted a breach of Article 5 (1) (e), 12 (3), 15 and 21 (2) GDPR. The supervisory authority subsequently imposed a fine of 90,000 euros on the company and also ordered it to amend its consent text.

(Christina Prowald)