Newsletter data protection 10/2025

The  Regulation (EU) 2023/2854 on fair data access and use (Data Act) has been in force since September 12, 2025. The Data Act aims to enable greater and better use of data in the future. The aim is to promote innovation, prevent distortions of competition and strengthen data sovereignty. Data generated by connected products and related services should not remain with manufacturers and providers, but should also be accessible to users, third-party providers and public authorities. Users should have more control over their data. Companies are therefore not only faced with the task of protecting personal data, but now also of sharing non-personal data. 

Last month, there were also several interesting rulings by the Court of Justice of the European Union, which are presented in this newsletter, among other things. These include a ruling on the term "personal data" in the context of pseudonymised transfers, a rejected claim for injunctive relief, and a ruling confirming the European Commission's adequacy decision of July 10, 2023 for data transfers to the United States.

If you have any feedback on this newsletter or questions relating to the topics covered, please send us an email at datenschutz@brandi.net. You can also find further contact details on our homepage.

Dr Sebastian Meyer and the data protection team at BRANDI

 

Topic of the month: Employee excess

Companies rely on comprehensive compliance measures such as training programmes and the development of data protection concepts to ensure that personal data is processed in accordance with data protection regulations, especially by their employees. Nevertheless, it can happen that employees access personal data without professional reason. For example, it is conceivable that employees might view their neighbour's purchasing behaviour in internal systems or access family members' account information out of pure curiosity. Such actions regularly violate the principle of purpose limitation under Art. 5 (1) (b) GDPR. If employees process the data for their own purposes, the question arises as to when the unauthorised action is attributable to the company and who, if anyone, is liable for the data protection breach.

To the complete main topic

 

ECJ: Pseudonymisation alone is not always sufficient

In its ruling of September 4, 2025, the ECJ clarified the term of personal data with regard to the transfer of pseudonymised data to third parties (ECJ, decision dated 04.09.2025 - Ref. C-413/23 P).

In the original case, the Single Resolution Board (SRB) – a committee of the European Banking Union responsible for resolving financial institutions threatened with insolvency – initiated resolution proceedings against Banco Popular Español SA. The parties involved were able to submit their statements for the final decision on whether shareholders and creditors affected by the resolution should be compensated. These statements were transmitted as pseudonymised data to Deloitte GmbH, an auditing and consulting firm. The parties concerned were not informed about the data transfer.

First, the ECJ clarified that a statement containing a personal opinion is inevitably closely linked to that person as an expression of their thoughts and is therefore always personal data. The content and purpose of the statement do not need to be examined in order for it to qualify as personal data. If the statements are transferred as pseudonymised data, the personal reference is not necessarily removed. Pseudonymisation is a technical and organisational measure aimed at preventing data subjects from being identified without additional information. The controller regularly has additional information at its disposal that enables the statements to be assigned to a specific person. Conversely, non-personal data could become personal data if the controller transmits the data to a person who has the means or information to enable identification. Consequently, for pseudonymisation to qualify as personal data, it depends on whether the third party is actually able to remove the pseudonymisation and establish a personal reference.

Finally, the ECJ stated that for the purposes of applying the information obligations regarding potential recipients of personal data under Article 13 (1) (e) GDPR, the identifiability of the data subject must be assessed from the perspective of the controller and at the time of data collection. The decisive factor is therefore whether the statements were to be classified as personal data from the SRB's point of view at the time of collection and thus before pseudonymisation. In the opinion of the ECJ, the SRB should have informed the data subjects before transferring the data to Deloitte GmbH.

(Mira Husemann)

 

ECJ: No right to injunctive relief under the GDPR

On September 4, 2025, the ECJ ruled that the GDPR does not give rise to a right to injunctive relief in cases where the person affected by unlawful data processing does not request the deletion of their data (ECJ, decision dated 04.09.2025 - Ref. C-655/23). The decision also concerned certain aspects of non-material damages.

The plaintiff had initially applied to Quirin Privatbank via an online career network. During the application process, an employee of the bank sent a message via the messaging service of the career platform Xing, which was actually intended for the plaintiff, informing him of the rejection of his salary expectations and offering him a different remuneration, to a third party who was not involved but who knew the plaintiff professionally. This gave rise to concerns on the part of the plaintiff that at least one third party known to him from the same industry had been put in a position to pass on confidential data to former and potential employers and thereby gain an advantage in competitive situations. In addition, others could then learn about what he perceived as a humiliating defeat in the salary negotiations. He therefore demanded damages and an injunction against any further unauthorised disclosure. The Federal Court of Justice (BGH), which was entrusted with the appeal, referred several questions to the ECJ, in particular with regard to a possible claim for injunctive relief under EU law.

The ECJ found that the GDPR does not contain any provision that expressly or implicitly provides that the data subject can preventively demand the controller to refrain from future violations of the GDPR. Nor does such a right arise from the rights of data subjects under Articles 17 and 18 GDPR (right to erasure and right to restriction of processing) if the data subject does not request the erasure of their data. However, it follows in particular from the right to an effective judicial remedy under Article 79 (1) GDPR that Member States are nevertheless not prevented from granting such a preventive injunction relief under national law. Irrespective of an opening clause in the relevant chapter of the GDPR, comprehensive harmonisation of legal remedies for violations of the GDPR is not desired, and a national injunction relief could also promote the objectives of the GDPR.

In addition, the ECJ confirmed its case law according to which non-material damage within the meaning of Art. 82 (1) GDPR can already lie in the mere loss of control over the data, as well as in the fear of future misuse, provided that both are duly proven. Even if worries or annoyance, as the BGH points out, may be part of the general risk of life, they can constitute non-material damage if they are causally attributable to the violation of the GDPR in question. This would be in line with the wording and recitals of the provision.

(Gesche Kracht)

 

EU court dismisses action against agreement with the USA

In its judgment of September 3, 2025, the Court of Justice of the European Union (CJEU) dismissed an action for annulment against the European Commission's adequacy decision, which was adopted on July 10, 2023 for data transfers to the US, thereby confirming an adequate level of data protection in the US at the time the contested decision was adopted (press release of 03.09.2025).

Under the General Data Protection Regulation (GDPR), the transfer of personal data to third countries – i.e. countries outside the European Union or the European Economic Area – is only permitted if an equivalent level of data protection can be ensured in the third country. To ensure an equivalent level of data protection, the GDPR defines various protection mechanisms, including the European Commission's so-called adequacy decision. Such an adequacy decision has been in place for the United States since July 10, 2023. Two previous regulations were declared invalid by the ECJ in its judgments of October 6, 2015 and July 16, 2020; The ECJ attributed the lack of an equivalent level of protection to the far-reaching powers of the US intelligence agencies (ECJ, decision dated 06.10.2015 - Ref. C-362/12; decision dated 16.07.2020 - Ref. C-311/18). The reason for the renewed adoption of an adequacy decision is an executive order of October 7, 2022, which limited the surveillance measures of the US intelligence agencies and amended the provisions on the US Data Protection Review Court (DPRC).

An action for annulment challenged the impartiality and independence of the DPRC and the approach taken by the US intelligence services to the collection of personal data. In its judgment, the ECJ first found that the independence of the members of the DPRC was guaranteed by several guarantees and conditions relating to the functioning of the DPRC and the appointment of its judges. With regard to the practices of US intelligence services, the authorisation of the collection of personal data must be subject to ex post review by the DPRC and be sufficient in light of ECJ case law.

The ruling refers to the time when the adequacy decision was issued. The European Commission must continuously monitor developments in the legal situation in the US and, if necessary, suspend, amend or revoke the decision.

(Mira Husemann)

 

Federal Court of Justice: No damages for hypothetical risk of misuse of data

In a ruling dated May 13, 2025, the Federal Court of Justice (BGH) ruled that a purely hypothetical risk of misuse of personal data by an unauthorised third party cannot give rise to a claim for damages under Article 82 GDPR (Federal Court of Justice, decision dated 13.05.2025 - Ref. VI ZR 186/22).

The plaintiff objected to the defendant city transmitting of his personal data in unencrypted form. Nevertheless, the city sent several unencrypted letters by fax, which included the plaintiff's surname, among other things.

A claim for damages under Art. 82 GDPR was rejected by the Federal Court of Justice (BGH) due to a lack of evidence of non-material damage. The mere violation of the GDPR is not sufficient to justify a claim for damages under Art. 82 GDPR; rather, material or non-material damage and a causal link between the violation and the damage are also required. The claimant must prove that he has suffered damage as a result of the violation of the GDPR. In this context, the fear of a data subject triggered by a violation that his personal data could be misused by third parties could in itself constitute non-material damage; however, a mere assertion – without proven negative consequences – is not sufficient. Without proof from the claimant, there is no loss of control giving rise to liability, but rather a purely hypothetical risk. Even the fact that the claimant is exposed to danger to life and limb due to his professional activity does not necessarily indicate a concrete threat to his person.

(Mira Husemann)

 

Munich Higher Regional Court: Email hosting provider does not have to disclose user data

In a ruling dated August 26, 2025, the Munich Higher Regional Court decided that an email hosting provider does not have to disclose user inventory data because an email hosting service does not constitute a digital service within the meaning of Section 21 of the Telecommunications Digital Services Data Protection Act (TDDDG), but rather an interpersonal information service that is exclusively subject to the regulatory scope of the Telecommunications Act (TKG), for which no such (private law) right to information exists (Munich Higher Regional Court, decision dated 26.08.2025 – 18 W 677/25 Pre e).

The legal dispute arose from two extremely negative and unlawful reviews that had been published by their author on a review platform and against which the plaintiff – an automotive company – sought to defend itself. Although the two reviews had already been removed by the operator of the review platform, the automotive company also intended to take legal action against the author. In the legal proceedings against the operator of the review platform, the plaintiff initially succeeded in obtaining information about the author's email address. In subsequent proceedings against the email hosting provider, it then attempted to obtain the author's name, address and date of birth. After the plaintiff prevailed with this claim before the Munich I Regional Court, the email hosting provider successfully appealed against this first-instance decision before the Munich Higher Regional Court.

In the court's view, Section 21 (2) TDDDG does allow providers of digital services to disclose information about their existing inventory data in individual cases. However, the defendant email hosting provider is not a provider of digital services in this sense. Rather, such email hosting services be electronic communications services, for which the Telecommunications Act (TKG) be the relevant regulatory framework. Therefore, the email hosting provider could at most be obliged to provide information under Section 174 TKG, whereby neither private individuals nor legal entities – such as the plaintiff – could invoke this provision. Contrary to the plaintiff's assertion, Section 21 TDDDG should also not be understood to mean that the requested "chain information" could be demanded all the way back to the last provider in the chain where the name and address of the person to be held liable for the infringement are stored in the inventory data. This does result in a gap in protection for the plaintiff. However, in order to close this gap, the legislature must take action. In particular, the email hosting provider cannot be required to comply with the regulations governing digital service providers in addition to the telecommunications data protection regulations to which it is subject as an operator of an interpersonal communication service under the TKG, even though the TDDDG system stipulates that different requirements apply to the two areas of telecommunications and digital services.

(Habib Majuno)

 

Lübeck Regional Court refers questions to the ECJ regarding the transfer of data to SCHUFA

The Regional Court of Lübeck was dealing with a case involving the transfer of data to SCHUFA and suspended the proceedings for a preliminary ruling (Regional Court of Lübeck, decision dated 04.09.2025 – Ref. 15 O 12/24).

After concluding a mobile phone contract, a telecommunications company transferred personal data of the contractual partner to SCHUFA Holding AG, the operator of a credit rating system, without the contractual partner's consent. The information in question did not relate to negative payment experiences or other contractual behaviour, but rather to the commissioning, execution and termination of a contract (so-called positive data).

The Regional Court doubts that the transfer of positive data can be based on the legal basis of legitimate interest under Art. 6 (1) (f) GDPR. The concept of legitimate interest is not sufficiently defined with regard to the requirements for mass transfers of positive data. In addition, the Regional Court considers the use of positive data to create a personality profile in the form of a score value to be problematic. Due to the largely non-transparent connection of data points, the score value is likely to trigger a feeling of continuous monitoring without any possibility of influence on the part of the data subject. Finally, it is unclear to the Regional Court whether the contested data transfer and storage constitutes a loss of control and thus damage under Art. 82 GDPR.

(Mira Husemann)

 

State data protection commissioners criticise weakening of fundamental rights protection in AI control

The Independent Data Protection Supervisory Authorities of the federal states have criticised a current legislative proposal by the Federal Ministry of Digital and Public Service (press release dated 04.09.2025). The proposal aims to transfer market surveillance for certain types of artificial intelligence relevant to fundamental rights to the Federal Network Agency, even though this task is already assigned to the Data Protection Supervisory Authorities under the AI Regulation. According to the draft, this is intended to remove barriers to innovation. This particularly affects high-risk AI systems used for law enforcement, border management, justice and democracy.

Meike Kamp, Berlin's Commissioner for Data Protection and Freedom of Information, commented as follows: "The Data Protection Supervisory Authorities, which protect and strengthen the fundamental rights of citizens, are therefore to play no role in supervision in the future, and that in this very sensitive area. This reveals a strange understanding of the importance of fundamental rights and also of the tasks of data protection supervision." She went on to note: "The protection of fundamental rights is not a flaw, but a democratic necessity. Especially in view of the far-reaching impact that the use of artificial intelligence potentially has on society, the fundamental rights of citizens must be respected."

The State Data Protection Authorities also made it clear that the distribution of competences between the Federal Network Agency and the Data Protection Supervisory Authorities provided for in the AI Regulation was sensible and that the federal government should follow this path.

(Christina Prowald)

 

Microsoft provides new data protection documents for MS365

The technology group Microsoft has made new data protection documents available for the use of the cloud-based Microsoft 365 platform in its Service Trust Portal.

The data protection-compliant use of Microsoft 365 is particularly critical in relation to data transfers to third countries and the lack of transparency regarding data processing. In order to increase legal certainty when using Microsoft 365 and to offer small and medium-sized enterprises assistance in using this platform in a manner that complies with data protection regulations, Microsoft has published new data protection documents in collaboration with the Data Protection Supervisory Authorities in Bavaria and Hesse. These include a cover sheet for an M365 toolkit, sample entries for a directory of processing activities, threshold analyses, legal bases for typical Microsoft 365 usage scenarios, a sample privacy policy and explanations on data anonymisation. This does not eliminate the need for individual review and risk assessment when using Microsoft 365.

(Mira Husemann)

 

The EU Data Act

The Data Act, which promotes access to, exchange and use of data, has been in force since September 12, 2025. The Data Act affects manufacturers, data owners and users of connected products and related services. It regulates the right to access and provide data, as well as the right to transfer data to third parties. Furthermore, the Data Act defines new types of contracts that must be concluded, including the data use contract for the processing of non-personal data (Art. 4 (13) Data Act), the provision contract between data owners and data recipients (Art. 8 et seq. Data Act) and the contract to facilitate switching between data processing services (Art. 23 et seq. Data Act). As a first step, companies should check whether they fall within the scope of application; exceptions apply in particular to small businesses. If this is the case, companies may have to identify trade secrets and implement protection mechanisms for them, set up data access (via interfaces), and prepare contracts and information texts. In Germany, the Data Act has not yet been transposed into national law; only a draft bill is available. Concrete regulations on official responsibilities or sanctions therefore remain to be seen. The European Commission has already published guidelines to support the implementation of the Data Act.

(Mira Husemann)

 

France: Fine imposed on Google for cookies and inbox advertising

The French Data Protection Authority (CNIL) has imposed a fine of 325 million euros on Google for displaying advertisements between emails to users of the Gmail email service without their consent and for setting cookies when creating new Google accounts without obtaining the consent of its French users (press release dated 03.09.2025).

The CNIL's investigation into Google's email service and the registration process for creating a Google user account was prompted by a complaint filed by the Austrian organisation "NOYB - European Centre for Digital Rights" on August 24, 2022. The CNIL concluded that Google was in breach of French law by displaying advertising in the form of emails in Gmail users' mailboxes. According to Article L 34-5 of the French Telecommunications Code (CPCE), this requires the consent of users, which was lacking in this case. Furthermore, without the effective consent of users, Google had set cookies when creating a Google account that could be used to display personalised advertising. For consent to be valid, users would have had to be clearly informed that they could not use Google services without allowing Google to set cookies for advertising purposes. French users of Google did not receive this information, meaning that the consent obtained in this context was invalid. Google therefore violated Article 82 of the French Data Protection Act. The CNIL therefore imposed a fine totalling 325 million euros and ordered Google to remedy the violations within six months, under threat of a penalty payment of 100,000 euros per day of infringement.

(Habib Majuno)

 

France: Fine for using cookies without user consent

On September 1, 2025, the French Data Protection Authority CNIL imposed a fine of 150 million euros on Infinite Styles Services Co. Limited, the Irish subsidiary of the Shein Group, for violating the requirements that must be observed when using cookies on websites (announcement dated 03.09.2025). The CNIL reviewed the website "shein.com" in August 2023 and found that the requirements of Article 82 of the French Data Protection Act had not been complied with. In particular, the necessary user consent for the use of cookies had not been obtained. Various cookies that were not technically necessary had already been set before the user had the opportunity to interact with the cookie banner. The cookie banners were also incomplete, as they did not contain all the information to be provided to the user, such as the purpose of data processing. The data protection authority also criticised the lack of information about third-party providers used and the inadequate mechanism for refusing and revoking consent.

In calculating the fine, the Data Protection Authority took into account, among other things, the large number of violations and the enormous scale of the data processing in question. The Shein Group's website is visited by an average of 12 million people living in France every month. The responsible committee also pointed out that companies had already been repeatedly sanctioned for similar violations in 2020 and that the relevant decisions had been published.

(Christina Prowald)