dices switching 2025 to 2026
Informationen zum Datenschutz

Annual Review 2025 and Outlook for 2026

Introduction

Data protection law in 2025 was characterised by a variety of court and authority decisions concerning the interpretation and application of the provisions of the GDPR. Central issues included claims for damages under Article 82 GDPR, as well as requests for information under Article 15 GDPR. Judgements were also handed down relating to data protection responsibility and the concept of pseudonymisation.

On May 16, 2025, our BRANDI Data Protection Law Day was held for the sixth time. This year, Professor Ulrich Kelber, former Federal Commissioner for Data Protection, was our guest at BRANDI. Together with Professor Kelber, we discussed current questions and developments on the topic of “Data Protection and Digitalisation”. In conversation with BRANDI’s lawyers, the guest speaker provided valuable insights into current issues and developments in the area of AI, as well as discussing his experience as Federal Commissioner for Data Protection.

To mark the beginning of the new year, we would like to use our traditional annual review to reflect on the events of the past year, as well as our key topics. Finally, we would like to look ahead to 2026 and potential developments.

Main topics of the BRANDI Data Protection Newsletter

We report monthly on current developments in data protection law in our Data Protection Newsletter. Each month, our Data Protection Newsletter provides updates on current events in data protection law. In addition, each issue features an in-depth examination of a selected data protection topic, in which we explore legal particularities and offer our readers practical guidance. The choice of topics is shaped by recent cases from our legal practice, or prompted by court decisions and statements from supervisory authorities. Over the past year, our newsletters have addressed the following main topics:

Case law

In the following, we address several particularly significant court decisions from the year 2025.

This year, the German Federal Court of Justice (BGH) dealt on multiple occasions with claims for damages arising from data protection violations. Reflecting the established case law of the European Court of Justice (ECJ), the BGH commented for the first time in March 2025 in concrete terms on the quantification of such claims (BGH, decision dated 28.01.2025 – VI ZR 183/22). Due to the principle of compensation, the actual consequences of the data protection breach are crucial for determining the amount of damages to be awarded. In the case at hand, the BGH considered the amount of € 500 granted by the appellate court to be reasonable, despite the actual consequences, although the claimant had initially sought € 6,000. Furthermore, the BGH referred to the ECJ’s case law, according to which mere loss of control may already constitute compensable non-material damage, and ruled that no further infringement of personality rights or harm of a certain gravity is required (BGH, decision dated 11.02.2025 – VI ZR 365/22). As there is no materiality threshold to be met, the BGH stated that impairment of creditworthiness resulting from an unjustified SCHUFA entry in the form of a credit card block can indeed constitute non-material damage (BGH, decision dated 13.05.2025 – VI ZR 67/23). In May 2025, the BGH further decided that a purely hypothetical risk of misuse of personal data by an unauthorised third party does not give rise to a claim for damages (BGH, decision dated 13.05.2025 – VI ZR 186/22). Merely breaching the GDPR is not sufficient to establish a claim for damages under Article 82 GDPR; instead, both material and non-material damage must have occurred, as well as a causal link between the breach and the damage. The claimant must demonstrate that they have suffered damage as a result of the breach of the GDPR. In this regard, the mere apprehension on the part of the data subject that their personal data could be misused by third parties as a consequence of a breach may, in itself, constitute non-material damage; however, a mere assertion — without proven adverse effects — is not sufficient. Without proof provided by the claimant, there is no actionable loss of control, but only a purely hypothetical risk. On this issue, the BGH had already ruled in January that even an unsolicited marketing email alone does not establish a claim for non-material damages under Article 82 (1) of the GDPR (BGH, decision dated 28.01.2025 – VI ZR 109/23). While sending such an email may qualify as a breach of the GDPR, this in itself does not suffice to justify a claim for non-material damages within the meaning of Article 82 (1) GDPR. In the case in question, this would at least have required that the defendant, by sending the marketing email, had also made the claimant’s data accessible to third parties, or that the claimant had been able to substantiate their asserted apprehension of loss of control.

The Federal Fiscal Court (BFH) addressed the scope of the right of access and ruled in January 2025 that a public authority is not entitled to refuse a data subject’s request for access under Article 15 GDPR on the grounds of disproportionate effort by merely granting inspection of files instead (BFH, decision dated 14.01.2025 – IX R 25/22). While, under Article 14 (5) (b) alternative 2 GDPR, the controller is not obliged to fulfil its information obligations if this would involve disproportionate effort, this exception applies solely to the information duties pursuant to Article 14 GDPR and cannot be analogously extended to access requests. The right of access is, in particular, not subject to a general test of proportionality and cannot be rejected as excessive merely on the grounds that the data subject is requesting information about their personal data without having limited the request in terms of scope or time.

In a judgment from September 2025, the ECJ clarified the concept of personal data in relation to the transmission of pseudonymised data to third parties (ECJ, decision dated 04.09.2025 – C-413/23 P). Pseudonymisation is deemed a technical and organisational measure aimed at preventing data subjects from being identified without additional information. The data controller will generally possess supplementary information that enables the attribution of data to a specific individual. Conversely, data that are not considered personal data may become personal data when the controller transmits such data to a person who possesses the means or information necessary to effect identification. Consequently, in the context of pseudonymisation, whether data qualify as personal data depends on whether the third party is actually in a position to reverse the pseudonymisation and establish a link to an individual.

The issue of data protection responsibility was also the subject of a decision by the BGH this year. In its judgment from October 2025, the BGH held that, as a rule, employees are not to be regarded as controllers within the meaning of the GDPR (BGH, decision dated 07.10.2025 – VI ZR 294/24). In reaching this decision, the BGH referred to the established case law of the ECJ, according to which employees of a controller ordinarily act on the instructions of that controller and are therefore subordinate to it.

Finally, in September 2025, the General Court of the European Union (GC) dismissed an action for annulment against the adequacy decision issued by the European Commission on July 10, 2023 concerning data transfers to the USA, thereby confirming that the level of data protection in the USA was adequate at the time the contested decision was issued (press release of 3 September 2025). In its judgment, the GC first found that the independence of the members of the US Data Protection Review Court (DPRC) was ensured through a range of guarantees and rules governing both the operation of the DPRC and the appointment of its judges. With regard to the practices of US intelligence agencies, the authorisation for bulk collection of personal data is subject to subsequent review by the DPRC, which, according to GC case law, is deemed sufficient.

Developments in Legislation

On June 28, 2025, the Accessibility Enhancement Act (Barrierefreiheitsstärkungsgesetz, BFSG) entered into force in Germany. The BFSG aims to promote the equal and non-discriminatory participation of people with disabilities, impairments, and older people in relation to certain products and services, with particular focus on offerings in e-commerce. To this end, the Act stipulates various requirements regarding the design of specified products and services. The BFSG implements the EU Directive on the accessibility requirements for products and services of 17 April 2019 (Directive 2019/882), which harmonises technical accessibility standards and the requirements for accessible information.

Since August 1, 2025, the provisions of the second stage of the AI Act have applied. These regulations primarily concern the use of general-purpose AI models (General Purpose AI – GPAI), including widely used systems such as Mistral or ChatGPT. Additionally, as of August 2025, the provisions regarding sanctions and the organisation of the European AI Office have also come into effect. In future, the AI Office is to monitor and enforce compliance with the AI Act’s requirements for GPAI models across national borders and coordinate this with the supervisory authorities in the EU Member States (in Germany, the Federal Network Agency).

Since September 12, 2025, the Data Act has been in force, promoting access to, sharing, and use of data. The Data Act applies to manufacturers, data holders, and users of connected products and related services. It governs the right of access to and provision of data, as well as the right to share data with third parties. Furthermore, the Data Act defines new types of contracts that are mandatory, including the data usage agreement for the processing of non-personal data (Article 4 (13) Data Act), the provision agreement between data holder and data recipient (Article 8 et seq. Data Act), and the contract to facilitate switching between data processing services (Article 23 et seq. Data Act).

Activities of Supervisory Authorities  

In 2025, as in previous years, the data protection supervisory authorities of the EU Member States addressed a variety of data protection issues. Particular focus was placed on the imposition of administrative fines for data protection violations as well as the publication of opinions and guidance on selected topics.

Fines

The Spanish data protection authority (AEPD) imposed a fine of 4 million euros on the insurance company Generali España. The penalty was prompted by a cyberattack that had persisted since September 19, 2022 and was only discovered on October 5, 2022. Generali España did not report the attack on its systems until November 2022. During the attack, hackers gained access to the systems and thereby to the personal data of former customers. In reaching its decision, the supervisory authority took into account not only the data protection violations themselves, but also the way Generali España handled the data protection incident.

The Irish Data Protection Commission (DPC) imposed a fine of 530 million euros on TikTok Technology Limited (TikTok) on May 2, 2025 for the unlawful transfer of European users’ data to China. The DPC found that data belonging to European users could be accessed from Chinese servers. Initially, TikTok had stated that no user data were stored on servers in China. Later, the company revised its position after discovering a problem that meant some EU user data were in fact stored, albeit to a limited extent, on servers in China. According to the supervisory authority, TikTok had failed to adequately assess the level of data protection and, as a result, to implement appropriate safeguards. Furthermore, the transfer of data to third countries was not mentioned in the privacy policy.

The Federal Commissioner for Data Protection and Freedom of Information (BfDI) imposed two fines totalling 45 million euros on Vodafone GmbH. A fine of 15 million euros was imposed because the company failed to adequately fulfil its obligation to monitor its commissioned data processors for data protection compliance. Employees of partner agencies were able — at the expense of the affected customers — to create unauthorised new contracts or manipulate existing ones. Due to severe security flaws in the authentication process when using the "MeinVodafone" online portal in combination with the customer hotline, a further fine of 30 million euros was imposed. The inadequate security measures enabled unauthorised persons, for example, to retrieve eSIM profiles.

The French data protection authority (CNIL) imposed a fine of 325 million euros on Google. Google displayed advertising in the form of emails to Gmail users without their consent and thereby, in the view of the authority, breached Article 34-5 of the French Telecommunications Code (CPCE). Furthermore, when creating a Google account, Google set cookies that enabled personalised advertising to be shown. According to the CNIL, the effectiveness of the consents obtained failed because Google did not sufficiently inform users that Google services could not be used without allowing cookies for advertising purposes.

The Hamburg Commissioner for Data Protection and Freedom of Information (HmbBfDI) imposed a fine of 492,000 euros on a company in the financial sector for making automated decisions regarding credit card applications. Using automated decisions, the credit card applications of several customers were rejected by a financial institution – despite their good creditworthiness. The affected customers requested an explanation for the rejection, whereupon the company did not adequately fulfil its data protection information and disclosure obligations.

The Croatian data protection authority imposed a fine of 4.5 million euros on a telecommunications company for the unlawful transfer of personal data to a third country. Between April 2022 and December 2022, the data was still transferred to the processor in Serbia on the basis of Standard Contractual Clauses (SCCs). After this period, however, no such clauses were in place and the EU Commission had not issued an adequacy decision for Serbia. No risk assessment was carried out by the company and the data protection officer’s comments were ignored. The data transfer violated Article 44 in conjunction with Article 46 (1) GDPR, which require the implementation of appropriate safeguards for international transfers. Furthermore, customers were not informed about the transfer of their data to a third country.

Guidance and Opinions

The State Commissioner for Data Protection in Lower Saxony (LfD) has pointed out the data protection risks associated with the use of the Chinese AI tool DeepSeek. It is currently assumed that the tool does not meet the requirements of the GDPR or the AI Act. According to the provider’s privacy policy, all information and documents entered into the tool are comprehensively recorded, transmitted, stored, and analysed without restriction. The provider also states that it is obligated to pass the data on to Chinese intelligence services and security authorities.

With regard to the obligation to provide for a guest checkout option in online retail, the HmbBfDI was able to achieve improvements through a proactive review, not prompted by complaints. In January 2025, in view of a resolution by the Conference of the German Data Protection Authorities (DSK), according to which online retailers must not generally require customers to set up a customer account, but must instead allow for guest checkout without permanent registration, the HmbBfDI reviewed relevant Hamburg-based online shops. The majority of the websites examined already included corresponding options. In one case, following a request from the HmbBfDI, the shop subsequently provided the guest checkout facility.

The State Commissioner for Data Protection in North Rhine-Westphalia (LDI) published a statement for businesses in July 2025, clarifying when the processing of health data is permitted and the extent to which employers may access information regarding the health status of their employees. According to the LDI NRW, the principal legal basis consists, in particular, of Section 26 (3) of the German Federal Data Protection Act (BDSG) in conjunction with Article 9 (2) (b) of the GDPR and the provisions of the German Continued Remuneration Act (Entgeltfortzahlungsgesetz, EFZG), as well as Article 6 (1) (b) of the GDPR in conjunction with the employment contract. Due to the prevailing power imbalance, consent is generally not considered a valid basis. Furthermore, health data must be stored separately from personnel files and the usual medical certificates of incapacity for work.

The data protection supervisory authorities of the Federal States have criticised a recent legislative initiative by the Federal Ministry for Digital Affairs and State Modernisation. This initiative seeks to transfer market supervision of certain fundamental rights-relevant artificial intelligence to the Federal Network Agency, even though this responsibility has already been assigned to the data protection supervisory authorities under the AI Regulation. According to the draft, these changes are intended to remove obstacles to innovation. The systems affected are, in particular, high-risk AI applications deployed for the purposes of law enforcement, border management, the judiciary and democracy.

Outlook for 2026

Certain data protection issues from previous years will remain significant in 2026 – for example, questions relating to claims for damages under Article 82 GDPR, as well as the use of various AI tools. In addition to these, it is expected that new data protection topics will also emerge.

The ECJ currently has two preliminary references pending. In one proceeding concerning a “Google Fonts data protection violation”, the BGH submitted, among other things, questions to the ECJ in August 2025 regarding whether a claim for damages under Article 82 (1) GDPR may exist even if the data subject has deliberately and solely brought about a data protection breach with the express purpose of asserting the violation, and – should the first question be answered in the affirmative – whether this applies even in cases where similar breaches are provoked in large quantities by automated means (BGH, decision dated 28.08.2025 – VI ZR 258/24). In September 2025, the BGH suspended proceedings involving Facebook’s self-promotion with the assertion of being “free of charge”, in order to refer a question to the ECJ concerning the interpretation of the term “costs” within the meaning of the Directive on Unfair Commercial Practices (BGH, decision dated 25.09.2025 – I ZR 11/20). The decisions of the ECJ remain to be seen.

In November 2025, the European Commission published a digital package comprising a so-called Digital Omnibus, a strategy for the Data Union, and the introduction of European Business Wallets. The aim of the Digital Omnibus is to simplify existing regulations on artificial intelligence, cybersecurity, and data in the future. Additionally, the Data Union strategy includes measures to unlock high-quality data for AI by expanding access to such data. Finally, the Commission proposed the introduction of the European Business Wallet as a digital tool designed to digitise business processes, which is expected to reduce administrative burdens by 150 million euros per year.

The BRANDI data protection law team will, of course, continue to keep you updated on data protection events and challenges arising in 2026 throughout the new year.

Mira Husemann

Research Associate