Laptop
Newsletter data protection

Newsletter data protection 01/2026

Every year brings new developments and challenges in data protection law. Traditionally, we use the January edition of our newsletter to review the key data protection events of the previous year. In our main topic for this first issue of the year, we therefore take a customary look back at the data protection year 2025 and venture a forward-looking perspective on the coming year, 2026.

As ever, we are pleased to keep you informed about current legal developments in data protection throughout the new year. In this edition, for example, we discuss the decision by the European Court of Justice (ECJ) regarding the data protection responsibility of online marketplace operators, a ruling by the German Federal Court of Justice (BGH) on the retention of personal data by data processors, reform proposals from the German Data Protection Conference (DSK) concerning the GDPR, as well as fines handed down by the Spanish and Polish data protection authorities.

Should you have any feedback on this newsletter or questions about the topics covered, please send us an email at datenschutz@brandi.net zu senden. Further contact details can also be found on our website.

Dr. Sebastian Meyer, LL.M.

Lawyer and Notary in and for Bielefeld
Certified Specialized Attorney in information technology law (IT-Recht)
Data Protection Auditor (TÜV)

Information and contact

Topic of the Month / January 2026

Annual Review 2025 and Outlook for 2026

Data protection law in 2025 was characterised by a variety of court and authority decisions concerning the interpretation and application of the provisions of the GDPR. Central issues included claims for damages under Article 82 GDPR, as well as requests for information under Article 15 GDPR. Judgements were also handed down relating to data protection responsibility and the concept of pseudonymisation.

On May 16, 2025, our BRANDI Data Protection Law Day was held for the sixth time. This year, Professor Ulrich Kelber, former Federal Commissioner for Data Protection, was our guest at BRANDI. Together with Professor Kelber, we discussed current questions and developments on the topic of “Data Protection and Digitalisation”. In conversation with BRANDI’s lawyers, the guest speaker provided valuable insights into current issues and developments in the area of AI, as well as discussing his experience as Federal Commissioner for Data Protection.

To mark the beginning of the new year, we would like to use our traditional annual review to reflect on the events of the past year, as well as our key topics. Finally, we would like to look ahead to 2026 and potential developments.

Other topics in this newsletter

ECJ

Data Protection Responsibility of Online Marketplace Operators

In its judgment of December 2, 2025, the European Court of Justice (ECJ) ruled that operators of online marketplaces qualify as data controllers within the meaning of Article 4 (7) GDPR for advertisements published on their platforms (ECJ, decision dated 02.12.2025 – C-492/23). Consequently, they are required to implement technical and organisational measures to prevent adverts from unlawfully disclosing sensitive data.

The judgment was preceded by a legal dispute in which a claimant took action against the platform operator after being wrongly portrayed in an advert on the marketplace as providing sexual services.

The ECJ justified its decision by stating that the operator publishes adverts based on a commercial self-interest and thus participates in determining the purposes of the data processing. By providing the online marketplace and setting the parameters for the dissemination of adverts, the operator also plays a part in deciding the means of such publication.

This classification as a data controller entails comprehensive obligations under the GDPR. Operators of online marketplaces should now review whether they too qualify as controllers for the advertisements published on their websites according to this case law and, where appropriate, implement protective measures.

BGH

Retention of data by a processor constitutes a breach of duty

In its judgment of November 11, 2025, the German Federal Court of Justice (BGH) held that the data controller remains responsible for protecting the rights of data subjects even after the termination of a processing contract. The controller must ensure that, as a general rule, no personal data remain with the processor once the contractual relationship has ended (BGH, decision dated 11.11.2025 – VI ZR 396/24).

The claimant was a user of a French online music streaming service. The defendant, as the service operator, had engaged an external company as a data processor for technical operations until December 2019. After the contract ended, the users’ personal data were not deleted as previously indicated by the processor, but were instead initially moved to a test environment. In 2022 it emerged that this data, including names, email addresses and further personal details, had been put up for sale on the dark web. The data is believed to have been circulated either as a result of a hacking attack or unauthorised disclosure by the service provider. The claimant subsequently requested non-material damages from the defendant pursuant to Article 82 of the GDPR.

The German Federal Court of Justice found that the defendant had infringed data protection law in the present case. In the court’s opinion, it is not sufficient simply to conclude a contract with the processor which imposes obligations to delete the data and provide evidence of deletion at the end of processing. Rather, the controller must take all measures reasonably required in the circumstances to ensure that the processor fulfils its contractual obligations. In this instance, the defendant failed in that duty, as it did not obtain the necessary confirmation that the data had been deleted following the end of the contract.

OVG Koblenz

The GDPR principally protects only the data of living individuals

In a judgment dated November 28, 2025, the Higher Administrative Court of Koblenz addressed the issue of whether the right to lodge a data protection complaint under Article 77 GDPR constitutes a transmissible legal position within the meaning of Paragraph 1922 (1) of the German Civil Code (BGB) (OVG Koblenz, decision dated 28.11.2025).

The claimant had lodged a complaint with a data protection supervisory authority regarding the processing of data relating to her now-deceased spouse. Following an examination, the authority closed the proceedings without finding any unlawful data processing. The claimant challenged this decision by seeking judicial review.

The court established, as a starting point, that the right to lodge a complaint under Article 77 GDPR is vested exclusively in the data subject. The processing at issue did not concern the claimant herself, but rather her deceased spouse. The right to complain about data protection infringements is not inheritable, as the protections granted by the GDPR generally apply only to living natural persons. Firstly, the derivation of the right to informational self-determination as a core protected good of the GDPR — designed fundamentally to ensure the individual’s control over their own data — argues against its extension to deceased persons. Furthermore, Recital 27 to the GDPR makes it clear that the Regulation does not apply to the personal data of deceased persons. The right to lodge a complaint therefore ceases upon the data subject’s death.

Although the Higher Administrative Court in this case dealt solely with the right to lodge a complaint, its reasoning can, in principle, be extended to other data subject rights. Consequently, it is generally to be assumed that other data subject rights also lapse upon the death of the individual concerned.

LG Lübeck

Processing of personal data using Meta Business Tools

The Regional Court of Lübeck, in a judgment dated November 27, 2025, ruled against Meta in connection with its use of the Meta Business Tools, issuing a far-reaching prohibition and awarding damages of € 5,000 to a user (LG Lübeck, decision dated 27.11.2025 – 15 O 15/24).

Meta provides what are known as Meta Business Tools, which are integrated by numerous third-party providers into websites and apps to record user behaviour for advertising purposes. When users visit such third-party sites or use third-party apps, data may be transferred to Meta. The court initially established that Meta is also to be regarded as responsible within the meaning of Article 4 (7) GDPR for the data collected using the Meta-designed Business Tools, since, according to the case law of the ECJ, merely enabling the collection of data and having influence over the categories of data collected is sufficient.
The court did not consider any justification under Article 6 GDPR for the data processing in question to exist; in particular, the data subject’s consent had not been obtained. In the court’s view, the options available within Instagram do not alter this assessment. Although users can, for example, influence how Meta further uses data from activities outside its platforms, such as for personalised advertising, these options cannot substitute for prior consent to the collection and storage of the data concerned. The court therefore held that Meta was not permitted to process the claimant’s personal data.

DSK

Proposals for reforming the GDPR

The German Data Protection Conference (DSK), a body comprised of the independent federal and state data protection authorities in Germany, has made statements at its 110th conference regarding the draft Digital Package by the European Commission (the “Digital Omnibus”). The DSK initially advocated for manufacturers and providers of standard solutions to be more strongly included in future data protection responsibilities (resolution dated 12.12.2025). According to the DSK, this would make it easier for small and medium-sized enterprises to utilise such standard solutions.

A reform of the GDPR should serve to extend the principle of manufacturer responsibility and align it with other digital regulatory acts, such as the Cyber Resilience Act or the AI Regulation. Although the GDPR already enshrines corresponding principles through Data Protection by Design and by Default (Art. 25 GDPR), to date only the users themselves have borne responsibility under data protection law—even where they have no influence over the product’s design. In future, manufacturers should, in the DSK’s view, be held to a greater degree accountable for data protection-friendly product design and compliance, thereby reducing users’ liability risks.

Beyond this, the DSK highlights the need for new, specific provisions governing the operation of AI models and systems (resolution dated 12.12.2025).

The greatest need for optimisation, it contends, lies in the creation of AI-specific legal bases. The DSK identifies three illustrative applications: for the training of AI models, this would encompass web scraping and the further processing of data originally collected for other purposes. In addition, a legal basis could be established for processing memorised personal data when operating AI models.

The DSK also sees a need for action in protecting data subject rights, as practical difficulties frequently arise here in connection with the use of AI systems. It therefore advocates the introduction of functionally equivalent or compensatory protections. For example, the general duty to inform and the right of access for data subjects should be extended to include information on whether personal data are processed within AI systems.

EDPB

Dealing with Mandatory User Accounts

The European Data Protection Board (EDPB) has issued recommendations on the legal basis for mandatory creation of user accounts on e-commerce websites (recommendations dated 03.12.2025).

Users are often required to create an online account on e-commerce websites before being able to purchase goods or access services. While the controller may have a legitimate commercial interest in requiring a user account, the EDPB considers that such account creation may also pose risks to the rights and freedoms of the data subjects. In its recommendations, the EDPB examines potential legal bases under Article 6 GDPR with regard to their suitability for justifying the mandatory creation of accounts. It concludes that an obligation to create an account is justified only in very limited circumstances, such as where a subscription service is being offered. Ultimately, the EDPB considers it most appropriate to give users the choice of whether to create an account or continue as guests.

Spain

Fine imposed for planned use of AI-based facial recognition

The Spanish data protection authority (AEPD) has imposed a fine of € 750,000 on the Valencian International University for its planned use of AI-based facial recognition software (press release dated 21.11.2025).

The university intended to use AI-based facial recognition software during examinations to detect potential attempts at cheating. Continuous video recordings via webcam were to be made throughout the entire duration of the exam in order to verify students’ identities based on biometric characteristics and to identify suspicious behaviour. The students were informed of this in advance. However, the university did not provide an alternative solution that would not involve the processing of biometric data.

The data protection authority found that valid consent was lacking, given the imbalance of power between the university and the students and the absence of any real choice for an alternative. Moreover, for the intended purposes of identity verification and the prevention of cheating, less intrusive measures would also have been available.

Spain

Fine imposed for the loss of documents

The AEPD has imposed a fine of € 80,000 on a courier service after documents sent via the company were lost.

As part of a bank registration process, the data subject sent the necessary documents — such as a registration form and a copy of their identity card — using a courier appointed by the bank. However, the documents never reached their intended recipient, as they were misplaced by the courier. It was also revealed that the courier company had not established a valid data processing agreement with its sub-processor.

The data protection supervisory authority took direct action against the courier, rather than the actual data controller, the bank. In determining the amount of the fine, it emphasised that, although only one person was directly affected, the breaches could have potentially impacted all customers of the bank.

This decision by the AEPD demonstrates that, when calculating fines, not only actual damage and risk are taken into account, but also the potential dangers posed by such infringements.