
Newsletter data protection 08/2025
A survey conducted by the TÜV Association on cybersecurity in German companies has revealed that 73% of the 506 companies surveyed consider cybersecurity to be important. Large companies with more than 250 employees in particular feel threatened by organized criminals (73%) and state-sponsored hackers (64%), but also by their own employees (52%). The survey also shows that the number of successful cyberattacks is on the rise. While 11% of the companies surveyed were affected by an IT security incident in 2023, 15% of all respondents have now experienced at least one successful cyberattack in the last 12 months. Almost one in seven companies has recently fallen victim to cybercriminals, with the source of the attack usually impossible to identify. Phishing and spear-phishing attacks (84%), password attacks (12%), and ransomware attacks (12%) are the most common. The damage caused by successful cyberattacks is usually limited. Around two-thirds of respondents (65%) stated that there were no consequences. However, in 6% of cases, the attacks led to serious damage and in 1% of cases to damage that threatened the existence of the company. A good nine out of ten companies rate their own cybersecurity as very good in the survey. This positive assessment is encouraging, but at the same time carries the risk that companies underestimate their resilience and the capabilities of attackers. To increase cybersecurity in companies, the TÜV Association concludes by recommending that companies take cyber risks seriously, develop cybersecurity strategies, and draw up an action plan.
If you have any feedback on this newsletter or questions in connection with the topics covered in the newsletter, please send an email to datenschutz@brandi.net. Further contact details can also be found on our homepage.
Dr. Sebastian Meyer and the BRANDI data protection team
Topic of the month: Data protection at trade fairs
Trade fairs offer companies the opportunity to present their products and services to a wider audience, increase awareness of their brand(s), and position the company positively in the relevant markets. At the company's booth, trade fair visitors can find out about the latest products and services and get in touch with company representatives. This usually does not lead to the processing of personal data that is relevant in terms of data protection law. A different assessment may apply if trade fair visitors are persuaded to purchase a product on site or are encouraged to subscribe to a newsletter. Both the processing of contracts and the provision of newsletters require the collection, storage, and use of personal data. Personal data should always be handled properly and professionally. This article therefore aims to provide information on how personal data can be processed at trade fairs in compliance with data protection regulations.
EU Commission sticks to AI Regulation timetable
The staggered applicability of the AI Regulation remains in place without delay. The second-stage regulations have been in force since August 1, 2025. These regulations primarily concern the use of general-purpose AI (GPAI) models. This includes frequently used systems such as Mistral and ChatGPT. In addition, the provisions on sanctions and the organization of the European Artificial Intelligence Office (AI Office) now also apply. In the future, the AI Office will monitor and enforce compliance with the requirements of the AI Regulation for GPAI models across countries and coordinate with the actual supervisory authorities in the EU member states (in Germany, the Federal Network Agency) for this purpose.
The Artificial Intelligence Regulation (AI Regulation) came into force on August 1, 2024. The regulation contains comprehensive rules for artificial intelligence and aims to guarantee fundamental rights and promote innovation. We reported on the AI Regulation in our April 2024 newsletter, among other places. The regulation provides for gradual implementation in four phases. The first regulations have been in force since February 2025. This initial implementation phase included a ban on AI systems that pose an unreasonable risk, in accordance with Article 5 of the AI Regulation, as well as the application of general provisions in accordance with Articles 1-4 AI Regulation. According to a media report, there have been calls for important deadlines in the AI Regulation to be postponed (report dated 08.07.2025). In particular, the delay in drafting the “Code of Practice” that the European Commission is developing for AI for general use was cited as an argument for a possible postponement. Regardless of this, however, the Commission is sticking to the previous schedule. This was recently confirmed once again by the Commission's spokesperson for digital affairs, Thomas Regnier. Incidentally, the European Commission has now also received the final version of the code. It is now awaiting evaluation by the AI Office.
(Geraldine Paus)
EU Commission’s online dispute resolution platform will be shut down
The European Online Dispute Resolution (ODR) platform will be shut down on July 20, 2025 (Regulation (EU) 2024/3228). The platform was made available to consumers and businesses as a central point of contact for the out-of-court settlement of disputes arising from online purchase and online service contracts. The platform was used to identify the appropriate national body for alternative dispute resolution.
The reasons given for shutting down the platform are the low usage of the platform for complaints and the low number of positive responses from companies, meaning that it was not possible to forward the request via the platform to a listed alternative dispute resolution body.
On the Online Dispute Resolution website, the European Commission provides information about the shutdown of the platform and further information on the resolution of consumer disputes and national contact points.
With the shutdown of the platform, online retailers must adapt their websites and remove information about the ODR platform and links to it.
(Geraldine Paus)
LG Nuremberg-Fürth on the obligation of social networks to provide information
Data subjects have a far-reaching right of access regarding the processing of their personal data, which also includes personal data obtained by an online network via third-party websites and apps.
The Regional Court of Nuremberg-Fürth had to deal with the scope of the right of access regarding personal data, as an online network refused to provide information about the data it had obtained via third-party websites and apps. The Regional Court ordered the defendant to provide the information in accordance with Art. 15 (1) (a), (c), (g), and (h) of the GDPR and to completely delete the data after providing the information due to unlawful processing and to pay 500.00 euros to the plaintiff for the loss of control of the data caused (LG Nuremberg-Fürth, final judgment of February 20, 2025 – 6 O 1485/24). In the opinion of the court, there is a far-reaching right of access pursuant to Art. 15 GDPR in the processing of personal data obtained by an online network on third-party websites and apps. The court was convinced that the defendant processed the plaintiff's personal data via the network, in particular through tools on third-party websites and apps, and was therefore a controller within the meaning of Art. 4 No. 7 GDPR and obliged to provide information to the plaintiff. For the lawful processing of personal data via third-party websites and apps, effective consent must be obtained. Taking into account the scope of the data processing in question and the necessary transparency, effective consent can be ensured if it is obtained separately for internal and external data. The defendant also failed to obtain effective consent for the processing of data collected on third-party websites and apps. The defendant is the sole “controller” and the obtaining of the relevant consent cannot be “outsourced” to the website operators.
In addition, a claim for damages pursuant to Art. 82 (1) GDPR is justified, as the defendant processed the plaintiff's personal data unlawfully, thereby causing damage that can be causally attributed to the data protection violation. Furthermore, the unlawful processing of personal data in online networks gives rise to a right to erasure pursuant to Art. 17 (1) (d) GDPR.
(Geraldine Paus)
Austrian BVerwG: Fine of 15.000 euros for non-functioning data protection email address
The Austrian Federal Administrative Court imposed a fine of 15,000.00 euros on a company for several data protection violations (BVerwG Austria, decision dated 27.03.2025 - W2982285480-1/10E).
In fulfillment of its data protection obligations, the company listed an email address on its website as the responsible body through which data subjects could exercise their rights under Art. 15 of the GDPR. However, this email address did not actually exist or was not assigned to a valid domain, meaning that a customer's request for deletion sent to this email address could not reach the company and was not processed. For this reason, the customer concerned filed a complaint with the competent data protection authority. The company only deleted the personal data of the customer concerned after repeated requests and the involvement of the data protection authority. In the opinion of the Federal Administrative Court, there is a violation of the obligation under Art. 12 (2) GDPR if the company's email address does not work for a longer period of time.
The company failed to respond to further letters from the data protection authority and did not amend its privacy policy. As a result, the company is also charged with a violation of the obligation to provide information under Art. 12 (3) in conjunction with Art. 17 GDPR and a violation of the obligation to cooperate with the data protection authority under Art. 31 GDPR.
(Mira Husemann)
VG Bremen: Use of customer data after contract termination
In its ruling of April 23, 2025, the Administrative Court of Bremen decided that post-contractual use of customer data for recovery purposes is permissible under data protection law for up to 24 months (VG Bremen, decision dated 23.04.2025 - 4 K 2873/23).
The initial dispute is based on a situation in which an energy supply company conducts door-to-door sales visits 24 months after the termination of a contract in order to win back customers. The personal data processed for this purpose includes the address and meter reading information, as well as, in individual cases, the title and name of the former customers.
This data processing is a secondary purpose pursuant to Art. 6 (1) (f) in conjunction with (4) GDPR. This means that the data processing was originally carried out for another purpose—namely, for advertising purposes by mail during the term of the contract—and was later also used for post-contractual door-to-door advertising. The secondary purpose is compatible with the primary purpose of the original data collection. The prerequisite is a close connection between the data collection and further processing; in particular, that the further processing is covered by the legitimate expectations of the data subjects. A close connection is justified by the purpose of customer recovery. Furthermore, although further processing is not foreseeable according to the company's data protection information, advertising through personal contact and temporary use of the data after the end of the contract is reasonably to be expected and is also desired by some customers. It should be noted that the data processing does not concern particularly sensitive data, nor would it have serious consequences for the customer.
In addition, storage for more than 24 months is required within the meaning of Art. 6 (1) (f) GDPR. Energy supply contracts usually have a term of 12 to 24 months, so that reuse of the data after this period is the most economically sensible means of customer recovery.
(Mira Husemann)
Damages for unauthorized use of Meta Business Tools
The Regional Court of Stuttgart ordered Meta to pay damages in the amount of 300 euros for the unlawful storage of off-site data (LG Stuttgart, decision dated 05.02.2025 – 27 O 190/23).
Contested off-site data refers to the merging of data from third-party websites and apps with user accounts on Meta services. If a third-party contractor integrates Meta Business Tools into their website or app, this off-site data is forwarded to Meta for the purpose of displaying personalized advertising.
Meta cannot rely on the consent obtained from third-party contractors for the transfer of data for the storage of off-site data. Rather, the storage of data for the purpose of further processing or use constitutes independent data processing pursuant to Art. 4 No. 2 GDPR and requires a separate legal basis. The defendant alone is responsible for data storage. The defendant was unable to present any justification, meaning that the storage is unlawful and gives rise to a right to erasure under Art. 17 (1) GDPR.
In addition, the plaintiff has suffered immaterial damage within the meaning of Art. 82 GDPR due to a loss of control over the off-site data. The defendant can separate the data collected and transmitted with the Meta Business Tool from its user account so that it can no longer be assigned. However, it is not possible to delete the data by configuring his account, and the purpose of data storage is also not comprehensible.
(Mira Husemann)
OLG Frankfurt a.M.: Necessity of data for contract fulfillment
In its ruling of July 10, 2025, the Higher Regional Court of Frankfurt am Main decided that the mandatory provision of an email address or mobile phone number when purchasing a travel ticket constitutes unlawful data processing due to a lack of legal basis and violates Art. 5 (1) (a) GDPR (OLG Frankfurt a.M., decision dated 10.07.2025 - 6 UKl 14/24).
Until December 15, 2024, the defendant — DB Fernverkehr AG — offered “Spar” and “Super-Spar” tickets for sale in accordance with its own terms and conditions of carriage solely in the form of digital tickets. To purchase these tickets, customers were required to provide an email address or mobile phone number.
The data processing violates the prohibition of coupling under Art. 7 (4) GDPR and, due to the lack of voluntariness, cannot be based on the consent of the customers pursuant to Art. 6 (1) (a) GDPR. In particular, the defendant has not succeeded in proving that the contract – the provision of the transport service – cannot be fulfilled without data processing. This is because only the collection of the customer's identity data and not the collection of their email address is necessary to grant protection against access, reproduction, misuse, and transfer. In addition, according to the case law of the ECJ, the dominant position of the responsible body in the long-distance rail transport market must be taken into account as an indication of the lack of voluntariness. In the absence of other reasonable access to equivalent services on the market, customers are not in a position to refuse consent without suffering subsequent disadvantages. Consequently, customers are placed in a situation of pressure that negates their freedom of choice.
Since the sole purpose of the digital ticket is to provide proof of the conclusion of the contract for the transport service and payment of the fare, the collection and processing of the email address is not necessary for the performance of the contract pursuant to Art. 6 (1) (b) GDPR or for the pursuit of legitimate interests pursuant to Art. 6 (1) (f) GDPR. The facilitation and efficiency of the processing of the main service is not sufficient to justify the necessity.
(Mira Husemann)
OLG Nuremberg: Burden of proof and presentation in deletion claims
In its decision of June 11, 2025, the Higher Regional Court of Nuremberg had to deal with the burden of proof and presentation in data processing pursuant to Art. 6 (1) (f) GDPR (OLG Nuremberg, decision dated 11.06.2025 - 3 U 383/25). According to general principles of civil law, the burden of proof and presentation for an overriding interest in the protection of personal data lies with the data subject themselves.
For the purpose of providing business credit information in the credit sector, the defendant stores two entries relating to the plaintiff's payment history. The entries each contain a claim for 100 euros and 201.30 euros, which have not been paid by the plaintiff — the debtor of these claims — for four years. The plaintiff demanded the deletion of the entries, the correction of the score value, and the omission of any further storage of the entries.
In this specific case, data processing serves the socio-economic interests of the credit sector and therefore constitutes a legitimate interest pursuant to Art. 6 (1) (f) GDPR for the defendants and their contractual partners. The wording “[...] unless [...]” establishes a rule-exception relationship for the lawfulness of data processing. Consequently, the lawfulness can be rebutted by the data subject if their interests outweigh the interests of the defendants. Merely equivalent interests would not suffice. In order to demonstrate that their interests outweigh those of the defendants, the data subject would have to specify the negative effects. In particular, blanket references to an interest in economic participation or the feeling of being unjustifiably portrayed in a negative light would not suffice.
Nor does any other distribution of the burden of proof arise from past case law. According to the case law of the ECJ, the burden of proof for accountability under economic administrative law pursuant to Art. 5 (2) and Art. 24 GDPR lies with the responsible body (ECJ, decision dated 14.12.2023 - C-340/21). However, such public law matters do not give rise to a general burden of proof on the part of the data protection controller. The case law on disputes relating to the right of expression – in which the rule of evidence in Section 186 of the German Criminal Code (StGB) is applied via Section 823 (2) of the German Civil Code (BGB) – is also not transferable to this case due to the absence of defamation or the dissemination of a factual claim.
(Mira Husemann)
Regulatory authorities advocate standardization of reporting channels
The data protection supervisory authorities of the federal states have spoken out in favor of simplifying the reporting requirements under the NIS 2 Directive (press release of 04.07.2025). In future, companies will be able to report IT security incidents under the new NIS 2 Directive and data protection incidents under the GDPR in a uniform procedure and submit the relevant documents within a uniform process. From the perspective of data protectionists, standardization will lead to a reduction in bureaucracy, a noticeable reduction in the burden on companies and an acceleration of administrative procedures. A corresponding proposal for a legislative amendment was submitted to the Federal Ministry of the Interior by the data protection authorities as part of the participation of the federal states and associations.
The Berlin State Data Protection Commissioner, Meike Kamp, emphasized: “Instead of having to submit reports twice, companies should be able to complete all reports in one step.” Bettina Gayk, State Commissioner for Data Protection and Freedom of Information in North Rhine-Westphalia, also commented: "Reducing bureaucratic hurdles for companies makes perfect sense. We data protectionists don't want unnecessary bureaucracy either. Our work aims to ensure that people's privacy and self-determination remain secure.”
(Christina Prowald)
Berlin data protection commissioner wants to ban AI app DeepSeek from German app stores
On June 27, 2025, the AI app DeepSeek was reported as illegal content to the app platforms Google and Apple by Meike Kamp, Berlin's Commissioner for Data Protection and Freedom of Information (press release of 27.06.2025). The operators have not yet decided whether to block the app.
DeepSeek is an AI-powered multifunctional chatbot operated by Hangzhou DeepSeek Artificial Intelligence Co., Ltd., based in Beijing, China. According to its own information, the service processes and transmits personal data on a large scale to Chinese processors. In addition, data is stored on Chinese servers.
Although there is no branch in the European Union, the scope of application of the General Data Protection Regulation is nevertheless opened up under Art. 3 (2) (a) GDPR, and the high European data protection requirements must also be met when transferring personal data to third countries. The reason for this is that the service can be used in German and is offered to German users via app platforms such as the Google Play Store or the Apple Store with a description in German. There is currently no adequacy decision or guarantee within the meaning of Article 46 (1) GDPR for China, which means that data processing is considered unlawful.
On May 6, 2025, the supervisory authority unsuccessfully requested the company to cease the unlawful transfer of data to China or to fulfill the legal requirements for lawful third-country transfers, as well as to remove the apps from the app stores for Germany. As a consequence, Ms. Kamp took measures pursuant to Article 16 of the Digital Services Act (DSA): “DeepSeek has not been able to convincingly demonstrate to my authority that German users' data is protected in China at a level equivalent to that of the European Union. [...] I have therefore informed Google and Apple, as operators of the largest app platforms, of the violations and expect a prompt review of a ban.”
(Mira Husemann)