Probleme bei der Darstellung des Newsletters? Dann klicken Sie hier für die Webansicht

Newsletter data protection

Dear readers,

In January 2022, a ruling by the Munich Regional Court I on the data protection-compliant use of Google Fonts led to a veritable “wave of warning letters” and provided a pretext for targeted searches for violations in the integration of Google Fonts and damage claims against companies. Now, there have been house searches and account seizures in this same context in Berlin (press release dated 21.12.2022). According to the police report, the accused lawyer and his client are suspected of (attempted) warning letter fraud and (attempted) extortion in at least 2,418 cases. The two defendants are accused of issuing warnings to private individuals and small business throughout Germany who had embedded Google Fonts on their websites and offering to help them avoid civil proceedings in return for a settlement amount of 170 euros. They knew that there were no grounds for the alleged claims for compensation for pain and suffering, and that there was no justification for a settlement. Furthermore, the defendants had used specially programmed software to identify websites on which Google Fonts were embedded. The visits to these websites, which were also faked and logged using software, were then used as the basis for the warning letters. The prosecutor’s office had previously received 420 complaints in this matter. In addition, an analysis of the seized account records revealed that approximately another 2,000 individuals had made payments to the defendants out of concern for further consequences. In the December issue of the journal “Recht der Datenverarbeitung” (RDV 2022, 300), we last dealt with the topic in more detail in an essay and already pointed out the criminal law dimension of the wave of warning letters.

Traditionally, the January issue of our newsletter offers a summary of the data protection developments, events and challenges of the past year. In keeping with this tradition, the main topic of this year’s first issue looks back at the year 2022 in terms of data protection law, and ventures an outlook on the new year 2023. In 2023, BRANDI’s data protection team will continue to keep you up to date on the latest data protection developments and happenings in the usual way. In the current issue, we report, for example, on the new “Trusted Data Processors” rule of conduct, the current fines against Microsoft and Clubhouse, and the decision of the European Court of Justice on the right to delete false information.

For feedback on this newsletter or questions related to the newsletter topics, please email us at datenschutz@brandi.net. You can also find the other contact details on our homepage.

Dr. Sebastian Meyer and the BRANDI data protection team

Topic of the month: Review of the year 2022 and outlook 2023

In 2022, data protection law continued to be shaped by various decisions of the authorities and courts on data protection, while the official legal framework remained unchanged. Many decisions relate to the increasing processing of personal data in the course of our society’s digital transformation. The Corona situation has somewhat eased in the past year, but developments caused by the pandemic, such as working from home and the increased use of online tools to conduct video conferences or to work on documents and projects together, for example, were still highly relevant for companies.

On September 15, 2022, our BRANDI Data Protection Law Day took place for the third time. This year, Mr. Carl Christoph Möller, in-house lawyer and consultant for data protection & data security at the consumer advice center NRW, was a guest at BRANDI in Bielefeld. We talked to Mr. Möller about the topic of “Data Protection Incidents – Stakeholders, Consequences and Safeguards”. We gained exciting insights into various data protection topics, current procedures and the daily work of the consumer advice center. Since February 2022, we have also made our data protection newsletter available in English.

We have taken the start of a new year as an opportunity to bring together the main debates and particularly relevant developments and events of 2022 in our traditional annual review, while also venturing an outlook on the year ahead.

To the complete main topic

CJEU: Direct action against decisions of the European Data Protection Board

On December 7, 2022 the Court of Justice of the European Union (CJEU) declared direct actions against decisions of the European Data Protection Board (EDPB) inadmissible (CJEU, decision of 07.12.2022 – Ref. T-709/21; see also press release dated 07.12.2022).

The Irish supervisory authority DPC imposed a fine totaling € 225 million on WhatsApp in August 2021 for not informing data subjects transparently enough about the messaging service’s data processing (we reported in August 2021). The DPC had already initiated the proceedings against WhatsApp in 2018 and actually wanted to impose a moderate fine. Several other regulators then raised concerns about the Irish regulator’s actions, as the DPC was supposed to conduct a formal assessment of WhatsApp’s overall operations in Europe. After no agreement could be reached, the EDPB specified in a binding manner for all supervisory authorities that a significantly larger volume of data protection violations had to be assumed, and that the sanctions had to be significantly stricter, which ultimately resulted in the fine imposed. WhatsApp defended itself against the two decisions and, in particular, requested that the EDPB’s binding decision be declared null and void.

The CJEU dismissed WhatsApp’s action as inadmissible on the grounds that the action was not directed against a legal act that could be challenged under Article 263 TFEU, and that WhatsApp did not have the standing to bring an action because it was not directly affected. Rather, it is up to the national courts to review the EDPB decision challenged by WhatsApp. In the proceedings before the national courts, it may then be necessary to clarify whether the question of the legality of the legal act should be reviewed by the ECJ by way of preliminary ruling proceedings.

(Christina Prowald)

ECJ: Right to erasure false information

The European Court of Justice (ECJ) ruled in its judgment of December 8, 2022 that search engine operators must delete links to false information even without a corresponding judgment (ECJ, judgment of 08.12.2022 – Ref. C-460/20). Anyone wishing to have an entry on Google removed would only have to prove that the information in question was incorrect. A court decision would not necessarily be required to provide this proof.

The background to the decision is a case in which two individuals from the financial services industry found themselves discredited by various online articles because, they alleged, the reports made inaccurate claims about their investment model. The two concerned parties asked Google to remove the links to the articles, but Google refused. The Federal Court of Justice (BGH), which was involved in the legal dispute, finally turned to the ECJ with regard to the interpretation of the right to erasure.

In the context of the judgment, the ECJ stated that the protection of personal data is not an unlimited right. Rather, the right to privacy and the right to informational self-determination must be seen in the context of societal functions and weighed against other fundamental rights. The GDPR explicitly provides that there is no right to erasure if the data processed are necessary for the exercise of the right to freedom of information. However, the right to freedom of expression and information was not to be taken into account when it came to misinformation. With regard to the evidence to be provided by the data subject, the ECJ points out that data subjects would only have to provide such evidence as could reasonably be required of them. Being required to obtain a court decision prior to filing the request does not fall into this category. If the data subject is able to provide other relevant and sufficient evidence, the search engine operator is obliged to comply with the request for deletion. If, however, the incorrectness of the information in question is not clear from the evidence provided, the search engine operator does not have to grant the request without further information, such as a court decision.

(Christina Prowald)

LG Gießen: No compensation for damages due to data scraping

On November 3, 2022, the Regional Court Gießen ruled that Facebook users do not have a claim for damages against Facebook if the data provided during registration and made publicly accessible on the respective Facebook page is collected by third parties using automated processes (so-called scraping) (LG Gießen, judgment of 03.11.2022 – Ref. 5 O 195/22).

The court pointed out that the plaintiff had failed to prove the harm required under Article 82 (1) GDPR. It is true that the concept of damage in Article 82 GDPR is to be interpreted broadly, and in principle also includes immaterial damage. However, this does not mean that a mere violation of the provisions of the GDPR is sufficient; it is rather also necessary to demonstrate concrete (immaterial) damage. The plaintiff was unable to provide any such evidence. The fact that most of the data collected was data that is always publicly accessible also speaks against damage.

(Christina Prowald)

LG Nuremberg-Fürth:  Exemption from Section 7 (3) UWG requires effective order

The Regional Court Nuremburg-Fürth ruled on September 21, 2022 that the exception in Section 7 (3) of the German Unfair Competition Act (UWG), which allows the sending of e-mail advertising if certain conditions are met, is only relevant if an order has also resulted in an actual sale of the goods (LG Nuremberg-Fürth, judgment of 21.09.2022 – Ref. 4 HK O 655/21).

The plaintiff had placed an order with the defendant, which the defendant then canceled. Following this, the defendant began sending advertising e-mails to the plaintiff without his express consent. In this respect, the defendant relied on Section 7 (3) UWG and was of the opinion that the advertising concerned similar goods compared to the order placed by the plaintiff.

The court stated that Section 7 (3) UWG initially requires a valid contract, which had not come into existence as a result of the cancellation of the order. Furthermore, the characteristic of advertising similar goods is not fulfilled if, in addition to these or associated goods such as accessories or supplements, other categories of products or, as was the case in this dispute, even the entire product range of the company is advertised. Since the defendant neither met the requirements of Section 7 (3) UWG nor acquired the express consent of the plaintiff, the sending of the advertising e-mails was ruled unlawful.

(Christina Prowald)

LAG Hamm: Protection against dismissal does not apply to voluntarily appointed data protection officer pursuant to Sections 38 (2), 6 (4) BDSG

On October 6, 2022, the LAG Hamm ruled that an internal data protection officer who was voluntarily appointed by a company does not enjoy the special protection against dismissal provided by Section 6 (4) BDSG (LAG Hamm, judgment of 06.10.2022 – Ref. 18 Sa 271/22). A company which, as part of a group of companies, is responsible for payroll accounting and personnel administration for around 80 employees, is not obliged to appoint a data protection officer.

In the relevant case, the plaintiff objected to his termination on the grounds that he had been appointed as the company data protection officer and was therefore subject to special protection against termination under Sections 38, 6 (4) BDSG. In contrast, the company was of the opinion that the special protection against dismissal was not applicable in cases where the BDSG did not require the appointment of a data protection officer.

The LAG dismissed the plaintiff‘s appeal and confirmed the dismissal of the action for protection against dismissal by the lower court. It explained that Section 6 (4) (2) BDSG only applies to non-public bodies pursuant to Section 38 (2) BDSG if the appointment of a data protection officer is mandatory. However, such an obligation did not exist for the defendant in the specific case, neither under the GDPR nor under the BDSG.

(Christina Prowald)

LfDI: “Trusted Data Processors” rule of conduct

In order to create more clarity and legal certainty in the use of processors, the State Commissioner for Data Protection and Freedom of Information of Baden-Württemberg, Dr. Stefan Brink, has approved the new national code of conduct “Requirements for Processors under Article 28 GDPR – Trusted Data Processors” (press release dated 18.11.2022). With the recognition of the Code of Conduct, DSZ Datenschutz Zertifizierungsgesellschaft mbH was also accredited as a new supervisory body within the meaning of Article 41 GDPR.

Rules of conduct within the meaning of Article 40 GDPR are binding requirements of an association or other body that define the data protection conduct of the respective members, taking into account the specifics of the individual processing areas, and are intended to contribute to proper compliance with the GDPR. According to the LfDI, by making a voluntary commitment to the “Trusted Data Processor” rule of conduct, processors can make it clear to the outside world that they follow the standard set out in the rule of conduct and submit to scrutiny by a monitoring body accredited by the LfDI. Experts from the “Berufsverband der Datenschutzbeauftragten Deutschlands e.V.” (BvD) and the “Gesellschaft für Datenschutz und Datensicherheit e.V.” (GDD) were involved in developing the code of conduct.

(Christina Prowald)

France: Microsoft fined millions

Because there was no simple way to reject technically unnecessary cookies when searching with Bing, the French data protection supervisory authority imposed a fine of € 60 million on Microsoft and ordered the adjustment of processes within three months (press release dated 22.12.2022). The supervisory authority justified the amount of the fine primarily on the basis of the extent of the data processing that took place, the number of data subjects, as well as the profits that the company made from the web revenue generated indirectly from the data collected with the help of cookies.

The CNIL had already conducted checks on Bing in September 2020 and in May 2021. In the process, it discovered that marketing cookies were being set without users having given their prior consent. In addition, there had been no button by means of which the use of cookies could be rejected as easily as they were accepted. Several clicks by the user were required for the rejection, whereas only one click was required for the acceptance of the cookies.

The CNIL did not impose the fine on the basis of Article 83 GDPR, but relied on Article 82 of the national law on Information technology and Freedoms, which transposed the e-Privacy Directive in France. The CNIL already imposed comparable penalties on Google and Facebook (we reported in February 2022).

(Christina Prowald)

Italy: Clubhouse fined for multiple data protection violations

The Italian regulator (Garante per la protezione die dati personal, GPDP) imposed a fine of € 2 million on Alpha Exploration, the operator of the social media app Clubhouse, on 05.12.2022 (press release dated 05.12.2022).

The social media app Clubhouse is based on voice interactions that take place in chat rooms. Users can open such a chat room or access another user’s chat room as a listener. In addition, conversations can be recorded, saved on the platform, and shared.

In the course of its investigation, the supervisory authority identified several violations of data protection law on the part of Clubhouse. On the one hand, the supervisory authority criticized the lack of transparency in data processing and the unclear and indefinite storage periods. On the other hand, the possibility of storing and sharing audio recordings without the consent of other data subjects, as well as the profiling and sharing of account information without a corresponding legal basis, were criticized. In addition, the GPDP prohibited the company from any further data processing for marketing and profiling purposes without explicit consent.

In addition to the fine, the Italian regulator also imposed various remedies on Clubhouse. In particular, the company would have to introduce a feature that allows users to know whether the chat is being recorded before they enter a chat room. Furthermore, persons affected by the data processing would have to be informed about the relevant legal basis in each case as well as the applicable retention periods. A data protection impact assessment was also required for the processing operations carried out in the context of the Clubhouse platform.

(Christina Prowald)