Probleme bei der Darstellung des Newsletters? Dann klicken Sie hier für die Webansicht

Newsletter data protection

Dear readers,

Our BRANDI Data Protection Law Day on May 12, 2023, is approaching and we are looking forward to exciting discussions on the topic of “Data Protection in the Cloud and Cybersecurity”. In this newsletter you will find more information on how to register for the event.

As usual, we also report on current events and developments in data privacy law, including the European Data Protection Board’s opinion on the Data Privacy Framework, the Advocate General’s opinion on Schufa scoring and Schufa data storage, and a fine imposed on Meta Group by the Irish supervisory authority.

For feedback on this newsletter or questions related to the newsletter topics, please email us at datenschutz@brandi.net. You can also find the other contact details on our homepage.

Dr. Sebastian Meyer and the BRANDI data protection team

Topic of the month: Data protection requirements for the use of AI tools

The use of artificial intelligence (AI) is becoming increasingly important for most companies. Applications include translation tools or applications such as ChatGPT, as well as various products in the HR area that support the recruiting process or resource planning, for example. If personal data, such as that of employees, applicants, customers or business partners, is also processed in the context of the use of AI-based software applications, the data processing must take into account not only the requirements of labor law but also the requirements of data protection law, in particular the General Data Protection Regulation (GDPR) and the Federal Data Protection Act (BDSG). In addition, The European Parliament, with its provisions of the Regulation, and The Council of the European Union, working together, are laying down harmonized rules for artificial intelligence. These new AI rules, while still undergoing the legislative process, will be applied in the future.

To the complete main topic

On our own account: BRANDI Data Protection Law Day

In our data protection newsletters of the past few months, you have already received an invitation from us as well as information about our Data Protection Law Day on May 12, 2023. Together with you and external experts, we would like to discuss the topic of “Data Protection in the Cloud and Cybersecurity” at the event.

In the meantime, the registration form for the event has been released on our homepage. You can find the possibility to register under the following link: https://www.brandi.net/en/news/detail/4-brandi-datenschutzrechtstag-praesenzveranstaltung-am-12052023/.

We will be happy to answer any organizational questions you may have in advance of the event. In addition, you can send content-related questions that you would like to discuss at the event in advance to the following e-mail address: WissMit-DatenschutzBI@brandi.net. In addition, there will be the opportunity to ask questions during the event and to actively participate in the discussion.

We look forward to a large attendance at the event!

(Christina Prowald)

Advocate General: Schufa scoring and Schufa data storage

In the preliminary ruling proceedings OQ v. State of Hesse, and UF and AB v. State of Hessen, in which Schufa Holding AG is supporting the State of Hesse as an intervener in each case, the Advocate General of the European Court of Justice (ECJ), Priit Pikamäe, published his opinion on March 16, 2023 (Opinions of 16.03.2023 – Ref. C-634/21 and C-26/22, C-64/22). The content of the opinion relates on the one hand to the automated creation of a probability value regarding the ability of a person to service a loan (scoring) and on the other hand to the storage of data relating to a discharge of residual debt.

The background to case C-634/21, is the legal dispute between a citizen and the State of Hesse, represented by the Hessian Commissioner for Data Protection and Freedom of Information, concerning the protection of personal data on the occasion of the inaction of the Hessian supervisory authority. Schufa provided a credit institution with a score value relating to this citizen, on the basis of which the citizen was denied a loan he had applied for. The citizen then requested Schufa to delete the incorrect entries and grant him access to the relevant data. Schufa then only informed the citizen in general terms about how the score value is calculated. In contrast, the citizen did not receive any information about what specific information was included in the calculation, as Schufa considers this to be subject to a business trade secret. The citizen subsequently filed a complaint with the Hesse data protection supervisory authority against Schufa’s rejection of his request. The Hessian data protection commissioner refused to take further action because Schufa’s actions did not violate the Federal Data Protection Act, whereupon the case went to court.

The Advocate General now stated that already the automated generation of a probability value about a data subject’s ability to service a loan in the future constitutes a decision based solely on automated processing – including profiling – which produces legal effects vis-à-vis the data subject if these values are transmitted to a third party and the third party bases its decision on the establishment, performance or termination of a contractual relationship on the value. He points out that data subjects have the right to obtain information about the existence of automated decision-making, as well as information about the logic involved and the scope and effects of the processing. This obligation to provide information is to be understood as also including detailed explanations on the calculation of the score value and the reasons for a certain result.

The background to cases C-26/22 and C-64/22 are also legal disputes between two citizens and the State of Hesse regarding applications by the citizens to the Hesse data protection authority for deletion of the entry of a discharge of residual debt at Schufa. In the context of the insolvency proceedings affecting the citizens, they had been granted a discharge of residual debt. This information was officially published on the Internet and deleted after six months. Schufa, contrary, stores information about a discharge of residual debt for three years. The citizens considered this storage period to be inadmissible.

In this respect, the Advocate General concluded that the practice of storing personal data from public registers for a period of three years was not in line with the principles enshrined in the GDPR. The storage by a credit reporting agency could not be lawful if the personal data on an insolvency had already been deleted from the public registers. It should be emphasized that the granted discharge of residual debt should enable the person concerned to participate in economic life again. However, this goal would be thwarted if private credit agencies were entitled to store corresponding data beyond publication in public registers. Data subjects would have the right to have such data deleted without delay.

It remains to be seen to what extent the ECJ will follow the Advocate General’s opinion. As a rule, however, the court will follow the Advocate General’s reasoning.

(Christina Prowald)

VG Hannover: Employee data collection by Amazon permissible

On February 9, 2023, the Administrative Trial Court (VG) Hannover ruled that the use of hand-held scanners to record certain work steps and the use of employee data collected in this way as a basis for evaluating qualification measures, including gathering feedback and making personnel decisions, is rendered permissible (VG Hannover, decision dated 9.2.2023 – Ref. 10 A 6199/20). The court has allowed the appeal.

In a decision issued in October 2020, Amazon was prohibited in the context of a data protection control procedure from continuously collecting quantity and quality data from its employees and using this data to create performance profiles, process analyses and feedback discussions, as the continuous collection of performance data violated data protection regulations in the opinion of the authority. Amazon objected that the data was needed to respond to fluctuations in individual process paths through shifts. In addition, the data is needed to take into account the strengths and weaknesses of employees when planning assignments and to be able to provide objective and individual performance-related feedback. In this respect, the company has a legitimate interest in the data processing.

The VG Hannover is of the opinion that there is no violation of data protection law, in particular not of Section 26 BDSG. The data processing was necessary for the control of logistics processes, the control of qualification and the creation of assessment bases for individual feedback and personnel decisions. The interference with the right to informational self-determination of the employees is also not disproportionate to the interests of the company worth protecting and is therefore appropriate. In this respect, the court points out that the data collection does not take place secretly, that it is merely a matter of performance monitoring and that the data is required in particular for the control of logistics processes. The possibility of objective feedback had also been positively assessed by many employees.

(Christina Prowald)

ArbG Oldenburg: Compensation for delayed information

The Oldenburg Labor Court (ArbG) ruled on February 9, 2023 that a late response to a request for information gives rise to a claim for damages under Art. 82 GDPR in the amount of 10,000 euros (ArbG Oldenburg, decision dated 9.2.2023 – Ref. 3 Ca 150/21, BeckRS 2023, 3950).

In the underlying case, an employee had asserted his right to information under Art. 15 GDPR against his former employer and requested a copy of the data processed about him. The employer initially refused to provide the information and only submitted individual documents to the plaintiff 20 months later in the course of a labor court case. The plaintiff then asserted a claim for non-material damages due to the non-fulfillment of the duty to provide information, whereby he did not specify what the damage should be.

The Labor Court subsequently awarded the plaintiff non-material damages because the defendant had not fulfilled its duty to provide information within one month. Due to the plaintiff’s high interest in information and the long period of non-compliance, the court considered damages in the amount of 10,000 euros to be appropriate. The court was of the opinion that the plaintiff did not have to specify the damage. Already a violation of the GDPR leads to a non-material damage to be compensated. The claim under Art. 82 GDPR has a preventive character and serves as a deterrent.

With regard to the amount of damages awarded, the decision is an outlier. The court’s view that a mere violation of a provision of the GDPR is sufficient to justify a claim for damages is incomprehensible. The Advocate General of the European Court of Justice stated in October 2022 that the mere violation of a norm is not sufficient as such, if no damage is associated with it (as we reported in our data protection newsletter in December 2022). Without legal injury, damages would no longer fulfill the function of compensating for adverse consequences and would rather have the legal nature of a sanction. He further elaborated that non-material damage must exceed a certain materiality threshold.

(Christina Prowald)

Google Analytics 4

On June 30, 2023, Google will discontinue its analysis tool Universal Analytics, also known as Google Analytics 3. The successor Google Analytics 4 has already been on the market since October 14, 2020, and will be Google’s only analytics tool in the future. Google Analytics 4 is an analytics tool that website operators can use to evaluate user activity.

The latest version works with machine learning, i.e. algorithms for better data analysis and completion of missing data records through so-called “modeling”. In addition to the necessity arising from the operational discontinuation of Universal Analytics, it is also advisable to switch to the current offering from a data protection perspective. With Google Analytics 4, Google responded to the concerns of European data protectionists and implemented the automatic anonymization of internet protocol (IP) addresses as a standardized default setting. With the predecessor, this step still had to be taken manually. Another significant advance is the shortening of IP addresses on servers within the European Union. Previously, IP addresses were shortened on servers in the United States. This circumstance in particular was a thorn in the side of privacy rights activists. Google aims to work without the use of cookies and to enable data analysis through the use of machine learning. Specifically, this is to be implemented by forming small groups of users whose group behavior can then be determined by machine learning without reference to the respective user through the use of cookies. It remains to be seen whether this approach will result in individualization no longer being possible and thus personal data no longer being processed, which would render the GDPR inapplicable.

However, these changes do not mean that consent should no longer be obtained. The website operator is still required to obtain the user’s consent by means of a so-called “consent tool”. In addition, it is recommended to conclude a data processing agreement within the meaning of Art. 28 GDPR. A template version of such a contract can be found in your Google account. Finally, it is advisable to include the changes in your own privacy policy.

(Lukas Ingold)

EU: Cookie hint avoidance plan

At the level of the European Union, initial considerations are being made to create an option in the future to dispense with the cookie notices that a large part of the population finds annoying. A concept is being considered in which users can save their preferences and the information stored replaces a query about the use of cookies in individual cases. A comparable model already exists in the German Telecommunications Telemedia Data Protection Act (TTDSG), although it has not yet been able to gain acceptance in practice. In anticipation of the E-Privacy Regulation, which is still pending, Section 25 TTDSG refers to “recognized consent management services”, although this would still have to be defined in more detail in a national legal regulation. If it were possible to establish such a concept at the European level, this would certainly be easier to implement, than implementing only with a special national approach. So far, however, there are only initial ideas on this issue, which Euractiv magazine reported on in its article “EU consumer department to present voluntary pledge over ‘cookie fatigue”, and the proposal by the  German Federal Ministry of Transport and Digital Infrastructure (BMDV) with its draft bill for the Consent Management Regulation (EinwVO).

(Dr. Sebastian Meyer)

EDPB: Opinion on the Data Privacy Framework

On February 28, 2023, the European Data Protection Board (EDPB) issued its opinion on the draft adequacy decision on the EU-U.S. Data Privacy Framework.

U.S. President Joe Biden had signed a decree on October 7, 2022, which creates the legal basis on the U.S. side for a new legal framework for data transfers to the U.S. (we reported in our data protection newsletter in November 2022). Subsequently, in December 2022, the European Commission then submitted a draft adequacy decision for the U.S. and initiated the procedure for adopting the adequacy decision (as we reported in our Annual Outlook 2023).

In its opinion, the EDPB welcomes the improvements compared to the previous regulations, including the introduction of requirements regarding the necessity and proportionality of intelligence data collection, as well as the newly created redress mechanism for EU data subjects with regard to the access possibilities of U.S. security authorities. The EDPB was critical of the possible bulk collection of data. As the EDPB believes that there is still a need for clarification on several points, the EDPB asked the European Commission for further investigations and clarification of the points mentioned. These include, for example, certain rights of data subjects, the onward transfer of personal data, and the practical functioning of the redress mechanism.

The chairwoman of the DSK, Marit Hansen, commented on the statement as follows: “The data of many EU citizens are transferred to the USA. For comprehensive protection of fundamental rights, it is important that the level of protection in these cases is also equivalent to the level of data protection guaranteed in the EU. The European Data Protection Board, with the participation of German supervisory authorities, has carefully examined the level of protection described in the EU-U.S. Data Privacy Framework. I welcome the progress made and hope that the remaining open points we have jointly identified will now also be clarified.”

The opinion of the EDPB is a necessary prerequisite for the adoption of an appropriate resolution. However, the points criticized by the EDPB do not have to be taken into account by the European Commission. It remains to be seen to what extent the European Commission will now make changes to the resolution on the basis of the EDPB’s opinion.

(Christina Prowald)

BfDI: 31st Activity Report for Data Protection and Freedom of Information 2022

On March 15, 2023, the Federal Commissioner for Data Protection and Freedom of Information (BfDI), Ulrich Kelber, published his 31st Activity Report for Data Protection and Freedom of Information 2022. Mr. Kelber began by reporting that the year 2022 had been particularly eventful and that national as well as international cooperation between data protection authorities was becoming increasingly important. The authority had focused on employee data protection, the transfer of data to third countries, and the handling of fines and the right to information. According to the BfDI, advising and monitoring public authorities and companies was another focus of its activities in 2022. In addition, the authority received 10,658 reports of data protection violations and 6,619 inquiries and complaints from citizens. This roughly corresponds to the figures from previous years.

In the reporting period, the European Union also adopted various legal acts in the area of digitization, which BfDI dealt with intensively. These include the AI Regulation, the Data Governance Act and the Data Act. The Federal Commissioner also dealt with numerous other individual topics such as the Corona warning app, operational integration management, the 2022 census, new ways of personalized advertising and video conferencing services.

As part of the activity report, Mr. Kelber also makes various recommendations, under data protection law, on individual topics, including the enactment of an Employee Data Protection Act that clearly regulates, for example, the use of AI in the employment context, the limits of behavioral and performance monitoring, and typical data processing in the application and selection process, as well as the shutdown of the federal government’s Facebook fan page.

(Christina Prowald)

BfDI: Assessment of TrustPID

Since May 2022, Vodafone and Telekom have been conducting a test project in Germany on the technical feasibility of a venture called TrustPID. The platform, which was developed by several major telecommunications providers and is currently under construction, is intended to offer an alternative to the currently widespread personalized advertising based on third-party cookies. Users are to be recognized as part of the service on the basis of IP addresses and mobile phone numbers.

When calling up a partner website, the user is asked - separately from the cookie banner - to consent to the transmission of his IP address to his mobile network provider. This provider determines the user's phone number based on the IP address and creates a unique, pseudonymous network identifier for TrustPID. The TrustPID provider in turn uses this pseudonym to generate marketing identifiers (tokens) for the partner websites. These tokens can ultimately be used for personalized online marketing.

The BfDI advised Vodafone and Telekom as part of the project and was able to achieve various improvements in data protection law in this way. In particular, a more transparent design of user consent was achieved. Contrary to various reports in the press, TrustPID is not a “super cookie”, according to the BfDI, since the aim is to create an alternative to cookie-based personalized advertising.

However, the BfDI also said that the new service could be viewed ambivalently. Telecommunications providers have a special position of trust that is difficult to reconcile with user tracking. In addition, the token must be prevented from being merged with other user data, such as log-in data for services from providers on the web, which would enable re-personalization and subsequent detailed tracking.

On February 10, 2023, the European Commission approved the plan of Deutsche Telekom, Orange, Vodafone and Telefónica to establish a joint advertising tracking platform from an antitrust perspective. Following the approval, the competent European data protection authorities will now evaluate the project from a data protection perspective.

(Christina Prowald)

United Kingdom: Data protection reform

On March 8, 2023, the Secretary of State for Science, Innovation and Technology, Michelle Donelan, introduced a new bill in the English House of Commons. The new Data Protection and Digital Information Bill (No. 2) is currently in its second reading. The original bill, which was introduced on July 18, 2022, was previously withdrawn on the same day.

According to Michelle Donelan, the bill is intended to create a simple, clear and business-friendly framework that will not be difficult or costly to implement, while adopting the elements of the GDPR that are considered particularly useful, and with more flexibility for companies. The net result, however, is that EU assessments will no longer be automatically transferable to the UK in the future, sometimes creating barriers to trade.

Michelle Donelan commented as follows: “This new bill, developed in partnership with business from the outset, ensures that a vitally important data protection regime is tailored to the needs of the UK and our habits.”

(Christina Prowald)

 Ireland: Fine of 17 million euros against Meta (Facebook)

On March 15, 2023, the Irish Data Protection Authority (DPC) imposed a fine of 17 million euros on Meta Platforms Ireland Limited (formerly Facebook Ireland Limited) for inadequate technical and organizational measures related to the protection of EU users' data (press release dated 15.03.2023).

The DPC's decision is the result of an investigation into twelve data breach notifications from 2018. The previous investigation identified violations of Art. 5 (2) and Art. 24 (1) GDPR. In particular, Meta had failed to take appropriate technical and organizational measures that would allow the company to demonstrate the security measures it had taken in practice to protect the data of EU users.

Since the cases at issue were each cross-border cases, the decision was subject to the co-decision procedure pursuant to Art. 60 GDPR. Two of the European supervisory authorities involved subsequently raised objections to the DPC's draft decision. However, a consensus was ultimately reached in the course of further discussions.

(Christina Prowald)