Live blog

DPO Consultancy

Welcome to our live blog

\

CJEU confirms no minimum threshold for non-monetary GDPR damage claims

On 4 May 2023, the Court of Justice of the European Union (“CJEU”) published its decision in Case C-300-21 where it ruled that not any infringement of the General Data Protection Regulation (“GDPR”) triggers the right to compensation provided by Article 82 of the GDPR. The Court did, however, rule that there is no minimum threshold for damage claims.

The right to compensation as set out in Article 82 of the GDPR does have three requirements, namely that there is an infringement of the GDPR, a damage caused to the affected individual and a casual link between the infringement and the damage.

Background of the case:

During 2017, a company trading in addresses, Austrian Post (Österreichische Post), collected information on the political associations of individuals by using an algorithm that considered various social and demographic characteristics. The data that was generated from this was sold to various organizations for targeted advertising.

While conducting its activities, the Austrian Post identified the plaintiff with a high likelihood to a certain Austrian political party based on the statistical deductions of the collected data. This information was not transmitted to third parties.

The plaintiff had not consented to the processing of his personal data and felt offended by the fact that an association to a certain political party had been attributed to him. The plaintiff further argued that the storage of this data, which was based on a presumption, caused him great upset, a loss of confidence and a feeling of being exposed. The plaintiff brought an action against the Austrian Post for an injunction to stop the disputed data processing and payment in the amount of €1,000.00 as compensation for non-material damage.

Implications of the Ruling:

          Any company that is established in the European Union (“EU”) or subject to the GDPR, can be subject to a damage claim under the GDPR. The risk of damage claims may be particularly high in cases of data breaches and where data subjects can exercise their rights under the GDPR, especially regarding the right of access.

          It clarifies that a materiality threshold is not required and that the violation of the GDPR alone does not trigger a damage claim.

          It is expected that a scattered approach will now arise across the EU on the question under which conditions a non-material damage may be justified.

          The claim for (immaterial) damages does not require that a threshold of seriousness must be met, however, individuals must demonstrate that the infringement of the GDPR caused a (non-material) damage.

          Each Member State is to prescribe the rules governing actions for damage claims under Article 82 GDPR, which must include the criteria for determining the extent of compensation payable in that regard. The principles of equivalence (procedures for the remedies according to Union Law must not be less favourable that those governing similar domestic matters) and effectiveness (the implemented procedures do not make it excessively difficult or impossible in practice for individuals to exercise their rights under Union Law).

 

It will be very interesting to see how this judgment impacts upon future cases and Case C-340/21, where the Advocate General provided its Opinion regarding compensation for non-material damage for data breaches.

Does your organization have any questions about how this ruling may impact your compliance with the GDPR? Contact us, the Experts in Data Privacy at info@dpoconsultancy.nl, for assistance.

 

 Sources:

https://www.orrick.com/en/Insights/2023/05/CJEU-Denies-a-Minimum-Threshold-for-Raising-Non-Monetary-GDPR-Damage-Claims

https://noyb.eu/en/court-justice-confirmed-there-no-threshold-gdpr-damages

\

European Parliament: EU-US Data Privacy Framework fails to create essentially equivalent level of protection

On 11 May 2023, the European Parliament issued a resolution regarding the adequacy of the protection afforded by the EU-US Data Privacy Framework. In summary, the European Parliament has concluded that the EU-US Data Privacy Framework fails to create an essentially equivalent level of protection.

The resolution contains numerous points but the most relevant points include:

In its resolution of 20 May 2021, the European Parliament called on the European Commission (“Commission”) not to adopt a new adequacy decision in relation to the US unless meaningful reforms were introduced. The Executive Order 14086 (“EO 14086) is not considered as sufficiently meaningful and that the Commission should not leave the task of protecting the fundamental rights of EU citizens to the Court of Justice of the European Union (“CJEU”);

The Data Privacy Framework principles have not been sufficiently amended, when compared to those under the Privacy Shield, to provide essentially equivalent protection to that provided under the GDPR;

The Commission was not in a position to assess the effectiveness of the proposed remedies and rules on data processing by public authorities in the US as the US Intelligence Community has until October 2023 to update its policies and practices to align with the commitment of EO 14086. Furthermore, the US Advocate General must still name the EU and its Member States as qualifying countries to be eligible to access the remedy avenue available under the Data Protection Review Court (“DPRC”).Most importantly, the Commission can only proceed with the next step of an adequacy decision after these deadlines have been met by the US to ensure that the commitments have been delivered in practice;

The Commission and its US counterparts have been called upon to continue negotiations with the aim of creating a mechanism that would ensure such equivalence and which would provide an adequate level of protection as required by Union data protection law and the Charter;

Calls on the Commission to act in the interest of EU businesses and citizens by ensuring that the proposed framework provides a solid, sufficient and future-oriented legal basis for EU-US data transfers; and

Expects any adequacy decision, if adopted, to be challenged before the CJEU and highlights the Commission’s responsibility for the failure to protect EU citizens rights in the scenario where the adequacy decision is once again invalidated by the CJEU.

This week, MEPs of the Civil Liberties Committee will travel to the US to meet with members of the House of Representatives and Senator to discuss a wide range of current policy issues including security and child protection, data transfers and privacy, amongst others.

Until there is more clarity surrounding the EU-US Data Privacy Framework, organizations must ensure they adhere to the available transfer mechanisms and that the supplementary measures are implemented.

Does your organization have any questions about transferring personal data internationally to the US? Contact us, the Experts in Data Privacy at info@dpoconsultancy.nl for assistance.

 

 

Sources:

https://www.europarl.europa.eu/doceo/document/TA-9-2023-0204_EN.pdf

https://iapp.org/news/a/meps-head-to-us-for-eu-us-data-privacy-framework-talks/

\

The CJEU rules on data subject rights: the right to obtain a copy of personal data  

The Court of Justice of the European Union (“CJEU”) ruled that the right to obtain a copy of personal data entails the right to obtain copies of extracts from documents or even entire documents/extracts from databases which contain those data, if that is essential for the data subject to effectively exercise their rights under the GDPR.  


The case concerns an individual who requested access to his personal data from a business consulting agency that provides information on creditworthiness. The individual requested a copy of emails and database extracts containing his personal data in a standard technical format, but the agency only provided a summary of the list of his personal data undergoing processing. The Austrian Data Protection Authority rejected the individual’s complaint, leading to a court case on the scope of the obligation to provide a copy of personal data under Article 15(3) of the GDPR. 

During the proceedings, the competent Court in Austria asked the CJEU whether the said obligation is fulfilled where the controller transmits the personal data in the form of a summary table or whether that obligation also entails the transmission of documents extracts or even entire documents, as well as extracts from databases, in which those data are reproduced.  

Here are the main conclusions reached by the CJEU:  

– The CJEU clarified that the right to obtain a “copy” of personal data means that the data subject must be given a faithful and intelligible reproduction of all of their data. This means that providing a general description of the data undergoing processing will not be sufficient to comply with the data subject’s request.  

– That right includes the right to obtain copies of extracts from documents or entire documents or database extracts if that is essential for exercising effectively their rights under the GDPR while taking into account the rights and freedoms of others.  

– The CJEU also noted that the term “copy” does not refer to the document itself, but rather relates to the personal data it contains and that this must be complete. Therefore, the copy should include all the personal data that is being processed.  

– Finally, if there is a conflict between granting complete access to personal data and protecting the rights and freedoms of others, the Court suggests finding a balance between the two. It is preferable to choose methods of sharing personal data that do not violate the rights or freedoms of others. However, it is important to note that the end result should not be a complete refusal to provide information to the data subject. 

How does your organisation handle data subject rights? Contact us, experts in data privacy, if you want to learn more via info@dpoconsultancy.nl.   

https://curia.europa.eu/jcms/upload/docs/application/pdf/2023-05/cp230071en.pdf  

 

\

Latest on EU AI Act

The advancement of artificial intelligence is expected to be the big technological revolution of this century.

As AI technologies continue to advance rapidly and many existing laws across the globe have not been able to keep up with the potential negative outcomes of certain deployments of automated systems, there are uncertainties about when more stringent regulations may be enacted and how they will affect compliance in the future.

The EU is possibly the most advanced globally in creating a comprehensive legislation governing the development and commercial usage of AI through the proposed AI Act.

According to Brando Benifei, a co-rapporteur of the AI Act and a Member of the European Parliament, the European Union faces a challenge in implementing a comprehensive AI regulation because machine learning systems have already been utilized for various commercial and governmental purposes throughout the continent. Any AI that is already being used must adhere to the current norms and regulations, said Benifei, and he added that he was hoping the AI Act would not pose “further regulatory burden” on them.

Furthermore, Benifei recognizes the broad range of present applications for AI systems and expressed his desire that the AI Act would not impede the progress of machine learning technologies, but instead will ensure that high-risk systems are appropriately regulated to prevent any discriminatory or undemocratic outcomes.

EU member states have already taken steps to ensure that companies producing and using AI technology comply with the EU General Data Protection Regulation, which aims to safeguard personal data and privacy.

Italy’s data protection authority, the Garante, placed a temporary ban on ChatGPT on the 3rd of February, due to concerns that the AI application allegedly does not check the age of users and does not have any legal basis for collecting and storing such massive amounts of personal data for training their algorithm.

After issuing the ban by the Garante, OpenAI, ChatGPT parent company, promised to collaborate with the authorities and provide clarification with regards to their legal basis in addition to their desire and plan for more transparency. Moreover, the European Data Protection Board (EDPB) set up a ChatGPT task force to encourage collaboration and information-sharing among data protection authorities regarding potential enforcement actions.

AI has huge potential to solve various problems and tend to many challenges within our society, however, it could also pose as a threat as there are a lot of unknowns and uncertainties with the AI technology and the laws regulating the field. To train the algorithm quickly and properly, a lot of data is needed. This can lead to challenges with data privacy and in particular with the GDPR. It remains to be seen how the regulators will tackle these issues.

Do you have any questions about developments within privacy and data protection with regards to AI or other emerging technologies? Contact us, the Experts in Data Privacy, at info@dpoconsultancy.nl for more information.

Source: https://iapp.org/news/a/iapp-gps-2023-whats-next-for-potential-global-ai-regulations-best-practices-for-governing-automated-systems/

https://edpb.europa.eu/news/news/2023/edpb-resolves-dispute-transfers-meta-and-creates-task-force-chat-gpt_en

\

Meta fined €1.2 billion over EU-US data transfers

On Monday, a decade-long legal case, spanning from 2013 to 2023, regarding Meta’s role in mass surveillance in the United States reached a significant milestone. The outcome of this case obliges Meta to cease any additional transfer of personal data from Europe to the United States due to its obligation to comply with US surveillance legislation, such as FISA 702. The European Data Protection Board (EDPB) had largely overturned the decision made by the Irish Data Protection Commission (DPC), advocating for a substantial monetary penalty and insisting on a record fine and for previously transferred data to be returned to the EU.

Following the disclosure of Edward Snowden regarding the involvement of major US tech companies in assisting the NSA’s widespread surveillance program, Facebook, now known as Meta, became subject to legal proceedings in Ireland. Over the course of a decade, Meta failed to implement any significant measures and instead disregarded directives from the Court of Justice of the European Union (CJEU) and the European Data Protection Board (EDPB). As a consequence, Meta is now obligated to not only pay a historic fine amounting to €1.2 billion but also return all personal data to its data centers located within the EU.

The reauthorization of FISA 702 is currently being discussed. The ongoing conflict between EU privacy regulations and US surveillance laws also poses challenges for other prominent US cloud service providers like Microsoft, Google, and Amazon. The underlying US surveillance legislation, FISA 702, needs to undergo reauthorization before December 2023. Considering the recent substantial fine imposed by EU data protection authorities, there is a growing demand for significant changes among US tech giants. Several rulings from France, Italy, and Austria have declared the utilization of US services as unlawful, although they did not entail significant fines similar to the recent case.

Although Meta is likely to submit an appeal to both the Irish and potentially the European Courts, the likelihood of the decision being significantly overturned is low. This is due to the fact that the CJEU has previously ruled in two cases spanning from 2007 to 2023 that there was no legitimate legal framework for transferring data between the EU and the US. Additionally, there is no possibility of a new agreement being established to retroactively legalize prior breaches of the law.

According to a recent ruling by the CJEU, individuals may now have the ability to seek emotional damages for minor infringements of their data protection rights, including instances where their data is subject to US mass surveillance. This could result in claims that surpass the current penalties imposed. For instance, the Dutch consumer rights organization Consumentenbond is currently enlisting Dutch Facebook users to assert their claims regarding EU-US data transfers. Without users demanding fair compensation, substantial change is unlikely. As regulatory authorities are currently not very proactive in enforcing the GDPR, it falls upon consumer rights organizations and users to take action. Therefore, Noyb encourages every Facebook user in the Netherlands to register their claims for potential damages. Additionally, the implementation of the EU’s Collective Redress Directive this summer will allow European users to initiate collective actions for GDPR violations for the first time.

What does this mean for other companies? This is the first fine issued for unlawful transfers, setting a high standard for compliance. Controllers need to ensure that in case of international transfers, in the absence of an Adequacy Decision, there are appropriate safeguards and on condition that enforceable rights and effective legal remedies are available for individuals. Such appropriate safeguards include SCCs and TIAs.

Does your organization have any questions about transferring personal data internationally to the US? Contact us, the Experts in Data Privacy at info@dpoconsultancy.nl for assistance.

Source:

https://noyb.eu/en/edpb-decision-facebooks-eu-us-data-transfers-stop-transfers-fine-and-repatriation

\

EDPB Launches Data Protection Guide for Small Businesses

The European Data Protection Board (EDPB) has launched a new guide to help small businesses comply with the General Data Protection Regulation (GDPR). The guide provides practical advice and guidance on how to protect personal data and ensure compliance with the GDPR.  

The guide is aimed at small and medium-sized enterprises (SMEs) that may not have the resources or expertise to navigate the complex data protection landscape. It covers a range of topics, including data processing, data security, data breaches, and data subject rights. SMEs play a vital role in the European economy, but they often face challenges in complying with the GDPR. 

In this guide, SMEs will find various tools and practical tips to help them comply with the GDPR. It includes concrete examples gathered during our 5 years of experience with the GDPR.” said Andrea Jelinek, the Chair of the EDPB.  

The guide includes a range of resources such as videos, infographics, interactive flowcharts, and other practical materials designed to provide guidance on data protection practices for SMEs. These resources are aimed at helping SMEs understand their obligations under the GDPR and providing practical advice on how to implement data protection measures. 

Additionally, the guide provides an overview of other handy materials developed by national Data Protection Authorities (DPAs) to further assist SMEs in implementing data protection practices. It offers, among others, guidance on conducting a Data Protection Impact Assessment (DPIA), ensuring the security of personal data, and how to handle data breaches, such as notifying the relevant supervisory authority and data subjects. 

The guide is available on the EDPB’s website at https://edpb.europa.eu/sme-data-protection-guide/home_en#home-title  

 

Do you have any questions about developments within privacy and data protection? Contact us, the Experts in Data Privacy, at info@dpoconsultancy.nl for more information.  

\

Advocate General publishes Opinion regarding compensation for non-material damage for data breaches

On April 27, 2023, the Advocate General published it’s Opinion in Case C-340/21 regarding the compensation that can be awarded for non-material damage arising from liability and presumed fault on the part of the data controller.

In order to be exempt from liability, a data controller must demonstrate that it is not in any way responsible for the event giving rise to the damage. Fear of a possible misuse of the personal data in the future can constitute non-material damage. This will give rise to a right to compensation only if it is actual and certain emotional damage and not simply trouble or inconvenience for the data subject.

The Opinion of the Advocate General states that the data controller is obliged to implement appropriate technical and organizational measures to ensure that processing of personal data is performed in accordance with the General Data Protection Regulation (‘GDPR’).

The most important points of the Opinion are that that:

  • The occurrence of a personal data breach is not sufficient in itself to conclude that the technical and organizational measures implemented by the data controller were not ‘appropriate’ to ensure data protection. The data controller’s decision is subject to possible judicial review of compliance but the data controller must consider the ‘state of the art’ to what is reasonably possible at the time of implementation as well as the implementation costs. The assessment of appropriateness of the measures will be based on a balancing exercise between the interests of the data subject and the economic interests and technological capacity of the controller, in compliance with the general principle of proportionality;
  • The burden of proving that the measures are appropriate is on the data controller and not the data subject;
  • The fact that the infringement has been committed by a third party does not in itself constitute a ground for exempting the data controller. In order to be exempted from liability, the data controller must demonstrate, to a high standard of proof, that it is not in any way responsible for the event giving rise to the damage;
  • Detriment consisting in the fear of a potential misuse of one’s personal data in the future, the existence of which the data subject has demonstrated, may constitute non-material damage giving rise to a right to compensation, provided that it is a matter of actual and certain emotional damage and not simply trouble or inconvenience.

While the Advocate General’s Opinion is not binding on the Court of Justice of the European Union (‘CJEU’), the Opinion provides an independent legal solution to the CJEU which can be considered by the judges when deliberating the case. The outcome of Case C340/21 is a development that will be closely monitored.

Does your organization have a question about data breaches or how to respond to data subjects regarding data breaches? Contact us, the Experts in Data Privacy, at info@dpoconsultancy.nl for more information.

Source:

https://curia.europa.eu/jcms/upload/docs/application/pdf/2023-04/cp230067en.pdf

\

European Data Protection Board adopted three sets of Guidelines  

During its March plenary, the European Data Protection Board (“EDPB”) adopted the final versions of three Guidelines.

The EDPB has finalized the “Guidelines on data subject rights – Right of access” after a public consultation, which provides more precise guidance on the implementation of the right of access in different situations. The Guidelines cover various aspects such as  

    • the scope of the right of access,  
    • the information to be provided to data subjects,  
    • the format of access requests,  
    • the main modalities for providing access,  
    • the notion of manifestly unfounded or excessive requests, and  
    • the limitations of the right of access. 

 

The EDPB also adopted final versions of the targeted updates of “Guidelines for identifying a controller or processor’s lead supervisory authority” and the “Guidelines on data breach notification”. These updates concern the Article 29 Working Party Guidelines on the same subject.  

Regarding data breach notification, the new version clarifies that the responsibility of notification lies with the controller. However, stakeholders raised concerns about operational issues when notifying multiple DPAs. While the targeted update aligns with the GDPR, which does not provide for a one-stop-shop mechanism for controllers established outside EEA, the EDPB considered stakeholders’ feedback. Accordingly, the EDPB will publish a contact list for data breach notification with relevant links and accepted languages for all EEA DPAs on its website. This will facilitate controllers in identifying contact points and requirements per DPA. 

 

Do you have any questions about developments within privacy and data protection? Contact us, the Experts in Data Privacy, at info@dpoconsultancy.nl for more information.  

 Source:  

https://edpb.europa.eu/news/news/2023/edpb-adopts-final-version-guidelines-data-subject-rights-right-access_en  

\

MEPs are not in favor of allowing the transfer of personal data under the current EU-U.S. Data Privacy Framework

The European Parliament Committee on Civil Liberties, Justice and Home Affairs has adopted a resolution on Thursday rejecting the proposed EU-U.S. Data Privacy Framework. Members of the European Parliament (MEPs) have commented that while the framework was an improvement, it did not meet the requirements to justify an adequacy decision.

The European Commission initiated the process of adopting an adequacy decision for the EU-U.S. Data Privacy Framework on December 13th. This decision aims to facilitate trans-Atlantic data transfers and it would resolve the concerns of the Court of Justice of the European Union resulting from its Schrems ll decision from July 2020.

This draft adequacy decision entails that the U.S. would ensure that personal data transferred from the EU to the U.S. are adequately protected by the United States. This decision was made after a thorough analysis and assessment of the Data Privacy Framework including the obligations it poses on companies, as well as the restraints and security measures in place for the U.S. public authorities to access EU citizens’ personal data, specifically for the purposes of criminal law enforcement and national security.

MEPs noted that although the framework has some improvements, it still permits the bulk gathering of personal data under specific circumstances. Additionally, it does not require independent authorization before bulk data collection and does not establish explicit guidelines for data retention.

MEPs have pointed out that the Data Privacy Framework introduces a Data Protection Review Court (DPRC) to address compensation and facilitate complaints for EU data subjects. However, their decision would be classified and that would violate data subjects’ right to access and rectification of their personal data. Furthermore, the U.S. President could potentially dismiss the DPRC judges and strike down its decisions. Therefore, MEPs believe that the Review Court lacks independence. This concern was previously also raised in their draft opinion on this matter in February.

MEPs argue in the resolution that the framework has to be future-proof and the adequacy assessment by the Commission must depend on how the U.S. privacy rules are implemented in practice.  The U.S. Intelligence Community is still adapting its approach and methods pursuant to the Data Privacy Framework. Therefore, according to the MEPs an evaluation of its effect and consequences is not yet possible.

Since the prior data transfer frameworks between the EU and the U.S. were overthrown by the Court of Justice of the European Union, most recently in its decision for the Schrems ll case, MEPs are urging the Commission to ensure that the future framework is capable of withstanding legal challenges and offer legal certainty to EU citizens and companies. In order to accomplish this goal, MEPs advise the Commission not to grant an adequacy decision based on the current framework. Instead, they should work out a framework that would be likely to be upheld in court.

MEPs have adopted the resolution with 37 votes in favor, no votes against, and 21 abstentions. The resolution will now be postponed for a future plenary session of the European Parliament.

Organizations must ensure that they follow the existing transfer mechanisms, such as SCCs and TIAs, and implement any additional measures until there is more clarity regarding the EU-US Data Privacy Framework.

Does your organization have any questions about transferring personal data internationally to the U.S.? Contact us, the Experts in Data Privacy at info@dpoconsultancy.nl for assistance.

Sources:

https://www.europarl.europa.eu/news/en/press-room/20230411IPR79501/meps-against-greenlighting-data-transfers-with-the-u-s-under-current-rules

https://ec.europa.eu/commission/presscorner/detail/en/qanda_22_7632

https://iapp.org/news/a/meps-urge-european-commission-to-reject-eu-u-s-adequacy/

\

Updated EDPB Guidelines regarding personal data breach notification under the GDPR

Recently, the European Data Protection Board (“EDPB”) has updated the Guidelines on personal data breach notification under the GDPR. This update includes the targeted public consultation on the subject of data breach notification for controllers not established in the EEA. 

The EDPB noticed that there was a need to clarify the notification requirements concerning the personal data breaches at non-EU establishments. The paragraph concerning this matter has been revised and updated, while the rest of the document was left unchanged, except for editorial changes.  

Where a controller not established in the EU is subject to the GDPR (article 3(2)l; 3(3)), and
experiences a data breach, it is therefore still bound by the notification obligations under the GDPR (article 33; 34).

Although controllers that are not established in the European Economic Area (“EEA”), and are subject to the GDPR, require a controller (and a processor) to designate a data protection representative (“DPR”) in the EEA, it has become clear from these guidelines that the mere presence of a DPR in a Member State does not trigger the one-stop-shop mechanism. For this reason a data breach will need to be notified by the controller to every data protection authority (“DPA”) for which affected individuals reside in their EEA Member State. This means that when a controller is not established in the EEA and experiences a data breach in multiple Member States, the controller is obliged to notify every DPA in each Member State where individuals are affected by the data breach. 

Similarly, where a processor is subject to the GDPR, it will be bound by the obligations on
processors, of particular relevance here, the duty to notify a data breach to the controller under the GDPR (article 33(2)).  

Although this means that more effort is required from controllers under the GDPR that are not established in the EEA, this update provides more clarity about how to act in a GDPR compliant manner in case of a cross-border data breach.  

Does this affect your organization’s data breach policy and procedure or do you want to learn more about data breaches under the GDPR? Contact us DPO Consultancy, experts in data privacy, via: info@dpoconsultancy.nl 

https://edpb.europa.eu/our-work-tools/our-documents/guidelines/guidelines-92022-personal-data-breach-notification-under_en  

\

ChatGPT: Recent developments regarding privacy compliance 

Since ChatGPT is increasing in popularity, regulators tend to spend more time on assessing its level of compliance under different laws and regulations, including the GDPR but also the FTC Act for instance.  

What is OpenAI's ChatGPT and Can You Invest? (Updated March 16)

ChatGPT is an Artificial Intelligence (AI) chatbot that has been developed by OpenAI, and was launched in November 2022. OpenAI is a research laboratory and is headquartered in San Francisco (United States). The core function of ChatGPT is to mimic a human conversationalist, which makes it versatile. Things you can ask the chatbot is to answer test questions, write song lyrics, and even to debug computer programs. OpenAI collects data retrieved via ChatGPT from users to further train and fine-tune the service.  

Recent developments in the US have showed that the ethics group Center for AI and Digital Policy asked the US Federal Trade Commission to stop new AI chatbot releases by OpenAI, because of the fact that the GPT-4 version is “biased, deceptive, and a risk to privacy and public safety”. The group’s formal complaint stated that ChatGPT-4 fails to meet FTC standards of being “transparent, explainable, fair and empirically sound while fostering accountability”. An example of why the group feels this way, is that you can easily expose private chat histories to other users without the user’s knowledge. Therefore, the group aims to ensure the establishment of the necessary guardrails to protect consumers, businesses, and the commercial marketplace in the US. 

In Europe, the Italian DPA (Garante), ordered a “temporary limitation of the processing of data of Italian users” by OpenAI and has started an investigation into the vendor. Garante found that the GDPR is violated, since there is a lack of information that is provided to users about the data collection by OpenAI but also a lack of lawful processing ground that justifies the mass data collection and storage of personal data for the purpose to ‘train’ the algorithms of the chatbot.  

OpenAI now has to respond to Garante’s request within 20 days, otherwise they might face a fine of up to 20 million euros or up to 4% of the annual global turnover (whichever is higher). 

Although the booming development of ChatGPT has been beneficial for them so far, it has not been beneficial when it comes to the chatbot’s compliance with multiple laws and regulations. Critic assessments on the chatbot are just getting started and it is to be expected that more complaints and requests will follow.  

Sources: 

https://www.reuters.com/technology/us-advocacy-group-asks-ftc-stop-new-openai-gpt-releases-2023-03-30/ 

https://www.gpdp.it/web/guest/home/docweb/-/docweb-display/docweb/9870847 

\

Meta tracking tools declared illegal by Austrian Data Protection Authority

About a month after the Court of Justice of the European Union (‘CJEU’) issued its ruling in the Schrems II case, noyb filed 101 complaints regarding data transfers from the European Economic Area (‘EEA’) based websites to Google LLC and Facebook Inc located in the United States (‘US’).

In order to coordinate the work of the various data protection authorities (‘DPA’), the European Data Protection Board (‘EDPB’) created a special task force. This decision by the Austrian DPA arises from one of those complaints.

Facts of the complaint:
In August 2020, the data subject visited a website that was hosted by an Austrian media company, while he was logged into his personal Facebook  account. At the time, the company made use of the Facebook login tool. This tool facilitates a user’s access to services not offered by Facebook without the creation of additional accounts. The company also made use of the Facebook Pixel tool, which was able to track the visitor’s activities on its website.

The data subject argued that mere access to the website triggered an unlawful transfer of personal data to the US and that this was in contravention of the General Data Protection Regulation (‘GDPR’). Therefore, he filed a complaint against the media company for using these tracking tools and further argued that Meta infringed various provisions of the GPDR.

While the Austrian DPA was investigating the complaint, the media company deactivated these tracking tools and argued that it had concluded an agreement with Meta Platforms Ireland Limited and therefore any transfers, including transfers outside the EU, were compliant and justified in the light of media privilege.

Outcome of the complaint:
The Austria DPA held that Meta Platforms Inc, did not violate Article 44 GDPR as it was a data importer falling outside the scope of Chapter V GDPR. The Austrian DPA did not extensively elaborate on the subjective qualification of Meta under the GDPR as it did not have enough evidence to qualify it as a controller.

The Austrian DPA upheld the complaint against the Austrian media company for three reasons. Firstly, merely deactivating the tracking tools after receiving the complaint did not prevent an infringement of Article 44 GDPR as the violation had already occurred. Secondly, the media privilege claim did not fall within the scope of the journalistic exception as the tracking tools were implemented for tracking purposes and to facilitate the login procedure and for the sole purpose of disseminating information, opinions or ideas to the public. Furthermore, once the data was transferred to Meta Ireland, it could be used for further purposes. Thirdly, there was no lawful basis for the international data transfer and thus a violation of Chapter V GPDR had occurred.

Consequences of the complaint:
The decision by the Austrian DPA is a clear indication that the use of Facebook’s tracking pixel directly violates the GDPR and the decision of the CJEU in the Schrems II ruling.

In light of this decision, numerous websites that use the Facebook tracking tools to track users and display personalized advertisements may need to reconsider their approach in this regard and keep track of other DPAs following the example of the Austrian DPA.

Does your organization make use of tracking tools or do you have questions about the impacts of this decision? Contact us, the Experts in Data Privacy at info@dpoconsultancy.nl for more information.

Sources:

https://noyb.eu/en/austrian-dsb-meta-tracking-tools-illegal

https://gdprhub.eu/index.php?title=DSB_(Austria)_-_2022-0.726.643

\

The reform of the UK GDPR: what can we expect?

The second iteration of a set of proposals to reform the UK GDPR has been published: the Data Protection and Digital Information (No.2) Bill.

United Kingdom Flag Pictures | Download Free Images on Unsplash

Based on this publication there are a couple of interesting findings: 

  • The fundamental principles of the current UK GDPR, the obligations of controllers and processors, the data subject rights, and the wider constitutional and regulatory environment for privacy would remain unaffected by the proposals.  
  • There are targeted changes that reflect back on the experience of the GDPR so far, and are proposed in pursuit of the government’s policy objective to reduce compliance costs by introducing a more ‘common-sense-led’ version of the GDPR.  
  • There are targeted changes being made to current law. These changes may provide businesses more flexibility, and empowerment of future rule-making or guidance-issuing. 
  • Organizations that are compliant with the current UK GDPR will not be obliged to make changes to comply with the proposed UK GDPR. Although the proposed reforms will offer organizations the opportunity to make use of new compliance efficiencies. It is not to be expected that there will be conflicting requirements between the two versions. 

 

Some new proposals in the second iteration of the Bill include: 

  • That it is proposed to include a list of activities that may be regarded as in a controller’s legitimate interest to process data, although controllers are still required to ensure their interests are not outweighed by the data subject’s rights. It also clarified that any legitimate commercial activity can fall under the legitimate interest lawful processing ground, in case the processing is necessary and the balancing test is carried out. 
  • The proposal also contains a list of illustrative and non-exhaustive type of scientific research, that were previously to be found in the recitals and now moved to the operative parts of the UK GDPR, e.g. innovative research into technological development or applied or fundamental research. It also clarifies that research into public health only falls under scientific research if it is in the public interest, and it exempts controllers to provide notice where personal data has been collected directly from the data subject in certain instances. 
  • Regarding international data transfer mechanisms, there are alternatives proposed. 
  • The proposals also include a requirement to maintain a records of processing activities in case of processing activities that are likely to result in a high risk to the rights and freedoms of data subjects, so also for organizations with fewer than 250 employed people. 
  • The proposals create an obligation on providers of public electronic communication services and networks to report suspicious activity relating to unlawful direct marketing to the ICO. It is expected that there will be guidance published on what constitutes reasonable suspicion.

 

At DPO Consultancy we will keep monitoring any new developments regarding the UK GDPR. Do you want to learn more about the consequences of these proposals for your organization? Then contact us DPO Consultancy, experts in data privacy via: info@dpoconsultancy.nl 

Source: https://bills.parliament.uk/bills/3430

\

European Commission bans TikTok from corporate devices

The European Commission announced on Thursday that they had asked all employees to uninstall TikTok from their corporate devices, as well as the personal devices using corporate apps over data protection concerns. The Council of the EU also followed later in the day with a similar measure, banning its staff from using the Chinese app.

In an email sent to EU officials, it is said that “To protect the Commission’s data and increase its cybersecurity, the EC Corporate Management Board has decided to suspend the TikTok application on corporate devices and personal devices enrolled in the Commission mobile device services.”

Officials are required to uninstall the app at their earliest convenience and before March 15. Alternatively, they can delete work-related apps from their personal devices if they want to keep TikTok.

It is said that this is the first time the Commission has suspended the use of an app for its staff. The aim of the relevant measure is to protect Commission data and systems from potential cybersecurity threats.

In November 2022, TikTok admitted that European TikTok user data could be accessed in the Chinese headquarters. This was also scrutinized in the US through a report in Forbes that the app was being used to spy on journalists covering TikTok.

The US has recently banned TikTok on government devices at the state and federal levels due to fears about potential spying by China. Although most EU countries have not yet made a move about the app, Dutch government officials were told to suspend the use of TikTok.

Do you have any questions about developments within privacy and data protection? Contact us, the Experts in Data Privacy, at info@dpoconsultancy.nl for more information.

 

https://www.politico.eu/article/european-commission-to-staff-dont-use-tiktok/

https://www.euractiv.com/section/cybersecurity/news/european-commission-bans-tiktok-from-corporate-devices/

\

New law to ‘fix’ GDPR

Before the summer of 2023, the European Commission (EC) will propose a new law that is aimed at improving how the privacy regulators of various European Union (EU) countries’ will enforce the General Data Protection Regulation (GDPR).

Fine GDPR

A watershed moment in global tech regulation occurred in 2016, when the GDPR was adopted, as companies were forced to abide by new privacy standards which included asking for consent to collect people’s data online. Also, the possibility of receiving substantial fines for noncompliance was introduced.

Five years down the road, activists, experts and some national privacy watchdogs have become frustrated at what they see as an inefficient system to tackle major cases, especially in instances when Big Tech companies are involved.

Another aspect to receive attention is the powerful role that the Irish Data Protection Commission has under the one-stop shop rule, which directs most major investigations are to be handled by the Irish authorities as tech companies like Meta, Google, Apple and others have established their headquarters there. The Irish and – to a lesser extent – Luxembourg authorities have received frequent and mounting criticism in recent  years.

The EC is set to introduce a new EU regulation in the second quarter of 2023, which will set clear procedural rules for national data protection authorities dealing with cross-border investigations and infringements. The EC has been quoted as saying that the law “will harmonize some aspects of the administrative procedure in cross-border cases and support a smooth functioning of the GDPR cooperation and dispute resolution mechanism.”

Last year, the European Data Protection Board (EDPB) sent the EC a “wishlist” of procedural law changes to improve the enforcement. Some ideas included setting deadlines for different procedural steps in handling cases and harmonizing the rights of the different parties involved in EU-wide investigations.

The EC wants to keep the new regulation very targeted and limited. This partly because it is expecting that tense discussions with data privacy watchdogs, campaigners and Big Tech lobbyists will result from the new regulation.

Furthermore, it is not only lobby groups that are expected to jump on the EC’s new privacy regulation. Regulators have also clashed over how to interpret and enforce the GDPR. The result has often triggered dispute resolutions and the delay of cases. The EC has said that “nobody will be happy with the Commission proposal as usual, because the data protection authorities agree on the problem but they do not agree on the solutions.”

The EU executive only has a short window of time to push this new text through the EU legislation train as European elections are set for Spring 2024.

While this new law is expected to solve the enforcements flaws of the GDPR, it could open a Pandora’s box of lobbying and regulator infighting. This a development that will definitely be closely monitored.

Do you have any questions about developments within privacy and data protection? Contact us, the Experts in Data Privacy, at info@dpoconsultancy.nl for more information.

 

Source:

https://www.politico.eu/article/brussels-plans-new-privacy-enforcement-law-by-summer/

 

\

European Commission urged to reject EU-US adequacy

The European Parliament Committee on Civil Liberties, Justice and Home Affairs (LIBE committee) does not want the European Commission (EC) to extend an adequacy decision to the United States (US) based on the proposed EU-US Data Privacy Framework (DPF). This was made clear in its draft opinion published on the February 14.

In their opinion, the committee members have concluded that the proposed DPF “fails to create actual equivalence in the level of protection offered under the EU General Data Protection Regulation” and have urged the EC to only adopt a decision when “meaningful reforms were introduced, in particular for national security and intelligence purposes” on the part of the US.

The LIBE committee’s opinion is viewed by others as a noteworthy step towards accountability and warrants serious consideration before an adequacy decision is granted to the US.

Parliament and the European Data Protection Board (EDPB) each have the opportunity to present opinions, which are nonbinding, to the EC to consider towards its adequacy decision.

Maastricht University European Centre on Privacy and Cybersecurity Senior Visiting Fellow Paul Breitbarth has been quoted as saying “with the EDPB’s opinion on the framework weeks away, I think the Parliament’s resolution is very timely, and might also influence the (data protection authorities’) debate on whether or not the framework is indeed essentially equivalent.”

The LIBE committee has highlighted the key discrepancies with the DPF, with specific focus on the subject of equivalent protection and the proposed redress system the US plans to establish for EU citizens. Furthermore, MEPs have said that it is worrisome that there are differing definitions of key data protection concepts, such as the principles of necessity and proportionality, the absence of a comprehensive federal privacy legislation and adherence to a US Executive Order.

Another concern of MEPs is the proposed Data Protection Review Court as it raises concerns with a perceived lack of independence and general transparency. The committee’s view is that the Data Protection Review Court’s process is “based on secrecy and does not set up an obligation to notify the complainant that their personal data has been processed.” Furthermore, neither does it allow for appeals to federal courts nor the opportunity to claim for damages where appropriate.

This draft opinion will be formally presented in the LIBE committee at the beginning of March with the hope of finalization at the end of March. Thereafter, the opinion will go to a full parliament vote in April. The EC has maintained that it hopes to finalize the adequacy decision by July at the earliest.

Until there is more clarity surrounding the EU-US Data Privacy Framework, organizations must ensure they adhere to the available transfer mechanisms and that the supplementary measures are implemented.

Does your organization have any questions about transferring personal data internationally to the US? Contact us, the Experts in Data Privacy at info@dpoconsultancy.nl for assistance.

Source:

https://iapp.org/news/a/meps-urge-european-commission-to-reject-eu-u-s-adequacy/

 

\

New Swiss Federal Act on Data Protection to introduce stricter requirements than the GDPR

The longstanding Swiss Federal Act on Data Protection of 1992 will be replaced with new legislation to better protect Swiss citizens’ data (revFADP). The revFADP will improve the processing of personal data and grants Swiss citizens new rights consistent with other comprehensive laws, such as the General Data Protection Regulation (GDPR).

This change also introduces various increased obligations for companies doing business in Switzerland. Furthermore, while the revised legislation has many similarities to the GDPR, there are stark difference companies must be aware of.

The most important aspects companies must be aware of include:

          No compliance grace period: the revFADP takes effect on September 1, 2023 and there is no grace period for companies to become compliant.

      Expanded definition of sensitive data: the definition of sensitive personal data will be expanded to include genetic and biometric data that unequivocally identifies a natural person. The explicit consent of the data subject is required when processing sensitive personal data.

    Emphasized important of an “Independent” DPO: the Swiss Federal Data Protection and Information Commission (FDPIC) strongly emphasizes the importance of an independent DPO. Thus, the DPO’s activities should remain separate from other business activities of the company. Furthermore, it has been recommended that the DPOs speak at least one of the languages of Switzerland, for instance, French, German, Italian, Romansh.

        Breach Notice for Serious Attacks only and no clear notice timeframe: under the revFADP, the controller must notify the FDPIC of certain serious personal data breaches ‘as soon as possible.’ Furthermore, notice of breaches should only be made if they pose an ‘imminent danger’ to data subjects.

     Penalties: civil penalties will not be imposed by the revFADP but intentional violations can result in criminal sanctions of up to 250,000 Swiss Francs against individuals, which could potentially include DPOs and C-Suite Executives instead of the entity.

It is expected that the FDPIC will issue updates and guidance on the revFADP. These updates and guidance will continue to be monitored.

Does your company do business in Switzerland? Contact us, the Experts in Data Privacy at info@dpoconsultancy.nl, to ensure your company is compliant with the changes the revFADP will introduce.

 

 Source:

https://www.privacyanddatasecurityinsight.com/2023/02/nothing-neutral-about-the-new-swiss-federal-act-on-data-protection/

\

DPOs and conflict of interest: CJEU issues ruling

The Court of Justice of the European Union (CJEU) has affirmed that Data Protection Officers (DPOs) can maintain other tasks and duties within their role, provided it does not result in a conflict of interest.

The ruling of 9 February centered around Article 38 of the GDPR. The CJEU has stated that DPOs should “be in a position to perform their duties and tasks in an independent manner” but “cannot be entrusted with tasks or duties which would result in him or her determining the objectives and methods of processing personal data on the part of the controller or its processor.”

The CJEU has further stated that this is “a matter for the national court to determine, case by case, on the basis of an assessment of all the relevant circumstances.”

The Federal Labour Court of Germany, Bundesarbeitsgericht submitted a request to the CJEU for a preliminary ruling regarding proceedings between X-Fab Dresden and its former DPO. The former DPO, who was also the chair of the works council, was dismissed from the role of DPO in December 2017. When the GDPR came into force in May 2018, X-Fab argued that the former DPO’s dismissal was justified due to “a risk of a conflict of interests” in performing both functions as these posts were incompatible.

The CJEU found that Article 38 GDPR states that DPOs cannot be dismissed or penalized for performing tasks, however, this does not prevent national laws from establishing additional protections against dismissing DPOs. These additional protections should not “compromise the principal objectives of the GDPR to maintain high levels of data protection.” The example provided by the CJEU was that national laws cannot protect DPOs from dismissal if the DPO is unable or no longer able to carry out his or her duties in complete independence due to the existence of  conflict of interest.

This ruling comes ahead of the European Data Protection Board’s upcoming coordinated enforcement action focusing on the designations of DPOs.

Does your organization have questions about the role of a DPO? Contact us, the Experts in Data Privacy, at info@dpoconsultancy.nl, for more information.

 

Source:

https://iapp.org/news/a/cjeu-issues-ruling-on-dpos-and-conflict-of-interest/

\

Privacy by Design to become an ISO standard on 8 February

In January, it was reported that on February 8, 2023, the International Organization for Standardization (ISO) will adopt Privacy by Design (PbD) as ISO 31700.

PbD was introduced 14 years ago by Ann Cavoukian, a Canadian privacy commissioner, and this will now become an international privacy standard for the protection of consumer products and services.

PbD is a set of principles that calls for privacy to be taken into account throughout an organization’s data management process. Since its introduction, it has been adopted by the International Assembly of Privacy Commissioners and Data Protection Authorities and was incorporated into the General Data Protection Regulation (GDPR).

Cavoukian has been quoted as saying “the ISO standard is designed to be utilized by a whole range of companies – startups, multinational enterprises, and organizations of all sizes. With any product, you can make this standard work because it’s easy to adopt. We’re hoping privacy will be pro-actively embedded in the design of [an organization’s] operations and it will complement data protection laws.”

The proposed introduction notes that PbD refers to several methodologies for product, process, system, software and service development. The proposed bibliography that comes with the document refers to other standards with more detailed requirements on identifying personal information, access controls, consumer consent, and corporate governance amongst other topics. A separate document will outline possible use cases as well.

Adopting privacy can be regarded as a competitive advantage for businesses and by implementing PbD principles, both privacy and business interests are addressed resulting in a ‘win-win’ situation.

Does your organization have any questions about PbD? Read this white paper to find out more.

Source:

https://www.itworldcanada.com/article/privacy-by-design-to-become-an-iso-standard-next-month/521415

\

Hackers stole customers’ backups says LastPass owner GoTo

LastPass’ parent company GoTo – formerly known as LogMeIn – has confirmed that hackers stole customers’ encrypted backups during a recent breach of its systems.

The breach was first confirmed by LastPass on November 30. At the time, LastPass said that an ‘unauthorized party’ had gained access to some customers’ information stored in a third-party cloud service shared by LastPass and GoTo. The hackers used information stolen from an earlier breach of LastPass systems in August to further compromise the companies’ shared cloud data.

Now, almost two months’ later, in an updated statement GoTo has said that the cyberattack has impacted several of its products, including business communications tool Central, online meetings service Join.me; hosted VPN service Hamachi and its Remotely Anywhere remote access tool.

The hackers exfiltrated customers’ encrypted backups from these services, which included the company’s encryption key for securing the data.

GoTo has said that the company does not store customers’ credit card or bank details or collect personal information, such as date of birth, home address or Social Security Numbers. This, however, is in clear contrast to the hack affecting its subsidiary LastPass. During the attack at LastPass, hackers stole the contents of customers’ encrypted password vaults, along with customers’ names, email addresses, phone numbers and some billing information.

GoTo has 800,000 customers, including enterprises but has not indicated how many customers are affected. Furthermore, despite the delay, GoTo has not provided any remediation guidance or advice for affected customers.

Does your organization make use of a program where passwords or other sensitive information is saved? It may be time to review the Information Security Policy to ensure technical and organizational measures are in place to adequately protect personal data. Contact us, the Experts in Data Privacy, at info@dpoconsultancy.nl for assistance.

Source:

LastPass owner GoTo says hackers stole customers’ backups

 

\

Location data is not personal data… or is it?

The Spanish data protection authority (AEPD) had previously argued that location data of a telecommunications provider is not considered to be personal data under the GDPR. After which the organization ‘noyb’ (none of your business) appealed against the AEPD in the Spanish Court. The Court sided with noyb by stating that location data is personal data.

It all started when a telecommunication provider had denied its customers access to their location data because according to them it did not qualify as personal data under the GDPR. Therefore, they believed they did not have to grant access rights to users. A Spanish customer requested access to his location data with the telecommunication provider, after which his request was denied. The customer then filed a complaint with the AEPD. Thereinafter, noyb appealed this decision in June 2022.

The decision is of the AEPD is a hard one to comprehend, since the GDPR contains a very broad definition of personal data:

“…any information relating to an identified or identifiable natural person (‘data subject’); an identifiable natural person is one who can be identified, directly or indirectly, in particular by reference to an identifier such as a name, an identification number, location data, an online identifier or to one or more factors specific to the physical, physiological, genetic, mental, economic, cultural or social identity of that natural person…”

It even includes the example of location data being personal data.

Furthermore, the right to access of personal data should also not be taken lightly. Individuals have a right under the GDPR to access data organizations store on them. Therefore, the telecommunication provider could not lawfully deny access to location data.

How does your organization handle data subject requests (DSRs)? A decent DSR procedure will help your organization to assess and manage DSRs in a GDPR compliant manner. Contact us if you want to learn more about DSRs via: info@dpoconsultancy.nl.

https://noyb.eu/en/location-data-personal-data-noyb-wins-appeal-against-spanish-dpa

\

Cookie banners to be consistent across the EEA soon

On the 18th of January 2023, the European Data Protection Board (“EDPB”) adopted a report by the Cookie Banner Task Force for the work carried out in ensuring a consistent approach to cookie banners across the European Economic Area (“EEA”).

The Cookie Banner Task Force was established in September 2021 to coordinate the response to complaints received about cookie banners filed with numerous EEA Data Protection Authorities (“DPAs”) by NYOB.

The Task Force aimed to promote cooperation, information sharing and best practices amongst the DPAs, which has been instrumental in ensuring a consistent approach to cookie banners across the EEA.

From the report it will be noted that the various DPAs agreed upon a common denominator in their interpretation of the applicable provisions of the ePrivacy Directive and the General Data Protection Regulation (“GDPR”) on issues such as reject buttons, pre-ticked boxes, banner designs or withdraw icons.

Cookies are found on all websites and therefore it is important that your organization remains up to date with these developments. Do you have questions about how this report will affect your organization or which changes must be implemented to your existing cookie policy? Contact us, the Experts in Data Privacy, at info@dpoconsultancy.nl

 

Source:

https://edpb.europa.eu/our-work-tools/our-documents/report/report-work-undertaken-cookie-banner-taskforce_en

\

Court of Justice of the European Union: Everyone has the right to know to whom his/her personal data have been disclosed

On January 12, 2023, the Court of Justice of the European Union (“CJEU”) issued a preliminary ruling on the interpretation of the right of access to personal data (Article 15 GDPR). The CJEU ruled that the right of access entails the right to know the specific identity of recipients of the personal data.  


The CJEU was asked whether the GDPR leaves the data controller the choice to disclose either the specific identity of the recipients or only the categories of recipient, or whether it gives the data subject the right to know their specific identity.  

The highlights of the judgment are:  

  • The data subject has the right to know the specific identity of the recipients of his personal data as part of his/her right of access under Article 15(1)(c) of the GDPR. Informing the data subjects only about the categories of recipients, e.g., advertisers, IT companies, mailing list providers, is not sufficient.  
  • The right of access is necessary to enable the data subject to exercise other rights conferred by the GDPR, namely the right to rectification, right to erasure, right to restriction of processing, right to object to processing or right of action where he/she suffers damage. If the data subject is not informed about the identity of the recipients, the exercise of these rights vis-á-vis these recipients would be restricted.  
  • Therefore, where the personal data have been or will be disclosed to recipients, the data controller is obliged to provide the data subject, on request, with the actual identity of those recipients, unless it is not (yet) possible to identify those recipients, or the controller demonstrates that the request is manifestly unfounded or excessive.  

The information obligation under Articles 13 and 14 of the GDPR gives the data controller the right to inform data subjects only about the categories of recipients. The importance of the judgment lies in the fact that the data subject’s right to know the identity of recipients seems to prevail the data controller’s right to only disclose the categories of recipients.  

This ruling highlights the importance of maintaining a clear record of processing activities. Accordingly, the data controllers should keep track of the actual identity of recipients for each data transfer and duly reflect the names of specific recipients in their record of processing activities as recording only the categories of recipients in the registry would not be sufficient to fully comply with Article 15(1)(c) of the GDPR.  

 

How does your organisation handle data subject rights or keep the record of processing activities? Contact us, experts in data privacy, if you want to learn more via info@dpoconsultancy.nl.   

https://curia.europa.eu/jcms/upload/docs/application/pdf/2023-01/cp230004en.pdf  

\

Meta fined €390 million by the Irish DPC over legal basis for targeted advertising

The Irish Data Protection Commission (“DPC”) announced its final decisions on two inquiries into Meta’s data processing operations on Facebook and Instagram. The DPC fined Meta a total of €390 million for breaches of the GDPR and invalidated its reliance on a “contract” legal basis for personal data processing for targeted advertising purposes.  

The decision came after the complaints made by privacy rights organisation NOYB on May 25, 2018, the day the GDPR came into operation.Before 25 May 2018, Meta had changed the Terms of Service for its Facebook and Instagram services, which indicated a change in the legal basis on which Meta relies to legitimate its processing of personal user data. Accordingly, Meta would rely on the contract legal basis for its certain data processing activities in the context of the delivery of the services on Facebook and Instagram, including behavioural advertising, rather than the consent, as it had previously done.

The users were asked to accept the new Terms of Service if they wished to continue to access Facebook and Instagram services. The DPC stated that Meta considered that “a contract was entered into” between Meta and users and that “processing of users’ data in connection with the delivery of its Facebook and Instagram services was necessary for the performance of that contract, to include the provision of personalised services and behavioural advertising, so that such processing operations were lawful by reference to Article 6(1)(b) of the GDPR (the “contract” legal basis or processing).”

The complainants argued that by making the use of its services conditional on the users’ acceptance of the Terms of Service, Meta was forcing them to agree to Meta’s use of personal data for behavioural advertising purposes, which implied that Meta was in fact relying on consent as a legal basis for this processing and that there was no real choice for users in this regard.

Following the European Data Protection Board’s binding determinations on the matter, the DPC adopted its final decisions in which it stated that:

  • Meta is not entitled to rely on the contract legal basis for the delivery of behavioural advertising services on Facebook and Instagram and therefore, the processing of personal user data based on contract legal basis constitutes a violation of Article 6 of the GDPR, and
  • Since users were not clearly informed about the legal basis relied on by Meta and what processing operations were being performed on their data, Meta violated its transparency obligations. It is considered that such a lack of transparency constituted breaches of the lawfulness, fairness and transparency principle enshrined in Article 5(1)(a), and Articles 12 and 13(1)(c) of the GDPR.

Furthermore, the decision also includes that Meta must bring its data processing operations into compliance with the GDPR within a period of 3 months. This means that Meta has to find a new legal basis for its processing activities in relation to targeted advertising.

NOYB Founder Max Schrems embraced the decision and said users “must have a ‘yes or no’ option and can change their mind at any time. The decision also ensures a level playing field with other advertisers that also need to get opt-in consent.”

Although the decision concerns the activities of Meta on Facebook and Instagram, it is surely expected to exert influence over other online platforms in the digital advertising industry.

In June 2022, TikTok has amended its privacy policy to reflect its intended switch from relying on users’ consent to the legitimate interest as a legal basis for the processing of personal data for personalised advertising. This was followed by Italian Supervisory Authority’s warning to TikTok that the processing of personal data for personalised advertising on the basis of legitimate interest would violate the GDPR and ePrivacy Directive. TikTok has eventually agreed to pause the controversial privacy policy update in the EU after the engagement with the DPC.

As Schrems argued, many platforms currently make the access to their service conditional on the user’s consent. The decision leaves the question whether Meta or other online businesses must allow users to refuse targeted advertising without preventing them from having access to their services.

 

Does your organization make use of GDPR compliant targeted advertising? Contact us, experts in data privacy, if you want to learn more via: info@dpoconsultancy.nl.  

https://www.dataprotection.ie/en/news-media/data-protection-commission-announces-conclusion-two-inquiries-meta-ireland  

https://www.politico.eu/article/meta-fina-ad-business-model/  

\

France fines Microsoft €60 million over cookies

The French data protection authority, the CNIL, sanctioned Microsoft with a fine of 60 million euros for not having put in place a mechanism for users to refuse cookies as easily as to accept them.

The sanction came after the CNIL carried out several investigations over a complaint relating to the conditions for depositing cookies on “bing.com”. The CNIL’s decision was based on breaches of Article 82 of the French Data Protection Act. In particular, it is found that:

– When a user visited the search engine “bing.com”, a cookie serving several purposes, including the fight against advertising fraud, was automatically placed on their terminal without any action on their part.

– Moreover, when a user continued to browse on the search engine, a cookie with an advertising purpose was deposited on their terminal, without their consent having been obtained. This constitutes a violation of the Act as it requires that this type of cookies be placed only after users have consented.

The decision also highlighted the absence of a compliant means of obtaining consent for the deposit of cookies. It was stated that although the search engine offered a button to accept cookies immediately, it did not provide an equivalent solution (e.g. button to refuse) to allow the user to refuse them as easily as accepting them. This is because Bing offered one button for the users to accept all cookies, while two clicks were needed to refuse all cookies.

It was noted that complex refusal mechanisms are likely to discourage users from refusing cookies, instead, it encourages them to prefer the ease of the consent button appearing in the first window, which ultimately violates the freedom of consent of Internet users.

In deciding the amount of the fine, the CNIL regarded the scope of the processing, the number of data subjects, as well as the benefits that the company derives from the advertising profits indirectly generated from the data collected via cookies.

 

How does your organisation handle cookies? Contact us, experts in data privacy, if you want to learn more via: info@dpoconsultancy.nl.

https://www.cnil.fr/en/cookies-microsoft-ireland-operations-limited-fined-60-million-euros

\

The European Commission releases the EU-US Draft Adequacy Decision

On December 13, the European Commission published its draft adequacy decision for the EU-US Data Privacy Framework.

 

 

 

 

 

The draft decision came after the signature of the Executive Order by US President Joe Biden in October. The Executive Order will introduce safeguards for EU residents’ personal data, in particular by limiting the US intelligence services’ access to data and introducing an independent redress mechanism: Data Protection Review Court.  

 

With the draft decision, the Commission considers the new legal framework based on the Executive Order as having an adequate level of data protection comparable with European data protection standards. This means that the EU residents’ personal data can be safely and legally transferred to the United States. 

 

There are still two steps that need to be taken for the finalization of the data adequacy process. Following the draft decision, the European Data Protection Board, which gathers all EU data protection authorities, will issue an opinion. The decision will then need to have the approval of a committee formed by member states’ national representatives before the formal adoption. The Commission expects to obtain the approval by the summer of 2023.  

 

If approved, the adequacy decision would be subject to regular review to ensure the full implementation of the relevant elements from the US legal framework and their effective functioning in practice. The reviews will start one year after the adoption of the decision.  

 

The draft decision might be subject to legal challenges as has happened to its predecessors in the two landmark Schrems rulings. Max Schrems has been quoted as saying “As the draft decision is based on the known Executive Order, I can’t see how this would survive a challenge before the Court of Justice. It seems that the European Commission just issues similar decisions over and over again, which in flagrant breach of our fundamental rights.”  

 

It is important to note that there are other tools for international transfers available to companies in the meantime. Standard Contractual Clauses, which companies can add in their commercial contracts, are the most used mechanism to transfer data from the EU to the United States. As updated by the European Commission in June 2021, the modernized Standard Contractual Clauses must be accompanied with a Transfer Impact Assessment which is aimed to identify and mitigate privacy risks before personal data can leave the European Economic Area.  Until there is certainty from the European Commission regarding this draft adequacy decision, companies must ensure they adhere to the requirements for international data transfers.  

 

Does your organisation have questions about international data transfers to the US? Contact us, experts in data privacy, if you want to learn more via info@dpoconsultancy.nl.   

https://ec.europa.eu/commission/presscorner/detail/en/ip_22_7631  

https://www.euractiv.com/section/data-privacy/news/european-commission-publishes-draft-adequacy-decision-on-eu-us-data-flows/  

\

The CJEU: Google must delete search result content if it concerns manifestly inaccurate data

The Court of Justice of the European Union (“CJEU”) ruled that the operator of a search engine must delete search results if a person requesting deleting proves that information about them is “manifestly inaccurate”.

The case concerns a right to erasure request made by two managers of an investment company. They requested Google to de-reference results of a search made on the basis of their names which they asserted produced inaccurate claims. Google denied the request referring to the professional context in which the content of the search result was set and arguing that it was not aware whether the information contained in the content of the search was accurate or not. The German Federal Court of Justice requested CJEU for its interpretation of the GDPR, which governs the right to erasure.

 

In its press release, the CJEU firstly noted that the right to protection of personal data is not an absolute right and the right to erasure could be excluded where processing is necessary, in particular, for the exercise of the right of information. In this regard, one must strike a balance between the data subject’s rights to protection of personal data and protection of private life and the legitimate interest of internet users who may be interested in accessing the information in question.

 

The CJEU then highlighted that if, at the very least, a part of the information, which is not of minor importance, contained in the relevant content proves to be inaccurate, the right to freedom of expression and information cannot be considered. In this regard, the CJEU stated that:

– the person requesting de-referencing due to inaccurate content must prove the manifest inaccuracy of the information or a part of that information which is not of minor importance. Yet, this should not amount to an excessive burden on the person which could undermine the practical effect of the right to erasure. Therefore, the person is only expected to provide evidence that can be reasonably be required of him or her.

– the search engine operator, upon a request for de-referencing, must take into account all the rights and interests involved and all the circumstances of the case when assessing whether content may continue to be included in the search results.

 

Having considered the above, the CJEU concluded that where a person who has made a request for de-referencing provides relevant and sufficient evidence capable of substantiating their request and of establishing the manifest inaccuracy of the information contained in the content of the search result, the search engine operator is obliged to accede to that request.

 

How does your organisation handle data subject rights? Contact us, experts in data privacy, if you want to learn more via info@dpoconsultancy.nl.

 

https://curia.europa.eu/jcms/upload/docs/application/pdf/2022-12/cp220197en.pdf

\

Data scraping practices costs Meta €265 million in a fine

On 28 November, the Irish Data Protection Commissioner issued a €265 million fine on Meta-owned Facebook and Instagram over their data scraping practices and ordered remedial actions.

This inquiry by the Irish Data Protection Commissioner arose from the massive data leaks of Facebook personal data that was dumped online in a hacker forum in April 2021. The data included sensitive information such as full names, locations, birthdates, phone numbers and email addresses.

The data leak concerned 553 million people across 106 countries. In the EU alone, approximately 86 million people were affected. At the time, Facebook said that the leaked data was old since the mass data scraping occurred because of a vulnerability that the company had patched in August 2019.

A few days after the leak, the Irish Data Protection Authority announced that it was investigating the matter and would examine if Facebook’s data harvesting practices complied with the GDPR principles of privacy by design and by default.

The investigation concluded that between 25 May 2018 and September 2019, the social networks violated European privacy rules and imposed a set of specific remedial actions and issued a fine of €265 million.

A Meta spokesperson has been quoted as saying “unauthorized data scraping is unacceptable and against our rules, and we will continue working with our peers on this industry challenges. We are reviewing this decision carefully.” Meta can appeal this decision in court.

This is the second largest fine against Meta, following a €405 million fine against Instagram for breaching the privacy of children.

Does your company make use of data scraping practices? Contact us, the Experts in Data Privacy at info@dpoconsultancy.nl, to ensure that this is done in a manner that ensures privacy by design and by default are adhered to at all times.

 

Source:

https://www.euractiv.com/section/data-protection/news/meta-sanctioned-with-e265m-over-data-scraping-practices/

\

The French Data Protection Authority Fines Utility Company EDF €600,000 for Violations of Data Security and the Rights of Individuals

France’s data protection authority, the CNIL, sanctioned the company EDF with a fine of 600,000 euros, in particular for data security violations and for not having respected its obligations in terms of commercial prospecting and the rights of individuals.

The sanction came after the CNIL has received several complaints regarding the difficulties encountered by people in having their rights taken into account by EDF, which is the first electricity supplier in France. The CNIL’s fine was based on three core violations of the GDPR.

– EDF failed to demonstrate to the CNIL that it had obtained prior valid consent from the individuals as part of its commercial prospecting campaign by electronic means (Article 7 of the GDPR).

– The sanction decision also highlighted the breaches of the obligation to inform (Articles 13 and 14 of the GDPR) and to respect the exercise of rights (Articles 12, 15 and 21 of the GDPR). Accordingly, it was noted that the personal data protection statement appeared on EDF’s website did not specify the legal basis corresponding to each case of data use and was not clear on the duration of storage, and that the source of the data was not clearly indicated. Moreover, the CNIL stated that EDF failed to respond to certain complaints within the one-month period provided for by the texts, individuals were given inaccurate information on the source of data collected and thus their right of access to data was not respected, and that EDF did not consider the individuals’ right to object to receive commercial prospecting.

– The violations lastly included failing to ensure the security of personal data (Article 32 of the GDPR). It was found that the passwords for accessing the customer portal to receive an energy bonus for more than 25,000 accounts were stored in an unsecured manner until July 2022, and that the passwords were only hashed, without having been salted (addition of random characters before the hash to avoid finding a password by comparing hashes) which put them at risk.

In deciding the amount of the fine, the CNIL regarded the breaches identified, as well as the cooperation of the company and all the measures it took during the procedure to reach compliance on all alleged breaches with it was charged.

How does your organisation secure the processing of personal data or handle data subject rights? Contact us, experts in data privacy, if you want to learn more via: info@dpoconsultancy.nl.

 

https://www.cnil.fr/en/commercial-prospecting-and-rights-individuals-edf-fined-600-000-euros

\

Largest Attorney General Consumer Privacy Settlement in US History with Google

An historic $391.5 million settlement with Google in the US has been finalized over its location tracking practices. 

How to Get Location Tracking Alerts in Google Maps

“For years Google has prioritized profit over their users’ privacy. They have been crafty and deceptive. Consumers thought they had turned off their location tracking features on Google, but the company continued to secretly record their movements and use that information for advertisers”, said Attorney General Rosenblum. 

Google has misled users into thinking that they had turned off location tracking, while Google continued to collect their location information. For Google, location data is a key part of its digital advertising business. It uses personal and behavioral data to build detailed user profiles and target ads.  

As part of the multimillion-dollar settlement, Google has agreed to significantly improve its location tracking disclosures and user controls starting in 2023. The settlement furthermore requires that Google will be more transparent about its practices by: 

  1. Showing additional information to users whenever they turn a location-related account setting “on” or “off”;
  2. Make key information about location tracking unavoidable for users (i.e., not hidden); and 
  3. Give users detailed information about the types of location data Google collects and how it is used at an enhanced “Location Technologies” webpage.

In a blog post, Google outlined it will “provide a new control that allows users to easily turn off their Location History and Web & App Activity settings and delete their past data in one simple flow.” The company also plans to add additional disclosures to its Activity controls and Data & Privacy pages. 

This settlement is an important milestone, since location data is among the most sensitive and valuable data that Google collects, as it can expose a person’s identity, routines, and to infer personal details. This settlement is another example of how individuals would benefit from having a comprehensive US federal privacy law in place that prevents the usage of large amounts of personal data for marketing purposes with only a few controls in place. 

How does your organization safeguard the processing of location data or the processing of user profiles? Contact us, experts in data privacy, if you want to learn more via: info@dpoconsultancy.nl.  

https://www.doj.state.or.us/media-home/news-media-releases/largest-ag-consumer-privacy-settlement-in-u-s-history/   

https://blog.google/outreach-initiatives/public-policy/managing-your-location-data/  

https://myaccount.google.com/intro/activitycontrols?hl=en-US  

https://myaccount.google.com/intro/data-and-personalization  

\

EDPB update on Controller Binding Corporate Rules

On November 14, 2022, the European Data Protection Board (EDPB) adopted Recommendations on the application for approval and on the elements and principles to be found in Controller Binding Corporate Rules (BCR-C).

What are the BCRs?

BCRs are a data transfer tool that can be used by a group of undertakings, or a group of enterprises, engaged in a joint economic activity, for its international transfers of personal data from the European Union (EU) to controllers or processors within the same group of undertakings or enterprises outside of the EU. In this sense, BCRs are suitable for multi-national organisations making frequent transfers of personal data between group entities.

According to Article 46(2)(b) of the GDPR, BCRs are permitted safeguards for transfers of personal data to third countries. BCRs create a framework of policies and procedures implemented throughout the entities of the organizations, which include enforceable rights and commitments to establish an essentially equivalent level of protection to that guaranteed by the GDPR. BCR applications should meet the requirements set out in Article 47 GDPR for the relevant supervisory authority to approve the BCRs.

What do the recommendations bring?

The recommendations provide additional guidance on BCR-C applications, update the existing application form for the approval of BCR-Cs, and clarify the content required for BCR-C as stated in Article 47 GDPR. The recommendations also distinguish between information that must be included in a BCR-C form and that must be presented to the relevant data protection authority.

The EDPB considers that the recommendations aim to level the playing field for all BCR applicants and align the current guidance with the requirements of the CJEU’s judgment in the Schrems II case.

The EDPB notes that recommendations for processor BCR are currently being worked on.

Stakeholders can contribute their comments on the recommendations until January 10, 2023.

Although, this is a great development it would also be important to improve the BCR approval process with data protection authorities (DPAs), since there is a great discrepancy between the various DPAs in the EU.

Does your organization have questions about binding corporate rules or international data transfers? Contact us, the Experts in Data Privacy at info@dpoconsultancy.nl for more information.

Source:

https://edpb.europa.eu/news/news/2022/edpb-adopts-recommendations-application-approval-and-elements-and-principles-be_en

 

\

NIS2 adopted to strengthen EU-wide resilience for cyber security

MEP Bart Groothuis has been quoted as saying “this is the best cyber security legislation this continent has yet seen, because it will transform Europe to handling cyber incidents pro-actively and service orientated.”

Differing national cybersecurity measures makes the European Union (EU) more vulnerable. The new legislation sets tougher requirements for businesses, administrators and infrastructure. Last week, the rules requiring EU countries to meet stricter supervisory and enforcement measures and harmonize their sanctions were approved by the Members of the European Parliament (MEPs).

The Network and Information Security (NIS) Directive was the first EU-wide legislation on cybersecurity and its aim was to achieve a high common level of cybersecurity across the Member States. While increasing Member States’ cybersecurity capabilities, its implementation proved difficult with the result of fragmentation at various levels across the internal market.

To address the growing threats posed by digitalization and the surge in cyberattacks, the European Commission submitted a proposal to replace the NIS Directive. This has lead to the introduction and adoption of the NIS2 Directive.

This legislation sets stricter cyber security obligations for risk management, reporting obligations and information sharing. The requirements also cover incident response, supply chain security, vulnerability disclosure and encryption, amongst others.

A result of this new Directive is that more entities and sectors will have to take measures to protect themselves. “Essential sectors” have been defined, which includes energy, transport, banking, health, digital infrastructure, public administration and space sectors. Furthermore, numerous “important sectors” such as manufacturing of medical devices and digital providers will be protected by the new rules. All medium-sized and large companies in selected sectors would fall under the legislation.

Furthermore, a framework for better cooperation and information sharing between different authorities and Member States have been created as well as a European vulnerability database.

The only aspect remaining is for the European Council to formally adopt the law before it will be published in the EU’s Official Journal.

Does your organization have questions about cyber security or how the NIS2 Directive may impact you? Contact us, the Experts in Data Privacy at info@dpoconsultancy.nl for more information.

 Source:

Cybersecurity: Parliament adopts new law to strengthen EU-wide resilience

 

 

\

The right to privacy vs. journalistic interests: A balancing exercise

An individual was referred to in a news article as a master swindler and filed suit to explain to the court that article 8 of the European Convention of Human Rights (ECHR) was violated regarding the ‘Right to respect for private and family life’. He claimed the journalist was liable for €40,000 in damages, and also requested a ban on any future publications about him. 

Expanding the U.S. Press Freedom Tracker two years after launch

A journalist had, via its website, published online articles about the individual. The articles were thereinafter shared by national newspapers. The articles referred to the individual as a master swindler because it accused the individual of selling phone cards to detainees and of scamming them. The articles revealed the individual’s full name, data of birth and a photograph.  

The Court of First Instance dismissed the individual’s claims and considered that the journalist processed his data for journalistic purposes. Additionally, the court mentioned that the press has a watchdog function, in this case by warning the public about fraud, and that therefore, the journalist’s freedom of expression outweighed the protection of the individual’s privacy.  

The individual then appealed the decision of the Court of the First Instance, and claimed that the journalist’s purpose was to make him look bad and then there was no legal ground for processing his data under article 6 GDPR. Therefore, additionally the article 43 UAVG (the Dutch Implementation Act), which sets forth that the UAVG does not apply to the processing of personal data for exclusively journalistic purposes, was not applicable according to the individual.  

The Court of Appeal then explained that the term ‘journalistic’ within the meaning of article 43 UAVG should be interpreted broadly. The journalist’s website was not deemed to make people look bad, but instead to inform the public about fraudulent activities. The processing of the individual’s data was therefore not deemed to fall out of scope of article 43 UAVG. The Court did note that it is prohibited to process criminal data according to article 10 GDPR, unless the exception under article 43(3) UAVG applies, meaning, insofar the processing was necessary for journalistic purposes. The Court deemed this processing of personal data necessary because of the publications concerned reporting on and warning about fraudulent practices. Regarding the applicable legal basis, the court held that journalistic activities are a legitimate interest for processing data under article 6(1)(f) GDPR. It therefore, dismissed the appeal of the individual. 

Although many people think the right to privacy is an absolute right, it is important to note that this is not the case. It requires a balancing exercise to decide what right prevails in particular circumstances. It is a positive development that more case law is appearing regarding journalistic purposes, since this is still a grey area under relevant laws like the GDPR. 

 Contact us, experts in data privacy, if you want to learn more about data privacy via: info@dpoconsultancy.nl. 

 Source:

https://uitspraken.rechtspraak.nl/inziendocument?id=ECLI:NL:GHAMS:2022:3023  

 

\

Endangering encryption? The European Commission’s gross violation of privacy

Strong end-to-end encryption is an essential part of a secure and trustworthy internet. This protects citizens every time an online transaction is made, when medical information is shared or when citizens interact with family and friends.   

Strong encryption also helps protect children as it allows them to communicate with family and friends in confidence and allows others to report online abuse and harassment in a confidential manner. Encryption ensures personal data, and citizens’ private conversations, are kept private.

The EU’s new regulation intending to fight child sexual abuse online will require internet platforms – including end-to-end encrypted messaging applications like Signal and WhatsApp – to “detect, report and remove” images of child sexual abuse shared on their platforms. In order to do this, however, platforms would have to automatically scan every single message. This process is known as “client-side scanning.”

Not only is this a gross violation of privacy, there is no evidence that the technology exists to do this effectively and safely without undermining the security provided by end-to-end encryption. While the proposed regulation is well-intentioned, it will result in weakening encryption and making the internet less secure.

This proposal has already been criticized by privacy watchdogs – the European Data Protection Board (EDPB) and the European Data Protection Supervisor (EDPS) – which issued a joint statement calling for the regulations to be amended. The proposals were described as “highly intrusive and disproportionate”, arguing that by requiring platforms to weaken encryption, the regulations violate Articles 7 and 8 of the Charter of Fundamental Rights of the European Union – namely the right to respect for private and family life and the right to protection of personal data.

The EU has fallen for the myth that it is possible to keep EU citizens safer by weaking the very thing that protects its citizens. If backdoors are created for law enforcement, weaknesses in the system are created for everyone. These weaknesses could be exploited by criminal gangs or other malicious actors.

Furthermore, it is impossible for platforms to weaken encryption only for users located within the EU – any reduction in security would affect users of those platforms across the globe. In the United Kingdom (UK), similar legislation has been proposed for WhatsApp. WhatsApp has indicated that it is willing to withdraw from the UK market if they are required to weaken encryption. The same could occur across Europe.

The EU relies on encryption to protect the security of its member countries and the bloc as a whole but by proposing to weaken encryption, it may result in making all citizens more vulnerable.

If implemented correctly, encryption can be a very powerful tool in protecting personal data that your organization processes. Do you have any questions about encryption and how to keep personal data safe and secure? Contact us, the Experts in Data Privacy, at info@dpoconsultancy.nl

 

Source:

The Commission’s gross violation of privacy – endangering encryption

\

Luxembourg delivers the first GDPR accreditation

The national commission for data protection has become the first data protection authority in Europe to accredit a GDPR certification body.

On 12 October, the national commission for data protection in Luxembourg accredited an entity via its certification mechanism, GDPR-CARPA (General Data Protection Regulation-Certified Assurance Report-Based Processing Activities). This is the first mechanism to be adopted on a national and international level under the GDPR.

The European Data Protection Board’s Opinion states that certification mechanisms should enable controllers and processors to demonstrate compliance with the GDPR. The criteria should, therefore, properly reflect the requirements and principles concerning the protection of personal data as laid down in the GDPR and contribute to its consistent application.

The GDPR is a law that regulates how organizations target or collect data related to people in the European Union. Furthermore, it outlines how organizations must protect and handle data in a secure manner and details privacy rights which give individuals more control over their personal data.

With a GDPR certification, companies, public authorities, associations and other organizations can show that their data processing activities are complying with the GDPR. The implementation of the certification mechanism can promote transparency and compliance as it allows businesses and individuals to evaluate the level of protection offered by products, services, processes or systems used or offered by organizations that process personal data. Entities can, therefore, benefit from an independent certificate to demonstrate that their data processing activities comply with EU regulations. It will be interesting to see how this develops with other data protection authorities in the EU.

Does your organization have questions about accreditation? Contact us, the Experts in Data Privacy at info@dpoconsultancy.nl for more information.

 

Sources:

Luxembourg delivers first GDPR accreditation

Opinion 28/2022 on the European criteria of certification regarding their approval by the Board as European Data Protection Seal pursuant to Article 42.5 (GDPR)

 

 

\

Companies not established in the EU: take note

The European Data Protection Board (EDPB) has published an updated version of the Guidelines on Personal Data Breach Notifications under the General Data Protection Regulation (GDPR).

The previous Guidelines provided that “notification should be made to the supervisory authority in the Member State where the controller’s representative in the EU is established.” The revised Guidelines, however, state that “[…] the mere presence of a representative in a Member State does not trigger the one-stop-shop system. For this reason, the breach will need to be notified to every single authority for which affected subjects reside in their Member State. This notification shall be done in compliance with the mandate given by the controller to its representative and under the responsibility of the controller.”

 

Should the EDPB finalize the above suggested procedure, it can have immense implications for companies not established in the European Union (EU) and processing the personal data of EU citizens. It is, therefore, important for companies to keep up to date with developments within privacy and data protection.

The revised Guidelines are open for public consultation until 29 November 2022.

Does your company have questions about a how this may impact your operations within the EU or does your company require the services of a representative? Contact us, the Experts in Data Privacy, at info@dpoconsultancy.nl .

Source:

Guidelines 9/2022 on personal data breach notification under GDPR 

 

\

Dutch employee fired by U.S. firm for shutting off webcam awarded €75,000 in court

A Dutch employee of a U.S. firm with a branch in The Netherlands refused to be monitored during a full workday. The consequences were that the employee was fired for ‘refusal to work’ and ‘insubordination’. In a lawsuit a Dutch court awarded the employee with €75,000 for wrongful termination. 

The employee began working for the U.S. firm in 2019. A year and a half later, he was ordered to take part in a virtual training period, which was called the ‘Corrective Action Program’. During this program he was told to remain logged in for the entire day, including screen-sharing turned on and to have his webcam activated.  

The employee did not feel comfortable with being monitored for a full workday and claims it was an invasion of privacy. He agreed on having the activities on his laptop being monitored and to share his screen, but he refused to activate the webcam. Thereinafter, he was fired.  

The court concluded that the termination was not legally valid, because the reasons for dismissal were not made clear enough, nor were there reasonable instructions. The instructions definitely did not respect the employee’s right to respect for his private life (article 8 the European Convention on Human Rights) according to the court.  

Many organizations do not realize the ‘right to privacy’ is a fundamental right in the European Economic Area (EEA). When having business operations in the EEA, it is essential to respect this at all times. This may be in conflict with for example more flexible workplace privacy practices in the U.S.  

It is therefore important to make sure your organization is aware and informed when doing business in the EEA. Contact us for all of your privacy related questions via: info@dpoconsultancy.nl 

https://nltimes.nl/2022/10/09/dutch-employee-fired-us-firm-shutting-webcam-awarded-eu75000-court  

\

President Biden signs executive order on EU-US data privacy management

On 7 October, President Biden signed an executive order that would limit the ability of American national security agencies to access people’s personal information as part of the transatlantic data sharing agreement with the European Union (EU).

This executive order follows lengthy negotiations between the United States (US) and EU after the Court of Justice of the European Union (CJEU) ruled in 2020 that the US did not sufficiently protect Europe’s data when it was transferred across the Atlantic. The judges’ concerns focused on how US surveillance programmes did not have proper measures for European citizens to address how the government collected their data.

What is new?

The Data Privacy Framework (DPF) includes three components:

  • Commercial data protection principles to which US organizations may self-certify: the Privacy Shield will be updated to refer to the GDPR directly and US organizations must ensure they remain abreast of these developments.
  • A presidential executive order: this requires US intelligence authorities to limit US signals intelligence activities to what is necessary and proportionate. The executive order imposes necessity and proportionality limits first by mandating them explicitly, then by explaining what that mandate means and finally by prescribing oversight mechanisms to verify intelligence agencies to follow the new rules. What this will mean in practice is also included.
  • DOJ regulations: a two-step redress system will be introduced, which includes a new Data Protection Review Court. The first tier is an evaluation of a submitted claim by the Civil Liberties Protection Officer (CLPO) and the second tier is the Data Protection Review Court. This court is meant to process complaints concerning the legality of US signals intelligence activities transmitted from “qualifying states” for covered violations of US law.

Max Schrems’ response

The day the executive order was issued, Max Schrems and noyb.eu published their first reaction indicating that the executive order is unlikely to satisfy EU law. According to Schrems the executive order does not offer any solution and that there will be continuous “bulk surveillance” and a “court” that is not an actual court.

Bulk surveillance will continue by means of two types of ‘proportionality’. The words “necessity” and “proportionate” have been included but the EU and US did not agree that it will have the same legal meaning. In the end, CJEU’s definition will prevail therefore muting any EU decision again.

The “court” will not be a court but rather a body within the US government’s executive branch. The new system is an upgrade of the previous “Ombudsperson” system, which was rejected by the CJEU and it appears that it will not amount to “judicial redress” as required under the EU Charter.

noyb.eu has indicated that they are in the process of working on an in-depth analysis, which will be published soon. If the Commission’s decision is not in line with EU law and the relevant CJEU judgments, noyb.eu will likely bring another challenge before the CJEU.

What is next?

The European Commission will now launch its adequacy assessment. This process requires the Commission to put forward a draft adequacy determination, the EDPB to issue a nonbinding opinion, EU Member States to vote to approve the decision and the European Commission College of Commissioners to formally adopt it. The European Parliament may also weigh in with a non-binding resolution at any stage.

Previously, this process has taken four or five months once the Commission finalizes its draft. Should this trend be followed, it is expected that in March 2023 further clarification will be provided. Until then, however, organizations should not be under the impression that it relieves them from their obligation to conclude the Standard Contractual Clauses (SCC) including the performance of a Transfer Impact Assessment (TIA) by the 27th of December 2022.

Does your organization have any questions about transferring personal data internationally to the US? Contact us, the Experts in Data Privacy at info@dpoconsultancy.nl for assistance.

 

Sources:

Biden signs executive order on EU-U.S. data privacy management 

The EU-US Data Privacy Framework: A new era for data transfers?

New US Executive Order unlikely to satisfy EU law 

 

\

Britain to scrap GDPR rules

Earlier this week Britain vowed to tear up the European Union’s General Data Protection Regulation (GDPR) rules and implement an independent approach to data privacy.

Brexit and the GDPR

This announcement was made by the British Secretary of State for Digital, Culture, Media and Sport Michelle Donelan, who pledged to remove red tape during an address to the Conservative’s party annual conference, although the precise constitution of the new regime remains vague.

Donelan said: “We will be replacing GDPR with our own business-and consumer-friendly British data protection system. I can promise…that it will be simpler, it will be clearer for businesses to navigate. No longer will our businesses be shackled by lots of unnecessary red tape.”

Vowing to build new privacy protections founded on ‘common sense’, Donelan claimed that the British economy could be boosted by dispensing with needless bureaucracy while retaining protections for internet users.

It is not clear when this new regime will come into force, what exactly it will entail and what implications this will have for the adequacy decision that was adopted by the European Commission in June 2021. This is an interesting development to keep in mind when operating in United Kingdom.

Do you have questions about the UK GDPR? Contact us, the Experts in Data Privacy, at info@dpoconsultancy.nl

Source:

Britain to scrap GDPR rules and go it alone on data privacy 

\

£27 million fine possibly facing TikTok

In September, Instagram was fined over the protection of children’s data. TikTok may now be facing a similar fate.

Investigations by the Information Commissioners Office (ICO) in the United Kingdom (UK) have found that TikTok, the video-sharing App, may have breached the UK data protection law between 2018 and 2020.

The ICO issued TikTok with a “notice of intent”, a precursor to handing down a potential fine, which could be up to £27 million. If TikTok were to be fined this amount, it would be the largest in the ICO’s history, exceeding the record of £20 million which was handed to British Airways two years ago after an incident in 2018 that saw the personal details of more than 400 000 customers compromised by hackers.

The ICO’s provisional view is that TikTok may have processed the data of children under the age of 13 without parental consent and failed to provide proper information to its users in a “concise, transparent and easily understood way.” Furthermore, TikTok may have processed special categories of personal data without legal grounds to do so.

The information commissioner said that “companies providing digital services have a legal duty to put those protections in place but our provisional view is that TikTok fell short of meeting that requirement.” TikTok said it disagreed with the ICO’s provisional finding and would make a formal response challenging the findings of the investigations. Upon receipt of these representations, the ICO will reach a conclusion.

Do you have questions about complying with the (UK) GDPR? Contact us, the Experts in Data Privacy, at info@dpoconsultancy.nl for assistance.

 

Source:

TikTok could face £27m fine for failing to protect children’s privacy

 

\

Our key takeaways from PrivSec Amsterdam

One of the things we learnt yesterday is that the potential new data transfer agreement between the US and the EU is to win time and not to solve the true problem regarding international data transfers from the EU to the US, according to Max Schrems.

Our colleagues Jelmer and Dounia attended the PrivSec Amsterdam event that included a range of speakers from world renowned companies and industries to allow privacy professionals from across different fields to share case studies and their experiences.  It included keynote speeches, presentations, panel discussions, and enough time to meet up with other privacy professionals.

Topics of yesterday’s program included Google Analytics, the cooperation between data protection and security teams, data retention, consumer trust and transparency, and DPIAs amongst others. Furthermore, Max Schrems provided a presentation on The Future of Online Privacy, GDPR Enforcement and The Battle Against Surveillance.

In his keynote speech Schrems elaborated on how he believes that even though they are planning on introducing a new Executive Order that includes a proportionality assessment, that the new international data transfer agreement would not change anything and is meant to win time, because of the fact that FISA and the PRISM program would still apply. The best solution in his eyes would be to level data protection in the US by the introduction of a federal privacy law amongst other things.

Furthermore, he also warned organizations of  the risk of lawsuits by individuals when continuing their data transfers in violation of the GDPR, which could ultimately lead to higher costs than the fines opposed by data protection authorities.

For us, especially the Schrems keynote was very interesting, because it confirmed that we should continue with helping out our clients with all the needed work regarding Standard Contractual Clauses (SCCs) and Transfer Impact Assessments (TIAs). Chances are likely that a new standard will again be invalidated and your organization should be able to fall back on at least a GDPR compliant TIA in case SCCs are used as an international data transfer mechanism.

\

Will the next data transfer agreement smoothen cross-border data flows?

G-7 leaders have gathered just recently in June to discuss legal deals underpinning bilateral data flows that exist among most of G-7’s members. G-7 consists of the United States, United Kingdom, Germany, France, Italy, Japan, and Canada.

The goal of the meeting was to align with the regulators’ approaches to privacy and to be able to better understand the domestic rules in each jurisdiction. Methods were discussed to move data and to create options for businesses to choose cross-border transfer tools that are suitable for their business needs. One can think of techniques to properly anonymize data, to strip information of details that identify and individual, and the trend toward the closer cooperation between antitrust and privacy regulars.

An important conclusion after the meeting was that countries need legislation that agrees that an individuals’ personal data is only accessed in case it is strictly necessary for national security purposes. However, that is easier said than done, especially after the Schrems II ruling and European lawmakers that criticized the United States’ intelligence practices. Also, the lack of federal privacy law that guarantees privacy rights is playing a big role in this discussion.

The wish of regulators of the G-7 countries is to smoothen international business for many companies. To enable this, it is imperative to create better understanding on how domestic rules currently affect some kinds of information, and how it can be used and sold. Mr. Wiewiórowski, the European Data Protection Supervisor, does not believe the G-7 countries are ready to create such a market for cross-border data flows.

All in all, the next solution to smoothen cross-border data flows should address these complex points addressed above to make sure that the next data transfer agreement will not be invalidated just like Safe Harbor and the Privacy Shield. There is a lot of work to be done. Until then, it is mandatory to fall back on the GDPR’s international data transfer mechanisms, including Standard Contractual Clauses (SCCs) and by performing Transfer Impact Assessments (TIAs).

Source:

https://www.wsj.com/articles/g-7-privacy-regulators-aim-to-ease-turbulent-international-data-flows-11662730512?mod=djemCybersecruityPro&tpl=cy

\

Focus on designation of Data Protection Officers

The European Data Protection Board (EDPB) has announced that its next coordinated enforcement action will focus on the data protection officer (DPO) designations.

The 22 Supervisory Authorities across the European Economic Area, along with the European Data Protection Supervisor, will launch investigations into the yet-to-be-determined aspects of DPO requirements under the General Data Protection Regulation.

In a co-ordinated action, the EDPB prioritises a certain topic for data protection authorities to work on at a national level. The EDPB has further said that the individual actions will be “bundled and analysed, generating deeper insight into the topic and allowing for targeted follow-up on both national and EU level.”

What this will entail for the designation of DPOs is still unclear but upon further information been made available, we will provide an update. If you have any questions about the designation of a DPO and the advantages of DPO-as-a-Service, contact us: info@dpoconsultancy.nl

Sources:

EDPB to focus coordinated enforcement on DPO appointments

EDPB adopts statement on European Police Cooperation Code & picks topic for next coordinated action

 

\

Interpretation of special categories of personal data extended by CJEU

In August this year, the Court of Justice of the European Union (CJEU) issued a preliminary ruling in OT v Vyriausioji tarnybinės etikos komisija (Chief Official Ethics Commission, Lithuania) (Case C-184/20), which was referred by the Regional Administrative Court of Lithuania. In this ruling, the CJEU elected to interpret the GDPR very broadly.

The CJEU clarified that the indirect disclosure of sexual orientation data is protected under Article 9 of the General Data Protection Regulation (GDPR). Therefore, such disclosures falls under the special categories of personal data.

Background

This case arose from a question concerning the application of Lithuanian law, which required people in receipt of public funds to file declarations of interest. The declarations, which included information about the interests of the individual’s “spouse, cohabitee or partner” were published online. The applicant failed to file a declaration and was sanctioned therefore. The CJEU found that the underlying law did not strike a proper balance between the public interest in preventing corruption and the rights of affected individuals.

The CJEU went on to note that because it is possible to deduce information about an individual’s sex life or sexual orientation from the name of their partner, publishing that information online involves processing special categories of personal data as defined in Article 9 GDPR.

It was found by the CJEU that the processing of any personal data that are “liable indirectly to reveal sensitive information concerning a natural person”, for instance, any information that may reveal a person’s racial or ethnic origin, religious or philosophical beliefs, political views, trade union membership, health status or sexual orientation, is subject to the prohibition from processing under Article 9(1) GDPR unless an exception under Article 9(2) applies.

Possible Implications

The implications of this ruling could be significant. It is possible that common processing operations, such as publishing a photograph on a corporate social media page, could reveal some information that is protected under Article 9. Controllers may need to review their processing operations through a contextual lens to assess whether the data being processed and the manner of processing is liable or able to reveal any sensitive information.

It has even been suggested that this ruling could have implications in all contexts where Article 9 is applicable, including online advertising, dating apps, location data indicating places of worship or clinics visited, food choices for airplane flights amongst others.

The way forward?

The judgment is not clear how far controllers need to go to make this assessment. One option that may be possible is to argue that if the controller does not make personal data public, and it implements policies that prohibits employees from making inferences, then information is not liable to reveal special category data. An alternative option would be for regulatory guidance to be issued indicating how controllers can comply with the ruling and the existing guidelines.

 

Sources:

C-184/20

Sensitive data ruling by Europe’s top court could force broad privacy reboot 

 

 

 

\

€405 million: second highest fine ever issued under the GDPR

Ireland’s Data Protection Authority has fined Instagram €405 million over the lack of protection of children’s data. This fine was issued after an investigation found that Instagram, the social media platform, has mishandled teenagers’ personal information in violation of the strict European Union data privacy laws.

The investigation, which commenced in 2020, focused on how Instagram displayed the personal details of users in the age range of 13 – 17 years, including email addresses and phone numbers. The investigation began after a data scientist found that users, including those under the age of 18 years, were switching to business accounts and had their contact information displayed on their profiles. It is alleged that users were doing this to see statistics on how many likes their posts were getting after Instagram starting removing the feature from personal accounts in some countries to help with mental health.

Instagram said it updated its settings over a year ago and has since released new features to keep teenagers safe and their information private. Instagram disagrees with the calculation of the fine and plans to appeal this decision.

This investigation forms part of over a dozen investigations into Meta companies, including Facebook and WhatsApp, as opened by Ireland’s Data Protection Authority. Will the largest (or third largest) fine be issued against a Meta company in the near future? All we can do is to watch this space…

Source:

Ireland fines Instagram €405 million over protection of children’s data

 

\

Sephora – cosmetics and privacy?

The California Attorney General has dropped a bombshell: the first enforcement settlement under the California Consumer Privacy Act (CCPA). Sephora, a French cosmetics brand, must pay $1,2 million in fines and abide by a set of compliance obligations.

It is alleged that Sephora failed to disclose to consumers it was selling their personal information; failed to honour user requests to opt out of sale via user-enabled global privacy controls and did not cure these violations within the 30-day period allowed by the law.

At issue in this case was Sephora’s sharing of information with third-party advertising networks and analytics providers. This case marks a considerable uptick in risk for companies doing business in California and preparing for the California Privacy Rights Act activation in January 2023. It is clear that the Attorney General is focusing on online tracking and the implementation of and compliance with global opt-out signals, such as the Global Privacy Control.

Sephora has said that it uses cookies “strictly for Sephora experiences” and that the CCPA does not define the word “sale” in the traditional sense of the word but uses it to describe the common practice of using cookies. Under the settlement, Sephora must let customers know that it sells their data and give them a way to opt-out.

Take-aways from this:

  • A French cosmetics brand was fined, not a large technological company.
  • A Global Privacy Control is a browser extension that automatically signals a consumer’s privacy preferences to all websites they visit without having to click on opt-out links one by one.
  • The Attorney General has been quoted as saying, “My office is watching, and we will hold you accountable…There are no more excuses.”
  • The CCPA’s notice and cure provision is expiring at the end of this year, which means that businesses must comply from the outset
  • The American Data Privacy Bill, currently before Congress, may further alter the rules and, in future, these enforcement actions could not occur
  • Interesting developments within the privacy sphere of the USA

Sources:

The Sephora case: Do not sell – But are you selling?  

California fines Sephora $1.2 million for selling consumer data

 

\

EU plans to cut unneeded medical tests with data health plan

Being enabled to easily access your patient health records under strict rules to protect your privacy?

If it were up to the European Commission (EC), this is arranged in 2025 for patients, medics, regulators and researchers. The EC believes this will improve diagnosis, boost medicine research, and cut unnecessary costs from duplication of medical tests.The EC offered a binding proposal to EU governments and lawmakers, and includes the EU’s executive’s plans for an improved health data space, which would lead to savings and economic gains of over 10 billion euros in 10 years.

EU plans to cut unneeded medical tests with data health plan

\

Webinar: what does effective vendor risk management look like?

How do you ensure that you include all aspects of the GDPR in your vendor risk management? In this webinar, you’ll learn about the specific privacy management activities you need to perform with a vendor at each stage to comply with the GDPR.

\

Class-Action Lawsuit Targets Company that Harvests Location Data from 50 Million Cars

Data that is being sold revealing your whereabouts, for example where you live, work or where you go to church? This happened to more than 50 million car owners around the world.

A California-based data broker is involved in a class-action lawsuit because it has been accused of secretly collecting and selling real-time GPS location information from more than 50 million cars around the world, including California-based consumers. The claim in the lawsuit is that the company never requests consent from drivers before tracking their location.

Class-Action Lawsuit Targets Company that Harvests Location Data from 50 Million Cars

\

This is what your organization needs to know this about the Digital Services Act (DSA)

The EU legislators reached an agreement on the DSA, which will govern the digital sphere and increase the fight against illegal content and disinformation. It builds on the eCommerce Directive and provides clear rules for content moderation, platform accountability, illegal products and systemic risks.

EU institutions reach agreement on Digital Services Act

\

Announcement on New Trans-Atlantic Data Privacy Framework

16 July 2020 is a very significant date in the world of privacy and data protection. The reason for this is that on this date, the Court of Justice of the European Union (CJEU) handed down the judgment[1] invalidating the Privacy Shield framework. This ruling has become known as the Schrems II ruling.