Live blog

DPO Consultancy

Welcome to our live blog


Wednesday 19 Jun - 3:19 PM

AP: More Clarity Needed on Approaching People Entitled to Benefits or Allowances

The draft law allowing government agencies to proactively approach individuals eligible for benefits or allowances requires further modifications, according to the Dutch Data Protection Authority (AP). Specifically, individuals should receive clear information in advance about which personal data will be exchanged between agencies.

The AP’s findings come after reviewing the proposed Law on Proactive Service Provision by the Ministry of Social Affairs and Employment (SZW), which amends the Implementation Structure Act on Work and Income (SUWI).

Currently, within the existing system for work, income, and social security, entities such as the Employee Insurance Agency (UWV), the Social Insurance Bank (SVB), and municipalities already exchange certain personal data. The purpose of the new bill is to allow these entities to also exchange data to identify individuals who may be missing out on benefits or social provisions and to contact them accordingly.

Need for Clarification

“The AP understands the intention, as it is beneficial if people receive what they are rightfully entitled to,” says AP board member Katja Mur. “However, it should be properly organized from the start, with clear agreements about which personal information will be used so that individuals are not taken by surprise.”

Which Benefits and Provisions?

It is not yet clear which specific benefits and social provisions the new approach will apply to. These details will be outlined in lower regulations that are not subject to parliamentary approval. This could include benefits like the supplementary income provision for the elderly, certain forms of social assistance, or the child-related budget.

Types of Personal Data?

The draft law does not specify which personal data the UWV, SVB, and municipalities will process. For example, will it involve sensitive information such as financial data? Without this clarity, individuals cannot make informed decisions about their willingness to share their data. The proposal must explicitly state which categories of personal data will be used in this new approach. Additionally, a retention period for the personal data should be established.


In practice, some individuals may not want their data used for ‘proactive service provision,’ for various reasons. They should be able to opt-out. However, the draft law does not require the UWV, SVB, and municipalities to maintain a joint opt-out registry. This means individuals would need to opt-out separately with each agency. The AP insists this collective opt-out provision should be included in the law.



Thursday 13 Jun - 10:41 AM

🚨 More AI Guidance and Recommendations published by Data Protection Authorities

AI Systems: DPAs advice on GDPR Compliance

Since the introduction of the European Artificial Intelligence Act (“AI Act”) in March this year, guidance and recommendations by various Data Protection Authorities (“DPAs”) has been published. The most recent recommendations of the French Data Protection Authority (“CNIL”) are no different.

The recommendations published by the CNIL focus on General Data Protection Regulation (“GDPR”) compliance in the context of AI system development as designers and developments of AI systems have often reported to the CNIL that the application of the GDPR is challenging for them, particularly for the training of models.

The scope of the recommendations address the development of AI systems involving the processing of personal data, which regularly requires the use of large volumes of information on natural persons. This concerns systems based on machine learning, systems whose operational use is defined from the development phase and general purpose systems that can be used for various applications (“general purpose AI”) and systems for which the learning is done “once and for all” or continuously, e.g., using usage data for its improvement. What is also important to note is that the recommendations concern the development phase of AI systems, and not the development phase.

The 7 steps for AI Systems GDPR Compliance

The recommendations also indicate how the recommendations relate to the AI Act. 7 steps have been included, which are:

  • Step 1: Define an objective (purpose) for the AI system
  • Step 2: Determine your responsibilities
  • Step 3: Define the “legal basis” that allows you to process personal data
  • Step 4: Check if you can re-use certain personal data
  • Step 5: Minimize the personal data you use
  • Step 6: Set a retention period
  • Step 7: Carry out a Data Protection Impact Assessment (“DPIA”).

The CNIL has also indicated that soon it will publish new how-to sheets explaining how to design and train models in compliance with the GDPR: retrieval of data on the internet (web scraping) and the use of legitimate interest as a legal basis, exercise of the rights of access, rectification and erasure and whether or not to use open licences.

Stay up to date with GDPR Compliance and AI Systems

Furthermore, the German Data Protection Authority (“DSK”) and the United Kingdom’s Information Commissioner’s Office (“ICO”), to name but a few, have also published guidance and recommendations on AI. The European Data Protection Supervisor (“EDPS”) has also published guidelines on generative AI.

While it is expected that more DPAs will publish guidance and recommendations on AI in the coming months, it is imperative that any organization dealing with or using AI is aware of these developments. We, at DPO Consultancy, are monitoring these developments and will further developments will be communicated. If your organizations has any questions, please contact us, the Experts in Data Privacy at for assistance.


Thursday 30 May - 8:29 AM

Approved European Health Data Space (EHDS) Regulation

In April, the European Parliament approved the European Health Data Space (EHDS) regulation, which is expected to be ratified by EU member states soon. The aim of these data spaces is to unlock extensive repositories of existing data and facilitate their accessibility for research, innovation, and development, while ensuring compliance with pertinent data protection regulations.

The European Health Data Space constitutes a specialized environment defined by regulations, shared standards, operational practices, infrastructure, and a governance framework. It aims to establish a unified structure across European Union Member States for the sharing and transfer of high-quality health data, including electronic health records, patient registries, and genomic data, throughout the EU.

Patients: primary use of health data

Individuals will have the increased ability to control and digital use of their electronic health data within their home country or across other Member States, granting them greater autonomy over their personal health information.  The European Health Data Space, together with the GDPR, will provide individuals with several rights. Consequently, it ensures immediate and easy access to electronic health data without any cost to individuals. People can easily share these data in electronic form with other health professionals both within their own Member State and across borders, without hindrance from previous healthcare providers or manufacturers.  They will retain full control and will be able to add information to their electronic health record, correct wrong data, limit access, and obtain information on how their data are used and for which purpose.

Data users: secondary use of health data

The secondary use of data involves making it available for research and innovation. To facilitate this, data holders, which include both public and private entities, must identify the categories of data they possess and share them with health data access bodies. These bodies then provide the data to users, such as scientific researchers, through a secure platform.

Data users typically access anonymous data, but pseudonymized data is permitted if needed. Data holders are required to provide various data types, including health records, clinical trial data, genomic data, and nonpersonal data related to health determinants. They must organize and distribute this data to health data access bodies, who then grant access to users after approving permits for specific purposes like scientific research and product development.

Individuals have the right to opt-out of data sharing, although member states can override this for public data use. There are uncertainties regarding anonymization standards and responsibilities, as overly strict anonymization may limit data usefulness. Legal provisions ensure compliance with data protection regulations, but implementing these provisions, particularly concerning pseudonymous data, is complex.

Electronic health records

The European Health Data Space (EHDS) underscores the significance of consistency and interoperability in electronic health records (EHRs) to facilitate cross-border healthcare, protect patient rights, and enable data sharing for secondary use. The EHDS features a section outlining technical criteria for EHRs, including conformity assessment, common specifications, documentation prerequisites, and a CE marking. These specifications aim to facilitate the sharing of health data across the EU for healthcare delivery and research purposes. Medical devices and high-risk AI systems asserting interoperability with EHRs must adhere to these standards. Additionally, wellness apps must attain a compliance label if they claim compatibility with EHRs and meet the essential requirements.

Data localization

Originally, the European Health Data Space (EHDS) proposal from the Commission did not include specifications for data localization. However, prompted by apprehensions voiced by data protection authorities, proposed revisions in Parliament aimed to introduce extensive data localization duties. Ultimately, the EHDS’s final iteration mandated data localization solely for data managed by health data access bodies, leaving out data managed by data holders.

International transfers

The European Health Data Space (EHDS) proposal upholds existing GDPR regulations for international transfers of personal health data. However, it introduces novel considerations for nonpersonal data, particularly anonymous data, which falls outside the scope of the GDPR. According to the EHDS, certain categories of nonpersonal data provided by health data access bodies to third countries are classified as “highly sensitive,” especially if there is a risk of re-identification using sophisticated methods. The Commission is responsible for establishing special conditions for such transfers, as outlined in the Data Governance Act. Access to nonpersonal data by foreign governments is limited under the EHDS, unless permitted by Union law or national legislation. Specific conditions must be satisfied for such access, including the presence of an international agreement.


The European Health Data Space (EHDS) will be enacted 20 days after its publication in the Official Journal of the European Union. However, most of its provisions won’t take effect until four to ten years later due to the extensive preparatory work required by stakeholders. Several practical issues, such as the need for secondary legislation and updates to electronic health records systems, still need resolution. Nonetheless, if member states and the Commission can streamline implementation without introducing unnecessary complexities, the EHDS has the potential to revolutionize healthcare and scientific research in the EU.

Source: European Health Data Space: Revolutionizing health care, scientific research in the EU | IAPP


Wednesday 22 May - 10:44 AM

Governments and Facebook in the EU

Governments and Facebook in the EU: a complicated relationship

New problems arise for Facebook (and its owner Meta) in the EU. After receiving a fine of €390 million from the Irish Data Protection Commission (DPC) over the legal basis for targeted advertising, the Dutch Data Protection Authority questioned the use of Facebook by governmental bodies.

The problem of Government Organizations’ Facebook page

In theory, any governmental body should guarantee that the processing of the personal data of citizens complies with the law. The Dutch Data Protection Authority (AP) observes that this is not the case if it is unclear what happens to the personal data of visitors of the Government organization’s Facebook page.

This decision does not come out of the blue.

  • Already in 2021, the Dutch Ministry of the Interior (BZK) itself investigated the possible privacy risks of the use of Facebook by the government and concluded (not surprisingly) that he does not have a good idea of ​​what Facebook does with the personal data of people who visit Dutch government pages.
  • In 2023, the BZK asked the AP whether the government could use Facebook. The AP’s advice to the BZK is now not to do this if there is uncertainty about privacy
  • Eventually, in March 2024, the AP released the decision under analysis, observing that the Government should not rely on online platforms, such as Facebook.

The rationale for the decision

To better understand the rationale of the decision, it is useful to look at the statement released by Aleid Wolfsen, chairman of the AP.

He stated: “Because people rely on the government for many things, they will quickly be inclined to use a platform like Facebook to communicate with the government. But government organizations should not use such platforms as a communication channel if they conclude that they cannot be sure what is happening with people’s data. It must be crystal clear what happens to your data. That is the norm”.

He also added “If a platform threatens to black out your online account as soon as you do not agree to be followed online, that is not a free choice. Tech companies cannot force you to agree to track your behavior on the internet. For example, to sell your data to advertising companies. People sometimes depend on information on certain online platforms. Especially when it comes to government information, people have the right to access this information. Without having to pay for it and without having to sacrifice their privacy”.

Some final observations

The decision is comprehensive and does not raise any particular legal issue. It goes in the right direction, as it intends to protect citizens’ privacy rights against large Tech Companies. Early this week another decision, this time from the EDPB, states that social media platforms may not force users to be tracked. However, a question is not answered: why should EU Governments rely on a foreign private online platform to interact with their citizens?

Do you have more questions about the GDPR or Privacy in the EU? Please feel free to reach out to us at


Friday 10 May - 1:41 PM

Dutch Data Protection Authority releases facial recognition guidance

Due to the number of frequently asked questions the Dutch Data Protection Authority, Autoriteit Persoonsgegevens (“AP”), has received about the use of facial recognition, the AP has released guidance about the use of facial recognition (available here in Dutch).


Facial recognition is prohibited in most cases but there are exceptions. The exceptions include:

Security target: hazardous substances

One exception to facial recognition being prohibited is when it is necessary for authentication or security purposes. The Dutch General Data Protection Regulation Implementing Act (“UAVG”) contains the example that facial recognition may be used for the security of a nuclear power plant.

The guidance issued by the AP now contains the new example of the security of hazardous substances that could be used in the production of bombs. In 2023, the AP approved the code of conduct of port companies handling international shipping. The security of such hazardous substances can occur with facial recognition under strict conditions, such as a Data Protection Impact Assessment (“DPIA”) that was carried out before the use of facial recognition, and which shows that there is a need and significant public interest to do so.

Personal use

The conditions under which facial recognition may be regarded as ‘personal’ or ‘domestic use’ has also been defined by the AP. If it is for personal, domestic use of the household exemption applies, the General Data Protection Regulation (“GDPR”) is not applicable. The guidance contains the example of unlocking a mobile phone with facial recognition. This is allowed, but only if the biometric data is stored on the phone itself and user decides what happens to that data. The user must have the choice to use facial recognition or to unlock with mobile phone a PIN code.

The guidance further confirms that the prohibition of using facial recognition also applies to confirming identity. The uncertainty about whether the processing ban on special categories of personal data applies to facial recognition with the aim of confirming someone’s identity has been removed. The AP has concluded that the processing ban does indeed apply in this case. Therefore, the use of facial recognition to catch thieves and other people in supermarkets in the Netherlands is not permissible.

With the further development and use of AI, we believe that more Data Protection Authorities will follow the AP’s example and publish guidance about the use of AI and facial recognition. Does your organization have any questions about the use of facial recognition or the conducting of a DPIA? Contact us, the Experts in Data Privacy, at for assistance.


Wednesday 17 Apr - 3:48 PM

Privacy Concerns Surrounding Tracking Traffic Lights: An Urgent Call for Action

In recent years, the deployment of “tracking traffic lights” in the Netherlands has raised significant privacy concerns among both policymakers and citizens. These innovative traffic lights, designed to communicate with mobile phones of road users, have the capability to gather vast amounts of personal data, prompting intervention from the Dutch Data Protection Authority (AP). With the AP sounding the alarm once again, it is imperative for the Ministry of Infrastructure and Water Management (IenW) to take decisive action to address these privacy risks. 

The concept behind tracking traffic lights is simple: they are utilized to measure traffic flow by connecting with the apps on road users’ mobile phones. However, what sets them apart from traditional detection loops is their ability to collect personal data, raising serious questions about privacy infringement. Unbeknownst to many, these traffic lights establish contact with mobile phone apps, enabling them to gather a wealth of personal information. From tracking complete travel routes, including date, time, and speed, to enabling road authorities to monitor individuals’ movements, the implications for privacy are profound. 


One of the most troubling aspects is the apparent lack of consideration given by road authorities to the privacy risks associated with these tracking devices. Furthermore, there is often a lack of clarity regarding the sharing and responsibility of collected data, as mandated by the GDPR. These oversights underscore the urgent need for thorough assessment and regulation of tracking traffic lights to ensure compliance with data protection laws and safeguard individuals’ privacy rights. 


The AP’s renewed call for action underscores the gravity of the situation. Having previously alerted the Ministry of IenW to the risks associated with tracking traffic lights in 2021, the AP’s recent communication emphasizes the need for immediate investigation into the design and usage of these devices to ascertain their compliance with the GDPR. Additionally, the AP urges the Ministry to engage in dialogue with all road authorities involved to address and mitigate the identified privacy risks effectively. 


As concerns surrounding privacy continue to mount in an increasingly digitized world, it is incumbent upon regulatory bodies and policymakers to prioritize the protection of individuals’ personal data. In the case of tracking traffic lights, proactive measures must be taken to strike a balance between traffic management objectives and privacy considerations, ensuring that innovation does not come at the expense of fundamental rights. Only through collaborative efforts between governmental bodies, regulatory agencies, and stakeholders can effective solutions be devised to address the pressing privacy challenges posed by emerging technologies like tracking traffic lights. 




Monday 8 Apr - 11:36 AM

🚨BREAKING NEWS: US unveils new draft federal privacy bill

The American Privacy Rights Act (“APRA”) has been unveiled. This comprehensive draft legislation sets clear, national data privacy rights and protections for Americans, eliminates the existing patchwork of state data privacy laws and establishes robust enforcement mechanisms to hold violators accountable, including the private right of action for individuals.

One of Senators involved has been quoted as saying, “what we see is a patchwork of state laws developing, and this draft that Sen. Cantwell and I have agreed to will establish privacy protections that are stronger than any state law on the books.”

The APRA incorporates parts of other state laws, including California, Illinois and Washington and establishes foundational uniform national data privacy rights for Americans while also providing Americans the ability to enforce their data privacy rights.

While the timing of this proposed draft legislation is unclear, policymakers have in recent weeks, floated a number of draft bills – ranging from children’s privacy, reauthorization of Section 702 and new obligations for data brokers.

This important development for organizations will continue to be monitored by DPO Consultancy and further developments will be communicated. In the event your organization has any questions, please contact us, the Experts in Data Privacy at for assistance.




Wednesday 3 Apr - 5:06 PM

Employee Monitoring – Italian DPA fines companies for using facial recognition in employee attendance control

Monitoring at work encompasses a broad range of activities employers engage in to oversee the performance and conduct of their employees during work hours, and regardless of the work location. The use of diverse technologies for monitoring, including camera surveillance, email monitoring systems, keystroke logging, tracking of internet activity, GPS tracking, biometric systems, and productivity tracking software, reflects the evolving nature of workplace oversight. While technology advances, adherence to data protection laws to ensure lawful and fair monitoring remains paramount. 

The GDPR allows the monitoring of employees, as long as it complies with specific data protection requirements. A recent ruling by the Italian Data Protection Authority (“Garante”) showcases an instance of employee monitoring that breached data protection and privacy rules.  

The Garante has sanctioned five companies with fines ranging from 2,000 to 70,000 euros for the unauthorized use of biometric data through facial recognition to monitor employee attendance. This practice was deemed a violation of employee privacy rights, as the GDPR does not allow the use of biometric data, which constitutes special category of personal data, for such purposes without a legal exception under Article 9 of the GDPR. The sanctions were issued against companies operating at the same waste disposal site after the Garante received complaints from numerous employees. 

The Garante’s investigation revealed the companies failure to comply with both national and EU laws regarding employee rights and data protection. Notably, it was discovered that three of the companies had shared a biometric detection system for over a year without implementing necessary technical and security measures. This same system, found to be illegal, was also used across nine additional offices by one of the penalized companies. Moreover, the companies failed to provide employees with comprehensive information about the data collection, which violates the transparency principle, or conduct the mandatory data protection impact assessment. 

The Garante underlined that the companies should have considered less intrusive methods of tracking employee attendance, such as a badge. As part of the investigation, the DPA has also mandated the deletion of all illegally collected data, emphasizing the need for adherence to privacy laws and the protection of employee rights against invasive monitoring technologies. 

If your organization has questions about the privacy implications of employee monitoring, contact us at for assistance.   



Friday 15 Mar - 4:34 PM

The AI Act and the GDPR: what does it mean for companies?

What is the EU AI Act and what does it mean for companies who use AI Solutions?

On the 13th of March 2024, the AI Act passed the scrutiny of the European Parliament and is ready to become a law of the Union. This comprehensive regulatory framework aims to govern the development and use of artificial intelligence (AI) across the European Union (EU).

The AI Act’s primary aim is to ensure that AI technologies are developed and used in a manner that is ethical, transparent, and respects fundamental rights, and covers a wide range of AI systems used in various sectors, including healthcare, transport, and finance.

In particular,

  • it imposes that high-risk AI systems will be subject to strict regulatory requirements (such as those used in critical infrastructure, law enforcement, and employment)
  • prohibits certain AI practices that pose significant risks to individuals or society such as human behaviour manipulation, the exploitation of vulnerabilities, or social scoring
  • non-compliant organizations may face significant fines of up to €30 million or 6% of their global annual turnover, whichever is higher
  • it introduces a conformity test for AI systems. This assessment is designed to foster accountability and only applies to AI systems classified as ‘high-risk’

Where does the AI Act Apply? Also outside the EU?

Like the GDPR, the AI Act carries significant extraterritorial implications. It extends its jurisdiction to providers who introduce AI systems in the EU market, regardless of their geographical location. Additionally, it encompasses providers or deployers situated outside the European Union whose AI systems are utilized within the EU.

Significant exclusions from the AI Act encompass AI systems regarding scientific research and development purposes are in place. Furthermore, the Act exempts research, testing, and development activities related to AI before market placement or deployment, excluding real-world testing.

When will the AI Act come into force?

The Act is currently undergoing a final review by lawyer-linguists and is anticipated to be formally adopted before the conclusion of the legislative session. Additionally, the law necessitates formal endorsement by the Council.

Upon publication in the Official Journal, the regulation will become effective after twenty days. It will become fully enforceable twenty-four months after it enters into force, with certain exceptions:

  • prohibitions on restricted practices will take effect six months following the entry into force;
  • codes of practice will be applicable nine months after entry into force;
  • general-purpose AI regulations, including governance, will come into effect twelve months after entry into force;
  • obligations for high-risk systems will be enforced thirty-six months after entry into force.

The AI Act and the GDPR: how to efficiently combine these regulations?

The AI Act does not affect or amend the GDPR or the ePrivacy Directive. However, the deployment of AI solutions must not interfere with GDPR principles, because these activities still involve the processing of personal data. These are some examples.

  • Record of Processing Activities (RoPA): AI-related processes are processing activities that must be mapped in this important document.
  • Privacy Impact Assessment (PIA) and/or Data Protection Impact Assessment (DPIA): a new processing activity with AI technology usually requires one of these risk assessment procedures.
  • Technical and Organizational Security Measures (and other accountability measures): new policies (i.e. a comprehensive policy on AI use), procedures, and assessments (such as the Fundamental Rights Impact Assessment for high-risk AI systems or the conformity test for AI systems) are important to guarantee a safe and GDPR-compliant use of AI technology.

We are well-equipped and ready to help you handle the interactions between AI Governance and Data privacy! If you want to include AI compliance in your privacy journey, please contact us at





Wednesday 13 Mar - 10:36 AM

EDPS finds European Commission’s use of Microsoft 365 infringes EU data protection law

After its inquiry, the European Data Protection Supervisor (EDPS) found that the European Commission breached numerous essential data protection rules while using Microsoft 365. As a consequence, the EDPS has mandated that the Commission implement specific corrective actions. 

The EDPS discovered that the Commission breached several aspects of the EU Regulation 2018/1725, relating to EU data protection rules for EU institutions and entities, notably those regarding data transfers outside the European Economic Area (EEA). Specifically, it did not ensure that transferred personal data received equivalent protection as within the EEA. Additionally, the Commission’s contract with Microsoft lacked detailed specifications on the collection and intended use of personal data via Microsoft 365. The Commission’s breaches involved inadequate data processing carried out on its behalf, including transfers of personal data. 

As a result, the EDPS has mandated that starting 9 December 2024, the Commission must halt all data transfers from its use of Microsoft 365 to Microsoft and its related entities in non-EEA countries lacking adequacy decisions. Additionally, it’s required that the Commission align its Microsoft 365-related data handling practices with Regulation 2018/1725, ensuring all operations are compliant by the specified date. 

In its press release, the EDPS notes that it recognizes the importance of not hindering the Commission’s ability to carry out its tasks in the public interest or to exercise official authority vested in the Commission while providing sufficient time for it to adjust data flows and align its data processing practices with the Regulation 2018/1725 requirements.  

Background information 

In May 2021, the EDPS launched an investigation into the Commission’s usage of Microsoft 365, following the Schrems II decision. The purpose was to assess whether the Commission adhered to the EDPS’s earlier recommendations concerning Microsoft’s products and services within EU institutions. This inquiry is a component of the EDPS’s involvement in the 2022 Coordinated Enforcement Action led by the EDPB. 

If your organization has questions about international data transfers, contact us at for assistance.  



Thursday 29 Feb - 2:11 PM

HIPAA: Safeguarding Health Data in the Data Protection Landscape

In the era where data breaches are not just a possibility but also an unavoidable threat, the Health Insurance Portability and Accountability Act (HIPAA) positions as a ray of hope and security for the healthcare industry. HIPAA is more than just a regulatory requirement.  

Since its implementation in 1996, HIPAA has come to be associated with safeguarding private patient health information, adapting over the years to address the challenges and opportunities presented by the digital age. 

 The 3 rules of HIPAA are:

  • HIPAA privacy rule,  
  • HIPAA security rule, and the 
  • HIPAA breach notification rule 

The Privacy Rule and the Security Rule are the cornerstones of HIPAA to ensure the confidentiality, integrity and availability of Protected Health Information (PHI), which is the main concern of the regulation 

The information protected by HIPAA

The HIPAA Privacy Rule establishes national standards for the protection of PHI by covered entities and their business associates. The significance of patients’ rights to their health information is emphasized. In particular, the ability to see, receive a copy of, and request corrections from their medical records. PHI may only be used and disclosed for treatment, payment, and healthcare operations, and for no other reason without the consent of the patient, according to the rule. 

On the other hand, the Security Rule sets standards for protecting PHI that is held or transferred in electronic form. It describes technical, administrative, and physical security measures to guarantee the security of electronic PHI (ePHI). This is to prevent unwanted access or breaches, this comprises safeguards including transmission security, integrity controls, audit controls, and access controls. 

HIPAA and digital health 

As healthcare evolves with the integration of digital health technologies, HIPAA has become more important than ever. These technologies bring new concerns for data security while also increasing accessibility and efficiency in the delivery of healthcare. HIPAA’s requirement for regular risk assessments aids in identifying weaknesses and putting in place suitable security measures. 

Achieving HIPAA compliance is not just a regulatory requirement but a commitment to patient privacy and trust. It entails an ongoing process of staff training, privacy awareness-building, and security measure evaluation and improvement inside healthcare companies. 

The Future of HIPAA Compliance 

The journey of HIPAA compliance is incomplete, with future amendments expected to address emerging technologies and privacy challenges. In order to address the need for the protection of patients’ rights and the privacy of their medical records in the digital age, this dynamic framework will continue to influence the state of healthcare data protection. 

If you want to learn more about navigating HIPAA, check out our whitepaper about HIPAA or contact us at 


Wednesday 21 Feb - 3:40 PM

Embracing the Google Consent Mode V2

The digital advertising landscape has seen significant changes due to the enforcement of various regulations affecting user consent and the handling, security and use of personal data. Notably, the Digital Markets Act (DMA) (March 2024) imposes new requirements on major companies like Google, Amazon, Meta and Microsoft, designating them as ‘gatekeepers’. As the same time, the legislation clearly puts an obligation on such gatekeepers: they are responsible for obtaining user consent for their core platform services. Consequently, Google sees a clear need to implement Google Consent Mode V2 to make sure that they themselves adhere to privacy legislation.

 Consent Mode, developed by Google, enables the transmission of consent signals from websites cookie banners directly to Google. This ensures that user consent preferences of the user are, in fact, honored. In practice, this tool provides a direct line of communication between the websites, where the user has given their preference to agree to share personal data, directly with Google for advertising purposes and personalization. It is an effective and efficient tool that streamlines procedures while at the same time providing users with more control regarding their personal data. When the user does opt to provide consent, Google can utilize these tools for detailed analytics. Conversely, if the user chooses not to consent, Google restricts the use of cookies and identifiers respectively. 

The key enhancement lies in the addition of two new consent states: ad_user_data and ad_personalization, which reflect a user’s consent status.

  • ad_user_data: whether the user consented for advertising activities, it requires a pro-active act of the user where the users must actively agree to share their data with Google via a consent banner interaction on website of respective business.  
  • ad_personalization governs the use of data for personalized advertising, such as remarketing. This also requires a pro-active choice to provide consent via cookie banner interaction on website of respective business.  

For businesses who wish to maintain their use of Google Ads within the European Economic Area (EEA) markets, embracing or upgrading to Consent Mode v2 is simply necessary. This ensures accurate conversion tracking and efficient optimization of advertising expenditure. Failure to implement Google Consent Mode v2 by March 2024 means missing out on valuable insights. Consequently, businesses would analyze website performance and optimize Google Ads campaigns based on incomplete, incorrect and even inaccurate data, which evidently demonstrates a problem for businesses relying on such information. Moreover, without Consent Mode v2, advertisers won’t be able to add new EEA users to their target lists post-March 2024. This severely impacts the effectiveness of remarketing and engagement campaigns, as well as all metrics pertaining to EEA users. Therefore, lacking Consent Mode v2 renders measurement, reporting, audience list management, and remarketing efforts in the EEA largely ineffective. 

To conclude, to continue to enjoy the full workings of Google Ads, it is necessary to embrace the Google Consent Mode V2. For more information, do not hesitate to contact us at  

Source: Google announces Consent Mode v2 – What does it mean? (  


Tuesday 13 Feb - 11:28 AM

Not answering to DSAR causes serious fines even if the Information is Easily Accessible Online

Data Subject Access Requests (DSAR) from Employees: not answering causes Serious Fines even if the Information is Easily Accessible Online

In a recent development, the Italian DPA has taken decisive actions against Autostrade per l’Italia and Amazon Italia, fining them €100,000 and €40,000 respectively for having mishandled Data Subjects Access Requests (DSARs) from (former)employees. Article 15 GDPR outlines the Data Subject’s right to access, and its pivotal role has also been acknowledged by the European Data Protection Board (EDPB) guidelines 01/2022 on the right of access as updated on the 28th of March 2023. In particular, this right allows individuals to confirm the processing of their data, access personal information, and obtain details about the processing, including

  • purposes,
  • categories of data,
  • recipients,
  • storage period.

The EDPB emphasizes the interconnectedness of the right of access with other GDPR provisions and underscores that limitations should only be placed by other GDPR Articles (such as Articles 15(4) and 12(5), aimed at preventing infringement on others’ rights and addressing manifestly unfounded or excessive requests).

Employees Request for Access to Work-related Information (Autostrade per l’Italia)

Autostrade faced complaints from 50 employees. They requested access to their:

  • personal files,
  • pay slips,
  • information relating to the processing of data for the calculation of their pay slips,

without receiving any reply. Autostrade argued that it did not reply for the following reasons:

  • To safeguard its right to defend itself in several lawsuits that involved the data subjects.
  • The employees could have easily retrieved the information on an online platform.
  • Employees had been informed about their privacy rights in the privacy policy according to Article 13 GDPR.

Notwithstanding these arguments, the Italian DPA stated that Autostrade should have replied to the employees’ requests anyway and informed them about the reason for the denial of access. Failing to do so, lead to a fine of €100,000.

Former Employee Request to Access to Personnel File (Amazon Italia)

A former Amazon employee requested a copy of the personnel file. Amazon did not reply to the DSAR, arguing that it was too broad and generic. After the request for information from the Italian DPA (and significantly after six months after the request), Amazon sent the employee a copy of the personnel file. Notwithstanding this, the Italian DPA stated that Amazon:

  • should have replied to the employee within 30 days as provided by the GDPR
  • should have informed the employee that more information was required to specify the request.

In other words, the fact that the DSAR was too broad and generic did not exclude the Company’s duty under the GDPR to answer the DSAR.


In both cases, the companies failed to provide timely and adequate responses, violating the Data Subjects’ right of access. Autostrade should have replied even if the information was easily retrievable somewhere else and Amazon should have replied by asking the Dat Subject to send a more specific request. The decision to stay silent cost the companies tens of thousands of euros of fines that could have been easily avoided.

These recent regulatory actions highlight the critical role of the right of access in ensuring individuals’ control over their personal data. The Italian DPA stresses the necessity of providing motivated responses even in denial cases, informing individuals about the right to appeal. Additionally, it emphasizes that broad and generic access requests should not excuse delayed responses; instead, companies should seek clarification promptly.

In conclusion, organizations must recognize the centrality of access rights within privacy policies, adopting a proactive approach to address requests in a timely and transparent manner. These fines underscore once again the importance of compliance and the responsibility organizations bear in upholding individuals’ privacy rights. If you want help in structuring your DSAR policy or verifying it is up to date please contact us, we are happy to help you.


Wednesday 7 Feb - 4:03 PM

The Dutch Data Protection Authority is going to monitor cookie banners more closely

This year, the Dutch Data Protection Authority (AP)plans to increase its scrutiny of cookie consent practices to ensure compliance with regulations. Practice has shown that organizations quite often make use of misleading cookie banners, such as hidden rejection buttons or requiring the consumer to go through various clicks before rejecting cookies.

Aleid Wolfsen, chairman of the Dutch DPA stated that: “With tracking software or tracking cookies, organizations can look at your internet behavior. You can’t just do that, because what you do on the internet is very personal. An organization is only allowed to keep track of that if you explicitly agree to it. And you should have the option to refuse this tracking software, without it being detrimental to you.”  This clearly highlights the importance of requesting consent in the correct way; it ensures that consumers have- and remain in control over what they share and do not share.

Targeted Advertising
Websites make use of functional, analytical, and tracking cookies. These cookies frequently handle personal data, such as the consumer’s website entry point, visited websites or apps, duration spent on specific pages, clicked links, and even search queries. This facilitates organizations in crafting user profiles and delivering personalized advertisements. However, when employing these practices, organizations must adhere to the GDPR and the ePrivacy Directive.

Misleading Cookie Banners
It’s essential to retain control over personal data while browsing. A critical step in this direction is offering transparent information about cookie usage, enabling consumers to make informed consent decisions. Organizations should implement legally compliant cookie banners, steering clear of deceptive practices like concealing decline options in a separate layer of the banner (which necessitates additional clicks before the option to reject is available) or automatically selecting ‘accept’ checkboxes.

Correct Cookie Banners
The AP specifies important elements of compliant cookie banners, such as furnishing purpose information, abstaining from automatic checkmarks, employing transparent language, and offering consumers a genuine clear choice within the initial layer of the cookie banner. It emphasizes not requiring consumers to repeatedly click to decline cookies. The AP hereby seems to align with many other EU member states who have also indicated that the consumer should be presented with both the ‘accept’ and ‘reject all’ in the first layer.

Dutch Data protection Authority Investigation
If organizations don’t obtain appropriate consent for cookie banners and tracking software, the AP has the authority to examine and ensure adherence, potentially imposing fines.

For assistance in ensuring GDPR and ePrivacy Directive compliance with your organization’s cookie banner, please contact us at 

Source: AP pakt misleidende cookiebanners aan | Autoriteit Persoonsgegevens



Wednesday 31 Jan - 3:13 PM

Dutch Data Protection Authority Initiates European Procedure on Privacy and Personalized Ads

The Dutch Data Protection Authority (Autoriteit Persoonsgegevens or AP), in collaboration with the privacy watchdogs of Norway and Germany, is set to launch a European procedure addressing privacy concerns related to personalized advertisements. The regulators aim to present a clear stance, in conjunction with their EU counterparts, on how online platforms obtain user consent for displaying personalized ads.

Some online platforms assert that users can only continue to use their services for free if they agree to the utilization of their personal data for targeted advertising. For the AP, it is crucial that privacy protection is not exclusive to those who can afford it. Privacy is deemed a fundamental right that should be equally safeguarded for everyone.

EDPB’s Timely Decision

Within 8 weeks, the European Data Protection Board (EDPB), which includes European privacy watchdogs such as the AP, will issue a position on this matter.

Aleid Wolfsen, Chair of the AP, emphasizes the significance of maintaining control over personal data, especially in the era of extensive online tracking. He questions whether privacy is becoming a luxury reserved for the affluent and expects the EDPB’s forthcoming position to significantly impact how tech companies handle user privacy.

‘Pay or Okay’ Model

Online platforms are only allowed to display personalized ads with user consent, as per previous European rulings. However, some platforms employ the ‘pay or okay’ model, where users must pay a monthly fee if they do not agree to the use of their personal data for targeted ads.

The privacy authorities, working collectively within the EDPB, aim to swiftly determine whether the ‘pay or okay’ model complies with the General Data Protection Regulation (GDPR).

Free Consent

The GDPR mandates that companies processing personal data must have a legal basis, such as consent. Consent should be freely given without coercion, and individuals should have the option to refuse the processing of their personal data without facing adverse consequences.

The ‘pay or okay’ policy poses challenges, particularly concerning major online platforms with a large user base. Users might feel dependent on these platforms due to social connections or the presence of essential information and popular content.

Crucial questions include whether consent for data processing is given under duress, whether the pricing is fair, and whether refusal results in adverse consequences, especially for individuals with lower incomes. The lack of a uniform European approach among national regulators currently complicates the matter. DPO Consultancy will monitor developments in this regard and provide an update in due course.



Wednesday 24 Jan - 9:40 AM

Amazon France fined €32 million for unlawful employee monitoring

Normal 0 false false false en-NL X-NONE X-NONE

On 23 January 2024, the French Data Protection Authority (“CNIL”) published its decision, which was issued on 27 December 2023, regarding the fine it imposed upon Amazon France for numerous violations of the General Data Protection Regulation (“GDPR”) following an investigation. The fine imposed amounts to €32 million.


The CNIL investigated Amazon France after press articles were published on the practices implemented by Amazon France and after receiving numerous complaints from employees.

The Amazon warehouses, which are located in France, are managed by Amazon France. As part of this, each warehouse employee is equipped with a scanner, which documents the execution of certain tasks. Every time an employee makes use of the scan function, it results in the recording of data that can be used to calculate values relating to the quality, productivity, and periods of inactivity of each employee.

Findings of the CNIL

The data collected from employees was collected in real time and all data reported was kept for 31 days. The smallest details of an employee’s productivity were available to supervisors. The CNIL determined that supervisors should rather rely on data reported in real time to identify difficulties encountered by employees and that a selection of aggregated data, which has already been collected for other data, should be sufficient. Thus, Amazon France was found to be in violation of Article 5(1)(c) GDPR as it did not process personal data in a manner that is relevant, adequate and limited to what is necessary in relation to the purposes of the processing.

Regarding the monitoring of employees, the CNIL indicated that the three methods of monitoring – when an employee scans an item too quickly, the idle time indicator indicating interruptions of 10 minutes or more and latency times of less than 10 minutes – cannot be based upon legitimate interest as the methods are excessively intrusive. Therefore, in the absence of a lawful basis, Amazon France was found to have violated Article 6 GDPR.

Numerous employees were contracted on a temporary basis and the confidentiality and privacy policy of Amazon France was not provided before the collection of their personal data. The CNIL held that the information provided to temporary workers on the company’s intranet was insufficient due to the fact that temporary workers were not requested to read it and this was not the most appropriate method of informing temporary workers who did not have access to an office computer during working hours.

Lastly, the information provided about the video surveillance was not properly communicated to employees and external visitors and the access to the video surveillance was found to be insufficiently secured as the password access was not robust and access accounts were shared between multiple employees. Regarding the characteristics of processing and risks involved, the CNIL found that Amazon France was in violation of Article 32 GDPR for failure to guarantee a level of security appropriate to the risk of processing.

To ensure that your organization does not receive a fine similar to the one discussed, contact us, the Experts in Data Privacy at, for further assistance.




Wednesday 17 Jan - 1:08 PM

Importance of DPIAs: Dutch Authority Imposes Fine on ICS for Compliance Lapse

The Dutch Data Protection Authority (AP) has fined International Card Services B.V. (ICS) 150,000 euros for not conducting a required Data Protection Impact Assessment (DPIA), as mandated by the General Data Protection Regulation (GDPR). DPIAs are crucial for organizations to systematically identify and mitigate privacy risks associated with processing personal data.

DPIAs are essential for legal compliance, ensuring organizations meet GDPR requirements. They offer a proactive approach to privacy management, allowing organizations to identify, evaluate, and address potential risks before they escalate. This not only helps in preventing costly data breaches but also showcases an organization’s commitment to transparency, accountability, and responsible data processing.

Here are several reasons why conducting DPIAs is so important:

  1. Risk Identification and Mitigation:
    • DPIAs help organizations systematically identify and evaluate the risks that may arise from their data processing activities. This includes assessing the likelihood and severity of potential negative impacts on individuals’ privacy.
    • By recognizing and understanding these risks, organizations can implement appropriate measures to mitigate them, reducing the likelihood of data breaches, identity theft, or other privacy-related issues.
  2. Legal Compliance:
    • GDPR requires organizations to conduct DPIAs for processing activities that are likely to result in high risks to individuals’ rights and freedoms. Failing to conduct a DPIA when required can lead to legal consequences, including fines and penalties.
  3. Transparency and Accountability:
    • Performing DPIAs demonstrates an organization’s commitment to transparency and accountability in its data processing practices. It shows that the organization is proactively assessing and addressing potential privacy risks, promoting a culture of responsibility.
  4. Building Trust with Stakeholders:
    • Individuals are becoming increasingly aware of the importance of privacy, and they expect organizations to handle their personal data responsibly. Conducting DPIAs helps build trust with customers, employees, and other stakeholders by showcasing a commitment to protecting their privacy.
  5. Proactive Privacy Management:
    • DPIAs are a proactive tool for privacy management. Instead of reacting to privacy issues after they occur, organizations can anticipate and address potential problems in advance, minimizing the negative impact on individuals and the organization’s reputation.
  6. Avoiding Costly Data Breaches:
    • Identifying and mitigating risks through DPIAs can help organizations avoid costly data breaches. The financial and reputational damage caused by a data breach can be significant, and DPIAs serve as a preventive measure to minimize such risks.
  7. Continuous Improvement:
    • DPIAs are not a one-time activity; they contribute to a continuous improvement cycle. Organizations should regularly revisit and update DPIAs to account for changes in processing activities, technology, or regulations, ensuring ongoing compliance and effectiveness.

In summary, conducting DPIAs is a proactive and essential practice for organizations to meet legal requirements, protect individuals’ privacy, build trust, and avoid the negative consequences associated with privacy breaches. It is a foundational element of a robust privacy management framework in today’s data-driven landscape.

Does your organization need to conduct any DPIAs or do you have any questions on this topic, contact us, the Experts in Data Privacy at for assistance.



Thursday 11 Jan - 3:10 PM

Data Controller Liability and its Limits according to the CJEU

In December 2023, the Court of Justice of the European Union (CJEU) ruled on the matter of data controller liability for processing activities carried out by its processor. In case C-683/21, the Court stated that there are limits to this. In other words, the controller-processor relationship is not by itself sufficient, if:

  • the processor processes personal data for its own purposes
  • the processor acts in a manner that is incompatible with the arrangements set by the controller
  • it is reasonable to conclude that the controller didn’t agree to the processing

Case C-683/21 Background

During the outbreak of COVID-19, the National Public Health Centre of the Lithuanian Ministry of Health (NVSC) commissioned an app to UAB “IT sprendimai sėkmei” (UAB), an IT service provider. Finally, the app was available on Google Play, and its privacy policy referenced NVSC and the service provider as controllers. The NVSC and service provider had not concluded nor signed any contract. Eventually, the NVSC terminated the procurement of the app due to a lack of funds.

As a consequence of these events, the Lithuanian Data Protection Authority imposed administrative fines on the NVSC and UAB as joint controllers. On one hand, the decision was challenged by the NSVC on these grounds:

  • it was not a controller for the processing in question
  • the service provider built the app, although there was no contract between the parties
  • it had not consented to or authorized it to make the app available to the public.

On the other hand, UAB claimed it was merely a processor.

CJEU decision

Firstly, the Court reaffirmed the broad scope of controllership:

  • Facts over formalities: someone can be a controller event without a specific contact. On the other hand, the fact that a person is referenced as a controller in a privacy notice is not in itself sufficient to make that person a controller unless that person had consented — explicitly or implicitly — to this.
  • In this case, the NVSC commissioned the app for its own objectives (COVID-19 management). In doing so, the NVSC had foreseen the data processing that would be carried out and had participated in determining the parameters of that app. Therefore, NVSC should be regarded as a controller.
  • The fact the NVSC did not acquire the app and did not authorize its dissemination to the public is not relevant. It would have been different, however, if the NVSC expressly forbade UAB to make the app available to the public.

Secondly, the Court reaffirms that joint- controllership does not imply equal responsibility. The level of responsibility depends on the circumstances of the case. Moreover, formal arrangements are not necessary, and the party can be joint controllers, nevertheless. In other words, this arrangement is a consequence of the parties being joint controllers, not a pre-condition for the existence of joint control.

Controller Wrongful Behaviour and Liability of a Controller for Acts of the Processor

In conclusion, the Court ruled that:

  • according to Article 83 GDPR, a controller can receive an administrative fine only for an intentional or negligent infringement of the GDPR (i.e. wrongful behaviour). On the other hand, the Court also confirmed the fined controller may not be aware of the GDPR infringement
  • according to Recital 74 GDPR, the controller is responsible for processing carried out on its behalf by the processor
  • the controller would not be liable in the following situations because in these cases the processor would become a controller (Article 28(10) GDPR):
    • where the processor has acted for its own purposes
    • where the processor has processed data in a manner that is incompatible with the arrangements for the processing set by the controller
    • where it cannot be reasonably considered that the controller consented to such processing



Thursday 4 Jan - 3:56 PM

The need for a harmonious approach: rejecting cookies should be as easy as accepting them

Numerous websites utilize cookies, which are generally divided into ‘essential’ (functioning website) and ‘non-essential’ (for example, store important information and user preferences) categories. The regulation of cookies falls under the ePrivacy Directive, translated into national laws of EU Member States. The ePrivacy Directive mandates that websites offer transparent information about cookie usage and seek consent for placing non-essential cookies.

Currently, there are different approaches within the European Union (EU) with regards to accepting and denying consent for the placement of cookies. The first approach includes both “Accept All” and “Reject All” options displayed in the initial layer of a cookie consent management solution. The second approach features only the “Accept All” option in the first layer, accompanied by a link to the second layer of the cookie consent management solution where the visitor can reject the use of non-essential cookies.[1]

The prevailing approach seems to lean towards the first method: an “accept all” and “reject all” in the first layer. Specifically, the Belgian, Austrian and Spanish data protection authorities are in favor of presenting both options in the first layer, which has even been enforced by the Franch data protection authority.[2]  The German[3] data protection authorities do not require a “reject all” when, for example, the consent option is also not displayed in the first layer. The Irish[4] data protection authority indicates that it is sufficient for there to be a consent button in the first layer and a link to further and more detailed information in the second layer.

Having to click various times to reject the non-essential cookies is, by those who view the first approach as GDPR compliant, viewed as a harmful nudge technique that reduces control over personal data and discourages a visitor from rejecting consent yet steers the visitor to provide consent (as it is easier and does not require much effort), it is a clear influencing technique. Though the GDPR and the ePrivacy Directive do not explicitly dictate that rejecting consent should be as easy as consenting, it is argued that this can, in fact, be deducted implicitly from the GDPR.  The GDPR dictates that consent must be freely given, informed, specific and unambiguous.[5] The data protection authorities in favor of the first approach question whether requiring the visitor to conduct multiple clicks to be able to reject consent aligns with the GDPR consent requirements and, as such, pulls into doubt whether the method through which consent is asked, is valid. Specifically, is consent truly freely given while keeping in mind the nudging technique used to steer the consumer in a certain direction. Additionally, the GDPR includes the fairness principle: the same data protection authorities find that that offering an equal and same way of rejecting cookies as opposed to consent to cookies, is fair.

Also within the Netherlands, this is a hot topic. Minister Alexandra van Huffelen of Digital Affairs has written a letter to the Second Chamber. In her letter, she informs the Second Chamber that often websites use dark patterns or nudging techniques to influence the choice of the visitor, the example of the difficulty of rejecting non-essential cookies.[6]

The Dutch Authority for Consumers and Markets (ACM) is the supervisor of the Dutch Telecommunication Act (implementation of the ePrivacy Directive) and the Dutch Data Protection Authority (DPA) is the supervisor of the GDPR, who can both regulate such matters. The CJEU has already ruled that consent cannot be given by a pre-tickets box, rendering such consent invalid.[7] Under the European Data Protection Board (EDPB) a Cookie Banner Task Force has been established. In their January 2023 report, they discuss how different practices, such as dark patterns, relate to the ePrivacy Directive and the GDPR.[8] The EDPB also issued a guideline on this dark patterns on social media, potentially also relevant.[9]

Evidently, efforts at European level are being made to provide more clarity on this issue. Clear though is that the majority of the Member States find that both a “accept all cookies” and “reject non-essential cookies” in the first layer of the cookie banner is the GDPR compliant route. Important is that the approach concerning the rejection or acceptance of consent is streamlined and made harmonious across the European Union thereby enhancing consumer protection and control over personal data.

[1] “Reject All” button in cookie consent banners – An update from the UK and the EU | Technology Law Dispatch

[2] Belgium Checklist on Cookies; Austrian Guidance on Cookies; Spanish Guidance on Cookies; CNIL on refusal and acceptance of cookies

[3] German Guidance on Cookies

[4] Irish Guidance Note: Cookies and other tracking technologies

[5] Article 6 GDPR.

[6] Letter minister Alexandra van Huffelen of Digital Affairs concerning cookies to the Second Chamber

[7] CJEU 1 October 2019, C-673/17, ECLI:EU:C:2019:801

[8] Report cookie banner taskforce | European Data Protection Board (

[9] Guidelines 03/2022 on deceptive design patterns in social media platform interfaces: how to recognize and avoid them | European Data Protection Board (


Tuesday 2 Jan - 10:13 AM

Landmark CJEU Ruling: SCHUFA Case Redefines Automated Decision-Making Responsibilities

The recent judgment (C-634/21) by the Court of Justice of the European Union (CJEU) in the SCHUFA case has significant ramifications, particularly for entities engaged in automated decision-making processes. The case specifically addresses credit reference agencies, establishing that when creating credit repayment probability scores, these agencies are involved in automated individual decision-making. This responsibility extends to both the credit reference agency and the lenders relying on these scores, placing them under the purview of Article 22 of the GDPR.


Article 22 of the GDPR restricts automated individual decision-making, allowing it only under specific circumstances such as contractual necessity, legal justifications, or explicit consent. The CJEU’s ruling emphasizes the need for robust safeguards when engaging in such automated decision-making processes, including the provision for human intervention, the right to express views, and mechanisms to challenge decisions. While the decision pertains to credit reference agencies, its broader implications are felt across industries employing predictive AI tools and automated decision-making services.

Simultaneously, the CJEU, on the same day, addressed insolvency data retention in cases C-26/22 and 64/22. The cases involved individuals (UF and AB) who underwent insolvency proceedings in Germany, seeking the deletion of their data retained by SCHUFA, a credit reference agency. The CJEU ruled on the duration of data retention, emphasizing that German law, which allows public information on insolvencies to be published for six months, should take precedence over private sector interests.

Moreover, the decision underscored the right to erasure under Article 17 of the GDPR, asserting that it applies when personal data has been unlawfully processed. The right to object under Article 21 was reiterated, with the CJEU emphasizing the need for controllers to cease processing personal data if a data subject objects, unless compelling legitimate grounds override the data subject’s interests.

Importantly, the CJEU affirmed individuals’ right to a full judicial review of decisions made by Data Protection Authorities (DPAs). This rejects a more limited interpretation and ensures that data subjects have a comprehensive mechanism to challenge decisions related to their data.

In conclusion, these CJEU rulings accentuate the critical importance of transparency, accountability, and robust data protection measures in the evolving landscape of automated decision-making and data retention. Organizations are urged to align their practices with these legal developments to ensure compliance and safeguard individuals’ rights in the digital age.

Does your organization have questions about automated decision-making? Contact us, the Experts in Data Privacy at for assistance.



Thursday 21 Dec - 10:47 AM

Fear over possible misuse of personal data resulting from a data breach regarded as non-material damage

The Court of Justice of the European Union (“CJEU”) has ruled that the fear of a data subject over the possible misuse of their personal data from a data breach is regarded as non-material damage and can lead to financial compensation from the data controller.


The Bulgarian Tax Authority – the data controller – suffered a data breach and as a result, more than 6 million data subjects’ personal data was leaked online. The complainant was one of the 6 million data subjects affected.

The complainant instituted legal proceedings against the data controller under Article 82 General Data Protection Regulation (“GDPR”). Part of her claim included approximately €510 as compensation for the non-material damage resulting from the data breach. She argued that the data controller had caused the damage due to their failure to implement adequate technical and organizational measures in breach of Articles 5, 24 and 32 GDPR. Her non-material damage was the fear that her personal data might be misused in the future and that she could be threatened as a consequence.

Her claims were dismissed as the court held that the data controller had not caused the data breach as it caused by the actions of third parties and that the complainant had failed to prove the data controller failed to implement security measures. Furthermore, the court was of the opinion that the complainant had not suffered any non-material damage as her fear was only hypothetical.

The complaint appealed this decision and various questions were referred to the CJEU by the appellate court.


The CJEU held that the burden of proof for proving that technical and organizational measures are adequate lies with the data controller and therefore granted the complainant damages for the data breach.

The fact that a third party breaches a data controller does not automatically mean the technical and organizational measures of the data controller were inadequate. The CJEU further held that Articles 24 and 32 GDPR merely requires the data controller to implement technical and organizational measures in order to avoid any personal data breach, if at all possible. It cannot be inferred from the language of the GDPR that a breach is sufficient to conclude that the measures were not appropriate, without allowing the data controller to argue otherwise.

The appropriateness of technical and organizational measures and expert reports regarding this must be assessed by national courts. This must be assessed in two stages, firstly, the court must identify the risks of a breach and the potential consequences of those risks and secondly, the court must determine whether the data controller’s technical and organizational measures are appropriate to the risks. The substance of the technical and organizational measures in light of the criteria set out in Article 32 GDPR must be examined and the investigation must not be confined to how the data controller aimed to comply with Article 32 GDPR.  

The interpretation of non-material damages and compensation as relied upon by the CJEU is supported by the Ӧsterreichische Post AG case, where the CJEU stated that the concept of damage has to be interpreted broadly. The national courts must ensure that the fear over the misuse of personal data is not unfounded and that is related to the specific circumstances at issue with the data subject.

Does your organization have questions about appropriate technical and organizational measures? Contact us, the Experts in Data Privacy at for assistance.






Tuesday 19 Dec - 11:57 AM

EDPB’s Urgent Binding Decision Regarding META

Regarding the decision adopted on 27 October 2023 concerning the processing of personal data with the reason of behavioral advertising on the legal grounds of contract and legitimate interest within the European Economic Area (“EEA”), the Irish Data Protection Authority adopted its final decision.  

The decision prohibits the processing of personal data with the reason of behavioral advertising on the legal grounds of contract and legitimate interest. Following this decision, the Norwegian Data Protection Authority requested the European Data Protection Board (“EDPB”) to order measures concerning the entire EEA.  

The chair of EDPB argued that the decision has been made concerning Irish DPA to prohibit the processing in the entire EEA. He also highlighted that based on the decision in December 2022, the legal basis of contract cannot be justified for the processing carried out by Meta for behavioral advertising. It has been also recalled that Meta did not demonstrate compliance with this decision, which brings out the question of the applicability of Article 66 GDPR, urgency procedure.  

In the middle of this year, the Norwegian DPA temporarily prohibited the processing of personal data for Norwegian data subjects by Facebook Norway for behavioral advertising with the basis of contract and legal interest under Article 66 GDPR. This ban was limited both in scope and geographic area; with a duration of three months and being applicable only in Norway. Following this, in September, the Norwegian DPA requested EDPB to publish an urgent binding decision regarding the adaptation of final measures applicable to the entire EEA.  

With the continuous concerns regarding the infringement of GDPR, EDBP concluded that there is an urgent need to act since the rights and freedoms of data subjects are at risk. After the examination of the evidence provided, EDPB established that there is an ongoing infringement of Article 6(1), highlighting the inappropriate use of the legal grounds of contract and legitimate interest for behavioral advertising by Meta Ireland.  

Since this is a matter of urgency, EDPB concluded that the regular cooperation mechanisms cannot be applied and therefore a final measure needs to be ordered urgently, concerning the possible risks and irreparable harm.   

In addition, Irish DPA did not answer the request for mutual assistance from the Norwegian DPA, for which the timeframe is set for in the GDPR. Article 61(8) refers to the presumption to urgency which makes the application of GDPR clear, as well as highlights the need to derogate from regular cooperation mechanisms.  

EDPB ordered that the final measures needed to be complied with by the Irish DPA. In conclusion, the EDPB found out that imposing a ban on processing of personal data for behavioral advertising on the grounds of contract and legitimate purpose, by Meta Ireland, is proportionate, necessary and appropriate. 

This urgent binding decision was addressed to the Irish DPA, the Norwegian DPA and the other concerned DPAs. 

Do you have any questions about developments within privacy and data protection? Contact us, the Experts in Data Privacy, at for more information. 


Tuesday 12 Dec - 3:22 PM

How much can a DPA fine a Controller who is Part of a Group of Companies?

The European Court of Justice (CJEU) has clarified the conditions under which a national supervisory authority (DPA) may fine one or more controllers for violations of the GDPR. In particular, it points out that the imposition of such fines is predicated on illegal activity. In other words, the violation must be committed intentionally or negligently. Furthermore, if the recipient of the fine is part of a corporate group, the calculation of the fine must be based on the turnover of the group as a whole.

Judgments of the Court in Cases C-683/21 | Nacionalinis visuomenės sveikatos centras and C-807/21 | Deutsche Wohnen

A Lithuanian court and a German court sought the CJEU’s advice regarding the possibility for national DPAs to issue fines on the data controllers that infringe the regulation provisions. 

In the Lithuanian case, the National Public Health Centre under the Ministry of Health contested a fine of € 12 000 imposed on it in the context of the creation, with the assistance of a private undertaking, of a mobile application for registering and monitoring the data of persons exposed to Covid-19. 

In the German case, a real estate company and its group of undertakings (which held approximately 166,000 buildings) contested a fine of over € 14 million. The fine was imposed because the company stored the personal data of tenants for longer than necessary. 

The CJEU decision 

The Court holds that a data controller should not receive an administrative fine unless it is proved that the infringement was committed wrongfully, that is to say, intentionally or negligently. According to the CJEU, this happens when the controller could not have been unaware of the infringing nature of its conduct, regardless of whether or not it was aware of the infringement. In particular, it is not necessary that: 

  • the infringement was committed by the controller’s management body 
  • the management body was aware of the infringement 

On the contrary, if the data controller is a legal person, it is liable for infringements committed by: 

  • its representatives, directors, or managers 
  • any other person acting in the course of the business of that legal person and on its behalf (it is irrelevant if the infringement was committed by an identified natural person) 
  • a processor performing data processing activities on its behalf 

Finally, the Court clarifies that Joint-Controllership arises solely from the fact that those entities have participated in the determination of the purposes and means of processing. There is then no need for a formal arrangement between the entities in question. A common decision, or converging decisions, are sufficient. However, when there is a Joint Controllership, the parties involved must determine their respective responsibilities by means of an arrangement between them. 

Final observation 

About the calculation of the fine where the addressee is or forms part of an undertaking, the Court states that the competent DPA must: 

  • take into account the undertaking under competition law 
  • calculate the fine on the basis of a percentage of the total worldwide annual turnover of the undertaking concerned, taken as a whole, in the preceding business year. 



Thursday 30 Nov - 8:25 AM

Suffer a ransomware attack and a fine by the competent DPA? Sometimes it happens!

Ransomware is a malicious software designed to block access to a computer system until a sum of money is paid. However, some forms of ransomware allow deeper penetration into the internal resources of the targeted entity, and may allow the attackers to:

  • steal sensitive data, or
  • view sensitive information before encrypting it selectively.

These circumstances imply unlawful access to personal data, in other words, a data breach. Depending on the circumstances, the data breach should be reported to the competent DPA and/or to the data subjects involved. Notwithstanding the fact that the entity that suffered the ransom attack is the victim, sometimes the DPA can decide to further punish the entity with a fine. This is what the Italian DPA did regarding a local health board.

The Lack of Privacy by Design measures that led to the Data Breach 

The ransomware attack consisted of a virus that blocked access to the board’s database and a ransom request to restore the database functions. In particular, the attack put in jeopardy the personal data of almost a million data subjects (842.118). After it suffered a ransomware attack, the local health board diligently reported the data breach to the DPA which immediately launched a full-scale investigation. The investigation led to the detection of the following critical issues:

  • lack of privacy by design measures (i.e. failure to implement adequate measures to ensure the security of the internal networks, both in relation to their segmentation and segregation)
  • the absence of a detailed Data Breach Policy
  • poor technical and organizational security measures in place (I.e. the local health board’s VPN authentication procedure involved only a single-factor authentication by username and password)

In particular, the lack of segmentation of the internal networks allowed the virus to spread into the whole network from the first point of entrance.

After the Ransomware and the Implementation of New Security Measures… the Fine!

After the breach, the local health board, among others:

  • acted to mitigate the damages suffered by the data subjects
  • implemented new technical and organizational security measures (i.e.: a new VPN authentication procedure accessible through username and password)

Notwithstanding all these new measures, due to the previous extensive lack of security measures that led to the data breach the Italian DPA issued a fine of 30.000 euro.

In conclusion, implementing the correct technical and organizational security measures and a detailed Data Breach Policy is important both to prevent data breaches and DPA fines! If you want to learn more about these and other measures to protect personal data, please feel free to contact us via for further information.


Tuesday 21 Nov - 1:32 PM

Fine of 30,000 Euros for Municipality of Voorschoten

The Dutch Data Protection Authority (Autoriteit Persoonsgegevens – AP) has imposed a fine of 30,000 euros on the municipality of Voorschoten. The reason behind this penalty is the municipality’s prolonged retention of information about waste from individual households, along with inadequate communication to residents.

In 2018 and 2019, the municipality of Voorschoten replaced household bins and underground containers for apartments. These containers and tokens for the underground containers contain a chip with a unique number linked to a residential address. The aim is to encourage more segregated waste disposal by limiting the amount of residual waste residents can dispose of.

Excessive Retention of “Dumping Data”

If a household offers a bin more frequently than once a week, the garbage truck refuses to empty it. Alternatively, access to an underground container is blocked for the rest of the day after disposing of 5 bags of residual waste in a single day. These systems require some time to have access to the “dumping data” of the respective household.

While this is part of the public task of the municipality, the issue arose when the municipality retained the data for too long. The data from bins was stored for as long as they were in use, and data from tokens were kept for 5 years. This duration was far longer than necessary to check if a household exceeded the allowed amount.

Inadequate Information to Residents

Furthermore, the municipality did not properly inform its residents. Although the municipality sent letters about the new containers and tokens, they were not sufficiently clear about the use of personal data in waste collection.

Cessation of Violations

Both violations have now been addressed. The municipality has reduced the retention period to 14 days, and residents have received a new letter, which the municipality first submitted to the AP.

Does your organization have any questions about data privacy? Contact us, the Experts in Data Privacy at for assistance, we are happy to help you!



Tuesday 14 Nov - 8:42 AM

Employees Geolocation and Video Surveillance, is it always Illicit?

Illicit employees geolocation and video surveillance: a French tale from CNIL

From September to October 2023, the French DPA (CNIL) has been dealing with illicit employee geolocation and video surveillance. Overall, CNIL fined both private and public-sector entities for a total amount of 97,000 euros. In particular, the fined entities contravened:

  • The data minimization principle because of geolocation and continuous video surveillance of employees;
  • The duty to inform employees about the processing carried out and its purposes;
  • The obligation to respect the rights of individuals, and in particular to respond to a request for objection

Employee geolocation and video surveillance are not always illicit

When it comes to employee surveillance it is important to consider the imbalance of power between the employer and the employees. In particular, article 88 GDPR encourages Member States to:

  • Protect the rights and freedom of the employees when their personal data are processed by the employer
  • Safeguard the employees’ human dignity, legitimate interests, and fundamental rights when monitoring systems are deployed at the workplace

Employee geolocation

Regarding the monitoring of the employees’ geolocation, the main issues addressed by the CNIL were the following:

  • the continuous recording of geolocation data
  • the impossibility for employees to stop or suspend the tracking system during break times

However, the French DPA also stated that these activities do not always result in illicit activities subjected to sanctions. CNIL reaffirmed the principle that if there is a special justification, such activities can be carried out without causing an excessive infringement of employees’ freedom and their right to privacy.

Employee video surveillance

About the video surveillance of employees, the main issue was the implementation of video surveillance systems that constantly film employees at their workstations, without any particular reason. In deciding against the employers, the French DPA stated that the prevention of accidents in the workplace and the gathering of evidence do not per se justify the surveillance measures.

However, the DPA conclusions do not exclude employee surveillance measures, as long as:

  • the employer sets out a clear and reasonable purpose for implementing the measure
  • the measures are not disproportionate to the aims pursued

What to do to implement employee geolocation and video surveillance

By an analysis of the CNIL’s decisions, it emerges that there is room for employers to implement employee geolocation and video surveillance. Companies may need these measures for very different reasons:

  • prevent misconduct or improve the company’s security
  • prevent accidents and employee’s security
  • optimize resources and reduce costs

However, choosing a legitimate purpose and proportionate measures is not sufficient.

Does your organization have any questions about employee surveillance or already have such measures in place, but have not conducted PIAs or DPIAs? Contact us, the Experts in Data Privacy at for assistance, we are happy to help you!


Wednesday 8 Nov - 8:00 AM

GDPR certification is no longer unimaginable!

GDPR certification is no longer unimaginable!
General Data Protection Regulation (GDPR) certification is in the process of being established. The Dutch Supervisory Authority (DPA) has recently approved criteria that, when met, enable institutions and organizations to issue official GDPR certificates. As such, official GDPR certificates no longer unimaginable! The approval from the Dutch Supervisory Authority allows the company Brand Compliance to move forward with their application to the Council for Accreditation (RvA). Upon accreditation from the RvA, Brand Compliance will be officially authorized to issue GDPR certificates.  

Regarding the GDPR certificates, this is a relatively recent tool in the realm of personal data protection oversight. It serves as a documented confirmation that a product, process or service’s handling of personal data complies with specific requirements outlined in the GDPR. With a GDPR certificate, organizations can demonstrate to their intended audience that they handle and safeguard personal data with care. It is important to note that obtaining a GDPR certificate is not mandatory.

When it comes to accreditation, certification bodies must adhere to legal requirements. Institutions seeking the authority to issue GDPR certificates must undergo an assessment to evaluate their compliance with these requirements, a process known officially as accreditation. After an initial assessment by the RvA, the DPA reviews whether the criteria align with the relevant requirements of the GDPR. Once approved by the DPA, the RvA is responsible for accrediting the institution. The DPA itself does not grant accreditation to certification bodies. As such, if you wish to become a certification body authorized to issue GDPR certificates, you must submit an application for the accreditation to the RvA. 

The GDPR certificate is a very welcomed element within the privacy sphere. Though it is not mandatory, it is strongly advised. It establishes a greater form of trust between the organization and its clients and demonstrates that the organization considers privacy to be a top priority.  

AVG-certificering komt stap dichterbij | Autoriteit Persoonsgegevens 



Thursday 2 Nov - 10:55 AM

Ban on Personalized Advertising on Facebook and Instagram

Meta must cease the unlawful provision of personalized advertisements on Facebook and Instagram within Europe. This determination has been made by European privacy regulators, including the Dutch Data Protection Authority (Autoriteit Persoonsgegevens or AP), through the European Data Protection Board (EDPB).

Aleid Wolfsen, the chair of the AP and vice-chair of the EDPB, stated: “Meta tracks what you post, click on, or like on Facebook and Instagram and uses that information for offering personalized ads. Unlawfully processing the personal information of millions of people on Facebook is a revenue model for Meta. By putting an end to this, people’s privacy is better protected.”

Processing Ban for Meta

The ban on offering personalized ads is a consequence of an expedited procedure initiated within the EDPB with support from the Norwegian data protection authority and the AP. Following Norway’s prior findings, this ban is now applicable throughout Europe.

The EDPB has directed the Irish data protection authority, the lead supervisory body, to take definitive actions against Meta Ireland within a two-week period. This directive stems from the EDPB’s view that the Irish authority has not acted promptly enough.

No Valid Legal Basis

The EDPB has found that Meta engages in the unlawful processing of personal data from Facebook and Instagram users, including details like users’ locations, ages, education, and their online activity. Meta builds user profiles from this information, which advertisers find valuable, contributing to Meta’s revenue.

According to the EDPB, Meta lacks a valid legal basis for handling this personal data. The benefits claimed by Meta do not provide sufficient justification for processing users’ personal data, and users are often unaware that they are essentially trading their personal data for personalized ads when engaging with Meta.

Does your organization have any questions about privacy and advertising? Contact us, the Experts in Data Privacy at for assistance.




Tuesday 31 Oct - 8:49 AM

Dutch Data Protection Authority to receive more funding for cookies and online tracking

Normal 0 false false false en-NL X-NONE X-NONE

The Autoriteit Persoonsgegevens (“AP”), also known as the Dutch Data Protection Authority, will receive €500,000 extra funding in the coming years that must be used specifically to monitor cookies and online tracking.

In a letter to the House of Representatives, the State Secretary wrote that the Dutch Data Protection Authority will be awarded the amount of € 500, 000 per year commencing in 2024 and will be provided for 2025 and 2026.

This budget is in addition to the extra budget that was promised to the Dutch Data Protection Authority on Prinsjesdag in September this year. While the Dutch Data Protection Authority already supervises this from a General Data Protection Regulation (“GDPR”) perspective, the Dutch government has long believed that the independent regulator should pay more attention to cookies and online tracking.  

The extra funding is valid until 2027; thereafter, the Dutch Data Protection Authority will receive a structural budget of €350,000 intended for publishing guidance and developing tools to make investigations easier. In doing so, the Dutch Data Protection Authority will engage and cooperate with other regulators, such as the Autoriteit Consument en Markt.

This is the second time that funding has been made available to the Dutch Data Protection Authority for an explicit task. The first time concerned the supervision of algorithms.  

Considering that cookies will be receiving more attention by Supervisory Authorities and the European Economic Data Protection Board (“EDPB”), it is imperative that organizations ensure that if using cookies, it done in a compliant manner.

Does your organization have any questions about cookies? Contact us, the Experts in Data Privacy at for assistance.  




Thursday 26 Oct - 12:59 PM

Notes of Caution on DPF certified companies

What does it mean that a company is Data Privacy Framework certified?

In July 2023, the European Commission adopted a decision to enact the EU-US Data Privacy Framework (DPF). After this decision, US companies can become DPF-certified. This decision is voluntary and a company may well decide not to be certified. This means that each company will decide to become DPF certified depending on its:

  • type of business operations
  • risk appetite
  • global privacy program

It is important to remember that in data processing activity which involves an international data transfer of personal data, a DPF-certified company may well be:

  • a data controller
  • a joint-controller
  • a data processor

Regardless of the role, after the full entry into force of Data Privacy Framework, DPF-certified companies are mandated to provide an adequate level of protection for personal data received from (or sent to) according to the GDPR provisions. Therefore, the key advantage is that the legal basis for the international transfer is equivalent to an adequacy decision.

However, being DPF-certified is not a free-for-all situation for international data transfers concerning the US because:

  • the certification might not cover all the products and/or services of the certified company
  • a Data Processing Agreement (DPA) is still required according to Article 28 GDPR.

What does it mean that a company is not DPF-certified?

It is important to remember that there is no adequacy decision in place for the US as a country but only for US companies that are DPF-certified. This means that if a company is not DPF-certified the legal basis for the international data transfer cannot be an adequacy decision, but instead:

  • Standard Contractual Clauses (SSCs) – Article 46 GDPR
  • Binding Corporate Rules, if applicable – Article 47 GDPR
  • Derogations for specific situations – Article 49 GDPR

Moreover, a Transfer Impact Assessment (TIA) would be required.

Final considerations and two notes of caution

First consideration: the Data Privacy Framework relies on the premises of Executive Order 14086 of 7th October 2022 (EO), which was specifically targeted to meet EU standards of proportionality, necessity, and redress. An Executive Order is a rule issued by the president of the United States. This means that another president may decide to retract the order thus putting in jeopardy the DPF. Although this is a remote possibility, it is not a zero-chance scenario.

Second consideration: it is not unlikely that the DPF would be declared invalid by the Court of Justice of the European Union (CJEU) as has already happened with its predecessors: the Privacy Shield, and the Safe Harbor. Cases have already been brought in front of the CJEU and soon we will have a decision on the DPF validity (potentially a Shrem III).

Due to this high level of uncertainty, it is advisable to not rely solely on the DPF as a lawful transfer mechanism and to seek professional advice to implement the best strategy for your international data transfer with the US. Does your organization deal with US companies or have any questions about the DPF? Contact us, the Experts in Data Privacy at for assistance.


Tuesday 24 Oct - 1:12 PM

Saudi Arabia: Personal Data Protection Law

The Regulations of the Personal Data Protection Law, namely the Implementing Regulations and the Regulations on Personal Data Transfer outside the Geographical Boundaries of the Kingdom Saudi Arabia (KSA) were made available on the 7th of September 2023. Collectively they are referred to as ‘The Regulations’. Enforcement of the law will start from 14th of September 2024 onwards. As such, organizations and businesses that fall under the Kingdom’s PDPL (Personal Data Protection Law), have a one-year timeframe to align their processing activities with the requirements of the PDPL and its implementing regulations to ensure compliance.

The Regulations represent a notable advancement in the data protection framework of the Kingdom, as they provide valuable elucidation and essential particulars that complement the KSA PDPL.  The clauses and stipulations within their legislation showcase a strong convergence with not only other Middle Eastern nations but also international data protection standards, notably the EU GDPR. In particular, the PDPL governs the handling of an individual’s personal data within the KSA, even when such processing is conducted by entities located outside the country. The primary objective of the PDPL is to safeguard the privacy of personal data, oversee data sharing practices, and thwart any misuse of personal information. The Implementing Regulations serve the purpose of providing further clarification and detailing the practical application of the PDPL and thereby complementing the PDPL.

Implementing Regulations

The Implementing Regulations clarify key elements, such as information on the data subject rights, the specifications for the register of processing activities, obligations of the involved parties in case of a data breach – when to report to the competent authority and data subjects – and specifications about the information such a notification should entail.  Notably, a data protection impact assessment must be undertaken by controllers in nine different situations to assess the risks and address those risks. Other interesting core elements of the PDPL are purpose limitation and data minimization, conditions for when to appoint a data protection officer, rules concerning the disclosure of personal data to others (being third parties, such as the authorities), obligations- and responsibilities of the controller, processors and sub-processers and material requirements for the agreements to be concluded between the parties (data processing agreements). As demonstrated, it is a rather comprehensive and extensive piece of legislation. The practical application of these elements shows great resemblance to the GDPR, a very welcomed approach.

The Data Transfer Regulations

The PDPL also addresses the transfer of personal data between different countries. The Data Transfer Regulations provide further practical clarification and describe the situations when the controller is permitted to transfer or disclose personal data to an organization located outside the boundaries of the KSA. Once again, the requirements and obligations demonstrate close alignment with the GDPR.

What can DPO Consultancy do for you?

Does your organization have any questions about the obligations and requirements set forth by the PDPL and The Regulations? Contact us via for assistance.

Source: Kingdom of Saudi Arabia’s New Personal Data Protection Law and Implementing Regulations—Key Obligations, Responsibilities and Rights | Akin Gump Strauss Hauer & Feld LLP

Saudi Arabia publishes final Personal Data Protection Law (


Thursday 19 Oct - 2:08 PM

EDPB 2024 Coordinated Enforcement Action on the Implementation of the Right of Access by Data Controllers

The European Data Protection Board (“EDPB”) has chosen the topic for its third coordinated enforcement action. Throughout the year, 26 Data Protection Authorities (“DPAs”) across the European Economic Area (“EEA”) including the European Data Protection Supervisor (“EDPS”) will participate in the Coordinated Enforcement Framework 2024 (“CEF”) on the implementation of the right of access by controllers. This initiative is important because to date many companies still struggle when they receive a data subject access request.

The right of access of data subjects has been a part of the European data protection legal framework since its beginning and it is now further developed by more specified and precise rules in Article 15 General Data Protection Regulation (“GDPR”).

Data subjects have the right to obtain from the controller confirmation as to whether or not personal data concerning them are being processed. If this is the case, they can access to the personal data and the other relevant information:

  • the purposes of the processing;
  • the categories of personal data concerned;
  • the recipients or categories of recipients of the data subject personal data;
  • the storage period or the criteria used to determine that period;
  • the existence of the right to request from the controller rectification or erasure of personal data or restriction of processing of personal data concerning the data subject or to object to such processing;
  • the right to lodge a complaint with a supervisory authority;
  • where the personal data are not collected from the data subject, any available information as to their source;
  • the existence of automated decision-making, including profiling.

So how to manage Data Subject Access Requests Efficiently and Effectively

Despite GDPR’s provisions on the data subjects’ right of access, in 2022 the EDPB noted that the GDPR itself is not very prescriptive as to how the controllers have to provide access. The EDPB guidelines 01/2022 on data subject right of access shed some light stating that data controllers:

  • should find a balance between appropriate ways to inform the data subject and the privacy of others (i.e. the controller’s employees) when retrieving the requested data;
  • must document their approach to be able to demonstrate how the means chosen to provide the necessary information are appropriate;
  • can provide access to the data subject through other ways than providing a copy;
  • in case of a processing activity that involves a large amount of personal data, can use a layered approach (i.e. a data controller that analyses big data sets to place customers in different segments depending on their online behaviour). This layered approach, however, should not create an extra burden for the data subject;
  • must carefully decide upon the format in which the copy of the personal data and the information should be provided;
  • can extend the time to respond in certain cases.

In order to be ready to properly manage data subject access request, controllers should then draft in advance a data subject rights (DSR) request policy and procedure. This documentation allows the data controller to maintain procedures to respond to access requests and timely and effectively managed them. Moreover, through a DSR request policy and procedure it would also be possible to maintain the following procedures to:

  • address complaints;
  • respond to requests to rectify personal data;
  • erase personal data;
  • restrict the processing of personal data;
  • respond to requests to opt-out;
  • use the right to data portability;
  • object to the processing of personal data;
  • not be subjected to automated individual decision-making, including profiling.

In addition, it would also be possible to implement:

  • customer Frequently Asked Questions;
  • escalation procedures for serious complaints or complex access requests;
  • procedures to investigate root causes of data protection complaints;
  • metrics for data protection complaints (e.g. number, root cause).

What can DPO Consultancy do for you

Does your organization require assistance in handling data subject requests? Does your organization have already in place a DSR request policy and procedure? Contact us, the Experts in Data Privacy at, for assistance.


Tuesday 17 Oct - 7:45 AM

California Enacts New Privacy Law, ‘Delete Act,’ Affecting Data Brokers and Personal Data Protection

California has added another privacy law to its existing regulations. Governor Gavin Newsom signed Senate Bill 362, often referred to as the Delete Act, into law just before the October 14th deadline. This new law builds on the California Consumer Privacy Act and the California Privacy Rights Act. It mandates that data brokers must register with the California Privacy Protection Agency (CPPA), which will enforce the law.

A significant aspect of the Delete Act is the requirement for the CPPA to develop a mechanism, by January 1, 2026, that allows verified consumers to easily request the deletion and tracking of their personal data in one central place. Starting from August 1, 2026, data brokers will need to process deletion requests within 45 days of receiving a verified request.

Previously, consumers had to contact each data broker individually to request data deletion. California has around 500 such data brokers operating in the state.

State Senator Josh Becker, the bill’s author, hailed Newsom’s signature as making California a leader in consumer privacy. He emphasized that data brokers possess vast amounts of personal information and often sell sensitive data like reproductive healthcare and geolocation information to the highest bidder. The Delete Act aims to protect such sensitive information.

The new law transfers data broker registration from the California Department of Justice to the CPPA. Data brokers, as defined in the Delete Act, are companies that collect, use, and sell personal data without the consumer’s knowledge. The law also establishes a “do not track” list, preventing data brokers from collecting user data in the future.

The law also imposes transparency requirements on data brokers, including disclosing if they collect precise geolocation data, reproductive healthcare data, and personal information about minors. Reproductive healthcare data became a significant concern following the U.S. Supreme Court’s decision on Dobbs, which overturned Roe v. Wade.

The law has received mixed reactions. Supporters believe it will empower individuals to regain control of their personal data, while opponents, such as Gravy Analytics Chief Privacy Officer Jason Sarfati, are concerned about its impact on the data industry and California’s digital economy.

The Delete Act is a state law, but given California’s large population, some argue that it effectively functions as a federal law.

Implementing the single form deletion mechanism required by the Delete Act is a complex task. It raises questions about the verification of deletion requests, data collection and storage, and the potential for misuse in social engineering attacks.

While details need to be worked out, the Delete Act is now law and demands attention from those it applies to.

Does your organization have any questions about the Delete Act or other U.S. privacy laws? Contact us, the Experts in Data Privacy at for assistance.



Tuesday 26 Sep - 7:54 AM

The UK-US Data Bridge and the Future of Data Transfers

On September 21st, Michelle Donelan, the UK Secretary of State for Science, Innovation, and Technology, introduced regulations in the UK Parliament to establish a UK-US Data Bridge. This decision was based on her assessment that the UK-US Data Bridge “upholds high standards of privacy for UK personal data.”

These regulations will come into effect on October 12th. The UK government has also released supporting documents, including an explanation, fact sheet, and over 130 pages of in-depth analysis of US privacy safeguards relevant to the UK-US Data Bridge.

Through the Data Bridge, UK organizations can transfer personal data to US entities certified under the “U.K. Extension to the EU-US Data Privacy Framework” without the need for additional safeguards, such as international data transfer agreements (equivalent to the EU’s standard contractual clauses or binding corporate rules). Both UK and US organizations are required to meet certain criteria to implement the Data Bridge, including updating privacy policies and adhering to the Data Privacy Framework List.

An additional, perhaps more significant yet indirect, benefit of the Data Bridge arises. Thousands of UK organizations have used, and may continue to use, alternative transfer mechanisms for transferring personal data from the UK to the US. In doing so, they have been obligated to conduct a transfer risk assessment to evaluate whether the chosen alternative transfer mechanism, in the context of the transfer, would undermine the protections for individuals under the UK data protection regime due to the laws and practices of the third country.

Starting from October 12th, there are strong arguments to suggest that UK organizations may no longer need to perform such assessments concerning US surveillance laws and practices.

The UK government has meticulously analyzed relevant US laws and practices related to accessing and using personal data for national security and law enforcement purposes. This analysis played a significant role in the UK’s determination that, as a matter of UK law, these US laws and practices do not compromise data protection for UK data subjects when their data is transferred to the US.

This assessment applies to transfers made through the UK Extension to the EU-US Data Privacy Framework as well as those made through alternative transfer mechanisms like UK international data transfer agreements or BCRs.

The UK-US-EU triangle stands as a vital element in the global data transfer puzzle. Each side of this triangle is likely to face testing moments and mounting pressure. Challenges have already emerged, such as the EU adequacy decision for the EU-US Data Privacy Framework facing legal challenges.

While this newly established triangle confronts internal pressures, it also encounters opportunities and challenges on the global stage. Initiatives by various organizations exemplify the potential and prevalence of multilateralism and more scalable frameworks for data transfers.

With bridges constructed over previously troubled waters, privacy professionals can anticipate heightened focus and momentum in crafting a new framework for data transfers.




Tuesday 19 Sep - 9:05 AM

The Dutch Data Protection Authority demands clarity from Tech company on AI targeting children

The Dutch Data Protection Authority (Autoriteit Persoonsgegevens – AP) has expressed concerns regarding the handling of personal data by organizations employing generative artificial intelligence (AI). Special attention is directed towards apps designed for young children that utilize this form of AI.

As a result, the AP is taking various actions in the coming period. One such action involves the AP seeking clarity from a tech company concerning the functioning of the chatbot integrated into their app, which is popular among children. The AP seeks to ascertain the level of transparency exhibited by the chatbot regarding the handling of user data.

Given the rapid pace of developments in AI deployment, particularly in generative AI applications, there is a growing concern. The AP perceives significant risks associated with offering applications like AI chatbots to children, who may not be adequately aware of these risks. It is crucial for them to understand what happens to their chat messages and whether they are interacting with a chatbot or a real person.

The AP has raised several questions with the tech company, including inquiries about the transparency of the app regarding data usage and whether adequate consideration has been given to the retention period for collected data. Based on the responses, the AP will determine whether further actions are necessary.

Various Actions

Previously, the AP announced its intention to take several actions targeting organizations employing generative AI. During the summer, the AP sought clarification from software developer OpenAI regarding the ChatGPT chatbot and inquired about the company’s handling of personal data during the system’s training.

Additionally, the AP is part of the Taskforce ChatGPT within the collaborative framework of European privacy supervisory authorities, the European Data Protection Board (EDPB).

Generative AI

Generative AI can produce text or images based on a given prompt. These models are trained using extensive datasets typically sourced from the internet, often collected through scraping.

The questions posed by users in such tools can also serve as a source of data, occasionally including highly personal information. For example, someone seeking advice on marital disputes or medical matters through such a tool may share sensitive data. Consequently, generative AI tools often involve processing personal data, and developers must adhere to privacy regulations during their development.

The AP plans to release guidelines on scraping for businesses in the near future.




Tuesday 12 Sep - 8:29 AM

Legal challenge against the EU-US Data Privacy Framework has been filed

A French lawmaker and Member of the European Parliament, Philippe Latombe, has submitted challenges to the European Union General Court against the EU-US Data Privacy Framework (“DPF”). The DPF allows companies to freely transfer data between the European Union (“EU”) and the United States (“US”).

This legal challenge comes less than two months after the European Commission and the US government allegedly ended years of legal limbo for companies.

In a statement released by Latombe, he states that “the text resulting from these negotiations violates the Union’s Charter of Fundamental Rights, due to insufficient guarantees of respect for private and family life with regard to bulk collection of personal data, and the General Data Protection Regulation (GDPR).”

Two legal challenges were filed. The first legal challenge is to suspend the agreement between the EU and the US immediately and the second legal challenge concerns the content of the text of the DPF. This is because the concerns about US mass surveillance are still present and the DPF was notified in EU countries in English only and was not published in the EU’s Official Journal. This could fall short of procedural rules.

This legal challenge and the expected legal challenge of Max Schrems and noyb have the potential of opening the doors to years of legal wrangling. Until there is further legal clarity regarding the DPF, it is recommended to ensure compliance with the applicable requirements for international data transfers.

Are you certain that your organization complies with the applicable requirements for international data transfers? Contact us, Experts in Data Privacy at for assistance.




Wednesday 23 Aug - 7:36 AM

Rejecting cookies: Spanish Regulator follows other Supervisory Authorities’ approach

The Spanish Data Protection Authority Agency (“AEPD”) recently updated its cookie guidance, which now requires a “reject” button in the first layer of cookie banners.

Previously, the AEPD did not require a “reject” option in the first layer of information. This is commonly referred to as the cookie banner. The option to reject any cookies being installed on devices could be provided in the second layer of information or cookie settings.

The majority of European Union (“EU”) Supervisory Authorities consider the approach of a cookie banner without a “reject” option alongside an “accept” option as an infringement of the ePrivacy Directive. This was also reflected in the report issued by the European Data Protection Board’s (“EDPB”) Cookie Banner Taskforce in January 2023.

The updated guidance aligns with the Cookie Banner Taskforce report and the EDPB’s Guidelines on deceptive patterns, which was adopted in February 2023. This change introduces a stricter criterion, requiring the inclusion of a “reject” button in cookie banners.

The guidance includes examples of adequate cookie banners and sets out that both options (that is “accept all” and “reject all”) must be presented to the user at the same time, at the same level and with the same visibility. Colour or contrast of the text and buttons must not lead users to provide consent involuntarily. 

Personalized cookies and cookies walls are also addressed in the updated cookie guidance of the AEPD. If a website editor decides about the personalization cookies based on information obtained from the user, the user must be informed and offered the option to accept or reject these cookies. Furthermore, while cookies walls are not allowed, there may be instances where not accepting the use of cookies would prevent access to the website or service, as long as the user is informed and is offered an alternative to access the services without the need to accept the use of cookies. This does not necessarily have to be free of charge.

This updated cookie guidance of the Spanish Regulator follows the approach of the French and Belgium Supervisory Authorities and the British Information Commissioner’s Office (“ICO”). Looking ahead, updates from various other EU Supervisory Authorities will be closely monitored to determine if a similar approach will be adopted in other EU Member States.

Does your organization have any questions about cookies? Contact us, the Experts in Data Privacy at for assistance.



Tuesday 8 Aug - 7:40 AM

EDPB: launches coordinated enforcement on the role of Data Protection Officers

The European Data Protection Board (“EDPB”) has launched its 2023 coordinated enforcement action. Throughout the year, 26 Data Protection Authorities (“DPAs”) across the European Economic Area (“EEA”) including the European Data Protection Supervisor (“EDPS”) will participate in the Coordinated Enforcement Framework 2023 (“CEF”) on the designation and position of Data Protection Officers (“DPOs”).

 DPOs, which are intermediaries between DPAs, individuals and the business units of an organization, have an essential role in contributing to compliance with data protection law and promoting effective protection of data subject rights.

In order to determine whether DPOs have the position in their organization required by Articles 37 – 39 General Data Protection Regulation (“GDPR”) and the resources needed to carry out their tasks, participating DPAs will implement the CEF at national level in a number of ways, including:

          DPOs will be sent questionnaires to assist in fact-finding exercises or questionnaires to identify if a formal investigation is warranted;

          Commencement of a formal investigation;

          Follow-up of ongoing formal investigations.

The results of this joint initiative will be analysed in a coordinated manner and the DPAs will decide on possible further national supervision and enforcement actions. Furthermore, the results will be aggregated, generating deeper insight into the topic and allowing targeted follow-up action at the European Union (“EU”) level. The outcome of this analysis upon conclusion of the actions will be published by the EDPB.

This is the second initiative under the CEF and the results of the first initiative highlighted certain key aspects on the role of DPOs, which include:

          In addition to appointing the DPO to the competent supervisory authority, one organization should notify the other supervisory authority where there are other branches of the organization, despite the consistency mechanism;

          DPOs are not allowed to engage in a role as the controller’s representative before the supervisory authority, as this could jeopardize the autonomy or independence of the DPO;

          It is not possible to hire a company as an outsourced DPO and have it also hire an external individual to perform this role;

          An organization with a data protection committee does not replace the obligation of appointing a DPO; and

          If the DPO is selected to serve a head of compliance, audit and risk management, the autonomy or independence of the role may be compromised. This should be analysed on a case-by-case basis.

Decisions by DPAs have provided valuable insights into the role and responsibilities of DPOs under the GDPR. These decisions emphasize the importance of autonomy and independence, conflict of interest management, direct access to top management and appropriate measures for external DPOs.

Does your organization require the assistance of a DPO or does you question have questions in this regard? Contact us, the Experts in Data Privacy at, for assistance.




Tuesday 1 Aug - 8:09 AM

Update: Irish Data Protection Commission Passes GDPR ‘Gag’ Law

In June we reported about the Irish Government’s ‘last minute’ amendment to the Courts and Civil Law (Miscellaneous Provisions) Bill 2022, which was seen by some as an effort to ‘muzzle’ critics of the Data Protection Commission (“DPC”).

The Bill was debated and the Irish Government has indeed passed a new law governing complaints brought under the General Data Protection Regulation (“GDPR”), which allows the DPC to declare cases  ‘confidential’ and criminalize anyone sharing information about GDPR procedures. This is seen by some as a gag order introduced by the DPC.

Privacy activists, such as Max Schrems and noyb, have been a thorn in the DPC’s side for a number of years and have successfully challenged several of the DPC’s GDPR decisions. The new Irish GDPR gag law changes that, and gives the DPC wide-ranging powers to stop anyone from sharing details of GDPR cases.

Silencing Privacy Activists

The European Digital Rights group, another group of privacy activists, issued a statement of condemnation when the Irish Government first indicated that the Bill would come up for debate towards the end of June and noted that there was no pre-legislative scrutiny.

Furthermore, the Irish Council for Civil Liberties and Amnesty International were also troubled by the DPC’s new powers. They described it as “a blatant attempt not only to shield Big Tech from scrutiny but also to silence individuals and organizations that stand up for the right to privacy and data protection.”

EDPB and its powers

The European Data Protection Board (“EDPB”) ensures that the GDPR is applied consistently across the European Union (“EU”) and promotes cooperation – including on enforcement –  between national data protection authorities (“DPAs”).

There are concerns that the EDPB might be adversely affected by the new Irish law and will likely create further tensions between that the DPC and the EDPB. Presently, the EDPB has not released a statement on this development and therefore it is not clear

Effect on the rest of the EU

Previously, the DPC has taken the view that its orders and rules would apply in other EU Member States. This is in contradiction to Article 55(1) GDPR, which clarifies that each DPA only has jurisdiction in its own Member State. Furthermore, EU law prevents the DPC to apply Section 26A outside of Ireland.

The law has been viewed by some as unclear and likely unconstitutional and combined with the EU plans to modify some aspects of the GDPR, it is expected that the privacy world may be facing some interesting times ahead.

Does your organization require any assistance regarding the GDPR? Contact us, the Experts in Data Privacy, at for assistance.




Tuesday 18 Jul - 2:45 PM

Google’s AI chatbot launched in the EU

Bard, the artificial intelligence chatbot of Google, is rolling out in the European Union (“EU”) after resolving privacy concerns raised by the bloc’s key privacy regulator, the Irish Data Protection Commission (“DPC”).

In June, Google delayed the release of Bard after the Irish Data Protection Commission said that Google had given insufficient information about how Bard respected the General Data Protection Regulation (“GDPR”). The Irish DPC is the main regulator in the EU as Google’s European headquarters are located there.

Google met with Irish Data Protection Commission and other watchdogs to provide reassurance on issues relating to transparency, control and choice. Google has been quoted as saying “we expect it to be an ongoing dialogue” between itself, the Irish DPC and other privacy regulators in Europe.

Google has made numerous changes in advance of the launch, which includes features to enhance transparency, control and choice of users. Users will be able to know how their information is being used, to opt out of some uses and to control what conversations with Bard are saved or deleted by Google. Furthermore, the user interface has been changed to integrate a new Bard ‘privacy hub’ and it will also feature contextual information.

The decision to postpone the launch of Bard in the EU, is a recent example of US tech firms holding off on out products in the bloc. ChatGPT, which is Bard’s main competitor, did not follow this approach and was temporarily banned in Italy in March this year over concerns that it could violate privacy standards. ChatGPT is under investigation in various countries, such as Spain and Germany.

Data protection authorities are scrutinizing various privacy issues raised by generative AI tools under the umbrella of the European Data Protection Board (“EDPB”). Further guidance in this regard is expected in due course.

Does your organization make use of generative artificial tools or does your organization have questions about these tools? Contact us, the Experts in Data Privacy at, for more information.




Tuesday 11 Jul - 2:23 PM

The USA is adequate again: what we know so far

Yesterday, 10 July 2023, the European Commission (“the Commission”) adopted the adequacy decision for the EU-US Data Privacy Framework (“the Framework”).

The result thereof is that the United States (“US”) is now regarded as an adequate country that ensures an adequate level of protection for personal data transferred from the European Union (“the EU”) to US companies participating in the Framework. Personal data can flow safely from the EU to the US without having to additional data protection safeguards in place.

The Framework introduces new binding safeguards to address all the concerns raised by the European Court of Justice (“CJEU”), which includes:

      Limiting access to EU data by US Intelligence Services to what is necessary and proportionate to protect national security; and

       The establishment of a Data Protection Review Court (“DPRC”) to which EU individuals have access to. The DPRC will independently investigate and resolve complaints, including adopting binding remedial measures.

US companies may join the Framework by committing to comply with a detailed set of privacy obligations. These include adhering to the principles of purpose limitation, data minimization and data retention and more specific obligations such as the requirement to delete personal data when it is no longer necessary for the purpose for which it was collected and to ensure the continuity of protection when personal data are shared with third parties.

The self-certification program will be administered by the US Department of Commerce, which will process applications for certifications and monitor when participating companies continue to meet the certification requirements. The US Federal Trade Commission will enforce compliance with the obligations under the EU-US Data Privacy Framework for US companies.

The Commission has indicated that it will continuously monitor relevant developments in the US and regularly review the adequacy decision. The first review will take place within one year after the entry into force of the adequacy decision to verify whether all relevant elements of the Framework are functioning effectively in practice.

The NGO “noyb” has already announced that they will challenge the decision since, they declare, the new agreement is basically a copy of the Privacy Shield and does not protect the rights of data subjects in a compelling way.

In the interim, the main points for organizations to consider:

       The adequacy decision enters into force with its adoption on 11 July 2023;

      These safeguards apply to transfers of personal data to the US regardless of the transfer mechanism relied upon. Therefore, these safeguards also facilitate the use of other tools, such as the Standard Contractual Clauses and Binding Corporate Rules;

      Transfer Impact Assessments (“TIAs”) for EU-US data transfers are bit more challenging. Technically, TIAs are not needed for transfers covered by the Framework as the Framework adequacy decision replaces the adequacy assessment in the TIA. Existing TIAs should be reconsidered to account for the changes to US surveillance laws. Lastly, it is important to note that TIAs are still required for transfers not covered by the Framework (read: not certified) whether to the US or other third countries;

       Monitor developments to ensure your organization makes use of the correct contractual template.

      The question now remains whether to self-certify to the EU-US Data Privacy Framework or to continue using Standard Contractual Clauses?





Thursday 6 Jul - 9:26 AM

EU-US Data Privacy Framework awaiting adequacy decision

A lawyer for Ireland’s Data Protection Commissioner (“DPC”) has said that the European Commission (“the Commission”) is due to finalize a new data transfer pact with the United States (“US”) by mid-July.

On the 3rd of July, the US Secretary of Commerce issued a statement regarding the EU-US Data Privacy Framework indicating that the US has fulfilled its commitments for implementing the Data Privacy Framework.

One of these commitments is the Department of Justice’s designation of the European Union Member States, Iceland, Liechtenstein and Norway as ‘qualifying states’ whose citizens are able to file for redress through the proposed Data Protection Review Court while obtaining enhanced US privacy protections as established under Executive Order 14086 (“EO 14086”). This designation will, however, only take effect upon finalization of the Commission’s adequacy decision with the US.

Another commitment that has been made is the Office of the Director of National Intelligence confirmed that the US Intelligence Community has adopted its policies and procedures pursuant to EO 14086.

The US believes that taken together, the strengthened safeguards for signals intelligence activities established in EO 14086, the designation of the EU/EEA as qualifying states, the adoption of the Intelligence Community’s implementing procedures and the updated EU-US Data Privacy Framework principles will enable the EU to move forward with the adoption of an adequacy decision for the US. It remains to be seen if the new data transfer pact, which will be presented to the Commission’s collective decision-making body in mid-July, will be acceptable.

The Commission has indicated that it is currently finalizing the framework and repeated that it expects it to be in place this summer. Until there is more clarity surrounding the EU-US Data Privacy Framework, organizations must ensure they adhere to the available transfer mechanisms and that the supplementary measures are implemented. 

Does your organization have any questions about transferring personal data internationally to the US? Contact us, the Experts in Data Privacy at for assistance.   



Tuesday 4 Jul - 11:53 AM

The concept of “anonymization” and its meaning in practice

The concept of anonymization can be complex and vary across jurisdictions. Data that meets the criteria for anonymization is generally exempt from privacy and data protection laws. As the adoption of artificial intelligence increases, which relies on large datasets, the need for clear guidelines and standardization has become more pressing. In the EU, where data regulations have set the benchmark for data usage, achieving accurate anonymization is particularly significant. Compliance with EU regulations is a key aspect of global data strategies aimed at responsible data collection and utilization.

However, implementing EU legal standards for anonymization can be challenging and has faced criticism due to its inherent ambiguities. This article aims to shed light on the confusion surrounding these standards and their evolving nature. Recently, a notable event has taken place which is the decision of the EU General Court in Single Resolution Board v EDPS on April 26th.

The SRB v EDPS Case

This case involves the handling of personal comments provided by shareholders and creditors of Banco Popular, who were affected by a resolution decision made by the Single Resolution Board (SRB), the central resolution authority in the Banking Union. Some of these comments were shared with a third party, Deloitte, but without including identifying data used for registration purposes. The comments shared with third parties were filtered, categorized, and aggregated, and assigned a unique alphanumeric code.

The European Data Protection Supervisor (EDPS) claimed that the SRB violated the data protection obligations stated in Article 15 of the GDPR by not informing the data subjects that their personal data would be shared with third parties. In assessing the SRB’s request to overturn the EDPS’s revised decision, the General Court had to determine whether the comments shared with Deloitte qualified as personal data under Article 3(1) of the GDPR and whether the data subjects remained identifiable. Essentially, the General Court had to ascertain whether the data had been properly anonymized. In this regard, the General Court disagreed with the EDPS, stating that a risk-based approach indeed meets the legal requirements for anonymization in the EU.

The General Court made reference to a previous case known as the Patrick Breyer case from the Court of Justice of the European Union and considered two factors to assess whether adequate standards for anonymization were met:

  1. Controlled environment: This refers to the contextual controls implemented in the data environment, such as access controls. The General Court noted that Deloitte did not have access to the identification data collected during the registration phase, which would have allowed the participants to be linked to their comments through the alphanumeric code.
  2. Data itself: This refers to the controls applied to the data to transform its appearance, such as masking. The Court emphasized that the alphanumeric code alone did not enable the authors of the comments to be identified. It stated that personal views or opinions may potentially constitute personal data, but this conclusion cannot be based on presumption alone. Instead, it must be determined whether a view is connected to a specific person based on its content, purpose, or effect.

The General Court concluded that it was essential to assess whether the information transmitted to Deloitte pertained to “identifiable persons” from Deloitte’s perspective. It found that the EDPS failed to consider whether Deloitte could re-identify the authors of the comments or if such re-identification was reasonably possible by combining the transmitted information with additional data held by the SRB. As a result, the General Court deemed the EDPS’s stance as incorrect, emphasizing that it was the EDPS’s responsibility to evaluate whether Deloitte had the means to reasonably identify the authors of the comments.

In practice, the Court’s decision demonstrates that the EU does not strictly adhere to an absolutist approach to anonymization, which demands complete elimination of any chance of reidentifying data, even if statistically improbable. Instead, the court adopts a risk-based approach, considering both data controls and contextual factors. This approach allows for a certain level of probability of identification to remain acceptable as long as it is not likely or reasonable.

However, some objections have been raised regarding the General Court’s opinion. For instance, Recital 26 of the GDPR emphasizes the need to consider the intended data recipient and potential attackers to determine identifiability. Some regulators, like the Irish Supervisory Authority, default to viewing data controllers as potential attackers. However, when appropriate context controls are implemented, data controllers can also be considered trustworthy, as acknowledged by the U.K. Information Commissioner’s Office.

Additionally, it remains unclear whether the purpose for sharing the data played a significant role in the General Court’s analysis. Purpose restrictions are crucial in a risk-based approach to anonymization and typically exclude individual decision making. The importance of purpose restrictions has been emphasized by the EDPS and is also evident in the European Data Space regulation.

What does this mean for your organization?

The European Data Protection Board (EDPB) is currently working on finalizing guidance on anonymization, and it is uncertain whether the shift towards a more practical approach to anonymization in the EU will receive official endorsement from the key regulatory body overseeing personal data. There is a growing trend towards adopting a risk-based approach in the EU, but nonetheless we must be cautious and follow a strict approach until more clarity and guidance is provided. If you have questions regarding this matter, contact us, the experts at data privacy at for assistance.



Tuesday 27 Jun - 9:58 AM

Critics of the Data Protection Commission possibly “muzzled”

Yesterday, 26 June 2023, the Irish Council for Civil Liberties (“ICCL”) reported that the Irish Government’s “last minute” amendment to the Courts and Civil Law (Miscellaneous Provisions) Bill 2022 is an effort to “muzzle’’ critics of the Data Protection Commission (“DPC”), which will result in the DPC’s decision-making “even more opaque”.

The amendment provides that the DPC may direct information deemed by it to be confidential and not to be disclosed. Failure to comply with this non-disclosure notice issued by the DPC will be an offence liable on summary conviction to a €5000,00 fine.

In the event that the amendment to the Bill is approved, it is alleged that it will essentially silence people from speaking about how the DPC handles their complaint and from speaking about how Big Tech companies or public companies are misusing their personal data.

The concerns of the ICCL include that there is no explanation concerning what or why information may be deemed confidential. The possibility is present that even information that is not regarded as ‘commercially sensitive’ will no longer be utterable. Furthermore, it will be impossible for journalists to report on Ireland’s General Data Protection Regulation (“GDPR”) supervision of Big Tech companies that have their European headquarters there, including Google, Meta, Apple, Microsoft and TikTok.

Critics of the proposed amendment have argued that if enacted, the amendment may further damage the proper flow of information between the DPC and its peer Data Protection Authorities across the European Union, because the DPC, the lead data protection enforcer across Europe, will have the power to designate as confidential material that should be shared with other European Data Protection Authorities.

Furthermore, this amendment creates the risk of a conflict with imminent European law as the European Commission is currently formulating a new regulation to harmonise the conduct of cross-border GDPR cases.

Max Schrems and noyb have also published an article about the proposed amendment on its website and have stated that the last form of accountability is gone. Judicial action against the DPC has always been very costly and therefore public accountability was often the only way to ensure that the DPC does its job. The proposed amendment, if passed, now criminalizes this last form of accountability. Furthermore, according to Schrems, “no proper public debate or consultation of people that will be affected by such ‘gag orders” has taken place and “the way that this ‘gag order’ law is passed is highly problematic too.”

It is not clear what other European Data Protection Authorities or the European Data Protection Board (“EDPB”) thinks of this proposed amendment or how it will impact the cooperation between the DPC and other Data Protection Authorities when dealing with cross-border GDPR cases.

This Bill is up for final debate tomorrow, 28 June 2023, and the ICCL and Schrems have urged all parties in the Dáil, which forms part of the Irish Government, to challenge the proposed amendment.

This development will be closely monitored and updates will be provided as soon as further information is available.

Does your organization have questions about an investigation from a Data Protection Authority? Contact us, the Experts in Data Privacy at for assistance.



Thursday 22 Jun - 7:30 AM

Data Subjects’ Complaint Form adopted by EDPB

During its last plenary session of 20 June 2023, the European Data Protection Board (“EDPB”) adopted a data subject template complaint form.

This complaint form is to facilitate the submission of complaints by individuals and the subsequent handling of complaints by Data Protection Authorities (DPAs) throughout the European Union in cross-border cases.

In March 2022, various civil society organizations sent a letter to the European Commissioner for Justice, the EDPB and Members of the Civil Liberties, Justice and Home Affairs (LIBE) Committee of the European Parliament with a call on DPAs to make better use of existing tools for enforcement and cooperation under the GDPR. One such example was that DPAs should be required to perform joint investigations and coordinate their action through task forces where a data controller is subject to complaints in more than one Member State raising similar compliance issues.

During a high-level meeting in April 2022 in Vienna, the EDPB made a commitment to address this. After this meeting, members of the EDPB stressed that its duty is to ensure that the General Data Protection Regulation (“GDPR”) is enforced effectively and consistently; for this, the EDPB would collect best practices regarding the interpretation of national procedural law. Furthermore, DPAs reiterated their commitment to close cross-border cooperation and agreed to various measures to ensure enhanced cooperation.

The complaint form takes into account the existing differences between national laws and practices and it will be used on a voluntary basis by DPAs. DPAs can also adapt the complaint form to their respective national requirements.

It can also be used for cases where the complaint is filed by the individual personally and for cases where the complaint is filed by someone else, such as a legal representative, an entity acting on behalf of the individual or an entity acting on its own initiative.

The EDPB has also developed a template response acknowledging receipt, which aims to provide the complainant with general information on the next steps following the submission of their complaint and highlights the right to an effective judicial remedy against a legally binding decision of a DPA.

While awaiting the data subjects’ complaint form to be published by the EDPB, it is imperative that your organization attends to data subjects’ requests and complaints in accordance with the GDPR as not complying with data subjects’ requests or complaints is a short route to fines.

Does your organization have questions about data subjects’ requests or complaints? Contact us, the Experts in Data Privacy at for assistance.


Normal 0 false false false en-NL X-NONE X-NONE


Thursday 15 Jun - 3:13 PM

Dutch Data Protection Authority calls for clarification on ChatGPT

The Dutch Data Protection Authority (Autoriteit Persoonsgegevens or AP) is concerned about the handling of personal data of organizations that use so-called generative artificial intelligence (AI), such as ChatGPT. The AP is therefore taking several actions in the coming period. First, the AP has asked software developer OpenAI for clarification about chatbot ChatGPT. Among other things, the AP wants to know how OpenAI handles personal data when training the underlying system.

ChatGPT is a chatbot that can provide seemingly convincing answers to many different questions. For example, users can ask ChatGPT to do their math homework or write computer code, or can use the chatbot to ask for advice on relationship problems or medical issues. In the Netherlands alone, 1.5 million people used the chatbot in the first 4 months after its launch.

Trained with personal data

ChatGPT is based on an advanced language model (GPT) that is trained with data. This can be done by collecting data already on the Internet, but also by storing and using the questions people ask. That data can contain sensitive and very personal information, such as when someone asks for advice about a marital dispute or medical issues.

The AP wants to know from OpenAI whether people’s questions are used to train the algorithm, and if so, in what way. The AP also has questions about how OpenAI collects and uses personal data from the Internet. Further, the AP has concerns about information about people that GPT “generates” by providing answers to questions. The generated content may be inaccurate, outdated, inaccurate, inappropriate, offensive, or objectionable and may take on a life of its own.  Whether and if so how OpenAI can rectify or delete that data is unclear.

Other privacy regulators in Europe are also concerned about ChatGPT and believe a joint approach is necessary. Consequently, privacy regulators in Europe have set up a ChatGPT task force within their collaborative framework, the European Data Protection Board (EDPB), to exchange information and coordinate actions.


Personal data is often used in algorithms and AI. This means that the use of algorithms must comply with the General Data Protection Regulation (GDPR). The AP oversees this.

Because algorithms – with or without the use of personal data – can be found in all sectors, it is important to keep an overall eye on the risks and effects. And that regulators, inspectorates and other organizations that deal with them share knowledge and strengthen existing collaborations. The AP coordinates and promotes this joint supervision.

Does your organization use ChatGPT or any other form of AI? If you have any questions or need assistance, contact us, the Experts in Data Privacy, at




Friday 9 Jun - 2:14 PM

EDPB’s New Guidelines on GDPR Fines

The European Data Protection Board (‘EDPB’) recently released its Guidelines on the calculation of administrative fines under the General Data Protection Regulation (‘GDPR’). The new Guidelines provide a clear roadmap how Supervisory Authorities will calculate GDPR fines moving forward.

These Guidelines aim to harmonize the methodology Supervisory Authorities use to calculate fines and include harmonized ‘starting points.’ In this regard, three elements are considered, namely the categorization of infringements by nature, the seriousness of the infringement, and the turnover of a business.

In addition, the Guidelines set out a 5-step methodology for calculating administrative fines. These include identifying the processing operations, determining the starting point for the calculation of the fine, evaluating the mitigating and aggravating circumstances, identifying the legal maximums of fines, and lastly the requirements of effectiveness, dissuasiveness and proportionality are assessed.

Key Takeaways:
The EDPB emphasizes that calculating a fine is not a mere mathematical exercise. The circumstances of the specific case are the determining factors leading to the final amount, which can be any amount up to and including the legal maximum.

The Guidelines also underscore the importance of effectiveness, proportionality, and dissuasiveness when determining fines. This means that fines should not only be suitable for the infringement but also act as a deterrent for future violations.

The degree of cooperation with the Supervisory Authorities are regarded as a mitigating factor when determining the fine and this therefore means that organizations should have clear procedures in place for responding to data breaches and working with Supervisory Authorities.

The Guidelines will also consider the turnover of organizations when calculating fines. This means that organizations with lower turnovers could potentially face lower fines. However, the impact will still be significant.
Repeated infringements can lead to higher fines. Organizations should therefore have procedures in place to learn from any infringements and prevent them from happening again.

Does your organization require assistance with its data breach procedure? Contact us, the Experts in Data Privacy, at

Source: Normal 0 false false false en-NL X-NONE X-NONE

Normal 0 false false false en-NL X-NONE X-NONE



Tuesday 6 Jun - 1:58 PM

Irish DPC intends to fine LinkedIn with a $425m fine

Microsoft, which owns LinkedIn, released a statement to investors in which it vowed to dispute the draft decision and fine by the Irish Data Protection Commission (“Irish DPC”). Furthermore, Microsoft indicated that it would ‘defend itself vigorously’ in case the decision becomes final.

In the statement of 1 June, Microsoft explains that the Irish DPC sent the company a draft decision with a large fine for alleged violations of the General Data Protection Regulation (“GDPR”) through targeted advertising practices of LinkedIn.The Irish DPC sent this non-public draft decision in April with a proposed fine of approximately $425 million (approximately €396 million).

Microsoft has indicated that it intends to respond to the draft decision and that it intends to dispute the legal basis for and the amount of the proposed fine and will continue to defend it compliances with the GDPR.

The Irish DPC first began investigating a complaint against LinkedIn and other companies about their alleged violations of the GDPR through their targeted advertising practices. Microsoft has said that it has cooperated with the Irish DPC “throughout the period of inquiry.”

While there is no set timeline as to when the Irish DPC will issue a final decision, Microsoft will “consider all legal options and intends to defend itself vigorously in this matter.”

Does your organization have questions about advertising practices in compliance with the GDPR? Contact us, the Experts in Data Privacy at, for assistance.





Thursday 25 May - 8:42 AM

Meta fined €1.2 billion over EU-US data transfers

On Monday, a decade-long legal case, spanning from 2013 to 2023, regarding Meta’s role in mass surveillance in the United States reached a significant milestone. The outcome of this case obliges Meta to cease any additional transfer of personal data from Europe to the United States due to its obligation to comply with US surveillance legislation, such as FISA 702. The European Data Protection Board (EDPB) had largely overturned the decision made by the Irish Data Protection Commission (DPC), advocating for a substantial monetary penalty and insisting on a record fine and for previously transferred data to be returned to the EU.

Following the disclosure of Edward Snowden regarding the involvement of major US tech companies in assisting the NSA’s widespread surveillance program, Facebook, now known as Meta, became subject to legal proceedings in Ireland. Over the course of a decade, Meta failed to implement any significant measures and instead disregarded directives from the Court of Justice of the European Union (CJEU) and the European Data Protection Board (EDPB). As a consequence, Meta is now obligated to not only pay a historic fine amounting to €1.2 billion but also return all personal data to its data centers located within the EU.

The reauthorization of FISA 702 is currently being discussed. The ongoing conflict between EU privacy regulations and US surveillance laws also poses challenges for other prominent US cloud service providers like Microsoft, Google, and Amazon. The underlying US surveillance legislation, FISA 702, needs to undergo reauthorization before December 2023. Considering the recent substantial fine imposed by EU data protection authorities, there is a growing demand for significant changes among US tech giants. Several rulings from France, Italy, and Austria have declared the utilization of US services as unlawful, although they did not entail significant fines similar to the recent case.

Although Meta is likely to submit an appeal to both the Irish and potentially the European Courts, the likelihood of the decision being significantly overturned is low. This is due to the fact that the CJEU has previously ruled in two cases spanning from 2007 to 2023 that there was no legitimate legal framework for transferring data between the EU and the US. Additionally, there is no possibility of a new agreement being established to retroactively legalize prior breaches of the law.

According to a recent ruling by the CJEU, individuals may now have the ability to seek emotional damages for minor infringements of their data protection rights, including instances where their data is subject to US mass surveillance. This could result in claims that surpass the current penalties imposed. For instance, the Dutch consumer rights organization Consumentenbond is currently enlisting Dutch Facebook users to assert their claims regarding EU-US data transfers. Without users demanding fair compensation, substantial change is unlikely. As regulatory authorities are currently not very proactive in enforcing the GDPR, it falls upon consumer rights organizations and users to take action. Therefore, Noyb encourages every Facebook user in the Netherlands to register their claims for potential damages. Additionally, the implementation of the EU’s Collective Redress Directive this summer will allow European users to initiate collective actions for GDPR violations for the first time.

What does this mean for other companies? This is the first fine issued for unlawful transfers, setting a high standard for compliance. Controllers need to ensure that in case of international transfers, in the absence of an Adequacy Decision, there are appropriate safeguards and on condition that enforceable rights and effective legal remedies are available for individuals. Such appropriate safeguards include SCCs and TIAs.

Does your organization have any questions about transferring personal data internationally to the US? Contact us, the Experts in Data Privacy at for assistance.



Tuesday 23 May - 7:57 AM

CJEU confirms no minimum threshold for non-monetary GDPR damage claims

On 4 May 2023, the Court of Justice of the European Union (“CJEU”) published its decision in Case C-300-21 where it ruled that not any infringement of the General Data Protection Regulation (“GDPR”) triggers the right to compensation provided by Article 82 of the GDPR. The Court did, however, rule that there is no minimum threshold for damage claims.

The right to compensation as set out in Article 82 of the GDPR does have three requirements, namely that there is an infringement of the GDPR, a damage caused to the affected individual and a casual link between the infringement and the damage.

Background of the case:

During 2017, a company trading in addresses, Austrian Post (Österreichische Post), collected information on the political associations of individuals by using an algorithm that considered various social and demographic characteristics. The data that was generated from this was sold to various organizations for targeted advertising.

While conducting its activities, the Austrian Post identified the plaintiff with a high likelihood to a certain Austrian political party based on the statistical deductions of the collected data. This information was not transmitted to third parties.

The plaintiff had not consented to the processing of his personal data and felt offended by the fact that an association to a certain political party had been attributed to him. The plaintiff further argued that the storage of this data, which was based on a presumption, caused him great upset, a loss of confidence and a feeling of being exposed. The plaintiff brought an action against the Austrian Post for an injunction to stop the disputed data processing and payment in the amount of €1,000.00 as compensation for non-material damage.

Implications of the Ruling:

          Any company that is established in the European Union (“EU”) or subject to the GDPR, can be subject to a damage claim under the GDPR. The risk of damage claims may be particularly high in cases of data breaches and where data subjects can exercise their rights under the GDPR, especially regarding the right of access.

          It clarifies that a materiality threshold is not required and that the violation of the GDPR alone does not trigger a damage claim.

          It is expected that a scattered approach will now arise across the EU on the question under which conditions a non-material damage may be justified.

          The claim for (immaterial) damages does not require that a threshold of seriousness must be met, however, individuals must demonstrate that the infringement of the GDPR caused a (non-material) damage.

          Each Member State is to prescribe the rules governing actions for damage claims under Article 82 GDPR, which must include the criteria for determining the extent of compensation payable in that regard. The principles of equivalence (procedures for the remedies according to Union Law must not be less favourable that those governing similar domestic matters) and effectiveness (the implemented procedures do not make it excessively difficult or impossible in practice for individuals to exercise their rights under Union Law).


It will be very interesting to see how this judgment impacts upon future cases and Case C-340/21, where the Advocate General provided its Opinion regarding compensation for non-material damage for data breaches.

Does your organization have any questions about how this ruling may impact your compliance with the GDPR? Contact us, the Experts in Data Privacy at, for assistance.




Tuesday 16 May - 7:49 AM

European Parliament: EU-US Data Privacy Framework fails to create essentially equivalent level of protection

On 11 May 2023, the European Parliament issued a resolution regarding the adequacy of the protection afforded by the EU-US Data Privacy Framework. In summary, the European Parliament has concluded that the EU-US Data Privacy Framework fails to create an essentially equivalent level of protection.

The resolution contains numerous points but the most relevant points include:

In its resolution of 20 May 2021, the European Parliament called on the European Commission (“Commission”) not to adopt a new adequacy decision in relation to the US unless meaningful reforms were introduced. The Executive Order 14086 (“EO 14086) is not considered as sufficiently meaningful and that the Commission should not leave the task of protecting the fundamental rights of EU citizens to the Court of Justice of the European Union (“CJEU”);

The Data Privacy Framework principles have not been sufficiently amended, when compared to those under the Privacy Shield, to provide essentially equivalent protection to that provided under the GDPR;

The Commission was not in a position to assess the effectiveness of the proposed remedies and rules on data processing by public authorities in the US as the US Intelligence Community has until October 2023 to update its policies and practices to align with the commitment of EO 14086. Furthermore, the US Advocate General must still name the EU and its Member States as qualifying countries to be eligible to access the remedy avenue available under the Data Protection Review Court (“DPRC”).Most importantly, the Commission can only proceed with the next step of an adequacy decision after these deadlines have been met by the US to ensure that the commitments have been delivered in practice;

The Commission and its US counterparts have been called upon to continue negotiations with the aim of creating a mechanism that would ensure such equivalence and which would provide an adequate level of protection as required by Union data protection law and the Charter;

Calls on the Commission to act in the interest of EU businesses and citizens by ensuring that the proposed framework provides a solid, sufficient and future-oriented legal basis for EU-US data transfers; and

Expects any adequacy decision, if adopted, to be challenged before the CJEU and highlights the Commission’s responsibility for the failure to protect EU citizens rights in the scenario where the adequacy decision is once again invalidated by the CJEU.

This week, MEPs of the Civil Liberties Committee will travel to the US to meet with members of the House of Representatives and Senator to discuss a wide range of current policy issues including security and child protection, data transfers and privacy, amongst others.

Until there is more clarity surrounding the EU-US Data Privacy Framework, organizations must ensure they adhere to the available transfer mechanisms and that the supplementary measures are implemented.

Does your organization have any questions about transferring personal data internationally to the US? Contact us, the Experts in Data Privacy at for assistance.



Thursday 11 May - 1:31 PM

The CJEU rules on data subject rights: the right to obtain a copy of personal data  

The Court of Justice of the European Union (“CJEU”) ruled that the right to obtain a copy of personal data entails the right to obtain copies of extracts from documents or even entire documents/extracts from databases which contain those data, if that is essential for the data subject to effectively exercise their rights under the GDPR.  

The case concerns an individual who requested access to his personal data from a business consulting agency that provides information on creditworthiness. The individual requested a copy of emails and database extracts containing his personal data in a standard technical format, but the agency only provided a summary of the list of his personal data undergoing processing. The Austrian Data Protection Authority rejected the individual’s complaint, leading to a court case on the scope of the obligation to provide a copy of personal data under Article 15(3) of the GDPR. 

During the proceedings, the competent Court in Austria asked the CJEU whether the said obligation is fulfilled where the controller transmits the personal data in the form of a summary table or whether that obligation also entails the transmission of documents extracts or even entire documents, as well as extracts from databases, in which those data are reproduced.  

Here are the main conclusions reached by the CJEU:  

– The CJEU clarified that the right to obtain a “copy” of personal data means that the data subject must be given a faithful and intelligible reproduction of all of their data. This means that providing a general description of the data undergoing processing will not be sufficient to comply with the data subject’s request.  

– That right includes the right to obtain copies of extracts from documents or entire documents or database extracts if that is essential for exercising effectively their rights under the GDPR while taking into account the rights and freedoms of others.  

– The CJEU also noted that the term “copy” does not refer to the document itself, but rather relates to the personal data it contains and that this must be complete. Therefore, the copy should include all the personal data that is being processed.  

– Finally, if there is a conflict between granting complete access to personal data and protecting the rights and freedoms of others, the Court suggests finding a balance between the two. It is preferable to choose methods of sharing personal data that do not violate the rights or freedoms of others. However, it is important to note that the end result should not be a complete refusal to provide information to the data subject. 

How does your organisation handle data subject rights? Contact us, experts in data privacy, if you want to learn more via  



Thursday 4 May - 1:38 PM

Latest on EU AI Act

The advancement of artificial intelligence is expected to be the big technological revolution of this century.

As AI technologies continue to advance rapidly and many existing laws across the globe have not been able to keep up with the potential negative outcomes of certain deployments of automated systems, there are uncertainties about when more stringent regulations may be enacted and how they will affect compliance in the future.

The EU is possibly the most advanced globally in creating a comprehensive legislation governing the development and commercial usage of AI through the proposed AI Act.

According to Brando Benifei, a co-rapporteur of the AI Act and a Member of the European Parliament, the European Union faces a challenge in implementing a comprehensive AI regulation because machine learning systems have already been utilized for various commercial and governmental purposes throughout the continent. Any AI that is already being used must adhere to the current norms and regulations, said Benifei, and he added that he was hoping the AI Act would not pose “further regulatory burden” on them.

Furthermore, Benifei recognizes the broad range of present applications for AI systems and expressed his desire that the AI Act would not impede the progress of machine learning technologies, but instead will ensure that high-risk systems are appropriately regulated to prevent any discriminatory or undemocratic outcomes.

EU member states have already taken steps to ensure that companies producing and using AI technology comply with the EU General Data Protection Regulation, which aims to safeguard personal data and privacy.

Italy’s data protection authority, the Garante, placed a temporary ban on ChatGPT on the 3rd of February, due to concerns that the AI application allegedly does not check the age of users and does not have any legal basis for collecting and storing such massive amounts of personal data for training their algorithm.

After issuing the ban by the Garante, OpenAI, ChatGPT parent company, promised to collaborate with the authorities and provide clarification with regards to their legal basis in addition to their desire and plan for more transparency. Moreover, the European Data Protection Board (EDPB) set up a ChatGPT task force to encourage collaboration and information-sharing among data protection authorities regarding potential enforcement actions.

AI has huge potential to solve various problems and tend to many challenges within our society, however, it could also pose as a threat as there are a lot of unknowns and uncertainties with the AI technology and the laws regulating the field. To train the algorithm quickly and properly, a lot of data is needed. This can lead to challenges with data privacy and in particular with the GDPR. It remains to be seen how the regulators will tackle these issues.

Do you have any questions about developments within privacy and data protection with regards to AI or other emerging technologies? Contact us, the Experts in Data Privacy, at for more information.



Wednesday 3 May - 8:08 AM

EDPB Launches Data Protection Guide for Small Businesses

The European Data Protection Board (EDPB) has launched a new guide to help small businesses comply with the General Data Protection Regulation (GDPR). The guide provides practical advice and guidance on how to protect personal data and ensure compliance with the GDPR.  

The guide is aimed at small and medium-sized enterprises (SMEs) that may not have the resources or expertise to navigate the complex data protection landscape. It covers a range of topics, including data processing, data security, data breaches, and data subject rights. SMEs play a vital role in the European economy, but they often face challenges in complying with the GDPR. 

In this guide, SMEs will find various tools and practical tips to help them comply with the GDPR. It includes concrete examples gathered during our 5 years of experience with the GDPR.” said Andrea Jelinek, the Chair of the EDPB.  

The guide includes a range of resources such as videos, infographics, interactive flowcharts, and other practical materials designed to provide guidance on data protection practices for SMEs. These resources are aimed at helping SMEs understand their obligations under the GDPR and providing practical advice on how to implement data protection measures. 

Additionally, the guide provides an overview of other handy materials developed by national Data Protection Authorities (DPAs) to further assist SMEs in implementing data protection practices. It offers, among others, guidance on conducting a Data Protection Impact Assessment (DPIA), ensuring the security of personal data, and how to handle data breaches, such as notifying the relevant supervisory authority and data subjects. 

The guide is available on the EDPB’s website at  


Do you have any questions about developments within privacy and data protection? Contact us, the Experts in Data Privacy, at for more information.  


Friday 28 Apr - 11:30 AM

Advocate General publishes Opinion regarding compensation for non-material damage for data breaches

On April 27, 2023, the Advocate General published it’s Opinion in Case C-340/21 regarding the compensation that can be awarded for non-material damage arising from liability and presumed fault on the part of the data controller.

In order to be exempt from liability, a data controller must demonstrate that it is not in any way responsible for the event giving rise to the damage. Fear of a possible misuse of the personal data in the future can constitute non-material damage. This will give rise to a right to compensation only if it is actual and certain emotional damage and not simply trouble or inconvenience for the data subject.

The Opinion of the Advocate General states that the data controller is obliged to implement appropriate technical and organizational measures to ensure that processing of personal data is performed in accordance with the General Data Protection Regulation (‘GDPR’).

The most important points of the Opinion are that that:

  • The occurrence of a personal data breach is not sufficient in itself to conclude that the technical and organizational measures implemented by the data controller were not ‘appropriate’ to ensure data protection. The data controller’s decision is subject to possible judicial review of compliance but the data controller must consider the ‘state of the art’ to what is reasonably possible at the time of implementation as well as the implementation costs. The assessment of appropriateness of the measures will be based on a balancing exercise between the interests of the data subject and the economic interests and technological capacity of the controller, in compliance with the general principle of proportionality;
  • The burden of proving that the measures are appropriate is on the data controller and not the data subject;
  • The fact that the infringement has been committed by a third party does not in itself constitute a ground for exempting the data controller. In order to be exempted from liability, the data controller must demonstrate, to a high standard of proof, that it is not in any way responsible for the event giving rise to the damage;
  • Detriment consisting in the fear of a potential misuse of one’s personal data in the future, the existence of which the data subject has demonstrated, may constitute non-material damage giving rise to a right to compensation, provided that it is a matter of actual and certain emotional damage and not simply trouble or inconvenience.

While the Advocate General’s Opinion is not binding on the Court of Justice of the European Union (‘CJEU’), the Opinion provides an independent legal solution to the CJEU which can be considered by the judges when deliberating the case. The outcome of Case C340/21 is a development that will be closely monitored.

Does your organization have a question about data breaches or how to respond to data subjects regarding data breaches? Contact us, the Experts in Data Privacy, at for more information.



Thursday 20 Apr - 11:44 AM

European Data Protection Board adopted three sets of Guidelines  

During its March plenary, the European Data Protection Board (“EDPB”) adopted the final versions of three Guidelines.

The EDPB has finalized the “Guidelines on data subject rights – Right of access” after a public consultation, which provides more precise guidance on the implementation of the right of access in different situations. The Guidelines cover various aspects such as  

    • the scope of the right of access,  
    • the information to be provided to data subjects,  
    • the format of access requests,  
    • the main modalities for providing access,  
    • the notion of manifestly unfounded or excessive requests, and  
    • the limitations of the right of access. 


The EDPB also adopted final versions of the targeted updates of “Guidelines for identifying a controller or processor’s lead supervisory authority” and the “Guidelines on data breach notification”. These updates concern the Article 29 Working Party Guidelines on the same subject.  

Regarding data breach notification, the new version clarifies that the responsibility of notification lies with the controller. However, stakeholders raised concerns about operational issues when notifying multiple DPAs. While the targeted update aligns with the GDPR, which does not provide for a one-stop-shop mechanism for controllers established outside EEA, the EDPB considered stakeholders’ feedback. Accordingly, the EDPB will publish a contact list for data breach notification with relevant links and accepted languages for all EEA DPAs on its website. This will facilitate controllers in identifying contact points and requirements per DPA. 


Do you have any questions about developments within privacy and data protection? Contact us, the Experts in Data Privacy, at for more information.  



Friday 14 Apr - 11:20 AM

MEPs are not in favor of allowing the transfer of personal data under the current EU-U.S. Data Privacy Framework

The European Parliament Committee on Civil Liberties, Justice and Home Affairs has adopted a resolution on Thursday rejecting the proposed EU-U.S. Data Privacy Framework. Members of the European Parliament (MEPs) have commented that while the framework was an improvement, it did not meet the requirements to justify an adequacy decision.

The European Commission initiated the process of adopting an adequacy decision for the EU-U.S. Data Privacy Framework on December 13th. This decision aims to facilitate trans-Atlantic data transfers and it would resolve the concerns of the Court of Justice of the European Union resulting from its Schrems ll decision from July 2020.

This draft adequacy decision entails that the U.S. would ensure that personal data transferred from the EU to the U.S. are adequately protected by the United States. This decision was made after a thorough analysis and assessment of the Data Privacy Framework including the obligations it poses on companies, as well as the restraints and security measures in place for the U.S. public authorities to access EU citizens’ personal data, specifically for the purposes of criminal law enforcement and national security.

MEPs noted that although the framework has some improvements, it still permits the bulk gathering of personal data under specific circumstances. Additionally, it does not require independent authorization before bulk data collection and does not establish explicit guidelines for data retention.

MEPs have pointed out that the Data Privacy Framework introduces a Data Protection Review Court (DPRC) to address compensation and facilitate complaints for EU data subjects. However, their decision would be classified and that would violate data subjects’ right to access and rectification of their personal data. Furthermore, the U.S. President could potentially dismiss the DPRC judges and strike down its decisions. Therefore, MEPs believe that the Review Court lacks independence. This concern was previously also raised in their draft opinion on this matter in February.

MEPs argue in the resolution that the framework has to be future-proof and the adequacy assessment by the Commission must depend on how the U.S. privacy rules are implemented in practice.  The U.S. Intelligence Community is still adapting its approach and methods pursuant to the Data Privacy Framework. Therefore, according to the MEPs an evaluation of its effect and consequences is not yet possible.

Since the prior data transfer frameworks between the EU and the U.S. were overthrown by the Court of Justice of the European Union, most recently in its decision for the Schrems ll case, MEPs are urging the Commission to ensure that the future framework is capable of withstanding legal challenges and offer legal certainty to EU citizens and companies. In order to accomplish this goal, MEPs advise the Commission not to grant an adequacy decision based on the current framework. Instead, they should work out a framework that would be likely to be upheld in court.

MEPs have adopted the resolution with 37 votes in favor, no votes against, and 21 abstentions. The resolution will now be postponed for a future plenary session of the European Parliament.

Organizations must ensure that they follow the existing transfer mechanisms, such as SCCs and TIAs, and implement any additional measures until there is more clarity regarding the EU-US Data Privacy Framework.

Does your organization have any questions about transferring personal data internationally to the U.S.? Contact us, the Experts in Data Privacy at for assistance.



Tuesday 11 Apr - 11:35 AM

Updated EDPB Guidelines regarding personal data breach notification under the GDPR

Recently, the European Data Protection Board (“EDPB”) has updated the Guidelines on personal data breach notification under the GDPR. This update includes the targeted public consultation on the subject of data breach notification for controllers not established in the EEA. 

The EDPB noticed that there was a need to clarify the notification requirements concerning the personal data breaches at non-EU establishments. The paragraph concerning this matter has been revised and updated, while the rest of the document was left unchanged, except for editorial changes.  

Where a controller not established in the EU is subject to the GDPR (article 3(2)l; 3(3)), and
experiences a data breach, it is therefore still bound by the notification obligations under the GDPR (article 33; 34).

Although controllers that are not established in the European Economic Area (“EEA”), and are subject to the GDPR, require a controller (and a processor) to designate a data protection representative (“DPR”) in the EEA, it has become clear from these guidelines that the mere presence of a DPR in a Member State does not trigger the one-stop-shop mechanism. For this reason a data breach will need to be notified by the controller to every data protection authority (“DPA”) for which affected individuals reside in their EEA Member State. This means that when a controller is not established in the EEA and experiences a data breach in multiple Member States, the controller is obliged to notify every DPA in each Member State where individuals are affected by the data breach. 

Similarly, where a processor is subject to the GDPR, it will be bound by the obligations on
processors, of particular relevance here, the duty to notify a data breach to the controller under the GDPR (article 33(2)).  

Although this means that more effort is required from controllers under the GDPR that are not established in the EEA, this update provides more clarity about how to act in a GDPR compliant manner in case of a cross-border data breach.  

Does this affect your organization’s data breach policy and procedure or do you want to learn more about data breaches under the GDPR? Contact us DPO Consultancy, experts in data privacy, via:  


Tuesday 4 Apr - 1:53 PM

ChatGPT: Recent developments regarding privacy compliance 

Since ChatGPT is increasing in popularity, regulators tend to spend more time on assessing its level of compliance under different laws and regulations, including the GDPR but also the FTC Act for instance.  

What is OpenAI's ChatGPT and Can You Invest? (Updated March 16)

ChatGPT is an Artificial Intelligence (AI) chatbot that has been developed by OpenAI, and was launched in November 2022. OpenAI is a research laboratory and is headquartered in San Francisco (United States). The core function of ChatGPT is to mimic a human conversationalist, which makes it versatile. Things you can ask the chatbot is to answer test questions, write song lyrics, and even to debug computer programs. OpenAI collects data retrieved via ChatGPT from users to further train and fine-tune the service.  

Recent developments in the US have showed that the ethics group Center for AI and Digital Policy asked the US Federal Trade Commission to stop new AI chatbot releases by OpenAI, because of the fact that the GPT-4 version is “biased, deceptive, and a risk to privacy and public safety”. The group’s formal complaint stated that ChatGPT-4 fails to meet FTC standards of being “transparent, explainable, fair and empirically sound while fostering accountability”. An example of why the group feels this way, is that you can easily expose private chat histories to other users without the user’s knowledge. Therefore, the group aims to ensure the establishment of the necessary guardrails to protect consumers, businesses, and the commercial marketplace in the US. 

In Europe, the Italian DPA (Garante), ordered a “temporary limitation of the processing of data of Italian users” by OpenAI and has started an investigation into the vendor. Garante found that the GDPR is violated, since there is a lack of information that is provided to users about the data collection by OpenAI but also a lack of lawful processing ground that justifies the mass data collection and storage of personal data for the purpose to ‘train’ the algorithms of the chatbot.  

OpenAI now has to respond to Garante’s request within 20 days, otherwise they might face a fine of up to 20 million euros or up to 4% of the annual global turnover (whichever is higher). 

Although the booming development of ChatGPT has been beneficial for them so far, it has not been beneficial when it comes to the chatbot’s compliance with multiple laws and regulations. Critic assessments on the chatbot are just getting started and it is to be expected that more complaints and requests will follow.  



Friday 24 Mar - 8:56 AM

Meta tracking tools declared illegal by Austrian Data Protection Authority

About a month after the Court of Justice of the European Union (‘CJEU’) issued its ruling in the Schrems II case, noyb filed 101 complaints regarding data transfers from the European Economic Area (‘EEA’) based websites to Google LLC and Facebook Inc located in the United States (‘US’).

In order to coordinate the work of the various data protection authorities (‘DPA’), the European Data Protection Board (‘EDPB’) created a special task force. This decision by the Austrian DPA arises from one of those complaints.

Facts of the complaint:
In August 2020, the data subject visited a website that was hosted by an Austrian media company, while he was logged into his personal Facebook  account. At the time, the company made use of the Facebook login tool. This tool facilitates a user’s access to services not offered by Facebook without the creation of additional accounts. The company also made use of the Facebook Pixel tool, which was able to track the visitor’s activities on its website.

The data subject argued that mere access to the website triggered an unlawful transfer of personal data to the US and that this was in contravention of the General Data Protection Regulation (‘GDPR’). Therefore, he filed a complaint against the media company for using these tracking tools and further argued that Meta infringed various provisions of the GPDR.

While the Austrian DPA was investigating the complaint, the media company deactivated these tracking tools and argued that it had concluded an agreement with Meta Platforms Ireland Limited and therefore any transfers, including transfers outside the EU, were compliant and justified in the light of media privilege.

Outcome of the complaint:
The Austria DPA held that Meta Platforms Inc, did not violate Article 44 GDPR as it was a data importer falling outside the scope of Chapter V GDPR. The Austrian DPA did not extensively elaborate on the subjective qualification of Meta under the GDPR as it did not have enough evidence to qualify it as a controller.

The Austrian DPA upheld the complaint against the Austrian media company for three reasons. Firstly, merely deactivating the tracking tools after receiving the complaint did not prevent an infringement of Article 44 GDPR as the violation had already occurred. Secondly, the media privilege claim did not fall within the scope of the journalistic exception as the tracking tools were implemented for tracking purposes and to facilitate the login procedure and for the sole purpose of disseminating information, opinions or ideas to the public. Furthermore, once the data was transferred to Meta Ireland, it could be used for further purposes. Thirdly, there was no lawful basis for the international data transfer and thus a violation of Chapter V GPDR had occurred.

Consequences of the complaint:
The decision by the Austrian DPA is a clear indication that the use of Facebook’s tracking pixel directly violates the GDPR and the decision of the CJEU in the Schrems II ruling.

In light of this decision, numerous websites that use the Facebook tracking tools to track users and display personalized advertisements may need to reconsider their approach in this regard and keep track of other DPAs following the example of the Austrian DPA.

Does your organization make use of tracking tools or do you have questions about the impacts of this decision? Contact us, the Experts in Data Privacy at for more information.



Friday 10 Mar - 2:39 PM

The reform of the UK GDPR: what can we expect?

The second iteration of a set of proposals to reform the UK GDPR has been published: the Data Protection and Digital Information (No.2) Bill.

United Kingdom Flag Pictures | Download Free Images on Unsplash

Based on this publication there are a couple of interesting findings: 

  • The fundamental principles of the current UK GDPR, the obligations of controllers and processors, the data subject rights, and the wider constitutional and regulatory environment for privacy would remain unaffected by the proposals.  
  • There are targeted changes that reflect back on the experience of the GDPR so far, and are proposed in pursuit of the government’s policy objective to reduce compliance costs by introducing a more ‘common-sense-led’ version of the GDPR.  
  • There are targeted changes being made to current law. These changes may provide businesses more flexibility, and empowerment of future rule-making or guidance-issuing. 
  • Organizations that are compliant with the current UK GDPR will not be obliged to make changes to comply with the proposed UK GDPR. Although the proposed reforms will offer organizations the opportunity to make use of new compliance efficiencies. It is not to be expected that there will be conflicting requirements between the two versions. 


Some new proposals in the second iteration of the Bill include: 

  • That it is proposed to include a list of activities that may be regarded as in a controller’s legitimate interest to process data, although controllers are still required to ensure their interests are not outweighed by the data subject’s rights. It also clarified that any legitimate commercial activity can fall under the legitimate interest lawful processing ground, in case the processing is necessary and the balancing test is carried out. 
  • The proposal also contains a list of illustrative and non-exhaustive type of scientific research, that were previously to be found in the recitals and now moved to the operative parts of the UK GDPR, e.g. innovative research into technological development or applied or fundamental research. It also clarifies that research into public health only falls under scientific research if it is in the public interest, and it exempts controllers to provide notice where personal data has been collected directly from the data subject in certain instances. 
  • Regarding international data transfer mechanisms, there are alternatives proposed. 
  • The proposals also include a requirement to maintain a records of processing activities in case of processing activities that are likely to result in a high risk to the rights and freedoms of data subjects, so also for organizations with fewer than 250 employed people. 
  • The proposals create an obligation on providers of public electronic communication services and networks to report suspicious activity relating to unlawful direct marketing to the ICO. It is expected that there will be guidance published on what constitutes reasonable suspicion.


At DPO Consultancy we will keep monitoring any new developments regarding the UK GDPR. Do you want to learn more about the consequences of these proposals for your organization? Then contact us DPO Consultancy, experts in data privacy via: 



Tuesday 28 Feb - 9:36 AM

European Commission bans TikTok from corporate devices

The European Commission announced on Thursday that they had asked all employees to uninstall TikTok from their corporate devices, as well as the personal devices using corporate apps over data protection concerns. The Council of the EU also followed later in the day with a similar measure, banning its staff from using the Chinese app.

In an email sent to EU officials, it is said that “To protect the Commission’s data and increase its cybersecurity, the EC Corporate Management Board has decided to suspend the TikTok application on corporate devices and personal devices enrolled in the Commission mobile device services.”

Officials are required to uninstall the app at their earliest convenience and before March 15. Alternatively, they can delete work-related apps from their personal devices if they want to keep TikTok.

It is said that this is the first time the Commission has suspended the use of an app for its staff. The aim of the relevant measure is to protect Commission data and systems from potential cybersecurity threats.

In November 2022, TikTok admitted that European TikTok user data could be accessed in the Chinese headquarters. This was also scrutinized in the US through a report in Forbes that the app was being used to spy on journalists covering TikTok.

The US has recently banned TikTok on government devices at the state and federal levels due to fears about potential spying by China. Although most EU countries have not yet made a move about the app, Dutch government officials were told to suspend the use of TikTok.

Do you have any questions about developments within privacy and data protection? Contact us, the Experts in Data Privacy, at for more information.


Tuesday 21 Feb - 2:19 PM

New law to ‘fix’ GDPR

Before the summer of 2023, the European Commission (EC) will propose a new law that is aimed at improving how the privacy regulators of various European Union (EU) countries’ will enforce the General Data Protection Regulation (GDPR).


A watershed moment in global tech regulation occurred in 2016, when the GDPR was adopted, as companies were forced to abide by new privacy standards which included asking for consent to collect people’s data online. Also, the possibility of receiving substantial fines for noncompliance was introduced.

Five years down the road, activists, experts and some national privacy watchdogs have become frustrated at what they see as an inefficient system to tackle major cases, especially in instances when Big Tech companies are involved.

Another aspect to receive attention is the powerful role that the Irish Data Protection Commission has under the one-stop shop rule, which directs most major investigations are to be handled by the Irish authorities as tech companies like Meta, Google, Apple and others have established their headquarters there. The Irish and – to a lesser extent – Luxembourg authorities have received frequent and mounting criticism in recent  years.

The EC is set to introduce a new EU regulation in the second quarter of 2023, which will set clear procedural rules for national data protection authorities dealing with cross-border investigations and infringements. The EC has been quoted as saying that the law “will harmonize some aspects of the administrative procedure in cross-border cases and support a smooth functioning of the GDPR cooperation and dispute resolution mechanism.”

Last year, the European Data Protection Board (EDPB) sent the EC a “wishlist” of procedural law changes to improve the enforcement. Some ideas included setting deadlines for different procedural steps in handling cases and harmonizing the rights of the different parties involved in EU-wide investigations.

The EC wants to keep the new regulation very targeted and limited. This partly because it is expecting that tense discussions with data privacy watchdogs, campaigners and Big Tech lobbyists will result from the new regulation.

Furthermore, it is not only lobby groups that are expected to jump on the EC’s new privacy regulation. Regulators have also clashed over how to interpret and enforce the GDPR. The result has often triggered dispute resolutions and the delay of cases. The EC has said that “nobody will be happy with the Commission proposal as usual, because the data protection authorities agree on the problem but they do not agree on the solutions.”

The EU executive only has a short window of time to push this new text through the EU legislation train as European elections are set for Spring 2024.

While this new law is expected to solve the enforcements flaws of the GDPR, it could open a Pandora’s box of lobbying and regulator infighting. This a development that will definitely be closely monitored.

Do you have any questions about developments within privacy and data protection? Contact us, the Experts in Data Privacy, at for more information.


Source: Normal 0 false false false en-NL X-NONE X-NONE



Friday 17 Feb - 8:51 AM

European Commission urged to reject EU-US adequacy

The European Parliament Committee on Civil Liberties, Justice and Home Affairs (LIBE committee) does not want the European Commission (EC) to extend an adequacy decision to the United States (US) based on the proposed EU-US Data Privacy Framework (DPF). This was made clear in its draft opinion published on the February 14.

In their opinion, the committee members have concluded that the proposed DPF “fails to create actual equivalence in the level of protection offered under the EU General Data Protection Regulation” and have urged the EC to only adopt a decision when “meaningful reforms were introduced, in particular for national security and intelligence purposes” on the part of the US.

The LIBE committee’s opinion is viewed by others as a noteworthy step towards accountability and warrants serious consideration before an adequacy decision is granted to the US.

Parliament and the European Data Protection Board (EDPB) each have the opportunity to present opinions, which are nonbinding, to the EC to consider towards its adequacy decision.

Maastricht University European Centre on Privacy and Cybersecurity Senior Visiting Fellow Paul Breitbarth has been quoted as saying “with the EDPB’s opinion on the framework weeks away, I think the Parliament’s resolution is very timely, and might also influence the (data protection authorities’) debate on whether or not the framework is indeed essentially equivalent.”

The LIBE committee has highlighted the key discrepancies with the DPF, with specific focus on the subject of equivalent protection and the proposed redress system the US plans to establish for EU citizens. Furthermore, MEPs have said that it is worrisome that there are differing definitions of key data protection concepts, such as the principles of necessity and proportionality, the absence of a comprehensive federal privacy legislation and adherence to a US Executive Order.

Another concern of MEPs is the proposed Data Protection Review Court as it raises concerns with a perceived lack of independence and general transparency. The committee’s view is that the Data Protection Review Court’s process is “based on secrecy and does not set up an obligation to notify the complainant that their personal data has been processed.” Furthermore, neither does it allow for appeals to federal courts nor the opportunity to claim for damages where appropriate.

This draft opinion will be formally presented in the LIBE committee at the beginning of March with the hope of finalization at the end of March. Thereafter, the opinion will go to a full parliament vote in April. The EC has maintained that it hopes to finalize the adequacy decision by July at the earliest.

Until there is more clarity surrounding the EU-US Data Privacy Framework, organizations must ensure they adhere to the available transfer mechanisms and that the supplementary measures are implemented.

Does your organization have any questions about transferring personal data internationally to the US? Contact us, the Experts in Data Privacy at for assistance.

Source: Normal 0 false false false en-NL X-NONE X-NONE



Tuesday 14 Feb - 11:28 AM

New Swiss Federal Act on Data Protection to introduce stricter requirements than the GDPR

The longstanding Swiss Federal Act on Data Protection of 1992 will be replaced with new legislation to better protect Swiss citizens’ data (revFADP). The revFADP will improve the processing of personal data and grants Swiss citizens new rights consistent with other comprehensive laws, such as the General Data Protection Regulation (GDPR).

This change also introduces various increased obligations for companies doing business in Switzerland. Furthermore, while the revised legislation has many similarities to the GDPR, there are stark difference companies must be aware of.

The most important aspects companies must be aware of include:

          No compliance grace period: the revFADP takes effect on September 1, 2023 and there is no grace period for companies to become compliant.

      Expanded definition of sensitive data: the definition of sensitive personal data will be expanded to include genetic and biometric data that unequivocally identifies a natural person. The explicit consent of the data subject is required when processing sensitive personal data.

    Emphasized important of an “Independent” DPO: the Swiss Federal Data Protection and Information Commission (FDPIC) strongly emphasizes the importance of an independent DPO. Thus, the DPO’s activities should remain separate from other business activities of the company. Furthermore, it has been recommended that the DPOs speak at least one of the languages of Switzerland, for instance, French, German, Italian, Romansh.

        Breach Notice for Serious Attacks only and no clear notice timeframe: under the revFADP, the controller must notify the FDPIC of certain serious personal data breaches ‘as soon as possible.’ Furthermore, notice of breaches should only be made if they pose an ‘imminent danger’ to data subjects.

     Penalties: civil penalties will not be imposed by the revFADP but intentional violations can result in criminal sanctions of up to 250,000 Swiss Francs against individuals, which could potentially include DPOs and C-Suite Executives instead of the entity.

It is expected that the FDPIC will issue updates and guidance on the revFADP. These updates and guidance will continue to be monitored.

Does your company do business in Switzerland? Contact us, the Experts in Data Privacy at, to ensure your company is compliant with the changes the revFADP will introduce.




Friday 10 Feb - 1:57 PM

DPOs and conflict of interest: CJEU issues ruling

The Court of Justice of the European Union (CJEU) has affirmed that Data Protection Officers (DPOs) can maintain other tasks and duties within their role, provided it does not result in a conflict of interest.

The ruling of 9 February centered around Article 38 of the GDPR. The CJEU has stated that DPOs should “be in a position to perform their duties and tasks in an independent manner” but “cannot be entrusted with tasks or duties which would result in him or her determining the objectives and methods of processing personal data on the part of the controller or its processor.”

The CJEU has further stated that this is “a matter for the national court to determine, case by case, on the basis of an assessment of all the relevant circumstances.”

The Federal Labour Court of Germany, Bundesarbeitsgericht submitted a request to the CJEU for a preliminary ruling regarding proceedings between X-Fab Dresden and its former DPO. The former DPO, who was also the chair of the works council, was dismissed from the role of DPO in December 2017. When the GDPR came into force in May 2018, X-Fab argued that the former DPO’s dismissal was justified due to “a risk of a conflict of interests” in performing both functions as these posts were incompatible.

The CJEU found that Article 38 GDPR states that DPOs cannot be dismissed or penalized for performing tasks, however, this does not prevent national laws from establishing additional protections against dismissing DPOs. These additional protections should not “compromise the principal objectives of the GDPR to maintain high levels of data protection.” The example provided by the CJEU was that national laws cannot protect DPOs from dismissal if the DPO is unable or no longer able to carry out his or her duties in complete independence due to the existence of  conflict of interest.

This ruling comes ahead of the European Data Protection Board’s upcoming coordinated enforcement action focusing on the designations of DPOs.

Does your organization have questions about the role of a DPO? Contact us, the Experts in Data Privacy, at, for more information.




Tuesday 7 Feb - 8:57 AM

Privacy by Design to become an ISO standard on 8 February

In January, it was reported that on February 8, 2023, the International Organization for Standardization (ISO) will adopt Privacy by Design (PbD) as ISO 31700.

PbD was introduced 14 years ago by Ann Cavoukian, a Canadian privacy commissioner, and this will now become an international privacy standard for the protection of consumer products and services.

PbD is a set of principles that calls for privacy to be taken into account throughout an organization’s data management process. Since its introduction, it has been adopted by the International Assembly of Privacy Commissioners and Data Protection Authorities and was incorporated into the General Data Protection Regulation (GDPR).

Cavoukian has been quoted as saying “the ISO standard is designed to be utilized by a whole range of companies – startups, multinational enterprises, and organizations of all sizes. With any product, you can make this standard work because it’s easy to adopt. We’re hoping privacy will be pro-actively embedded in the design of [an organization’s] operations and it will complement data protection laws.”

The proposed introduction notes that PbD refers to several methodologies for product, process, system, software and service development. The proposed bibliography that comes with the document refers to other standards with more detailed requirements on identifying personal information, access controls, consumer consent, and corporate governance amongst other topics. A separate document will outline possible use cases as well.

Adopting privacy can be regarded as a competitive advantage for businesses and by implementing PbD principles, both privacy and business interests are addressed resulting in a ‘win-win’ situation.

Does your organization have any questions about PbD? Read this white paper to find out more.



Wednesday 1 Feb - 8:30 AM

Hackers stole customers’ backups says LastPass owner GoTo

LastPass’ parent company GoTo – formerly known as LogMeIn – has confirmed that hackers stole customers’ encrypted backups during a recent breach of its systems.

The breach was first confirmed by LastPass on November 30. At the time, LastPass said that an ‘unauthorized party’ had gained access to some customers’ information stored in a third-party cloud service shared by LastPass and GoTo. The hackers used information stolen from an earlier breach of LastPass systems in August to further compromise the companies’ shared cloud data.

Now, almost two months’ later, in an updated statement GoTo has said that the cyberattack has impacted several of its products, including business communications tool Central, online meetings service; hosted VPN service Hamachi and its Remotely Anywhere remote access tool.

The hackers exfiltrated customers’ encrypted backups from these services, which included the company’s encryption key for securing the data.

GoTo has said that the company does not store customers’ credit card or bank details or collect personal information, such as date of birth, home address or Social Security Numbers. This, however, is in clear contrast to the hack affecting its subsidiary LastPass. During the attack at LastPass, hackers stole the contents of customers’ encrypted password vaults, along with customers’ names, email addresses, phone numbers and some billing information.

GoTo has 800,000 customers, including enterprises but has not indicated how many customers are affected. Furthermore, despite the delay, GoTo has not provided any remediation guidance or advice for affected customers.

Does your organization make use of a program where passwords or other sensitive information is saved? It may be time to review the Information Security Policy to ensure technical and organizational measures are in place to adequately protect personal data. Contact us, the Experts in Data Privacy, at for assistance.


LastPass owner GoTo says hackers stole customers’ backups



Friday 27 Jan - 2:18 PM

Location data is not personal data… or is it?

The Spanish data protection authority (AEPD) had previously argued that location data of a telecommunications provider is not considered to be personal data under the GDPR. After which the organization ‘noyb’ (none of your business) appealed against the AEPD in the Spanish Court. The Court sided with noyb by stating that location data is personal data.

It all started when a telecommunication provider had denied its customers access to their location data because according to them it did not qualify as personal data under the GDPR. Therefore, they believed they did not have to grant access rights to users. A Spanish customer requested access to his location data with the telecommunication provider, after which his request was denied. The customer then filed a complaint with the AEPD. Thereinafter, noyb appealed this decision in June 2022.

The decision is of the AEPD is a hard one to comprehend, since the GDPR contains a very broad definition of personal data:

“…any information relating to an identified or identifiable natural person (‘data subject’); an identifiable natural person is one who can be identified, directly or indirectly, in particular by reference to an identifier such as a name, an identification number, location data, an online identifier or to one or more factors specific to the physical, physiological, genetic, mental, economic, cultural or social identity of that natural person…”

It even includes the example of location data being personal data.

Furthermore, the right to access of personal data should also not be taken lightly. Individuals have a right under the GDPR to access data organizations store on them. Therefore, the telecommunication provider could not lawfully deny access to location data.

How does your organization handle data subject requests (DSRs)? A decent DSR procedure will help your organization to assess and manage DSRs in a GDPR compliant manner. Contact us if you want to learn more about DSRs via:


Friday 20 Jan - 11:30 AM

Cookie banners to be consistent across the EEA soon

On the 18th of January 2023, the European Data Protection Board (“EDPB”) adopted a report by the Cookie Banner Task Force for the work carried out in ensuring a consistent approach to cookie banners across the European Economic Area (“EEA”).

The Cookie Banner Task Force was established in September 2021 to coordinate the response to complaints received about cookie banners filed with numerous EEA Data Protection Authorities (“DPAs”) by NYOB.

The Task Force aimed to promote cooperation, information sharing and best practices amongst the DPAs, which has been instrumental in ensuring a consistent approach to cookie banners across the EEA.

From the report it will be noted that the various DPAs agreed upon a common denominator in their interpretation of the applicable provisions of the ePrivacy Directive and the General Data Protection Regulation (“GDPR”) on issues such as reject buttons, pre-ticked boxes, banner designs or withdraw icons.

Cookies are found on all websites and therefore it is important that your organization remains up to date with these developments. Do you have questions about how this report will affect your organization or which changes must be implemented to your existing cookie policy? Contact us, the Experts in Data Privacy, at




Tuesday 17 Jan - 12:12 PM

Court of Justice of the European Union: Everyone has the right to know to whom his/her personal data have been disclosed

On January 12, 2023, the Court of Justice of the European Union (“CJEU”) issued a preliminary ruling on the interpretation of the right of access to personal data (Article 15 GDPR). The CJEU ruled that the right of access entails the right to know the specific identity of recipients of the personal data.  

The CJEU was asked whether the GDPR leaves the data controller the choice to disclose either the specific identity of the recipients or only the categories of recipient, or whether it gives the data subject the right to know their specific identity.  

The highlights of the judgment are:  

  • The data subject has the right to know the specific identity of the recipients of his personal data as part of his/her right of access under Article 15(1)(c) of the GDPR. Informing the data subjects only about the categories of recipients, e.g., advertisers, IT companies, mailing list providers, is not sufficient.  
  • The right of access is necessary to enable the data subject to exercise other rights conferred by the GDPR, namely the right to rectification, right to erasure, right to restriction of processing, right to object to processing or right of action where he/she suffers damage. If the data subject is not informed about the identity of the recipients, the exercise of these rights vis-á-vis these recipients would be restricted.  
  • Therefore, where the personal data have been or will be disclosed to recipients, the data controller is obliged to provide the data subject, on request, with the actual identity of those recipients, unless it is not (yet) possible to identify those recipients, or the controller demonstrates that the request is manifestly unfounded or excessive.  

The information obligation under Articles 13 and 14 of the GDPR gives the data controller the right to inform data subjects only about the categories of recipients. The importance of the judgment lies in the fact that the data subject’s right to know the identity of recipients seems to prevail the data controller’s right to only disclose the categories of recipients.  

This ruling highlights the importance of maintaining a clear record of processing activities. Accordingly, the data controllers should keep track of the actual identity of recipients for each data transfer and duly reflect the names of specific recipients in their record of processing activities as recording only the categories of recipients in the registry would not be sufficient to fully comply with Article 15(1)(c) of the GDPR.  


How does your organisation handle data subject rights or keep the record of processing activities? Contact us, experts in data privacy, if you want to learn more via  


Thursday 5 Jan - 2:15 PM

Meta fined €390 million by the Irish DPC over legal basis for targeted advertising

The Irish Data Protection Commission (“DPC”) announced its final decisions on two inquiries into Meta’s data processing operations on Facebook and Instagram. The DPC fined Meta a total of €390 million for breaches of the GDPR and invalidated its reliance on a “contract” legal basis for personal data processing for targeted advertising purposes.  

The decision came after the complaints made by privacy rights organisation NOYB on May 25, 2018, the day the GDPR came into operation.Before 25 May 2018, Meta had changed the Terms of Service for its Facebook and Instagram services, which indicated a change in the legal basis on which Meta relies to legitimate its processing of personal user data. Accordingly, Meta would rely on the contract legal basis for its certain data processing activities in the context of the delivery of the services on Facebook and Instagram, including behavioural advertising, rather than the consent, as it had previously done.

The users were asked to accept the new Terms of Service if they wished to continue to access Facebook and Instagram services. The DPC stated that Meta considered that “a contract was entered into” between Meta and users and that “processing of users’ data in connection with the delivery of its Facebook and Instagram services was necessary for the performance of that contract, to include the provision of personalised services and behavioural advertising, so that such processing operations were lawful by reference to Article 6(1)(b) of the GDPR (the “contract” legal basis or processing).”

The complainants argued that by making the use of its services conditional on the users’ acceptance of the Terms of Service, Meta was forcing them to agree to Meta’s use of personal data for behavioural advertising purposes, which implied that Meta was in fact relying on consent as a legal basis for this processing and that there was no real choice for users in this regard.

Following the European Data Protection Board’s binding determinations on the matter, the DPC adopted its final decisions in which it stated that:

  • Meta is not entitled to rely on the contract legal basis for the delivery of behavioural advertising services on Facebook and Instagram and therefore, the processing of personal user data based on contract legal basis constitutes a violation of Article 6 of the GDPR, and
  • Since users were not clearly informed about the legal basis relied on by Meta and what processing operations were being performed on their data, Meta violated its transparency obligations. It is considered that such a lack of transparency constituted breaches of the lawfulness, fairness and transparency principle enshrined in Article 5(1)(a), and Articles 12 and 13(1)(c) of the GDPR.

Furthermore, the decision also includes that Meta must bring its data processing operations into compliance with the GDPR within a period of 3 months. This means that Meta has to find a new legal basis for its processing activities in relation to targeted advertising.

NOYB Founder Max Schrems embraced the decision and said users “must have a ‘yes or no’ option and can change their mind at any time. The decision also ensures a level playing field with other advertisers that also need to get opt-in consent.”

Although the decision concerns the activities of Meta on Facebook and Instagram, it is surely expected to exert influence over other online platforms in the digital advertising industry.

In June 2022, TikTok has amended its privacy policy to reflect its intended switch from relying on users’ consent to the legitimate interest as a legal basis for the processing of personal data for personalised advertising. This was followed by Italian Supervisory Authority’s warning to TikTok that the processing of personal data for personalised advertising on the basis of legitimate interest would violate the GDPR and ePrivacy Directive. TikTok has eventually agreed to pause the controversial privacy policy update in the EU after the engagement with the DPC.

As Schrems argued, many platforms currently make the access to their service conditional on the user’s consent. The decision leaves the question whether Meta or other online businesses must allow users to refuse targeted advertising without preventing them from having access to their services.


Does your organization make use of GDPR compliant targeted advertising? Contact us, experts in data privacy, if you want to learn more via:  


Friday 30 Dec - 10:13 AM

France fines Microsoft €60 million over cookies

The French data protection authority, the CNIL, sanctioned Microsoft with a fine of 60 million euros for not having put in place a mechanism for users to refuse cookies as easily as to accept them.

The sanction came after the CNIL carried out several investigations over a complaint relating to the conditions for depositing cookies on “”. The CNIL’s decision was based on breaches of Article 82 of the French Data Protection Act. In particular, it is found that:

– When a user visited the search engine “”, a cookie serving several purposes, including the fight against advertising fraud, was automatically placed on their terminal without any action on their part.

– Moreover, when a user continued to browse on the search engine, a cookie with an advertising purpose was deposited on their terminal, without their consent having been obtained. This constitutes a violation of the Act as it requires that this type of cookies be placed only after users have consented.

The decision also highlighted the absence of a compliant means of obtaining consent for the deposit of cookies. It was stated that although the search engine offered a button to accept cookies immediately, it did not provide an equivalent solution (e.g. button to refuse) to allow the user to refuse them as easily as accepting them. This is because Bing offered one button for the users to accept all cookies, while two clicks were needed to refuse all cookies.

It was noted that complex refusal mechanisms are likely to discourage users from refusing cookies, instead, it encourages them to prefer the ease of the consent button appearing in the first window, which ultimately violates the freedom of consent of Internet users.

In deciding the amount of the fine, the CNIL regarded the scope of the processing, the number of data subjects, as well as the benefits that the company derives from the advertising profits indirectly generated from the data collected via cookies.


How does your organisation handle cookies? Contact us, experts in data privacy, if you want to learn more via:


Friday 16 Dec - 1:14 PM

The European Commission releases the EU-US Draft Adequacy Decision

On December 13, the European Commission published its draft adequacy decision for the EU-US Data Privacy Framework.






The draft decision came after the signature of the Executive Order by US President Joe Biden in October. The Executive Order will introduce safeguards for EU residents’ personal data, in particular by limiting the US intelligence services’ access to data and introducing an independent redress mechanism: Data Protection Review Court.  


With the draft decision, the Commission considers the new legal framework based on the Executive Order as having an adequate level of data protection comparable with European data protection standards. This means that the EU residents’ personal data can be safely and legally transferred to the United States. 


There are still two steps that need to be taken for the finalization of the data adequacy process. Following the draft decision, the European Data Protection Board, which gathers all EU data protection authorities, will issue an opinion. The decision will then need to have the approval of a committee formed by member states’ national representatives before the formal adoption. The Commission expects to obtain the approval by the summer of 2023.  


If approved, the adequacy decision would be subject to regular review to ensure the full implementation of the relevant elements from the US legal framework and their effective functioning in practice. The reviews will start one year after the adoption of the decision.  


The draft decision might be subject to legal challenges as has happened to its predecessors in the two landmark Schrems rulings. Max Schrems has been quoted as saying “As the draft decision is based on the known Executive Order, I can’t see how this would survive a challenge before the Court of Justice. It seems that the European Commission just issues similar decisions over and over again, which in flagrant breach of our fundamental rights.”  


It is important to note that there are other tools for international transfers available to companies in the meantime. Standard Contractual Clauses, which companies can add in their commercial contracts, are the most used mechanism to transfer data from the EU to the United States. As updated by the European Commission in June 2021, the modernized Standard Contractual Clauses must be accompanied with a Transfer Impact Assessment which is aimed to identify and mitigate privacy risks before personal data can leave the European Economic Area.  Until there is certainty from the European Commission regarding this draft adequacy decision, companies must ensure they adhere to the requirements for international data transfers.  


Does your organisation have questions about international data transfers to the US? Contact us, experts in data privacy, if you want to learn more via  


Friday 9 Dec - 3:57 PM

The CJEU: Google must delete search result content if it concerns manifestly inaccurate data

The Court of Justice of the European Union (“CJEU”) ruled that the operator of a search engine must delete search results if a person requesting deleting proves that information about them is “manifestly inaccurate”.

The case concerns a right to erasure request made by two managers of an investment company. They requested Google to de-reference results of a search made on the basis of their names which they asserted produced inaccurate claims. Google denied the request referring to the professional context in which the content of the search result was set and arguing that it was not aware whether the information contained in the content of the search was accurate or not. The German Federal Court of Justice requested CJEU for its interpretation of the GDPR, which governs the right to erasure.


In its press release, the CJEU firstly noted that the right to protection of personal data is not an absolute right and the right to erasure could be excluded where processing is necessary, in particular, for the exercise of the right of information. In this regard, one must strike a balance between the data subject’s rights to protection of personal data and protection of private life and the legitimate interest of internet users who may be interested in accessing the information in question.


The CJEU then highlighted that if, at the very least, a part of the information, which is not of minor importance, contained in the relevant content proves to be inaccurate, the right to freedom of expression and information cannot be considered. In this regard, the CJEU stated that:

– the person requesting de-referencing due to inaccurate content must prove the manifest inaccuracy of the information or a part of that information which is not of minor importance. Yet, this should not amount to an excessive burden on the person which could undermine the practical effect of the right to erasure. Therefore, the person is only expected to provide evidence that can be reasonably be required of him or her.

– the search engine operator, upon a request for de-referencing, must take into account all the rights and interests involved and all the circumstances of the case when assessing whether content may continue to be included in the search results.


Having considered the above, the CJEU concluded that where a person who has made a request for de-referencing provides relevant and sufficient evidence capable of substantiating their request and of establishing the manifest inaccuracy of the information contained in the content of the search result, the search engine operator is obliged to accede to that request.


How does your organisation handle data subject rights? Contact us, experts in data privacy, if you want to learn more via


Tuesday 6 Dec - 1:07 PM

Data scraping practices costs Meta €265 million in a fine

On 28 November, the Irish Data Protection Commissioner issued a €265 million fine on Meta-owned Facebook and Instagram over their data scraping practices and ordered remedial actions.

This inquiry by the Irish Data Protection Commissioner arose from the massive data leaks of Facebook personal data that was dumped online in a hacker forum in April 2021. The data included sensitive information such as full names, locations, birthdates, phone numbers and email addresses.

The data leak concerned 553 million people across 106 countries. In the EU alone, approximately 86 million people were affected. At the time, Facebook said that the leaked data was old since the mass data scraping occurred because of a vulnerability that the company had patched in August 2019.

A few days after the leak, the Irish Data Protection Authority announced that it was investigating the matter and would examine if Facebook’s data harvesting practices complied with the GDPR principles of privacy by design and by default.

The investigation concluded that between 25 May 2018 and September 2019, the social networks violated European privacy rules and imposed a set of specific remedial actions and issued a fine of €265 million.

A Meta spokesperson has been quoted as saying “unauthorized data scraping is unacceptable and against our rules, and we will continue working with our peers on this industry challenges. We are reviewing this decision carefully.” Meta can appeal this decision in court.

This is the second largest fine against Meta, following a €405 million fine against Instagram for breaching the privacy of children.

Does your company make use of data scraping practices? Contact us, the Experts in Data Privacy at, to ensure that this is done in a manner that ensures privacy by design and by default are adhered to at all times.




Friday 2 Dec - 3:56 PM

The French Data Protection Authority Fines Utility Company EDF €600,000 for Violations of Data Security and the Rights of Individuals

France’s data protection authority, the CNIL, sanctioned the company EDF with a fine of 600,000 euros, in particular for data security violations and for not having respected its obligations in terms of commercial prospecting and the rights of individuals.

The sanction came after the CNIL has received several complaints regarding the difficulties encountered by people in having their rights taken into account by EDF, which is the first electricity supplier in France. The CNIL’s fine was based on three core violations of the GDPR.

– EDF failed to demonstrate to the CNIL that it had obtained prior valid consent from the individuals as part of its commercial prospecting campaign by electronic means (Article 7 of the GDPR).

– The sanction decision also highlighted the breaches of the obligation to inform (Articles 13 and 14 of the GDPR) and to respect the exercise of rights (Articles 12, 15 and 21 of the GDPR). Accordingly, it was noted that the personal data protection statement appeared on EDF’s website did not specify the legal basis corresponding to each case of data use and was not clear on the duration of storage, and that the source of the data was not clearly indicated. Moreover, the CNIL stated that EDF failed to respond to certain complaints within the one-month period provided for by the texts, individuals were given inaccurate information on the source of data collected and thus their right of access to data was not respected, and that EDF did not consider the individuals’ right to object to receive commercial prospecting.

– The violations lastly included failing to ensure the security of personal data (Article 32 of the GDPR). It was found that the passwords for accessing the customer portal to receive an energy bonus for more than 25,000 accounts were stored in an unsecured manner until July 2022, and that the passwords were only hashed, without having been salted (addition of random characters before the hash to avoid finding a password by comparing hashes) which put them at risk.

In deciding the amount of the fine, the CNIL regarded the breaches identified, as well as the cooperation of the company and all the measures it took during the procedure to reach compliance on all alleged breaches with it was charged.

How does your organisation secure the processing of personal data or handle data subject rights? Contact us, experts in data privacy, if you want to learn more via:


Wednesday 23 Nov - 8:09 AM

Largest Attorney General Consumer Privacy Settlement in US History with Google

An historic $391.5 million settlement with Google in the US has been finalized over its location tracking practices. 

How to Get Location Tracking Alerts in Google Maps

“For years Google has prioritized profit over their users’ privacy. They have been crafty and deceptive. Consumers thought they had turned off their location tracking features on Google, but the company continued to secretly record their movements and use that information for advertisers”, said Attorney General Rosenblum. 

Google has misled users into thinking that they had turned off location tracking, while Google continued to collect their location information. For Google, location data is a key part of its digital advertising business. It uses personal and behavioral data to build detailed user profiles and target ads.  

As part of the multimillion-dollar settlement, Google has agreed to significantly improve its location tracking disclosures and user controls starting in 2023. The settlement furthermore requires that Google will be more transparent about its practices by: 

  1. Showing additional information to users whenever they turn a location-related account setting “on” or “off”;
  2. Make key information about location tracking unavoidable for users (i.e., not hidden); and 
  3. Give users detailed information about the types of location data Google collects and how it is used at an enhanced “Location Technologies” webpage.

In a blog post, Google outlined it will “provide a new control that allows users to easily turn off their Location History and Web & App Activity settings and delete their past data in one simple flow.” The company also plans to add additional disclosures to its Activity controls and Data & Privacy pages. 

This settlement is an important milestone, since location data is among the most sensitive and valuable data that Google collects, as it can expose a person’s identity, routines, and to infer personal details. This settlement is another example of how individuals would benefit from having a comprehensive US federal privacy law in place that prevents the usage of large amounts of personal data for marketing purposes with only a few controls in place. 

How does your organization safeguard the processing of location data or the processing of user profiles? Contact us, experts in data privacy, if you want to learn more via:  


Friday 18 Nov - 1:57 PM

EDPB update on Controller Binding Corporate Rules

On November 14, 2022, the European Data Protection Board (EDPB) adopted Recommendations on the application for approval and on the elements and principles to be found in Controller Binding Corporate Rules (BCR-C).

What are the BCRs?

BCRs are a data transfer tool that can be used by a group of undertakings, or a group of enterprises, engaged in a joint economic activity, for its international transfers of personal data from the European Union (EU) to controllers or processors within the same group of undertakings or enterprises outside of the EU. In this sense, BCRs are suitable for multi-national organisations making frequent transfers of personal data between group entities.

According to Article 46(2)(b) of the GDPR, BCRs are permitted safeguards for transfers of personal data to third countries. BCRs create a framework of policies and procedures implemented throughout the entities of the organizations, which include enforceable rights and commitments to establish an essentially equivalent level of protection to that guaranteed by the GDPR. BCR applications should meet the requirements set out in Article 47 GDPR for the relevant supervisory authority to approve the BCRs.

What do the recommendations bring?

The recommendations provide additional guidance on BCR-C applications, update the existing application form for the approval of BCR-Cs, and clarify the content required for BCR-C as stated in Article 47 GDPR. The recommendations also distinguish between information that must be included in a BCR-C form and that must be presented to the relevant data protection authority.

The EDPB considers that the recommendations aim to level the playing field for all BCR applicants and align the current guidance with the requirements of the CJEU’s judgment in the Schrems II case.

The EDPB notes that recommendations for processor BCR are currently being worked on.

Stakeholders can contribute their comments on the recommendations until January 10, 2023.

Although, this is a great development it would also be important to improve the BCR approval process with data protection authorities (DPAs), since there is a great discrepancy between the various DPAs in the EU.

Does your organization have questions about binding corporate rules or international data transfers? Contact us, the Experts in Data Privacy at for more information.




Tuesday 15 Nov - 2:25 PM

NIS2 adopted to strengthen EU-wide resilience for cyber security

MEP Bart Groothuis has been quoted as saying “this is the best cyber security legislation this continent has yet seen, because it will transform Europe to handling cyber incidents pro-actively and service orientated.”

Differing national cybersecurity measures makes the European Union (EU) more vulnerable. The new legislation sets tougher requirements for businesses, administrators and infrastructure. Last week, the rules requiring EU countries to meet stricter supervisory and enforcement measures and harmonize their sanctions were approved by the Members of the European Parliament (MEPs).

The Network and Information Security (NIS) Directive was the first EU-wide legislation on cybersecurity and its aim was to achieve a high common level of cybersecurity across the Member States. While increasing Member States’ cybersecurity capabilities, its implementation proved difficult with the result of fragmentation at various levels across the internal market.

To address the growing threats posed by digitalization and the surge in cyberattacks, the European Commission submitted a proposal to replace the NIS Directive. This has lead to the introduction and adoption of the NIS2 Directive.

This legislation sets stricter cyber security obligations for risk management, reporting obligations and information sharing. The requirements also cover incident response, supply chain security, vulnerability disclosure and encryption, amongst others.

A result of this new Directive is that more entities and sectors will have to take measures to protect themselves. “Essential sectors” have been defined, which includes energy, transport, banking, health, digital infrastructure, public administration and space sectors. Furthermore, numerous “important sectors” such as manufacturing of medical devices and digital providers will be protected by the new rules. All medium-sized and large companies in selected sectors would fall under the legislation.

Furthermore, a framework for better cooperation and information sharing between different authorities and Member States have been created as well as a European vulnerability database.

The only aspect remaining is for the European Council to formally adopt the law before it will be published in the EU’s Official Journal.

Does your organization have questions about cyber security or how the NIS2 Directive may impact you? Contact us, the Experts in Data Privacy at for more information.


Cybersecurity: Parliament adopts new law to strengthen EU-wide resilience




Friday 11 Nov - 10:54 AM

The right to privacy vs. journalistic interests: A balancing exercise

An individual was referred to in a news article as a master swindler and filed suit to explain to the court that article 8 of the European Convention of Human Rights (ECHR) was violated regarding the ‘Right to respect for private and family life’. He claimed the journalist was liable for €40,000 in damages, and also requested a ban on any future publications about him. 

Expanding the U.S. Press Freedom Tracker two years after launch

A journalist had, via its website, published online articles about the individual. The articles were thereinafter shared by national newspapers. The articles referred to the individual as a master swindler because it accused the individual of selling phone cards to detainees and of scamming them. The articles revealed the individual’s full name, data of birth and a photograph.  

The Court of First Instance dismissed the individual’s claims and considered that the journalist processed his data for journalistic purposes. Additionally, the court mentioned that the press has a watchdog function, in this case by warning the public about fraud, and that therefore, the journalist’s freedom of expression outweighed the protection of the individual’s privacy.  

The individual then appealed the decision of the Court of the First Instance, and claimed that the journalist’s purpose was to make him look bad and then there was no legal ground for processing his data under article 6 GDPR. Therefore, additionally the article 43 UAVG (the Dutch Implementation Act), which sets forth that the UAVG does not apply to the processing of personal data for exclusively journalistic purposes, was not applicable according to the individual.  

The Court of Appeal then explained that the term ‘journalistic’ within the meaning of article 43 UAVG should be interpreted broadly. The journalist’s website was not deemed to make people look bad, but instead to inform the public about fraudulent activities. The processing of the individual’s data was therefore not deemed to fall out of scope of article 43 UAVG. The Court did note that it is prohibited to process criminal data according to article 10 GDPR, unless the exception under article 43(3) UAVG applies, meaning, insofar the processing was necessary for journalistic purposes. The Court deemed this processing of personal data necessary because of the publications concerned reporting on and warning about fraudulent practices. Regarding the applicable legal basis, the court held that journalistic activities are a legitimate interest for processing data under article 6(1)(f) GDPR. It therefore, dismissed the appeal of the individual. 

Although many people think the right to privacy is an absolute right, it is important to note that this is not the case. It requires a balancing exercise to decide what right prevails in particular circumstances. It is a positive development that more case law is appearing regarding journalistic purposes, since this is still a grey area under relevant laws like the GDPR. 

 Contact us, experts in data privacy, if you want to learn more about data privacy via: 




Wednesday 2 Nov - 2:07 PM

Endangering encryption? The European Commission’s gross violation of privacy

Strong end-to-end encryption is an essential part of a secure and trustworthy internet. This protects citizens every time an online transaction is made, when medical information is shared or when citizens interact with family and friends.   

Strong encryption also helps protect children as it allows them to communicate with family and friends in confidence and allows others to report online abuse and harassment in a confidential manner. Encryption ensures personal data, and citizens’ private conversations, are kept private.

The EU’s new regulation intending to fight child sexual abuse online will require internet platforms – including end-to-end encrypted messaging applications like Signal and WhatsApp – to “detect, report and remove” images of child sexual abuse shared on their platforms. In order to do this, however, platforms would have to automatically scan every single message. This process is known as “client-side scanning.”

Not only is this a gross violation of privacy, there is no evidence that the technology exists to do this effectively and safely without undermining the security provided by end-to-end encryption. While the proposed regulation is well-intentioned, it will result in weakening encryption and making the internet less secure.

This proposal has already been criticized by privacy watchdogs – the European Data Protection Board (EDPB) and the European Data Protection Supervisor (EDPS) – which issued a joint statement calling for the regulations to be amended. The proposals were described as “highly intrusive and disproportionate”, arguing that by requiring platforms to weaken encryption, the regulations violate Articles 7 and 8 of the Charter of Fundamental Rights of the European Union – namely the right to respect for private and family life and the right to protection of personal data.

The EU has fallen for the myth that it is possible to keep EU citizens safer by weaking the very thing that protects its citizens. If backdoors are created for law enforcement, weaknesses in the system are created for everyone. These weaknesses could be exploited by criminal gangs or other malicious actors.

Furthermore, it is impossible for platforms to weaken encryption only for users located within the EU – any reduction in security would affect users of those platforms across the globe. In the United Kingdom (UK), similar legislation has been proposed for WhatsApp. WhatsApp has indicated that it is willing to withdraw from the UK market if they are required to weaken encryption. The same could occur across Europe.

The EU relies on encryption to protect the security of its member countries and the bloc as a whole but by proposing to weaken encryption, it may result in making all citizens more vulnerable.

If implemented correctly, encryption can be a very powerful tool in protecting personal data that your organization processes. Do you have any questions about encryption and how to keep personal data safe and secure? Contact us, the Experts in Data Privacy, at



The Commission’s gross violation of privacy – endangering encryption


Friday 28 Oct - 11:41 AM

Luxembourg delivers the first GDPR accreditation

The national commission for data protection has become the first data protection authority in Europe to accredit a GDPR certification body.

On 12 October, the national commission for data protection in Luxembourg accredited an entity via its certification mechanism, GDPR-CARPA (General Data Protection Regulation-Certified Assurance Report-Based Processing Activities). This is the first mechanism to be adopted on a national and international level under the GDPR.

The European Data Protection Board’s Opinion states that certification mechanisms should enable controllers and processors to demonstrate compliance with the GDPR. The criteria should, therefore, properly reflect the requirements and principles concerning the protection of personal data as laid down in the GDPR and contribute to its consistent application.

The GDPR is a law that regulates how organizations target or collect data related to people in the European Union. Furthermore, it outlines how organizations must protect and handle data in a secure manner and details privacy rights which give individuals more control over their personal data.

With a GDPR certification, companies, public authorities, associations and other organizations can show that their data processing activities are complying with the GDPR. The implementation of the certification mechanism can promote transparency and compliance as it allows businesses and individuals to evaluate the level of protection offered by products, services, processes or systems used or offered by organizations that process personal data. Entities can, therefore, benefit from an independent certificate to demonstrate that their data processing activities comply with EU regulations. It will be interesting to see how this develops with other data protection authorities in the EU.

Does your organization have questions about accreditation? Contact us, the Experts in Data Privacy at for more information.



Luxembourg delivers first GDPR accreditation

Opinion 28/2022 on the European criteria of certification regarding their approval by the Board as European Data Protection Seal pursuant to Article 42.5 (GDPR)




Friday 21 Oct - 9:29 AM

Companies not established in the EU: take note

The European Data Protection Board (EDPB) has published an updated version of the Guidelines on Personal Data Breach Notifications under the General Data Protection Regulation (GDPR).

The previous Guidelines provided that “notification should be made to the supervisory authority in the Member State where the controller’s representative in the EU is established.” The revised Guidelines, however, state that “[…] the mere presence of a representative in a Member State does not trigger the one-stop-shop system. For this reason, the breach will need to be notified to every single authority for which affected subjects reside in their Member State. This notification shall be done in compliance with the mandate given by the controller to its representative and under the responsibility of the controller.”


Should the EDPB finalize the above suggested procedure, it can have immense implications for companies not established in the European Union (EU) and processing the personal data of EU citizens. It is, therefore, important for companies to keep up to date with developments within privacy and data protection.

The revised Guidelines are open for public consultation until 29 November 2022.

Does your company have questions about a how this may impact your operations within the EU or does your company require the services of a representative? Contact us, the Experts in Data Privacy, at .


Guidelines 9/2022 on personal data breach notification under GDPR 



Friday 14 Oct - 8:43 AM

Dutch employee fired by U.S. firm for shutting off webcam awarded €75,000 in court

A Dutch employee of a U.S. firm with a branch in The Netherlands refused to be monitored during a full workday. The consequences were that the employee was fired for ‘refusal to work’ and ‘insubordination’. In a lawsuit a Dutch court awarded the employee with €75,000 for wrongful termination. 

The employee began working for the U.S. firm in 2019. A year and a half later, he was ordered to take part in a virtual training period, which was called the ‘Corrective Action Program’. During this program he was told to remain logged in for the entire day, including screen-sharing turned on and to have his webcam activated.  

The employee did not feel comfortable with being monitored for a full workday and claims it was an invasion of privacy. He agreed on having the activities on his laptop being monitored and to share his screen, but he refused to activate the webcam. Thereinafter, he was fired.  

The court concluded that the termination was not legally valid, because the reasons for dismissal were not made clear enough, nor were there reasonable instructions. The instructions definitely did not respect the employee’s right to respect for his private life (article 8 the European Convention on Human Rights) according to the court.  

Many organizations do not realize the ‘right to privacy’ is a fundamental right in the European Economic Area (EEA). When having business operations in the EEA, it is essential to respect this at all times. This may be in conflict with for example more flexible workplace privacy practices in the U.S.  

It is therefore important to make sure your organization is aware and informed when doing business in the EEA. Contact us for all of your privacy related questions via:  


Wednesday 12 Oct - 8:17 AM

President Biden signs executive order on EU-US data privacy management

On 7 October, President Biden signed an executive order that would limit the ability of American national security agencies to access people’s personal information as part of the transatlantic data sharing agreement with the European Union (EU).

This executive order follows lengthy negotiations between the United States (US) and EU after the Court of Justice of the European Union (CJEU) ruled in 2020 that the US did not sufficiently protect Europe’s data when it was transferred across the Atlantic. The judges’ concerns focused on how US surveillance programmes did not have proper measures for European citizens to address how the government collected their data.

What is new?

The Data Privacy Framework (DPF) includes three components:

  • Commercial data protection principles to which US organizations may self-certify: the Privacy Shield will be updated to refer to the GDPR directly and US organizations must ensure they remain abreast of these developments.
  • A presidential executive order: this requires US intelligence authorities to limit US signals intelligence activities to what is necessary and proportionate. The executive order imposes necessity and proportionality limits first by mandating them explicitly, then by explaining what that mandate means and finally by prescribing oversight mechanisms to verify intelligence agencies to follow the new rules. What this will mean in practice is also included.
  • DOJ regulations: a two-step redress system will be introduced, which includes a new Data Protection Review Court. The first tier is an evaluation of a submitted claim by the Civil Liberties Protection Officer (CLPO) and the second tier is the Data Protection Review Court. This court is meant to process complaints concerning the legality of US signals intelligence activities transmitted from “qualifying states” for covered violations of US law.

Max Schrems’ response

The day the executive order was issued, Max Schrems and published their first reaction indicating that the executive order is unlikely to satisfy EU law. According to Schrems the executive order does not offer any solution and that there will be continuous “bulk surveillance” and a “court” that is not an actual court.

Bulk surveillance will continue by means of two types of ‘proportionality’. The words “necessity” and “proportionate” have been included but the EU and US did not agree that it will have the same legal meaning. In the end, CJEU’s definition will prevail therefore muting any EU decision again.

The “court” will not be a court but rather a body within the US government’s executive branch. The new system is an upgrade of the previous “Ombudsperson” system, which was rejected by the CJEU and it appears that it will not amount to “judicial redress” as required under the EU Charter. has indicated that they are in the process of working on an in-depth analysis, which will be published soon. If the Commission’s decision is not in line with EU law and the relevant CJEU judgments, will likely bring another challenge before the CJEU.

What is next?

The European Commission will now launch its adequacy assessment. This process requires the Commission to put forward a draft adequacy determination, the EDPB to issue a nonbinding opinion, EU Member States to vote to approve the decision and the European Commission College of Commissioners to formally adopt it. The European Parliament may also weigh in with a non-binding resolution at any stage.

Previously, this process has taken four or five months once the Commission finalizes its draft. Should this trend be followed, it is expected that in March 2023 further clarification will be provided. Until then, however, organizations should not be under the impression that it relieves them from their obligation to conclude the Standard Contractual Clauses (SCC) including the performance of a Transfer Impact Assessment (TIA) by the 27th of December 2022.

Does your organization have any questions about transferring personal data internationally to the US? Contact us, the Experts in Data Privacy at for assistance.



Biden signs executive order on EU-U.S. data privacy management 

The EU-US Data Privacy Framework: A new era for data transfers?

New US Executive Order unlikely to satisfy EU law 



Friday 7 Oct - 11:46 AM

Britain to scrap GDPR rules

Earlier this week Britain vowed to tear up the European Union’s General Data Protection Regulation (GDPR) rules and implement an independent approach to data privacy.

Brexit and the GDPR

This announcement was made by the British Secretary of State for Digital, Culture, Media and Sport Michelle Donelan, who pledged to remove red tape during an address to the Conservative’s party annual conference, although the precise constitution of the new regime remains vague.

Donelan said: “We will be replacing GDPR with our own business-and consumer-friendly British data protection system. I can promise…that it will be simpler, it will be clearer for businesses to navigate. No longer will our businesses be shackled by lots of unnecessary red tape.”

Vowing to build new privacy protections founded on ‘common sense’, Donelan claimed that the British economy could be boosted by dispensing with needless bureaucracy while retaining protections for internet users.

It is not clear when this new regime will come into force, what exactly it will entail and what implications this will have for the adequacy decision that was adopted by the European Commission in June 2021. This is an interesting development to keep in mind when operating in United Kingdom.

Do you have questions about the UK GDPR? Contact us, the Experts in Data Privacy, at


Britain to scrap GDPR rules and go it alone on data privacy 


Tuesday 4 Oct - 3:40 PM

£27 million fine possibly facing TikTok

In September, Instagram was fined over the protection of children’s data. TikTok may now be facing a similar fate.

Investigations by the Information Commissioners Office (ICO) in the United Kingdom (UK) have found that TikTok, the video-sharing App, may have breached the UK data protection law between 2018 and 2020.

The ICO issued TikTok with a “notice of intent”, a precursor to handing down a potential fine, which could be up to £27 million. If TikTok were to be fined this amount, it would be the largest in the ICO’s history, exceeding the record of £20 million which was handed to British Airways two years ago after an incident in 2018 that saw the personal details of more than 400 000 customers compromised by hackers.

The ICO’s provisional view is that TikTok may have processed the data of children under the age of 13 without parental consent and failed to provide proper information to its users in a “concise, transparent and easily understood way.” Furthermore, TikTok may have processed special categories of personal data without legal grounds to do so.

The information commissioner said that “companies providing digital services have a legal duty to put those protections in place but our provisional view is that TikTok fell short of meeting that requirement.” TikTok said it disagreed with the ICO’s provisional finding and would make a formal response challenging the findings of the investigations. Upon receipt of these representations, the ICO will reach a conclusion.

Do you have questions about complying with the (UK) GDPR? Contact us, the Experts in Data Privacy, at for assistance.



TikTok could face £27m fine for failing to protect children’s privacy



Wednesday 28 Sep - 12:08 PM

Our key takeaways from PrivSec Amsterdam

One of the things we learnt yesterday is that the potential new data transfer agreement between the US and the EU is to win time and not to solve the true problem regarding international data transfers from the EU to the US, according to Max Schrems.

Our colleagues Jelmer and Dounia attended the PrivSec Amsterdam event that included a range of speakers from world renowned companies and industries to allow privacy professionals from across different fields to share case studies and their experiences.  It included keynote speeches, presentations, panel discussions, and enough time to meet up with other privacy professionals.

Topics of yesterday’s program included Google Analytics, the cooperation between data protection and security teams, data retention, consumer trust and transparency, and DPIAs amongst others. Furthermore, Max Schrems provided a presentation on The Future of Online Privacy, GDPR Enforcement and The Battle Against Surveillance.

In his keynote speech Schrems elaborated on how he believes that even though they are planning on introducing a new Executive Order that includes a proportionality assessment, that the new international data transfer agreement would not change anything and is meant to win time, because of the fact that FISA and the PRISM program would still apply. The best solution in his eyes would be to level data protection in the US by the introduction of a federal privacy law amongst other things.

Furthermore, he also warned organizations of  the risk of lawsuits by individuals when continuing their data transfers in violation of the GDPR, which could ultimately lead to higher costs than the fines opposed by data protection authorities.

For us, especially the Schrems keynote was very interesting, because it confirmed that we should continue with helping out our clients with all the needed work regarding Standard Contractual Clauses (SCCs) and Transfer Impact Assessments (TIAs). Chances are likely that a new standard will again be invalidated and your organization should be able to fall back on at least a GDPR compliant TIA in case SCCs are used as an international data transfer mechanism.


Friday 23 Sep - 11:42 AM

Will the next data transfer agreement smoothen cross-border data flows?

G-7 leaders have gathered just recently in June to discuss legal deals underpinning bilateral data flows that exist among most of G-7’s members. G-7 consists of the United States, United Kingdom, Germany, France, Italy, Japan, and Canada.

The goal of the meeting was to align with the regulators’ approaches to privacy and to be able to better understand the domestic rules in each jurisdiction. Methods were discussed to move data and to create options for businesses to choose cross-border transfer tools that are suitable for their business needs. One can think of techniques to properly anonymize data, to strip information of details that identify and individual, and the trend toward the closer cooperation between antitrust and privacy regulars.

An important conclusion after the meeting was that countries need legislation that agrees that an individuals’ personal data is only accessed in case it is strictly necessary for national security purposes. However, that is easier said than done, especially after the Schrems II ruling and European lawmakers that criticized the United States’ intelligence practices. Also, the lack of federal privacy law that guarantees privacy rights is playing a big role in this discussion.

The wish of regulators of the G-7 countries is to smoothen international business for many companies. To enable this, it is imperative to create better understanding on how domestic rules currently affect some kinds of information, and how it can be used and sold. Mr. Wiewiórowski, the European Data Protection Supervisor, does not believe the G-7 countries are ready to create such a market for cross-border data flows.

All in all, the next solution to smoothen cross-border data flows should address these complex points addressed above to make sure that the next data transfer agreement will not be invalidated just like Safe Harbor and the Privacy Shield. There is a lot of work to be done. Until then, it is mandatory to fall back on the GDPR’s international data transfer mechanisms, including Standard Contractual Clauses (SCCs) and by performing Transfer Impact Assessments (TIAs).



Wednesday 21 Sep - 6:03 AM

Focus on designation of Data Protection Officers

The European Data Protection Board (EDPB) has announced that its next coordinated enforcement action will focus on the data protection officer (DPO) designations.

The 22 Supervisory Authorities across the European Economic Area, along with the European Data Protection Supervisor, will launch investigations into the yet-to-be-determined aspects of DPO requirements under the General Data Protection Regulation.

In a co-ordinated action, the EDPB prioritises a certain topic for data protection authorities to work on at a national level. The EDPB has further said that the individual actions will be “bundled and analysed, generating deeper insight into the topic and allowing for targeted follow-up on both national and EU level.”

What this will entail for the designation of DPOs is still unclear but upon further information been made available, we will provide an update. If you have any questions about the designation of a DPO and the advantages of DPO-as-a-Service, contact us:


EDPB to focus coordinated enforcement on DPO appointments

EDPB adopts statement on European Police Cooperation Code & picks topic for next coordinated action



Tuesday 13 Sep - 6:22 PM

Interpretation of special categories of personal data extended by CJEU

In August this year, the Court of Justice of the European Union (CJEU) issued a preliminary ruling in OT v Vyriausioji tarnybinės etikos komisija (Chief Official Ethics Commission, Lithuania) (Case C-184/20), which was referred by the Regional Administrative Court of Lithuania. In this ruling, the CJEU elected to interpret the GDPR very broadly.

The CJEU clarified that the indirect disclosure of sexual orientation data is protected under Article 9 of the General Data Protection Regulation (GDPR). Therefore, such disclosures falls under the special categories of personal data.


This case arose from a question concerning the application of Lithuanian law, which required people in receipt of public funds to file declarations of interest. The declarations, which included information about the interests of the individual’s “spouse, cohabitee or partner” were published online. The applicant failed to file a declaration and was sanctioned therefore. The CJEU found that the underlying law did not strike a proper balance between the public interest in preventing corruption and the rights of affected individuals.

The CJEU went on to note that because it is possible to deduce information about an individual’s sex life or sexual orientation from the name of their partner, publishing that information online involves processing special categories of personal data as defined in Article 9 GDPR.

It was found by the CJEU that the processing of any personal data that are “liable indirectly to reveal sensitive information concerning a natural person”, for instance, any information that may reveal a person’s racial or ethnic origin, religious or philosophical beliefs, political views, trade union membership, health status or sexual orientation, is subject to the prohibition from processing under Article 9(1) GDPR unless an exception under Article 9(2) applies.

Possible Implications

The implications of this ruling could be significant. It is possible that common processing operations, such as publishing a photograph on a corporate social media page, could reveal some information that is protected under Article 9. Controllers may need to review their processing operations through a contextual lens to assess whether the data being processed and the manner of processing is liable or able to reveal any sensitive information.

It has even been suggested that this ruling could have implications in all contexts where Article 9 is applicable, including online advertising, dating apps, location data indicating places of worship or clinics visited, food choices for airplane flights amongst others.

The way forward?

The judgment is not clear how far controllers need to go to make this assessment. One option that may be possible is to argue that if the controller does not make personal data public, and it implements policies that prohibits employees from making inferences, then information is not liable to reveal special category data. An alternative option would be for regulatory guidance to be issued indicating how controllers can comply with the ruling and the existing guidelines.




Sensitive data ruling by Europe’s top court could force broad privacy reboot 





Friday 9 Sep - 2:37 PM

€405 million: second highest fine ever issued under the GDPR

Ireland’s Data Protection Authority has fined Instagram €405 million over the lack of protection of children’s data. This fine was issued after an investigation found that Instagram, the social media platform, has mishandled teenagers’ personal information in violation of the strict European Union data privacy laws.

The investigation, which commenced in 2020, focused on how Instagram displayed the personal details of users in the age range of 13 – 17 years, including email addresses and phone numbers. The investigation began after a data scientist found that users, including those under the age of 18 years, were switching to business accounts and had their contact information displayed on their profiles. It is alleged that users were doing this to see statistics on how many likes their posts were getting after Instagram starting removing the feature from personal accounts in some countries to help with mental health.

Instagram said it updated its settings over a year ago and has since released new features to keep teenagers safe and their information private. Instagram disagrees with the calculation of the fine and plans to appeal this decision.

This investigation forms part of over a dozen investigations into Meta companies, including Facebook and WhatsApp, as opened by Ireland’s Data Protection Authority. Will the largest (or third largest) fine be issued against a Meta company in the near future? All we can do is to watch this space…


Ireland fines Instagram €405 million over protection of children’s data



Monday 5 Sep - 9:23 AM

Sephora – cosmetics and privacy?

The California Attorney General has dropped a bombshell: the first enforcement settlement under the California Consumer Privacy Act (CCPA). Sephora, a French cosmetics brand, must pay $1,2 million in fines and abide by a set of compliance obligations.

It is alleged that Sephora failed to disclose to consumers it was selling their personal information; failed to honour user requests to opt out of sale via user-enabled global privacy controls and did not cure these violations within the 30-day period allowed by the law.

At issue in this case was Sephora’s sharing of information with third-party advertising networks and analytics providers. This case marks a considerable uptick in risk for companies doing business in California and preparing for the California Privacy Rights Act activation in January 2023. It is clear that the Attorney General is focusing on online tracking and the implementation of and compliance with global opt-out signals, such as the Global Privacy Control.

Sephora has said that it uses cookies “strictly for Sephora experiences” and that the CCPA does not define the word “sale” in the traditional sense of the word but uses it to describe the common practice of using cookies. Under the settlement, Sephora must let customers know that it sells their data and give them a way to opt-out.

Take-aways from this:

  • A French cosmetics brand was fined, not a large technological company.
  • A Global Privacy Control is a browser extension that automatically signals a consumer’s privacy preferences to all websites they visit without having to click on opt-out links one by one.
  • The Attorney General has been quoted as saying, “My office is watching, and we will hold you accountable…There are no more excuses.”
  • The CCPA’s notice and cure provision is expiring at the end of this year, which means that businesses must comply from the outset
  • The American Data Privacy Bill, currently before Congress, may further alter the rules and, in future, these enforcement actions could not occur
  • Interesting developments within the privacy sphere of the USA


The Sephora case: Do not sell – But are you selling?  

California fines Sephora $1.2 million for selling consumer data



Thursday 14 Jul - 2:17 PM

Greek DPA fined Clearview AI for €20 million

After decisions by the French DPA and Italian DPA, the Greek DPA now also has fined Clearview for €20 million. Clearview AI sells facial recognition software to the US law enforcement agencies but it is no longer permitted to share biometric data of individuals in Greece. Which also leads to the example of Italy and France.


Greek DPA fined Clearview AI for €20 million

Clearview claims to have:

the largest database of more than 10 billion facial images

and the aim is to succeed in in reaching 100 billion facial images by next year in order to identify every person worldwide. This also includes the usage of Clearview AI’s software to monitor the behavior or people in Greece, even though the company is based in the US and does not offer any services in Greece or the European Economic Area. One might think that the GDPR does not apply in this case, but the ruling was very clear: the GDPR is applicable because the territorial scope applies.

The images are collected from social media accounts and other online sources by Clearview. However, the DPA’s ruling explains that collecting images for a biometric search engine is illegal, meaning that both the images and the biometric information should be deleted. Biometric data is a special category of personal data under the GDPR. This means that processing of biometric data is in principle prohibited, unless an exception applies under the GDPR or specific Member State law.

The Greek DPA further ordered Clearview to appoint a data protection representative to enable EU individuals to exercise their rights more easily, and to provide a contact point in the EU for regulators. This is a requirement under the GDPR when a non-EU company is processing EU personal data without any establishment in the European Economic Area. This is further explained in our white paper.

Since complaints have been filed by an alliance of organizations with multiple data protection authorities, it is likely that after France, Italy, and Greece, similar decisions will follow in Austria and the UK. Every data protection authority is allowed to fine companies a maximum of €20 million.

Clearview clearly interpreted the territorial scope of the GDPR incorrectly and has to face millions in fines. It is therefore recommended that non-EU companies ensure that they comply with the GDPR and carefully assess whether the territorial and material scope of the GDPR applies to the company’s business operations.

Is your company complying with the GDPR? Or do you have any questions about GDPR compliance? Contact us, experts in data privacy, to learn more about the GDPR via:


Friday 13 May - 1:38 PM

EU plans to cut unneeded medical tests with data health plan

Being enabled to easily access your patient health records under strict rules to protect your privacy?

If it were up to the European Commission (EC), this is arranged in 2025 for patients, medics, regulators and researchers. The EC believes this will improve diagnosis, boost medicine research, and cut unnecessary costs from duplication of medical tests.The EC offered a binding proposal to EU governments and lawmakers, and includes the EU’s executive’s plans for an improved health data space, which would lead to savings and economic gains of over 10 billion euros in 10 years.

EU plans to cut unneeded medical tests with data health plan


Friday 29 Apr - 11:31 AM

Webinar: what does effective vendor risk management look like?

How do you ensure that you include all aspects of the GDPR in your vendor risk management? In this webinar, you’ll learn about the specific privacy management activities you need to perform with a vendor at each stage to comply with the GDPR.


Friday 29 Apr - 11:16 AM

Class-Action Lawsuit Targets Company that Harvests Location Data from 50 Million Cars

Data that is being sold revealing your whereabouts, for example where you live, work or where you go to church? This happened to more than 50 million car owners around the world.

A California-based data broker is involved in a class-action lawsuit because it has been accused of secretly collecting and selling real-time GPS location information from more than 50 million cars around the world, including California-based consumers. The claim in the lawsuit is that the company never requests consent from drivers before tracking their location.

Class-Action Lawsuit Targets Company that Harvests Location Data from 50 Million Cars


Friday 29 Apr - 9:40 AM

This is what your organization needs to know this about the Digital Services Act (DSA)

The EU legislators reached an agreement on the DSA, which will govern the digital sphere and increase the fight against illegal content and disinformation. It builds on the eCommerce Directive and provides clear rules for content moderation, platform accountability, illegal products and systemic risks.

EU institutions reach agreement on Digital Services Act


Monday 25 Apr - 11:52 AM

Announcement on New Trans-Atlantic Data Privacy Framework

16 July 2020 is a very significant date in the world of privacy and data protection. The reason for this is that on this date, the Court of Justice of the European Union (CJEU) handed down the judgment[1] invalidating the Privacy Shield framework. This ruling has become known as the Schrems II ruling.