Privacy by Design to become an ISO standard on 8 February
In January, it was reported that on February 8, 2023, the International Organization for Standardization (ISO) will adopt Privacy by Design (PbD) as ISO 31700.
PbD was introduced 14 years ago by Ann Cavoukian, a Canadian privacy commissioner, and this will now become an international privacy standard for the protection of consumer products and services.
PbD is a set of principles that calls for privacy to be taken into account throughout an organization’s data management process. Since its introduction, it has been adopted by the International Assembly of Privacy Commissioners and Data Protection Authorities and was incorporated into the General Data Protection Regulation (GDPR).
Cavoukian has been quoted as saying “the ISO standard is designed to be utilized by a whole range of companies – startups, multinational enterprises, and organizations of all sizes. With any product, you can make this standard work because it’s easy to adopt. We’re hoping privacy will be pro-actively embedded in the design of [an organization’s] operations and it will complement data protection laws.”
The proposed introduction notes that PbD refers to several methodologies for product, process, system, software and service development. The proposed bibliography that comes with the document refers to other standards with more detailed requirements on identifying personal information, access controls, consumer consent, and corporate governance amongst other topics. A separate document will outline possible use cases as well.
Adopting privacy can be regarded as a competitive advantage for businesses and by implementing PbD principles, both privacy and business interests are addressed resulting in a ‘win-win’ situation.
Does your organization have any questions about PbD? Read this white paper to find out more.
LastPass’ parent company GoTo – formerly known as LogMeIn – has confirmed that hackers stole customers’ encrypted backups during a recent breach of its systems.
The breach was first confirmed by LastPass on November 30. At the time, LastPass said that an ‘unauthorized party’ had gained access to some customers’ information stored in a third-party cloud service shared by LastPass and GoTo. The hackers used information stolen from an earlier breach of LastPass systems in August to further compromise the companies’ shared cloud data.
Now, almost two months’ later, in an updated statement GoTo has said that the cyberattack has impacted several of its products, including business communications tool Central, online meetings service Join.me; hosted VPN service Hamachi and its Remotely Anywhere remote access tool.
The hackers exfiltrated customers’ encrypted backups from these services, which included the company’s encryption key for securing the data.
GoTo has said that the company does not store customers’ credit card or bank details or collect personal information, such as date of birth, home address or Social Security Numbers. This, however, is in clear contrast to the hack affecting its subsidiary LastPass. During the attack at LastPass, hackers stole the contents of customers’ encrypted password vaults, along with customers’ names, email addresses, phone numbers and some billing information.
GoTo has 800,000 customers, including enterprises but has not indicated how many customers are affected. Furthermore, despite the delay, GoTo has not provided any remediation guidance or advice for affected customers.
Does your organization make use of a program where passwords or other sensitive information is saved? It may be time to review the Information Security Policy to ensure technical and organizational measures are in place to adequately protect personal data. Contact us, the Experts in Data Privacy, at email@example.com for assistance.
The Spanish data protection authority (AEPD) had previously argued that location data of a telecommunications provider is not considered to be personal data under the GDPR. After which the organization ‘noyb’ (none of your business) appealed against the AEPD in the Spanish Court. The Court sided with noyb by stating that location data is personal data.
It all started when a telecommunication provider had denied its customers access to their location data because according to them it did not qualify as personal data under the GDPR. Therefore, they believed they did not have to grant access rights to users. A Spanish customer requested access to his location data with the telecommunication provider, after which his request was denied. The customer then filed a complaint with the AEPD. Thereinafter, noyb appealed this decision in June 2022.
The decision is of the AEPD is a hard one to comprehend, since the GDPR contains a very broad definition of personal data:
“…any information relating to an identified or identifiable natural person (‘data subject’); an identifiable natural person is one who can be identified, directly or indirectly, in particular by reference to an identifier such as a name, an identification number, location data, an online identifier or to one or more factors specific to the physical, physiological, genetic, mental, economic, cultural or social identity of that natural person…”
It even includes the example of location data being personal data.
Furthermore, the right to access of personal data should also not be taken lightly. Individuals have a right under the GDPR to access data organizations store on them. Therefore, the telecommunication provider could not lawfully deny access to location data.
How does your organization handle data subject requests (DSRs)? A decent DSR procedure will help your organization to assess and manage DSRs in a GDPR compliant manner. Contact us if you want to learn more about DSRs via: firstname.lastname@example.org.
Cookie banners to be consistent across the EEA soon
On the 18th of January 2023, the European Data Protection Board (“EDPB”) adopted a report by the Cookie Banner Task Force for the work carried out in ensuring a consistent approach to cookie banners across the European Economic Area (“EEA”).
The Cookie Banner Task Force was established in September 2021 to coordinate the response to complaints received about cookie banners filed with numerous EEA Data Protection Authorities (“DPAs”) by NYOB.
The Task Force aimed to promote cooperation, information sharing and best practices amongst the DPAs, which has been instrumental in ensuring a consistent approach to cookie banners across the EEA.
From the report it will be noted that the various DPAs agreed upon a common denominator in their interpretation of the applicable provisions of the ePrivacy Directive and the General Data Protection Regulation (“GDPR”) on issues such as reject buttons, pre-ticked boxes, banner designs or withdraw icons.
Court of Justice of the European Union: Everyone has the right to know to whom his/her personal data have been disclosed
On January 12, 2023, the Court of Justice of the European Union (“CJEU”) issued a preliminary ruling on the interpretation of the right of access to personal data (Article 15 GDPR). The CJEU ruled that the right of access entails the right to know the specific identity of recipients of the personal data.
The CJEU was asked whether the GDPR leaves the data controller the choice to disclose either the specific identity of the recipients or only the categories of recipient, or whether it gives the data subject the right to know their specific identity.
The highlights of the judgment are:
The data subject has the right to know the specific identity of the recipients of his personal data as part of his/her right of access under Article 15(1)(c) of the GDPR. Informing the data subjects only about the categories of recipients, e.g., advertisers, IT companies, mailing list providers, is not sufficient.
The right of access is necessary to enable the data subject to exercise other rights conferred by the GDPR, namely the right to rectification, right to erasure, right to restriction of processing, right to object to processing or right of action where he/she suffers damage. If the data subject is not informed about the identity of the recipients, the exercise of these rights vis-á-vis these recipients would be restricted.
Therefore, where the personal data have been or will be disclosed to recipients, the data controller is obliged to provide the data subject, on request, with the actual identity of those recipients, unless it is not (yet) possible to identify those recipients, or the controller demonstrates that the request is manifestly unfounded or excessive.
The information obligation under Articles 13 and 14 of the GDPR gives the data controller the right to inform data subjects only about the categories of recipients. The importance of the judgment lies in the fact that the data subject’s right to know the identity of recipients seems to prevail the data controller’s right to only disclose the categories of recipients.
This ruling highlights the importance of maintaining a clear record of processing activities. Accordingly, the data controllers should keep track of the actual identity of recipients for each data transfer and duly reflect the names of specific recipients in their record of processing activities as recording only the categories of recipients in the registry would not be sufficient to fully comply with Article 15(1)(c) of the GDPR.
How does your organisation handle data subject rights or keep the record of processing activities? Contact us, experts in data privacy, if you want to learn more via email@example.com.
Meta fined €390 million by the Irish DPC over legal basis for targeted advertising
The Irish Data Protection Commission (“DPC”) announced its final decisions on two inquiries into Meta’s data processing operations on Facebook and Instagram. The DPC fined Meta a total of €390 million for breaches of the GDPR and invalidated its reliance on a “contract” legal basis for personal data processing for targeted advertising purposes.
The decision came after the complaints made by privacy rights organisation NOYB on May 25, 2018, the day the GDPR came into operation.Before 25 May 2018, Meta had changed the Terms of Service for its Facebook and Instagram services, which indicated a change in the legal basis on which Meta relies to legitimate its processing of personal user data. Accordingly, Meta would rely on the contract legal basis for its certain data processing activities in the context of the delivery of the services on Facebook and Instagram, including behavioural advertising, rather than the consent, as it had previously done.
The users were asked to accept the new Terms of Service if they wished to continue to access Facebook and Instagram services. The DPC stated that Meta considered that “a contract was entered into” between Meta and users and that “processing of users’ data in connection with the delivery of its Facebook and Instagram services was necessary for the performance of that contract, to include the provision of personalised services and behavioural advertising, so that such processing operations were lawful by reference to Article 6(1)(b) of the GDPR (the “contract” legal basis or processing).”
The complainants argued that by making the use of its services conditional on the users’ acceptance of the Terms of Service, Meta was forcing them to agree to Meta’s use of personal data for behavioural advertising purposes, which implied that Meta was in fact relying on consent as a legal basis for this processing and that there was no real choice for users in this regard.
Following the European Data Protection Board’s binding determinations on the matter, the DPC adopted its final decisions in which it stated that:
Meta is not entitled to rely on the contract legal basis for the delivery of behavioural advertising services on Facebook and Instagram and therefore, the processing of personal user data based on contract legal basis constitutes a violation of Article 6 of the GDPR, and
Since users were not clearly informed about the legal basis relied on by Meta and what processing operations were being performed on their data, Meta violated its transparency obligations. It is considered that such a lack of transparency constituted breaches of the lawfulness, fairness and transparency principle enshrined in Article 5(1)(a), and Articles 12 and 13(1)(c) of the GDPR.
Furthermore, the decision also includes that Meta must bring its data processing operations into compliance with the GDPR within a period of 3 months. This means that Meta has to find a new legal basis for its processing activities in relation to targeted advertising.
NOYB Founder Max Schrems embraced the decision and said users “must have a ‘yes or no’ option and can change their mind at any time. The decision also ensures a level playing field with other advertisers that also need to get opt-in consent.”
Although the decision concerns the activities of Meta on Facebook and Instagram, it is surely expected to exert influence over other online platforms in the digital advertising industry.
As Schrems argued, many platforms currently make the access to their service conditional on the user’s consent. The decision leaves the question whether Meta or other online businesses must allow users to refuse targeted advertising without preventing them from having access to their services.
Does your organization make use of GDPR compliant targeted advertising? Contact us, experts in data privacy, if you want to learn more via: firstname.lastname@example.org.
The sanction came after the CNIL carried out several investigations over a complaint relating to the conditions for depositing cookies on “bing.com”. The CNIL’s decision was based on breaches of Article 82 of the French Data Protection Act. In particular, it is found that:
– When a user visited the search engine “bing.com”, a cookie serving several purposes, including the fight against advertising fraud, was automatically placed on their terminal without any action on their part.
– Moreover, when a user continued to browse on the search engine, a cookie with an advertising purpose was deposited on their terminal, without their consent having been obtained. This constitutes a violation of the Act as it requires that this type of cookies be placed only after users have consented.
The decision also highlighted the absence of a compliant means of obtaining consent for the deposit of cookies. It was stated that although the search engine offered a button to accept cookies immediately, it did not provide an equivalent solution (e.g. button to refuse) to allow the user to refuse them as easily as accepting them. This is because Bing offered one button for the users to accept all cookies, while two clicks were needed to refuse all cookies.
It was noted that complex refusal mechanisms are likely to discourage users from refusing cookies, instead, it encourages them to prefer the ease of the consent button appearing in the first window, which ultimately violates the freedom of consent of Internet users.
In deciding the amount of the fine, the CNIL regarded the scope of the processing, the number of data subjects, as well as the benefits that the company derives from the advertising profits indirectly generated from the data collected via cookies.
How does your organisation handle cookies? Contact us, experts in data privacy, if you want to learn more via: email@example.com.
The European Commission releases the EU-US Draft Adequacy Decision
On December 13, the European Commission published its draft adequacy decision for the EU-US Data Privacy Framework.
The draft decision came after the signature of the Executive Order by US President Joe Biden in October. The Executive Order will introduce safeguards for EU residents’ personal data, in particular by limiting the US intelligence services’ access to data and introducing an independent redress mechanism: Data Protection Review Court.
With the draft decision, the Commission considers the new legal framework based on the Executive Order as having an adequate level of data protection comparable with European data protection standards. This means that the EU residents’ personal data can be safely and legally transferred to the United States.
There are still two steps that need to be taken for the finalization of the data adequacy process. Following the draft decision, the European Data Protection Board, which gathers all EU data protection authorities, will issue an opinion. The decision will then need to have the approval of a committee formed by member states’ national representatives before the formal adoption. The Commission expects to obtain the approval by the summer of 2023.
If approved, the adequacy decision would be subject to regular review to ensure the full implementation of the relevant elements from the US legal framework and their effective functioning in practice. The reviews will start one year after the adoption of the decision.
The draft decision might be subject to legal challenges as has happened to its predecessors in the two landmark Schrems rulings. Max Schrems has been quoted as saying “As the draft decision is based on the known Executive Order, I can’t see how this would survive a challenge before the Court of Justice. It seems that the European Commission just issues similar decisions over and over again, which in flagrant breach of our fundamental rights.”
It is important to note that there are other tools for international transfers available to companies in the meantime. Standard Contractual Clauses, which companies can add in their commercial contracts, are the most used mechanism to transfer data from the EU to the United States. As updated by the European Commission in June 2021, the modernized Standard Contractual Clauses must be accompanied with a Transfer Impact Assessment which is aimed to identify and mitigate privacy risks before personal data can leave the European Economic Area. Until there is certainty from the European Commission regarding this draft adequacy decision, companies must ensure they adhere to the requirements for international data transfers.
Does your organisation have questions about international data transfers to the US? Contact us, experts in data privacy, if you want to learn more via firstname.lastname@example.org.
The CJEU: Google must delete search result content if it concerns manifestly inaccurate data
The Court of Justice of the European Union (“CJEU”) ruled that the operator of a search engine must delete search results if a person requesting deleting proves that information about them is “manifestly inaccurate”.
The case concerns a right to erasure request made by two managers of an investment company. They requested Google to de-reference results of a search made on the basis of their names which they asserted produced inaccurate claims. Google denied the request referring to the professional context in which the content of the search result was set and arguing that it was not aware whether the information contained in the content of the search was accurate or not. The German Federal Court of Justice requested CJEU for its interpretation of the GDPR, which governs the right to erasure.
In its press release, the CJEU firstly noted that the right to protection of personal data is not an absolute right and the right to erasure could be excluded where processing is necessary, in particular, for the exercise of the right of information. In this regard, one must strike a balance between the data subject’s rights to protection of personal data and protection of private life and the legitimate interest of internet users who may be interested in accessing the information in question.
The CJEU then highlighted that if, at the very least, a part of the information, which is not of minor importance, contained in the relevant content proves to be inaccurate, the right to freedom of expression and information cannot be considered. In this regard, the CJEU stated that:
– the person requesting de-referencing due to inaccurate content must prove the manifest inaccuracy of the information or a part of that information which is not of minor importance. Yet, this should not amount to an excessive burden on the person which could undermine the practical effect of the right to erasure. Therefore, the person is only expected to provide evidence that can be reasonably be required of him or her.
– the search engine operator, upon a request for de-referencing, must take into account all the rights and interests involved and all the circumstances of the case when assessing whether content may continue to be included in the search results.
Having considered the above, the CJEU concluded that where a person who has made a request for de-referencing provides relevant and sufficient evidence capable of substantiating their request and of establishing the manifest inaccuracy of the information contained in the content of the search result, the search engine operator is obliged to accede to that request.
How does your organisation handle data subject rights? Contact us, experts in data privacy, if you want to learn more via email@example.com.
Data scraping practices costs Meta €265 million in a fine
On 28 November, the Irish Data Protection Commissioner issued a €265 million fine on Meta-owned Facebook and Instagram over their data scraping practices and ordered remedial actions.
This inquiry by the Irish Data Protection Commissioner arose from the massive data leaks of Facebook personal data that was dumped online in a hacker forum in April 2021. The data included sensitive information such as full names, locations, birthdates, phone numbers and email addresses.
The data leak concerned 553 million people across 106 countries. In the EU alone, approximately 86 million people were affected. At the time, Facebook said that the leaked data was old since the mass data scraping occurred because of a vulnerability that the company had patched in August 2019.
A few days after the leak, the Irish Data Protection Authority announced that it was investigating the matter and would examine if Facebook’s data harvesting practices complied with the GDPR principles of privacy by design and by default.
The investigation concluded that between 25 May 2018 and September 2019, the social networks violated European privacy rules and imposed a set of specific remedial actions and issued a fine of €265 million.
A Meta spokesperson has been quoted as saying “unauthorized data scraping is unacceptable and against our rules, and we will continue working with our peers on this industry challenges. We are reviewing this decision carefully.” Meta can appeal this decision in court.
This is the second largest fine against Meta, following a €405 million fine against Instagram for breaching the privacy of children.
Does your company make use of data scraping practices? Contact us, the Experts in Data Privacy at firstname.lastname@example.org, to ensure that this is done in a manner that ensures privacy by design and by default are adhered to at all times.
The French Data Protection Authority Fines Utility Company EDF €600,000 for Violations of Data Security and the Rights of Individuals
France’s data protection authority, the CNIL, sanctioned the company EDF with a fine of 600,000 euros, in particular for data security violations and for not having respected its obligations in terms of commercial prospecting and the rights of individuals.
The sanction came after the CNIL has received several complaints regarding the difficulties encountered by people in having their rights taken into account by EDF, which is the first electricity supplier in France. The CNIL’s fine was based on three core violations of the GDPR.
– EDF failed to demonstrate to the CNIL that it had obtained prior valid consent from the individuals as part of its commercial prospecting campaign by electronic means (Article 7 of the GDPR).
– The sanction decision also highlighted the breaches of the obligation to inform (Articles 13 and 14 of the GDPR) and to respect the exercise of rights (Articles 12, 15 and 21 of the GDPR). Accordingly, it was noted that the personal data protection statement appeared on EDF’s website did not specify the legal basis corresponding to each case of data use and was not clear on the duration of storage, and that the source of the data was not clearly indicated. Moreover, the CNIL stated that EDF failed to respond to certain complaints within the one-month period provided for by the texts, individuals were given inaccurate information on the source of data collected and thus their right of access to data was not respected, and that EDF did not consider the individuals’ right to object to receive commercial prospecting.
– The violations lastly included failing to ensure the security of personal data (Article 32 of the GDPR). It was found that the passwords for accessing the customer portal to receive an energy bonus for more than 25,000 accounts were stored in an unsecured manner until July 2022, and that the passwords were only hashed, without having been salted (addition of random characters before the hash to avoid finding a password by comparing hashes) which put them at risk.
In deciding the amount of the fine, the CNIL regarded the breaches identified, as well as the cooperation of the company and all the measures it took during the procedure to reach compliance on all alleged breaches with it was charged.
How does your organisation secure the processing of personal data or handle data subject rights? Contact us, experts in data privacy, if you want to learn more via: email@example.com.
Largest Attorney General Consumer Privacy Settlement in US History with Google
An historic $391.5 million settlement with Google in the US has been finalized over its location tracking practices.
“For years Google has prioritized profit over their users’ privacy. They have been crafty and deceptive. Consumers thought they had turned off their location tracking features on Google, but the company continued to secretly record their movements and use that information for advertisers”, said Attorney General Rosenblum.
Google has misled users into thinking that they had turned off location tracking, while Google continued to collect their location information. For Google, location data is a key part of its digital advertising business. It uses personal and behavioral data to build detailed user profiles and target ads.
As part of the multimillion-dollar settlement, Google has agreed to significantly improve its location tracking disclosures and user controls starting in 2023. The settlement furthermore requires that Google will be more transparent about its practices by:
Showing additional information to users whenever they turn a location-related account setting “on” or “off”;
Make key information about location tracking unavoidable for users (i.e., not hidden); and
Give users detailed information about the types of location data Google collects and how it is used at an enhanced “Location Technologies” webpage.
In a blog post, Google outlined it will “provide a new control that allows users to easily turn off their Location History and Web & App Activity settings and delete their past data in one simple flow.” The company also plans to add additional disclosures to its Activity controls and Data & Privacy pages.
This settlement is an important milestone, since location data is among the most sensitive and valuable data that Google collects, as it can expose a person’s identity, routines, and to infer personal details. This settlement is another example of how individuals would benefit from having a comprehensive US federal privacy law in place that prevents the usage of large amounts of personal data for marketing purposes with only a few controls in place.
How does your organization safeguard the processing of location data or the processing of user profiles? Contact us, experts in data privacy, if you want to learn more via: firstname.lastname@example.org.
On November 14, 2022, the European Data Protection Board (EDPB) adopted Recommendations on the application for approval and on the elements and principles to be found in Controller Binding Corporate Rules (BCR-C).
What are the BCRs?
BCRs are a data transfer tool that can be used by a group of undertakings, or a group of enterprises, engaged in a joint economic activity, for its international transfers of personal data from the European Union (EU) to controllers or processors within the same group of undertakings or enterprises outside of the EU. In this sense, BCRs are suitable for multi-national organisations making frequent transfers of personal data between group entities.
According to Article 46(2)(b) of the GDPR, BCRs are permitted safeguards for transfers of personal data to third countries. BCRs create a framework of policies and procedures implemented throughout the entities of the organizations, which include enforceable rights and commitments to establish an essentially equivalent level of protection to that guaranteed by the GDPR. BCR applications should meet the requirements set out in Article 47 GDPR for the relevant supervisory authority to approve the BCRs.
What do the recommendations bring?
The recommendations provide additional guidance on BCR-C applications, update the existing application form for the approval of BCR-Cs, and clarify the content required for BCR-C as stated in Article 47 GDPR. The recommendations also distinguish between information that must be included in a BCR-C form and that must be presented to the relevant data protection authority.
The EDPB considers that the recommendations aim to level the playing field for all BCR applicants and align the current guidance with the requirements of the CJEU’s judgment in the Schrems II case.
The EDPB notes that recommendations for processor BCR are currently being worked on.
Stakeholders can contribute their comments on the recommendations until January 10, 2023.
Although, this is a great development it would also be important to improve the BCR approval process with data protection authorities (DPAs), since there is a great discrepancy between the various DPAs in the EU.
Does your organization have questions about binding corporate rules or international data transfers? Contact us, the Experts in Data Privacy at email@example.com for more information.
NIS2 adopted to strengthen EU-wide resilience for cyber security
MEP Bart Groothuis has been quoted as saying “this is the best cyber security legislation this continent has yet seen, because it will transform Europe to handling cyber incidents pro-actively and service orientated.”
Differing national cybersecurity measures makes the European Union (EU) more vulnerable. The new legislation sets tougher requirements for businesses, administrators and infrastructure. Last week, the rules requiring EU countries to meet stricter supervisory and enforcement measures and harmonize their sanctions were approved by the Members of the European Parliament (MEPs).
The Network and Information Security (NIS) Directive was the first EU-wide legislation on cybersecurity and its aim was to achieve a high common level of cybersecurity across the Member States. While increasing Member States’ cybersecurity capabilities, its implementation proved difficult with the result of fragmentation at various levels across the internal market.
To address the growing threats posed by digitalization and the surge in cyberattacks, the European Commission submitted a proposal to replace the NIS Directive. This has lead to the introduction and adoption of the NIS2 Directive.
This legislation sets stricter cyber security obligations for risk management, reporting obligations and information sharing. The requirements also cover incident response, supply chain security, vulnerability disclosure and encryption, amongst others.
A result of this new Directive is that more entities and sectors will have to take measures to protect themselves. “Essential sectors” have been defined, which includes energy, transport, banking, health, digital infrastructure, public administration and space sectors. Furthermore, numerous “important sectors” such as manufacturing of medical devices and digital providers will be protected by the new rules. All medium-sized and large companies in selected sectors would fall under the legislation.
Furthermore, a framework for better cooperation and information sharing between different authorities and Member States have been created as well as a European vulnerability database.
The only aspect remaining is for the European Council to formally adopt the law before it will be published in the EU’s Official Journal.
Does your organization have questions about cyber security or how the NIS2 Directive may impact you? Contact us, the Experts in Data Privacy at firstname.lastname@example.org for more information.
The right to privacy vs. journalistic interests: A balancing exercise
An individual was referred to in a news article as a master swindler and filed suit to explain to the court that article 8 of the European Convention of Human Rights (ECHR) was violated regarding the ‘Right to respect for private and family life’. He claimed the journalist was liable for €40,000 in damages, and also requested a ban on any future publications about him.
A journalist had, via its website, published online articles about the individual. The articles were thereinafter shared by national newspapers. The articles referred to the individual as a master swindler because it accused the individual of selling phone cards to detainees and of scamming them. The articles revealed the individual’s full name, data of birth and a photograph.
The Court of First Instance dismissed the individual’s claims and considered that the journalist processed his data for journalistic purposes. Additionally, the court mentioned that the press has a watchdog function, in this case by warning the public about fraud, and that therefore, the journalist’s freedom of expression outweighed the protection of the individual’s privacy.
The individual then appealed the decision of the Court of the First Instance, and claimed that the journalist’s purpose was to make him look bad and then there was no legal ground for processing his data under article 6 GDPR. Therefore, additionally the article 43 UAVG (the Dutch Implementation Act), which sets forth that the UAVG does not apply to the processing of personal data for exclusively journalistic purposes, was not applicable according to the individual.
The Court of Appeal then explained that the term ‘journalistic’ within the meaning of article 43 UAVG should be interpreted broadly. The journalist’s website was not deemed to make people look bad, but instead to inform the public about fraudulent activities. The processing of the individual’s data was therefore not deemed to fall out of scope of article 43 UAVG. The Court did note that it is prohibited to process criminal data according to article 10 GDPR, unless the exception under article 43(3) UAVG applies, meaning, insofar the processing was necessary for journalistic purposes. The Court deemed this processing of personal data necessary because of the publications concerned reporting on and warning about fraudulent practices. Regarding the applicable legal basis, the court held that journalistic activities are a legitimate interest for processing data under article 6(1)(f) GDPR. It therefore, dismissed the appeal of the individual.
Although many people think the right to privacy is an absolute right, it is important to note that this is not the case. It requires a balancing exercise to decide what right prevails in particular circumstances. It is a positive development that more case law is appearing regarding journalistic purposes, since this is still a grey area under relevant laws like the GDPR.
Contact us, experts in data privacy, if you want to learn more about data privacy via: email@example.com.
Endangering encryption? The European Commission’s gross violation of privacy
Strong end-to-end encryption is an essential part of a secure and trustworthy internet. This protects citizens every time an online transaction is made, when medical information is shared or when citizens interact with family and friends.
Strong encryption also helps protect children as it allows them to communicate with family and friends in confidence and allows others to report online abuse and harassment in a confidential manner. Encryption ensures personal data, and citizens’ private conversations, are kept private.
The EU’s new regulation intending to fight child sexual abuse online will require internet platforms – including end-to-end encrypted messaging applications like Signal and WhatsApp – to “detect, report and remove” images of child sexual abuse shared on their platforms. In order to do this, however, platforms would have to automatically scan every single message. This process is known as “client-side scanning.”
Not only is this a gross violation of privacy, there is no evidence that the technology exists to do this effectively and safely without undermining the security provided by end-to-end encryption. While the proposed regulation is well-intentioned, it will result in weakening encryption and making the internet less secure.
This proposal has already been criticized by privacy watchdogs – the European Data Protection Board (EDPB) and the European Data Protection Supervisor (EDPS) – which issued a joint statement calling for the regulations to be amended. The proposals were described as “highly intrusive and disproportionate”, arguing that by requiring platforms to weaken encryption, the regulations violate Articles 7 and 8 of the Charter of Fundamental Rights of the European Union – namely the right to respect for private and family life and the right to protection of personal data.
The EU has fallen for the myth that it is possible to keep EU citizens safer by weaking the very thing that protects its citizens. If backdoors are created for law enforcement, weaknesses in the system are created for everyone. These weaknesses could be exploited by criminal gangs or other malicious actors.
Furthermore, it is impossible for platforms to weaken encryption only for users located within the EU – any reduction in security would affect users of those platforms across the globe. In the United Kingdom (UK), similar legislation has been proposed for WhatsApp. WhatsApp has indicated that it is willing to withdraw from the UK market if they are required to weaken encryption. The same could occur across Europe.
The EU relies on encryption to protect the security of its member countries and the bloc as a whole but by proposing to weaken encryption, it may result in making all citizens more vulnerable.
If implemented correctly, encryption can be a very powerful tool in protecting personal data that your organization processes. Do you have any questions about encryption and how to keep personal data safe and secure? Contact us, the Experts in Data Privacy, at firstname.lastname@example.org
The national commission for data protection has become the first data protection authority in Europe to accredit a GDPR certification body.
On 12 October, the national commission for data protection in Luxembourg accredited an entity via its certification mechanism, GDPR-CARPA (General Data Protection Regulation-Certified Assurance Report-Based Processing Activities). This is the first mechanism to be adopted on a national and international level under the GDPR.
The European Data Protection Board’s Opinion states that certification mechanisms should enable controllers and processors to demonstrate compliance with the GDPR. The criteria should, therefore, properly reflect the requirements and principles concerning the protection of personal data as laid down in the GDPR and contribute to its consistent application.
The GDPR is a law that regulates how organizations target or collect data related to people in the European Union. Furthermore, it outlines how organizations must protect and handle data in a secure manner and details privacy rights which give individuals more control over their personal data.
With a GDPR certification, companies, public authorities, associations and other organizations can show that their data processing activities are complying with the GDPR. The implementation of the certification mechanism can promote transparency and compliance as it allows businesses and individuals to evaluate the level of protection offered by products, services, processes or systems used or offered by organizations that process personal data. Entities can, therefore, benefit from an independent certificate to demonstrate that their data processing activities comply with EU regulations. It will be interesting to see how this develops with other data protection authorities in the EU.
Does your organization have questions about accreditation? Contact us, the Experts in Data Privacy at email@example.com for more information.
The European Data Protection Board (EDPB) has published an updated version of the Guidelines on Personal Data Breach Notifications under the General Data Protection Regulation (GDPR).
The previous Guidelines provided that “notification should be made to the supervisory authority in the Member State where the controller’s representative in the EU is established.” The revised Guidelines, however, state that “[…] the mere presence of a representative in a Member State does not trigger the one-stop-shop system. For this reason, the breach will need to be notified to every single authority for which affected subjects reside in their Member State. This notification shall be done in compliance with the mandate given by the controller to its representative and under the responsibility of the controller.”
Should the EDPB finalize the above suggested procedure, it can have immense implications for companies not established in the European Union (EU) and processing the personal data of EU citizens. It is, therefore, important for companies to keep up to date with developments within privacy and data protection.
The revised Guidelines are open for public consultation until 29 November 2022.
Does your company have questions about a how this may impact your operations within the EU or does your company require the services of a representative? Contact us, the Experts in Data Privacy, at firstname.lastname@example.org .
Dutch employee fired by U.S. firm for shutting off webcam awarded €75,000 in court
A Dutch employee of a U.S. firm with a branch in The Netherlands refused to be monitored during a full workday. The consequences were that the employee was fired for ‘refusal to work’ and ‘insubordination’. In a lawsuit a Dutch court awarded the employee with €75,000 for wrongful termination.
The employee began working for the U.S. firm in 2019. A year and a half later, he was ordered to take part in a virtual training period, which was called the ‘Corrective Action Program’. During this program he was told to remain logged in for the entire day, including screen-sharing turned on and to have his webcam activated.
The employee did not feel comfortable with being monitored for a full workday and claims it was an invasion of privacy. He agreed on having the activities on his laptop being monitored and to share his screen, but he refused to activate the webcam. Thereinafter, he was fired.
The court concluded that the termination was not legally valid, because the reasons for dismissal were not made clear enough, nor were there reasonable instructions. The instructions definitely did not respect the employee’s right to respect for his private life (article 8 the European Convention on Human Rights) according to the court.
Many organizations do not realize the ‘right to privacy’ is a fundamental right in the European Economic Area (EEA). When having business operations in the EEA, it is essential to respect this at all times. This may be in conflict with for example more flexible workplace privacy practices in the U.S.
It is therefore important to make sure your organization is aware and informed when doing business in the EEA. Contact us for all of your privacy related questions via: email@example.com.
President Biden signs executive order on EU-US data privacy management
On 7 October, President Biden signed an executive order that would limit the ability of American national security agencies to access people’s personal information as part of the transatlantic data sharing agreement with the European Union (EU).
This executive order follows lengthy negotiations between the United States (US) and EU after the Court of Justice of the European Union (CJEU) ruled in 2020 that the US did not sufficiently protect Europe’s data when it was transferred across the Atlantic. The judges’ concerns focused on how US surveillance programmes did not have proper measures for European citizens to address how the government collected their data.
What is new?
The Data Privacy Framework (DPF) includes three components:
Commercial data protection principles to which US organizations may self-certify: the Privacy Shield will be updated to refer to the GDPR directly and US organizations must ensure they remain abreast of these developments.
A presidential executive order: this requires US intelligence authorities to limit US signals intelligence activities to what is necessary and proportionate. The executive order imposes necessity and proportionality limits first by mandating them explicitly, then by explaining what that mandate means and finally by prescribing oversight mechanisms to verify intelligence agencies to follow the new rules. What this will mean in practice is also included.
DOJ regulations: a two-step redress system will be introduced, which includes a new Data Protection Review Court. The first tier is an evaluation of a submitted claim by the Civil Liberties Protection Officer (CLPO) and the second tier is the Data Protection Review Court. This court is meant to process complaints concerning the legality of US signals intelligence activities transmitted from “qualifying states” for covered violations of US law.
Max Schrems’ response
The day the executive order was issued, Max Schrems and noyb.eu published their first reaction indicating that the executive order is unlikely to satisfy EU law. According to Schrems the executive order does not offer any solution and that there will be continuous “bulk surveillance” and a “court” that is not an actual court.
Bulk surveillance will continue by means of two types of ‘proportionality’. The words “necessity” and “proportionate” have been included but the EU and US did not agree that it will have the same legal meaning. In the end, CJEU’s definition will prevail therefore muting any EU decision again.
The “court” will not be a court but rather a body within the US government’s executive branch. The new system is an upgrade of the previous “Ombudsperson” system, which was rejected by the CJEU and it appears that it will not amount to “judicial redress” as required under the EU Charter.
noyb.eu has indicated that they are in the process of working on an in-depth analysis, which will be published soon. If the Commission’s decision is not in line with EU law and the relevant CJEU judgments, noyb.eu will likely bring another challenge before the CJEU.
What is next?
The European Commission will now launch its adequacy assessment. This process requires the Commission to put forward a draft adequacy determination, the EDPB to issue a nonbinding opinion, EU Member States to vote to approve the decision and the European Commission College of Commissioners to formally adopt it. The European Parliament may also weigh in with a non-binding resolution at any stage.
Previously, this process has taken four or five months once the Commission finalizes its draft. Should this trend be followed, it is expected that in March 2023 further clarification will be provided. Until then, however, organizations should not be under the impression that it relieves them from their obligation to conclude the Standard Contractual Clauses (SCC) including the performance of a Transfer Impact Assessment (TIA) by the 27th of December 2022.
Does your organization have any questions about transferring personal data internationally to the US? Contact us, the Experts in Data Privacy at firstname.lastname@example.org for assistance.
Earlier this week Britain vowed to tear up the European Union’s General Data Protection Regulation (GDPR) rules and implement an independent approach to data privacy.
This announcement was made by the British Secretary of State for Digital, Culture, Media and Sport Michelle Donelan, who pledged to remove red tape during an address to the Conservative’s party annual conference, although the precise constitution of the new regime remains vague.
Donelan said: “We will be replacing GDPR with our own business-and consumer-friendly British data protection system. I can promise…that it will be simpler, it will be clearer for businesses to navigate. No longer will our businesses be shackled by lots of unnecessary red tape.”
Vowing to build new privacy protections founded on ‘common sense’, Donelan claimed that the British economy could be boosted by dispensing with needless bureaucracy while retaining protections for internet users.
It is not clear when this new regime will come into force, what exactly it will entail and what implications this will have for the adequacy decision that was adopted by the European Commission in June 2021. This is an interesting development to keep in mind when operating in United Kingdom.
In September, Instagram was fined over the protection of children’s data. TikTok may now be facing a similar fate.
Investigations by the Information Commissioners Office (ICO) in the United Kingdom (UK) have found that TikTok, the video-sharing App, may have breached the UK data protection law between 2018 and 2020.
The ICO issued TikTok with a “notice of intent”, a precursor to handing down a potential fine, which could be up to £27 million. If TikTok were to be fined this amount, it would be the largest in the ICO’s history, exceeding the record of £20 million which was handed to British Airways two years ago after an incident in 2018 that saw the personal details of more than 400 000 customers compromised by hackers.
The ICO’s provisional view is that TikTok may have processed the data of children under the age of 13 without parental consent and failed to provide proper information to its users in a “concise, transparent and easily understood way.” Furthermore, TikTok may have processed special categories of personal data without legal grounds to do so.
The information commissioner said that “companies providing digital services have a legal duty to put those protections in place but our provisional view is that TikTok fell short of meeting that requirement.” TikTok said it disagreed with the ICO’s provisional finding and would make a formal response challenging the findings of the investigations. Upon receipt of these representations, the ICO will reach a conclusion.
Do you have questions about complying with the (UK) GDPR? Contact us, the Experts in Data Privacy, at email@example.com for assistance.
One of the things we learnt yesterday is that the potential new data transfer agreement between the US and the EU is to win time and not to solve the true problem regarding international data transfers from the EU to the US, according to Max Schrems.
Our colleagues Jelmer and Dounia attended the PrivSec Amsterdam event that included a range of speakers from world renowned companies and industries to allow privacy professionals from across different fields to share case studies and their experiences. It included keynote speeches, presentations, panel discussions, and enough time to meet up with other privacy professionals.
Topics of yesterday’s program included Google Analytics, the cooperation between data protection and security teams, data retention, consumer trust and transparency, and DPIAs amongst others. Furthermore, Max Schrems provided a presentation on The Future of Online Privacy, GDPR Enforcement and The Battle Against Surveillance.
In his keynote speech Schrems elaborated on how he believes that even though they are planning on introducing a new Executive Order that includes a proportionality assessment, that the new international data transfer agreement would not change anything and is meant to win time, because of the fact that FISA and the PRISM program would still apply. The best solution in his eyes would be to level data protection in the US by the introduction of a federal privacy law amongst other things.
Furthermore, he also warned organizations of the risk of lawsuits by individuals when continuing their data transfers in violation of the GDPR, which could ultimately lead to higher costs than the fines opposed by data protection authorities.
For us, especially the Schrems keynote was very interesting, because it confirmed that we should continue with helping out our clients with all the needed work regarding Standard Contractual Clauses (SCCs) and Transfer Impact Assessments (TIAs). Chances are likely that a new standard will again be invalidated and your organization should be able to fall back on at least a GDPR compliant TIA in case SCCs are used as an international data transfer mechanism.
Will the next data transfer agreement smoothen cross-border data flows?
G-7 leaders have gathered just recently in June to discuss legal deals underpinning bilateral data flows that exist among most of G-7’s members. G-7 consists of the United States, United Kingdom, Germany, France, Italy, Japan, and Canada.
The goal of the meeting was to align with the regulators’ approaches to privacy and to be able to better understand the domestic rules in each jurisdiction. Methods were discussed to move data and to create options for businesses to choose cross-border transfer tools that are suitable for their business needs. One can think of techniques to properly anonymize data, to strip information of details that identify and individual, and the trend toward the closer cooperation between antitrust and privacy regulars.
An important conclusion after the meeting was that countries need legislation that agrees that an individuals’ personal data is only accessed in case it is strictly necessary for national security purposes. However, that is easier said than done, especially after the Schrems II ruling and European lawmakers that criticized the United States’ intelligence practices. Also, the lack of federal privacy law that guarantees privacy rights is playing a big role in this discussion.
The wish of regulators of the G-7 countries is to smoothen international business for many companies. To enable this, it is imperative to create better understanding on how domestic rules currently affect some kinds of information, and how it can be used and sold. Mr. Wiewiórowski, the European Data Protection Supervisor, does not believe the G-7 countries are ready to create such a market for cross-border data flows.
All in all, the next solution to smoothen cross-border data flows should address these complex points addressed above to make sure that the next data transfer agreement will not be invalidated just like Safe Harbor and the Privacy Shield. There is a lot of work to be done. Until then, it is mandatory to fall back on the GDPR’s international data transfer mechanisms, including Standard Contractual Clauses (SCCs) and by performing Transfer Impact Assessments (TIAs).
The European Data Protection Board (EDPB) has announced that its next coordinated enforcement action will focus on the data protection officer (DPO) designations.
The 22 Supervisory Authorities across the European Economic Area, along with the European Data Protection Supervisor, will launch investigations into the yet-to-be-determined aspects of DPO requirements under the General Data Protection Regulation.
In a co-ordinated action, the EDPB prioritises a certain topic for data protection authorities to work on at a national level. The EDPB has further said that the individual actions will be “bundled and analysed, generating deeper insight into the topic and allowing for targeted follow-up on both national and EU level.”
What this will entail for the designation of DPOs is still unclear but upon further information been made available, we will provide an update. If you have any questions about the designation of a DPO and the advantages of DPO-as-a-Service, contact us: firstname.lastname@example.org
Interpretation of special categories of personal data extended by CJEU
In August this year, the Court of Justice of the European Union (CJEU) issued a preliminary ruling in OT v Vyriausioji tarnybinės etikos komisija (Chief Official Ethics Commission, Lithuania) (Case C-184/20), which was referred by the Regional Administrative Court of Lithuania. In this ruling, the CJEU elected to interpret the GDPR very broadly.
The CJEU clarified that the indirect disclosure of sexual orientation data is protected under Article 9 of the General Data Protection Regulation (GDPR). Therefore, such disclosures falls under the special categories of personal data.
This case arose from a question concerning the application of Lithuanian law, which required people in receipt of public funds to file declarations of interest. The declarations, which included information about the interests of the individual’s “spouse, cohabitee or partner” were published online. The applicant failed to file a declaration and was sanctioned therefore. The CJEU found that the underlying law did not strike a proper balance between the public interest in preventing corruption and the rights of affected individuals.
The CJEU went on to note that because it is possible to deduce information about an individual’s sex life or sexual orientation from the name of their partner, publishing that information online involves processing special categories of personal data as defined in Article 9 GDPR.
It was found by the CJEU that the processing of any personal data that are “liable indirectly to reveal sensitive information concerning a natural person”, for instance, any information that may reveal a person’s racial or ethnic origin, religious or philosophical beliefs, political views, trade union membership, health status or sexual orientation, is subject to the prohibition from processing under Article 9(1) GDPR unless an exception under Article 9(2) applies.
The implications of this ruling could be significant. It is possible that common processing operations, such as publishing a photograph on a corporate social media page, could reveal some information that is protected under Article 9. Controllers may need to review their processing operations through a contextual lens to assess whether the data being processed and the manner of processing is liable or able to reveal any sensitive information.
It has even been suggested that this ruling could have implications in all contexts where Article 9 is applicable, including online advertising, dating apps, location data indicating places of worship or clinics visited, food choices for airplane flights amongst others.
The way forward?
The judgment is not clear how far controllers need to go to make this assessment. One option that may be possible is to argue that if the controller does not make personal data public, and it implements policies that prohibits employees from making inferences, then information is not liable to reveal special category data. An alternative option would be for regulatory guidance to be issued indicating how controllers can comply with the ruling and the existing guidelines.
€405 million: second highest fine ever issued under the GDPR
Ireland’s Data Protection Authority has fined Instagram €405 million over the lack of protection of children’s data. This fine was issued after an investigation found that Instagram, the social media platform, has mishandled teenagers’ personal information in violation of the strict European Union data privacy laws.
The investigation, which commenced in 2020, focused on how Instagram displayed the personal details of users in the age range of 13 – 17 years, including email addresses and phone numbers. The investigation began after a data scientist found that users, including those under the age of 18 years, were switching to business accounts and had their contact information displayed on their profiles. It is alleged that users were doing this to see statistics on how many likes their posts were getting after Instagram starting removing the feature from personal accounts in some countries to help with mental health.
Instagram said it updated its settings over a year ago and has since released new features to keep teenagers safe and their information private. Instagram disagrees with the calculation of the fine and plans to appeal this decision.
This investigation forms part of over a dozen investigations into Meta companies, including Facebook and WhatsApp, as opened by Ireland’s Data Protection Authority. Will the largest (or third largest) fine be issued against a Meta company in the near future? All we can do is to watch this space…
The California Attorney General has dropped a bombshell: the first enforcement settlement under the California Consumer Privacy Act (CCPA). Sephora, a French cosmetics brand, must pay $1,2 million in fines and abide by a set of compliance obligations.
It is alleged that Sephora failed to disclose to consumers it was selling their personal information; failed to honour user requests to opt out of sale via user-enabled global privacy controls and did not cure these violations within the 30-day period allowed by the law.
At issue in this case was Sephora’s sharing of information with third-party advertising networks and analytics providers. This case marks a considerable uptick in risk for companies doing business in California and preparing for the California Privacy Rights Act activation in January 2023. It is clear that the Attorney General is focusing on online tracking and the implementation of and compliance with global opt-out signals, such as the Global Privacy Control.
Take-aways from this:
A French cosmetics brand was fined, not a large technological company.
A Global Privacy Control is a browser extension that automatically signals a consumer’s privacy preferences to all websites they visit without having to click on opt-out links one by one.
The Attorney General has been quoted as saying, “My office is watching, and we will hold you accountable…There are no more excuses.”
The CCPA’s notice and cure provision is expiring at the end of this year, which means that businesses must comply from the outset
The American Data Privacy Bill, currently before Congress, may further alter the rules and, in future, these enforcement actions could not occur
Interesting developments within the privacy sphere of the USA
EU plans to cut unneeded medical tests with data health plan
Being enabled to easily access your patient health records under strict rules to protect your privacy?
If it were up to the European Commission (EC), this is arranged in 2025 for patients, medics, regulators and researchers. The EC believes this will improve diagnosis, boost medicine research, and cut unnecessary costs from duplication of medical tests.The EC offered a binding proposal to EU governments and lawmakers, and includes the EU’s executive’s plans for an improved health data space, which would lead to savings and economic gains of over 10 billion euros in 10 years.
Webinar: what does effective vendor risk management look like?
How do you ensure that you include all aspects of the GDPR in your vendor risk management? In this webinar, you’ll learn about the specific privacy management activities you need to perform with a vendor at each stage to comply with the GDPR.
Class-Action Lawsuit Targets Company that Harvests Location Data from 50 Million Cars
Data that is being sold revealing your whereabouts, for example where you live, work or where you go to church? This happened to more than 50 million car owners around the world.
A California-based data broker is involved in a class-action lawsuit because it has been accused of secretly collecting and selling real-time GPS location information from more than 50 million cars around the world, including California-based consumers. The claim in the lawsuit is that the company never requests consent from drivers before tracking their location.
This is what your organization needs to know this about the Digital Services Act (DSA)
The EU legislators reached an agreement on the DSA, which will govern the digital sphere and increase the fight against illegal content and disinformation. It builds on the eCommerce Directive and provides clear rules for content moderation, platform accountability, illegal products and systemic risks.
Announcement on New Trans-Atlantic Data Privacy Framework
16 July 2020 is a very significant date in the world of privacy and data protection. The reason for this is that on this date, the Court of Justice of the European Union (CJEU) handed down the judgment invalidating the Privacy Shield framework. This ruling has become known as the Schrems II ruling.
To provide the best experience, we use technologies like cookies to store and/or access device information. Consenting to these technologies will allow us to process data such as browsing behavior or unique IDs on this site. Not consenting or withdrawing consent, may adversely affect certain features and functions.
The technical storage or access is strictly necessary for the legitimate purpose of enabling the use of a specific service explicitly requested by the subscriber or user, or for the sole purpose of carrying out the transmission of a communication over an electronic communications network.
The technical storage or access is necessary for the legitimate purpose of storing preferences that are not requested by the subscriber or user.
The technical storage or access that is used exclusively for statistical purposes.The technical storage or access that is used exclusively for anonymous statistical purposes. Without a subpoena, voluntary compliance on the part of your Internet Service Provider, or additional records from a third party, information stored or retrieved for this purpose alone cannot usually be used to identify you.
The technical storage or access is required to create user profiles to send advertising, or to track the user on a website or across several websites for similar marketing purposes.