DPAs’ guidances to survive in the post-‘Schrems II’ world

IAPP has set up a valuable resource collecting together guidances and statements issued by national DPAs in response to the recent CJEU ruling on the so-called ‘Schrems II’ case. The IAPP will aim to update the register on an ongoing basis.

The link is below:

https://iapp.org/resources/article/dpa-and-government-guidance-on-schrems-ii-2/

While privacy pros advise to seek to put in place SCC as a substitution for the invalidated Privacy Shield, it should, however, be noted that SCC are by itself a safeguard with a limited scope of application as: (i) it still does not cover many processing scenarios (e.g., processor-to-controller, processor-to-sub-processor); (ii) it is quite outdated (issued in 2001, 2004 and 2010 in the pre-GDPR world); (iii) its validity has been put on several conditions by the ‘Schrems II’ decision.

Ambiguous status of SCC under the ‘Schrems II’ decision

As all privacy community already know, the CJEU has today struck down EU-US Privacy Shield scheme, while confirming the validity of SCC.

Arguments against Privacy Shield has changed little since the ‘Schrems I’ decision that invalidated Safe Harbour – governmental intrusion, lack of proportionality, ineffective role of ombudsperson.

What is really new is that a EU-based data controller relying upon SCC is now expected to assess how public authorities in third countries obtain access to personal data and how legal system in those countries works.

Two questions still remain:

1. How such controllers in question are expected to conduct such evaluation? Any methodology in this regard? It may seem somewhat similar to what we have in Article 45(2) – which factors Commission shall evaluate when issuing adequacy decisions. However, a private entity living with SCC is not a EU body and often does not have sufficient resources and understanding as to how to conduct the research and put necessary safeguards in place.

2. Enforcement. Amid DPAs facing lack of financial resources and manpower, the CJEU’s decision puts even extra burden on them. Thus, a newly invented (by CJEU) requirement may easily end up becoming unviable with no practical effect due to insufficient oversight.

Bonus question: taking into account the ‘accountability’ principle, how exporting controllers should demonstrate their compliance with the new obligation?

Hopefully, answers are yet to come.

An interesting twist in the ‘cookie walls’ saga.

France’s Council of State has ordered the CNIL (French data protection watchdog) to cancel parts of its guidelines on cookies as the ban on cookies walls was not valid. The court explained that the CNIL exceeded its specific mandate under an act called “flexible law” which refers to instruments, such as regulatory authorities’ guidelines, which do not create a legal right or obligation.

Although a recent update of the EDPB Guidelines on consent invalidated ‘cookie walls’, our patient may still be very much alive. There potentially might be similar court decisions in some other Member States.

Recently, the BfDI (German watchdog) said that “cookie-walls are permitted if a comparable service is also offered without tracking, for example, as a paid service”. This happened right after the update of the EDPB Guidelines on consent came out.

Original text of the decision is in French:

https://www.conseil-etat.fr/actualites/actualites/le-conseil-d-etat-annule-partiellement-les-lignes-directrices-de-la-cnil-relatives-aux-cookies-et-autres-traceurs-de-connexion

Breaking news: EDPB has published the “one-stop-shop” decision register.

Being a great tool for privacy pros to keep up to date with extensive case law, it also increases the overall awareness of how data protection laws are applied in cooperation between the lead DPA and the other DPAs concerned (the GDPR Article 60).

As I expect more comments on this occasion in the days/weeks to come, for now just two interesting points:

– most cases published so far are related to data subject rights and lawfulness of the processing;

– so far, lead DPAs issued more compliance orders and reprimands than fines.

To read more – see below.

https://edpb.europa.eu/news/news/2020/edpb-publishes-new-register-containing-one-stop-shop-decisions_en

Interplay between the GDPR Articles 25 (‘Data protection by design’, DPbD) and 35 (DPIA).

One is not a ‘special case’ of another as it may seem prima facie. The KEY consideration here is that DPIA is conducted prior to rolling out new projects implying data processing operations posing a high risk and thus tailored specifically to them. In contrast, DPbD comes into play at the very earliest stage of the lifecycle of a data controller and applies to every processing activity (not only those posing a high risk), including core ones.

Similarly, DPIA may just say whether the particular processing is in line with the controller’s privacy policy in the context of the project at issue, but it will not evaluate this policy’s content, etc.

This leads to a clear understanding that DPIA is not a substitution for DPbD and, hence, may not be the answer.

Further to this, it should also be noted that DPbD has recently received an increased attention from EDPB (see Guidelines 4/2019) and national watchdogs in Romania, Greece and Germany issuing fines for non-compliance with Article 25.

More to read on this – in an article from IAPP authors (see below)

https://iapp.org/news/a/privacy-by-design-gdprs-sleeping-giant/

Status of non-EU processors under Article 3(2) GDPR

A thorough analysis of clear things and grey zones of the EDPB Guidelines 3/2018 on territorial scope.

My attention was, in particular, drawn by a friendly reminder that a status of a non-EU processor is dual as per Article 3(2):

  • it is indirectly influenced by the GDPR if carries out processing on behalf of a EU controller (through the data processing agreement under Article 28 and Chapter V obligations);
  • It is directly caught by the GDPR if the respective processing activities carrying out on behalf of a controller meet the ‘targeting criterion’ in a sense of Articles 3(2)(a) and 3(2)(b).

More to read – see below.

A “purpose”​ element: what is inside the controller’s mind?

In ‘Opinion 4/2007’ on the concept of personal data, Working Party 29 (‘WP29’) identified four building blocks in the definition of personal data – ‘any information’, ‘relating to’, identified or identifiable’, ‘natural person’. They remained the same in the GDPR, thus rendering ‘Opinion 4/2007’ relevant for understanding the concept of personal data. 

However, WP29, instead of eliminating all subjectivity to the extent possible, seemed to add some unclarity to the explanation of what ‘relating to’ means.

WP29 sets out that ‘in order to consider that the data “relate” to an individual, a “content” element OR a “purpose” element OR a “result” element should be present’. In turn, ‘“purpose” element can be considered to exist when the data are used or are likely to be used, taking into account all the circumstances surrounding the precise case, with the purpose to evaluate, treat in a certain way or influence the status or behaviour of an individual’.

By itself, an idea to decide on whether the data are personal or not through the interpretation of the “purpose” element is quite controversial due to the subjective (rather than objective) nature of the notion of purpose.

An example given by WP29 brings this problem front and center:

Passenger vehicles owned by a transportation company suffer repeated damage when they are dirtied with graffiti. In order to evaluate the damage and to facilitate the exercise of legal claims against their authors, the company organises a register containing information about the circumstances of the damage, as well as images of the damaged items and of the “tags” or “signature” of the author. At the moment of entering the information into the register, the authors of the damage are not known nor to whom the “signature” corresponds. It may well happen that it will never be known. However, the purpose of the processing is precisely to identify individuals to whom the information relates as the authors of the damage, so as to be able to exercise legal claims against them. Such processing makes sense if the data controller expects as “reasonably likely” that there will one day be means to identify the individual. The information contained in the pictures should be considered as relating to “identifiable” individuals, the information in the register as “personal data”, and the processing should be subject to the data protection rules, which allow such processing as legitimate under certain circumstances and subject to certain safeguards.

Most likely, it is only common sense that can lead to the conclusion that the purpose is to precisely identify authors of the graffiti. However, the controller can potentially argue that it keeps the register and images for some other internal purposes not connected with the purpose of future identification. As a result, we may end up being engaged in a discussion about true intentions of the controller which might not be established easily due to a lack of the factual grounds.

The issue described above may prima facie seem to be solely theoretical. Moreover, the language used by the GDPR contains various ‘floating’ criteria implying the necessity to conduct evaluations on a case-by-case basis. However, one should not overlook that, by applying the concept of purpose as described above, we decide on whether the data are personal or not, and a positive answer inevitably triggers set of responsibilities vested in the controller under the GDPR and Member States laws. It can be assumed that more certainty is need when addressing such a fundamental issue which may (or may not) trigger application of the data protection legislation in general.

Interestingly, the GDPR suffers from the same flaw like the WP29 ‘Opinion 4/2007’. Under Article 9(1), processing of biometric data for the purpose of uniquely identifying a natural person is prohibited (unless one of the exemptions under Article 9(2) applies). This brings us back to the issue of identification of the controller’s intention. Ironically enough, Recital 51 applies more objective criteria when addressing the same issue:

“The processing of photographs should not systematically be considered to be processing of special categories of personal data as they are covered by the definition of biometric data only when processed through a specific technical means allowing the unique identification or authentication of a natural person

In other words, under Recital 51, it is ability of technical means to identify individuals that plays a key role (and not just purposes pursued by the controller). Unfortunately, this wording has been changed in Article 9(1) requiring to identify the subjective purposes (instead of objective abilities).

Sweden is going to have fun with the new Data Protection Regulation

There’s starting to be a bit of a flurry here in Sweden with the upcoming new Regulation.

One of the communications I received last week was concerning the fact that here in Sweden our personal data, including our ID is considered public information. This will not be the case once the Regulation comes into effect. What I find funny (you know the funny, not-so-funny British humour ;-)) is that those I talk to here think this is new in the Regulation, but it’s not. It is included in the Directive of today, just not implemented as law here in Sweden.

This is going to require significant work to get compliance in Sweden, especially the way our personal data is sold with the use of ‘utgivningsbevis’ without the consent of the data subject. In fact it is impossible for data subjects in Sweden to remove their personal data from public viewing!

Hurry up new Regulation so I can get my personal data removed from ratsit.se, birthdays.se and hitta.se… just to name a few!