Anonymisation – who is to blame and what to do?

The IAPP article provides retrospective of the #anonymisation issue under European data protection landscape and reiterates that there is still no ‘one-size-fits-all’ approach to making personal data #anonymised

The current standing can be described as confusion and vacillation, and, probably, the main culprit of this is #WP29 which took contradictory stances to anonymisation in 2007 and 2014, followed by ignorance from #EDPB side and, again, contradictory stances of national DPAs prone to either 2007 or 2014 approaches.

The simple thing is that straightforward ‘disguising of identity’ (e.g. by one-way cryptography), as WP29 suggested in 2007, can no longer be accepted as anonymisation (of course, unless stated otherwise by a national DPA). And the simple thing number two is that there is no industry standard describing step-by-step anonymisation algorithms and techniques. 

From a practical standpoint this calls for a case-by-cases assessment by an anonymisation entity. The recent AEPD-EDPS joint paper ‘on 10 misunderstandings related to anonymisation’ (‘joint paper’) specifically mentions that ‘anonymisation processes need to be tailored to the nature, scope, context and purposes of processing as well as the risks of varying likelihood and severity for the rights and freedoms of natural persons’.

Out of options suggested by the article, the most practical and realistic is, probably, arguing the risks of reidentification is sufficiently remote in every single case where anonymisation is relied on. In fact, this will require an ‘Anonymisation Impact Assessment’ (I have just come up with this term) which must include assessment of re-identification risks. The joint paper acknowledges that such risks are ‘never zero’ (‘except for specific cases where data is highly generalised’) and that ‘a residual risk of re-identification must be considered’.

Until to date, although addressed by WP29 and adopted by the GDPR, the notion of anonymisation and its application still remains ‘terra incognita’.

EDPB Recommendations 01/2020 – softening without being too soft?

Right after the final version of the Recommendations 01/2020 was issued, we (including myself) started to believe that now, here we will live the life! 

A reference to inability to rely on “subjective factors such as the likelihood of public authorities’ access to your data” is gone, data exporters may now assess how the laws are applied in practice, and even previous importer’s experience.

In fact, it may appear nothing more but just starting euphoria. Let’s be honest, we were happy because we understood: in the majority of cases the legislation of a third country will end up in the cohort of “problematic legislation” (para 43.3).

Okay, para 43.3 says that “you may decide to proceed with the transfer without being required to implement supplementary measures, if you consider that you have no reason to believe that relevant and problematic legislation will be applied, in practice, to your transferred data and/or importer”. That’s the exit, isn’t it? Let’s find some practice that “problematic legislation” does not apply to our transfer, and no need to think of supplementary measures. Everyone’s happy.

Not really. EDPB provides significant requirement to “sources of information” confirming our conclusions.

Non-exhaustive list of them is contained in Annex 3 (various reports from various credible organisations, warrants from other entities…), they must be “relevant, objective, reliable, verifiable and publicly available or otherwise accessible” (para 46). “Documented practical experience of the importer with relevant prior instances of requests” alone cannot be relied on (para 47). 

The question here is: do you know a third country with “problematic legislation” but at the same time with “relevant, objective, reliable, verifiable and publicly available or otherwise accessible” practice confirming that there is not really a problem for the transferred data?

In any event, it is clear: supplementary measures are here to stay.

Some fresh thoughts and updates on new #SCC

1. SCC cover data transfers to importers (i) established in thirds countries AND (ii) NOT subject to #GDPR through Article 3(2). This is not clearly articulated in implementing decision and SCC themselves as recitals and articles of both seem to contain controversial information. From confidential sources it’s become known that Directorate-General for Justice and Consumers will soon publish FAQ clarifying these issues. European Commission is not taking any position on the definition of the concept of international data transfers, though.

2. It is not sufficiently clear to what extent negotiating parties may “add other clauses” to SCC? Example I have seen in one of #IAPP articles: would clauses limiting liability between the parties (not towards data subjects, of course) contradict the SCC?

3. As SCC are based on modular principle, one very formal issue is still unclear: when building SCC, should the labels (“Module One: …” etc.) continue to appear in the clauses?
What to do with insertions in the middle of the text (especially for Module Three) if other clauses are used at the same time – is also not perfectly clear?

4. In terms of assessment, new SCC says that parties, when assessing how law and practice in a third country impact an importer’s ability to comply with SCC, are encouraged to take into account “reliable information on the application of the law in practice (such as case law and reports by independent oversight bodies), the existence or absence of requests in the same sector and, under strict conditions, the documented practical experience of the data exporter and/or data importer”. It is a clear shift from strict position taken by EDPB Recommendations 01/2020 that parties should take into account “objective factors, and not rely on subjective factors such as the likelihood of public authorities’ access”.

The final version of #EDPB Recommendations 01/2020 is in the pipeline, and perhaps some important things will be changed compared to the current version for public consultations.

1177 result of (Sweden) audit is final

This is a super interesting case. 1177 is the number used in Sweden to ring for your healthcare provider. There was a slight personal data breach reported in 2020 whereby 2.7 calls were publicly available. Apparently the voice data was not encrypted.

The results of the audit by the Swedish Supervisory Authority has resulted in fines of 12 million SEK (1.2 €) to the data controller (Med Help), 650k SEK (65k €) to the Voice Integrate, 500k SEK (50k € county Stockholm) and 250k SEK (25k €) to counties Värmland and Sörmland.

An interesting case from Austria that touches on Article 85 GDPR and refers to the balancing test developed by ECtHR of how to determine whether occurred processing was for journalistic purposes or not.

The case involved personal data mentioned by ex-convict in his Facebook publication. Ironically enough, Austrian DPA and court came to different conclusion using the same balancing tests. Contrary to DPA’s decision, the court found a violation of the data subject’s rights.

While the case is interesting by itself, it also raises thoughts about consistency of application of various tests and methodologies developed by #WP29 and #EDPB in their multiple papers to practical situations. In a nutshell, “two enforcers – three opinions”.

#dataprotection #lawandlegislation #privacy #politicsandlaw #compliance #dataprivacy #gdpr

Risk-based approach

I brushed up on one important thing – risk-based approach (RBA) which WP29 touched upon in 2014 in its “Statement on the role of a risk-based approach in data protection legal frameworks” (WP 218).

So what is RBA really about? The key phrase from the Statements that suggests answer: “… a data controller whose processing is relatively low risk may not have to do as much to comply with its legal obligations as a data controller whose processing is high-risk».

What does in mean? 

RBA does NOT mean that a controller may ignore some of its obligations if it processes low-risk data. This does not lead to compliance and this is a common misconception I’ve seen in my practice many times. Instead, RBA means that in this case a controller may do less to be compliant.

In fact, this is not correct to say that the whole GDPR implies RBA. Only some particular articles in particular cases do so (Art. 25, 30(5), 32-35 and some other).

The recent cases of usage of Facial Recognition Technologies (FRT)

Enforcement actions against Dutch supermarket and Swedish police remind that the bar for using FRT is very high.

Even if used for security purposes inside the supermarket, giving of relevant privacy notification to customers that FRT is used is not enough. However, no fine was issued by Dutch DPA (only warning).

In another case, Swedish Police unlawfully processed biometric data for facial recognition without conducting DPIA which resulted in a fine of approximately EUR 250,000. In addition, Swedish police was ordered to delete all data transfer to external database (Clearview AI) and notify data subjects that their personal data were transferred.

CLICK here to learn more on that

Data Retention Guidance from DPN

Hi world!

Came across a very good #Data Retention Guidance from Data Protection Network Associates issued in July, 2020 (LINK).

Being based on the #UK and #EU laws, it outlines starting tips for conducting data retention review process (it all, however, begins with data mapping exercise), provides advice on how to decide on retention periods, advises on creation of data retention policy and schedule, and much more.

Attention: the Guidance refers to #anonymization as an acceptable way of handling data when retention period comes to an end. It should be noted here that ‘true’ anonymization is very hard (if possible) to achieve, especially given that there is no industry standard on strict sequence of steps to be taken to render the data anonymized. In addition, amid the constant development of #bigdata and #AI algorithms, data we consider truly anonymized today may not have the same status tomorrow.

Virtualshadows blog is back!

This blog has got a resurrection. It was closed down in November last year because of non-compliance concerning the amount of cookies that the blog was using (WordPress was a cloud service based in US), and Schrems II ruling and that all cookie consent banners were too expensive, after all this is a private blog, its just there were quite a few visitors each month. I guess if this blog had been about my dog, or anything else, maybe I wouldn’t have bothered with all the GDPR stuff, but even so I am professionally a ‘privacy guy’, so the blog had to go.

So what happened to my blog was something I call ‘GDPR paralysis’, everything comes to a stop, and GDPR is the cause. I remember when my business (Privasee) which I founded in 2015 came into a state of GDPR paralysis in 2017, the privacy purists versus myself as CEO, in that ‘business has to function’. There needs to be a compromise, otherwise Privasee would cease to exist, making money for the business is necessary for survival, and for my business to achieve what it set out to do, i.e. ‘make privacy compliance accessible’.

One could claim that a blog comes under ‘household exemption’, which was how I was thinking, maybe misguided, but you know how we can be, human beings, believing in what is easiest, and anyhow what harm can it do to the ‘rights and freedoms of the natural person’? It’s just all those cookies made it a privacy risk to visitors, and today something popped up in my LinkedIn feed that the Danish Data protection authority have passed a ruling that a so called ‘private website’ was not exempt under Article 2(1). I can’t find the case now.

When reading this blogpost, you should only have 7 cookies downloaded, and they are all session cookies, except one with a life of a single day. The WordPress site is based in the EU, so no international transfers.

Enjoy reading the blog again, and welcome back!