A dilemma that you are a part of can be confusing. You may not even realise that a decision that you must take is presenting for you a personal/professional dilemma. As a data protection officer, this becomes very evident, and problematic requiring skills beyond what is considered normal, i.e. legal, security, etc.

The most profound privacy dilemma is that in any democratic society, it is a human right to have freedom of speech, as is it a human right to a private life. On a more personal level, we want a right to privacy for ourselves per se, but for others, we want to know what our family, friends, acquaintances, are up to!

Then there is the human right to feel safe, which can be provided with the installation of cameras, alarms, etc., in shops, petrol stations, metros, and even in our homes. But this conflicts with our human right for a private life!

Dilemmas come in many forms even in our daily life, especially as parents. I remember when I was offered a job at Cern in Geneva in 1996, at the time I was living in UK, a single parent with a 16 year old son. I was not immediately in a dilemma, I gave him the choice to join me in France, go to a French school, learn French, skiing every weekend, etc., a wonderful life ….which he refused. Funnily enough, looking back now, my immediate dilemma was not to respect my son’s wishes and then find a way so that he could stay and I could go, my dilemma was what others would say about me as a mother. Maybe I had to refuse the job offer?

If we return to the role of the data protection officer. Often the DPO will advise on what is the best course of action when standing in the shoes of the data subject, then it is up to the business to make a decision. If this decision conflicts with advice from the DPO. This is a business dilemma, e.g. lack of transparency concerning a personal data breach (not yet confirmed) versus doing what is right as per GDPR.

If the DPO role as defined in the GDPR Articles 37-39 is respected by the business, then as long as the DPO is experienced and aware of the dilemma and potential conflict with the business, a way forward can be found. It is another skillset required for a DPO not mentioned in the GDPR, or even in any book I’ve read so far, and that is the ability to mediate between the business and meet data subject rights. To able to see the trees for the wood, to see there is a dilemma, to see that it is not personal, it is life.

Just in case you are interested in the outcome of my dilemma as a mother. A year following the decision my son told me when visiting me in France, that it was in fact the best parenting decision I had made to leave him in UK. He was freed from my mothering…which he’d had enough of. So as unlikely was this outcome, it was indeed the right decision for us both.

The box and automated decisions

It is said that the best way to explain anything in a way so that it can not only be understood but also retained as a piece of knowledge is to tell a story. So I am going to tell a story about what is called in the GDPR ‘profiling’ and ‘automated decision making’.

The connection between the two is understandably difficult for many to comprehend, for good reasons. It seems technical after all and a consequence of technology, data warehousing/analytics/AI etc. The question for many is why are they connected? So this is a story about me and a box and automated decisions made before technology became a part of it. I say me first, because I came before the box did, although you may argue this point after reading my story 😉

In fact it is not just a single box but hundreds of boxes, each one unique and different and through some act of fate, or magic, the single instance of me happens to have resided in each of the boxes at some time in my life. Often I reside in multiple boxes at the same time, a physical impossibility I know, but it is true.

My first memory of the box, was before I was aware of them, or their function. I was about 10 years old, and with my sister and brother introduced to a new friend/acquaintance of our family.

My mother speaking, proudly. “This is Karen she’s the intelligent one of the family.” Then she moves on to introduce my sister and brother, as the tomboy, and the trouble-maker. So I was profiled even before computers arrived on the scene in a normal family or even business life. It was even before store/loyalty cards, data-warehousing and big data analytics. I internalised this when looking back 10 years later, that I had been put into a box.

So what? You may think. Well the consequences are subtle and profound. Clearly for those that have been introduced to our family, the automated decision made is that Karen is intelligent which implies an inverse situation with the other 2 kids. Karen is the favoured according to the other 2 siblings. Clearly it didn’t make me popular, hence an automated decision is made on how I should be treated by those who are aware of this box. What is more is that my siblings have been placed into another box, inadvertently which influences how they are treated by others, and even how they perceive me.

I started to become aware of these boxes when I got pregnant at the age of 17 years and I was moved in a new box. The old box I didn’t mind so much, but this new one was not nice at all. I had done something really bad, my life, my dreams, it was over. I was in the ‘teenage pregnancy’ box and there were a whole load more boxes waiting to grab me when I took the marry the father route, teenage mother box… and let us not forget the box that my son ended up in a statistic by social services. Decisions were made over which I had no control, no more smiling faces, but faceless bureaucrats, George Orwell 1984 style. Nothing digitally automated, faceless human beings filling in checkboxes.

By the time I was 21 I landed in a new box of my own making for the next 13 year of my life. I was a single mum with a young kid living in an area occupied by mainly single mums like me, dysfunctional families, prostitutes and drug pushers. I didn’t choose to be there even though this is the box I was in. I lived in a flat with no heating, no wall insulation, single pane windows and concrete floors. For periods of time I didn’t even have hot water due to an automated decision. At this stage to have a choice in anything in life was a luxury. You survive. Which takes us to the most important word when it comes to human rights, and that is the right to choice. The right for human intervention in the GDPR on automated decisions made on us gives us a choice to challenge decisions made which affect your rights and freedoms as a data subject.

Fast forward to today. I realise now that what I called a ‘box’ when I was young, is actually a ‘profile’ in GDPR. I had been profiled, and automated decisions were made on my life over which I had no control -and this was before profiling and automated decisions were digitally automated -as we know it today.

This all happened in the UK. Now to place this into today’s context in Sweden.

By 2019 I’d been an entrepreneur for 6 years, and my second startup was suffering. I became personally liable for tax debts which I was unable to pay. I requested a ‘repayment plan’ from the Swedish tax authority, which I knew was technically possible, but was refused. The reason was that I’d had 3 parking fines over 3 years that I had been overdue in payment, hence I had passed the threshold to not get offered a repayment plan. I explained to them that I’d paid all my taxes for over 15 years, but it didn’t make a difference. I had been profiled on 3 parking fines, and an automated decision had been made which was was going to devastate me and my family. This was bizarre considering I had significant capital in assets (our house) which I was unable to take a loan against due to that I was by then profiled as not creditworthy. Soon afterwards I received a fat letter in the post, a repossession order on our family home.

What was interesting are the similarities from when I was young, and today which are as scary as the differences. The main difference being that I knew what was happening this time. I knew I’d been profiled using technology, and an automated decision had been made using technology. What was the similar, was that I felt equally as vulnerable as I had done in the 1980s-90s. The problem was that even though I had rights now, I was aware of my rights, I wasn’t in a position to use them.

Article 22 of the GDPR is a really important data subject right that we can overlook because we get bogged down with the technical/legal stuff. It seems complicated. But in reality it is not complicated nor even technical, it is very personal. I hope my story has succeeded in bringing this message across.

Who owns my personal data?

A question which is still creating something of a storm in some circles. Mainly in those that make money from the intricately complex interpretations of law text which is further complicated by neighbouring laws, such as Intellectual Property (IP).

Most legal guys, and even worse my esteemed colleagues who started their careers in other domains such as cyber, just love to do this. Make it difficult and the more complex and intricate the better; and will even go as far as to condemn those who dare to state that my personal data belongs to me.

Well condemn me as you like, because this is the message I give to my customers! And why? Because to do anything different would forge dependancy instead of fostering empowerment.

Karen Lawrence Öqvist, Founder/CEO Privacy

Clearly from a deep legal technical standpoint I cannot have claim to own my personal data. But this is irrelevant to 99.99% of the worlds population who (1) do not have a law degree, or (2) are not working in the domain of GDPR, or in the slightest bit interested to do so.

If I was to start with a dive into the intricacies of IP law, there will be a zero comprehension of an individual’s rights pertaining to GDPR and their personal data. If this is where I, as a professional were to start… my customer who maybe installs alarms for a living, will be thinking but “….who cares except for the music industry?”

For my customers, I have only 20-30 minutes of training for each employee, to get the message across. And the message needs to be formulated in a way so that the business achieves the desired result, i.e. compliance as far as is practicable with GDPR Rules. And yes, I even use the terminology GDPR Rules, because my customers understand what a rule is!

For the GDPR Administrator the training is 3 hours. This is not long enough to dive into 250 pages of GDPR legal text and even IP law. It is long enough to ensure that he/she can do her job, and understands data subject rights and that these rights are not absolute. Some GDPR Admins I train even get excited about the subject matter. And only then can one take the next step.

I sometimes am concerned that I may simplify things too far, to a level whereby my skills are redundant, but in short this approach: costs less for my customers; and, they know exactly what they are getting for every € they spend which are building blocks for trust.

Scope of the obligation to provide «a copy of the personal data undergoing processing» (Article 15(3) GDPR).

#EDPB in draft #Guidelines 01/2022 attempted to clarify the scope of the controller’s obligation to provide «a copy of the personal data undergoing processing» (Article 15(3) #GDPR).

What Article 15(3) implies by «a copy» – has long been a controversial issue, while approaches varied across #EU Member States. Below are some examples:

#Germany: there have been contradictory views as to how the term “copy” should be understood – i.e. whether it should be literally a copy or just summary (https://lnkd.in/eRbC5gY7)

#Austria: GDPR (Article 15(3)) does not grant a right of access to files or documents. However, the content of documents may qualify as personal data. Providing copies of personal data stored within a document will often be the easiest option by redacting superfluous information and providing the document to the applicant (https://lnkd.in/eRgPcEDs)

More insights can be obtained from the #IAPP article – https://lnkd.in/e9g37p9v

Now, EDPB seems to take a so-called ‘fit-for-purpose’ approach to how the notion of ‘copy’ should be understood.

Para 23, 25 of the draft Guidelines 01/2022 say that a right to obtain a copy refers ‘not necessarily to a reproduction of the original documents’ and ‘that the information on the personal data concerning the person who makes the request is provided to the data subject in a way which allows the data subject to retain all of the information and to come back to it’.

Further to this, para 150 stipulates that ‘an obligation to provide the data subject with a copy of the personal data undergoing processing […] does not mean that the data subject always has the right to obtain a copy of the documents containing the personal data, but an unaltered copy of the personal data being processed in these documents. Such copy of the personal data could be provided through a compilation containing all personal data covered by the right of access as long as the compilation makes it possible for the data subject to be made aware and verify the lawfulness of the processing’.

In other words, against this purpose, ‘it is the responsibility of the controller to decide upon the appropriate form in which the #personaldata will be provided’.

EDPB Guidelines 05/2021 on the interplay of Article 3 and Chapter V: not a big deal at all

I intentionally deterred myself so far from reading opinions and analytics about newly issued Guidelines 05/2021 so that those do not inform my personal ‘first’ opinion. 

For now, Guidelines 05/2021 do not appear to be a big deal at all, nor are they free from inconsistency with the new SCCs and from casuistic examples.

1) Three criterions of transfers do not look like something ‘surprising’. C’mon, there could scarcely be anyone who expected the existence of ‘transfer’ between a controller/processor and a data subject. Maybe it is just me, but I can see few (if any) things that could be seen as significantly changing the landscape and adding value to the current understanding of things. 

2) At the same time, a misalignment between the EU Commission and the EDPB still continues. Recital 7 of the SCCs implementing decision noted that SCCs may be used for transfers “only to the extent that the processing by the importer does not fall within the scope of Regulation (EU) 2016/679”, and this is in clear contradiction with EDPB’s transfer criteria #3. More on the conflict between the Commission’s and EDPB’s approaches can be read here: https://iapp.org/news/a/why-it-is-unlikely-the-announced-supplemental-sccs-will-materialize/?mkt_tok=MTM4LUVaTS0wNDIAAAGAiv2DhonU2mSs-GNpYnvfsyMcmuYxz64LrNpH1YIA75K7-YZFEz3tT0a3i4wGnMiMXfBDlsr1mVDx_wDm-qJrSV0CybkgplN9HxJo5DkdpDW2

More interesting, there is still no uniform definition of ‘data exporter’ and ‘data importer’. From new Guidelines 05/2021 it is clear that only controllers, processors and joint controllers may qualify as ‘data exporter’ or ‘data importer’, and only between exporters and importers a transfer may take place. More or less (with some textual discrepancies) the same understanding may be seen in Annex 1 of the EDPB Recommendations 01/2020. But the different approach is seen in Clause 1(b) of the SCCs where the understanding of ‘exporter’ of ‘importer’ bears no relation to controllership issues. 

3) Such details may become important in some scenarios – let’s look at Example 5 (employee of a EU-based company travelling to a third country). First of all, this example seems to be borrowed from Norwegian DPA’s guidance – https://www.datatilsynet.no/rettigheter-og-plikter/virksomhetenes-plikter/overforing-av-personopplysninger-ut-av-eos/ . Second, what if, let’s say, an employee is not travelling to a third country but permanently sits there? Will this change the assessment and why does EDPB endorse such casuistic examples? Will this make the employee ‘importer’ and will this give rise to a ‘transfer’? My answer is ‘No’ for many reasons. And if the EDPB agrees (does it?..), what would be the role of such employee in the scheme? I tend to believe these will qualify as an ‘establishment’ of an employer (who, in turn, can be either a controller or processor). 

But never mind, it is just an example, and it does not really matter what I (or you) personally think. It is EDPB (not us) who is here to give clear answers applicable in a vast majority of scenarios – as opposed to superficial and often evident explanations and casuistic examples, evading a deep-dive into the heart of the issues. 

Anonymisation – who is to blame and what to do?

The IAPP article provides retrospective of the #anonymisation issue under European data protection landscape and reiterates that there is still no ‘one-size-fits-all’ approach to making personal data #anonymised

The current standing can be described as confusion and vacillation, and, probably, the main culprit of this is #WP29 which took contradictory stances to anonymisation in 2007 and 2014, followed by ignorance from #EDPB side and, again, contradictory stances of national DPAs prone to either 2007 or 2014 approaches.

The simple thing is that straightforward ‘disguising of identity’ (e.g. by one-way cryptography), as WP29 suggested in 2007, can no longer be accepted as anonymisation (of course, unless stated otherwise by a national DPA). And the simple thing number two is that there is no industry standard describing step-by-step anonymisation algorithms and techniques. 

From a practical standpoint this calls for a case-by-cases assessment by an anonymisation entity. The recent AEPD-EDPS joint paper ‘on 10 misunderstandings related to anonymisation’ (‘joint paper’) specifically mentions that ‘anonymisation processes need to be tailored to the nature, scope, context and purposes of processing as well as the risks of varying likelihood and severity for the rights and freedoms of natural persons’.

Out of options suggested by the article, the most practical and realistic is, probably, arguing the risks of reidentification is sufficiently remote in every single case where anonymisation is relied on. In fact, this will require an ‘Anonymisation Impact Assessment’ (I have just come up with this term) which must include assessment of re-identification risks. The joint paper acknowledges that such risks are ‘never zero’ (‘except for specific cases where data is highly generalised’) and that ‘a residual risk of re-identification must be considered’.

Until to date, although addressed by WP29 and adopted by the GDPR, the notion of anonymisation and its application still remains ‘terra incognita’.

EDPB Recommendations 01/2020 – softening without being too soft?

Right after the final version of the Recommendations 01/2020 was issued, we (including myself) started to believe that now, here we will live the life! 

A reference to inability to rely on “subjective factors such as the likelihood of public authorities’ access to your data” is gone, data exporters may now assess how the laws are applied in practice, and even previous importer’s experience.

In fact, it may appear nothing more but just starting euphoria. Let’s be honest, we were happy because we understood: in the majority of cases the legislation of a third country will end up in the cohort of “problematic legislation” (para 43.3).

Okay, para 43.3 says that “you may decide to proceed with the transfer without being required to implement supplementary measures, if you consider that you have no reason to believe that relevant and problematic legislation will be applied, in practice, to your transferred data and/or importer”. That’s the exit, isn’t it? Let’s find some practice that “problematic legislation” does not apply to our transfer, and no need to think of supplementary measures. Everyone’s happy.

Not really. EDPB provides significant requirement to “sources of information” confirming our conclusions.

Non-exhaustive list of them is contained in Annex 3 (various reports from various credible organisations, warrants from other entities…), they must be “relevant, objective, reliable, verifiable and publicly available or otherwise accessible” (para 46). “Documented practical experience of the importer with relevant prior instances of requests” alone cannot be relied on (para 47). 

The question here is: do you know a third country with “problematic legislation” but at the same time with “relevant, objective, reliable, verifiable and publicly available or otherwise accessible” practice confirming that there is not really a problem for the transferred data?

In any event, it is clear: supplementary measures are here to stay.

Some fresh thoughts and updates on new #SCC

1. SCC cover data transfers to importers (i) established in thirds countries AND (ii) NOT subject to #GDPR through Article 3(2). This is not clearly articulated in implementing decision and SCC themselves as recitals and articles of both seem to contain controversial information. From confidential sources it’s become known that Directorate-General for Justice and Consumers will soon publish FAQ clarifying these issues. European Commission is not taking any position on the definition of the concept of international data transfers, though.

2. It is not sufficiently clear to what extent negotiating parties may “add other clauses” to SCC? Example I have seen in one of #IAPP articles: would clauses limiting liability between the parties (not towards data subjects, of course) contradict the SCC?

3. As SCC are based on modular principle, one very formal issue is still unclear: when building SCC, should the labels (“Module One: …” etc.) continue to appear in the clauses?
What to do with insertions in the middle of the text (especially for Module Three) if other clauses are used at the same time – is also not perfectly clear?

4. In terms of assessment, new SCC says that parties, when assessing how law and practice in a third country impact an importer’s ability to comply with SCC, are encouraged to take into account “reliable information on the application of the law in practice (such as case law and reports by independent oversight bodies), the existence or absence of requests in the same sector and, under strict conditions, the documented practical experience of the data exporter and/or data importer”. It is a clear shift from strict position taken by EDPB Recommendations 01/2020 that parties should take into account “objective factors, and not rely on subjective factors such as the likelihood of public authorities’ access”.

The final version of #EDPB Recommendations 01/2020 is in the pipeline, and perhaps some important things will be changed compared to the current version for public consultations.

1177 result of (Sweden) audit is final

This is a super interesting case. 1177 is the number used in Sweden to ring for your healthcare provider. There was a slight personal data breach reported in 2020 whereby 2.7 calls were publicly available. Apparently the voice data was not encrypted.

The results of the audit by the Swedish Supervisory Authority has resulted in fines of 12 million SEK (1.2 €) to the data controller (Med Help), 650k SEK (65k €) to the Voice Integrate, 500k SEK (50k € county Stockholm) and 250k SEK (25k €) to counties Värmland and Sörmland.

An interesting case from Austria that touches on Article 85 GDPR and refers to the balancing test developed by ECtHR of how to determine whether occurred processing was for journalistic purposes or not.

The case involved personal data mentioned by ex-convict in his Facebook publication. Ironically enough, Austrian DPA and court came to different conclusion using the same balancing tests. Contrary to DPA’s decision, the court found a violation of the data subject’s rights.

While the case is interesting by itself, it also raises thoughts about consistency of application of various tests and methodologies developed by #WP29 and #EDPB in their multiple papers to practical situations. In a nutshell, “two enforcers – three opinions”.

#dataprotection #lawandlegislation #privacy #politicsandlaw #compliance #dataprivacy #gdpr