Who owns my personal data?

A question which is still creating something of a storm in some circles. Mainly in those that make money from the intricately complex interpretations of law text which is further complicated by neighbouring laws, such as Intellectual Property (IP).

Most legal guys, and even worse my esteemed colleagues who started their careers in other domains such as cyber, just love to do this. Make it difficult and the more complex and intricate the better; and will even go as far as to condemn those who dare to state that my personal data belongs to me.

Well condemn me as you like, because this is the message I give to my customers! And why? Because to do anything different would forge dependancy instead of fostering empowerment.

Karen Lawrence Öqvist, Founder/CEO Privacy

Clearly from a deep legal technical standpoint I cannot have claim to own my personal data. But this is irrelevant to 99.99% of the worlds population who (1) do not have a law degree, or (2) are not working in the domain of GDPR, or in the slightest bit interested to do so.

If I was to start with a dive into the intricacies of IP law, there will be a zero comprehension of an individual’s rights pertaining to GDPR and their personal data. If this is where I, as a professional were to start… my customer who maybe installs alarms for a living, will be thinking but “….who cares except for the music industry?”

For my customers, I have only 20-30 minutes of training for each employee, to get the message across. And the message needs to be formulated in a way so that the business achieves the desired result, i.e. compliance as far as is practicable with GDPR Rules. And yes, I even use the terminology GDPR Rules, because my customers understand what a rule is!

For the GDPR Administrator the training is 3 hours. This is not long enough to dive into 250 pages of GDPR legal text and even IP law. It is long enough to ensure that he/she can do her job, and understands data subject rights and that these rights are not absolute. Some GDPR Admins I train even get excited about the subject matter. And only then can one take the next step.

I sometimes am concerned that I may simplify things too far, to a level whereby my skills are redundant, but in short this approach: costs less for my customers; and, they know exactly what they are getting for every € they spend which are building blocks for trust.

Scope of the obligation to provide «a copy of the personal data undergoing processing» (Article 15(3) GDPR).

#EDPB in draft #Guidelines 01/2022 attempted to clarify the scope of the controller’s obligation to provide «a copy of the personal data undergoing processing» (Article 15(3) #GDPR).

What Article 15(3) implies by «a copy» – has long been a controversial issue, while approaches varied across #EU Member States. Below are some examples:

#Germany: there have been contradictory views as to how the term “copy” should be understood – i.e. whether it should be literally a copy or just summary (https://lnkd.in/eRbC5gY7)

#Austria: GDPR (Article 15(3)) does not grant a right of access to files or documents. However, the content of documents may qualify as personal data. Providing copies of personal data stored within a document will often be the easiest option by redacting superfluous information and providing the document to the applicant (https://lnkd.in/eRgPcEDs)

More insights can be obtained from the #IAPP article – https://lnkd.in/e9g37p9v

Now, EDPB seems to take a so-called ‘fit-for-purpose’ approach to how the notion of ‘copy’ should be understood.

Para 23, 25 of the draft Guidelines 01/2022 say that a right to obtain a copy refers ‘not necessarily to a reproduction of the original documents’ and ‘that the information on the personal data concerning the person who makes the request is provided to the data subject in a way which allows the data subject to retain all of the information and to come back to it’.

Further to this, para 150 stipulates that ‘an obligation to provide the data subject with a copy of the personal data undergoing processing […] does not mean that the data subject always has the right to obtain a copy of the documents containing the personal data, but an unaltered copy of the personal data being processed in these documents. Such copy of the personal data could be provided through a compilation containing all personal data covered by the right of access as long as the compilation makes it possible for the data subject to be made aware and verify the lawfulness of the processing’.

In other words, against this purpose, ‘it is the responsibility of the controller to decide upon the appropriate form in which the #personaldata will be provided’.

EDPB Guidelines 05/2021 on the interplay of Article 3 and Chapter V: not a big deal at all

I intentionally deterred myself so far from reading opinions and analytics about newly issued Guidelines 05/2021 so that those do not inform my personal ‘first’ opinion. 

For now, Guidelines 05/2021 do not appear to be a big deal at all, nor are they free from inconsistency with the new SCCs and from casuistic examples.

1) Three criterions of transfers do not look like something ‘surprising’. C’mon, there could scarcely be anyone who expected the existence of ‘transfer’ between a controller/processor and a data subject. Maybe it is just me, but I can see few (if any) things that could be seen as significantly changing the landscape and adding value to the current understanding of things. 

2) At the same time, a misalignment between the EU Commission and the EDPB still continues. Recital 7 of the SCCs implementing decision noted that SCCs may be used for transfers “only to the extent that the processing by the importer does not fall within the scope of Regulation (EU) 2016/679”, and this is in clear contradiction with EDPB’s transfer criteria #3. More on the conflict between the Commission’s and EDPB’s approaches can be read here: https://iapp.org/news/a/why-it-is-unlikely-the-announced-supplemental-sccs-will-materialize/?mkt_tok=MTM4LUVaTS0wNDIAAAGAiv2DhonU2mSs-GNpYnvfsyMcmuYxz64LrNpH1YIA75K7-YZFEz3tT0a3i4wGnMiMXfBDlsr1mVDx_wDm-qJrSV0CybkgplN9HxJo5DkdpDW2

More interesting, there is still no uniform definition of ‘data exporter’ and ‘data importer’. From new Guidelines 05/2021 it is clear that only controllers, processors and joint controllers may qualify as ‘data exporter’ or ‘data importer’, and only between exporters and importers a transfer may take place. More or less (with some textual discrepancies) the same understanding may be seen in Annex 1 of the EDPB Recommendations 01/2020. But the different approach is seen in Clause 1(b) of the SCCs where the understanding of ‘exporter’ of ‘importer’ bears no relation to controllership issues. 

3) Such details may become important in some scenarios – let’s look at Example 5 (employee of a EU-based company travelling to a third country). First of all, this example seems to be borrowed from Norwegian DPA’s guidance – https://www.datatilsynet.no/rettigheter-og-plikter/virksomhetenes-plikter/overforing-av-personopplysninger-ut-av-eos/ . Second, what if, let’s say, an employee is not travelling to a third country but permanently sits there? Will this change the assessment and why does EDPB endorse such casuistic examples? Will this make the employee ‘importer’ and will this give rise to a ‘transfer’? My answer is ‘No’ for many reasons. And if the EDPB agrees (does it?..), what would be the role of such employee in the scheme? I tend to believe these will qualify as an ‘establishment’ of an employer (who, in turn, can be either a controller or processor). 

But never mind, it is just an example, and it does not really matter what I (or you) personally think. It is EDPB (not us) who is here to give clear answers applicable in a vast majority of scenarios – as opposed to superficial and often evident explanations and casuistic examples, evading a deep-dive into the heart of the issues. 

Apple on the road to hell?

Apple has always been the ‘white sheep’ of the corporate world when it comes to privacy. They actually build in privacy as a differentiator, woven into the DNA of their products. However, it’s not easy being a privacy body with all the conflicts out there.

There is for example the conflict of ‘freedom of speech’ vs. ‘privacy’, both are essential for a proper functioning democratic society, but they conflict with each other. The quote that I love from David Brinn’s book ‘Transparent Society’ is that ‘we want privacy for ourself’ but ‘we want transparency for others’. How the hell do we solve this one?

Then if we move back to the reason for this Post, it is the conflict of ‘protection of our kids’ vs. ‘privacy. And Apple have taken this ‘bull by the horns’ and have now launched 2 new features in the latest updates for iOS, iPadOS, and macOS operating systems.

  1. Protect our kids from online predators, and this is a parental control for its Message App. It captures nude images and the child will be presented with a message that this is the case. If they chose to view it anyhow, the parent will be notified.
  2. Our (backed up) photo libraries will be scanned against a library of images by maintained by the National Center for Missing and Exploited Children (NCMEC). by scanning the user’s Photo Library for matches against a table of hash values of known child abuse images.

We all, I am sure, agree that our kids need to be protected, this is a no-brainer, although at what cost is the question? And how effective will this be in practice?

Both initiatives above seem to be logical measures to (1) protect our kids from harmful content, and (2) find the perverts. However, unfortunately this is not going to work, at least long term, and the cost to our privacy will eventually outweigh the benefits of the short-term gain.

Why do I say this?

Online predators hangout together, they share tricks on how to ‘groom’ kids online and offline. One of these will be to get the kid to use another messaging App, there’s loads out there including Telegram and Signal. So what? if parental controls are installed on the kids digital devices -we have it installed- they can’t download anything without my (as a parent) consent, right?

In theory, yes, but most parents wouldn’t see any threat in downloading another messaging App, especially Signal, used as the preferred median of Snowden. This means that all good intentions of Apple are quite useless in practice. Also, it is still a lot of parents that are not tech savvy… and this will probably be the case for another 5-10 years, or maybe much longer…. read on…

So what do we mean by ‘tech savvy parents’? I have been tech savvy for 30 years, before kids were online, and I had a kid who was playing offline, SIM city and the like. I was tech savvy in those days. Now roll forward to 2021 andI have another kid who is 12 years old, and this is where it gets strange. I am tech savvy, most definitely, but not in her world, and maybe I am deceiving myself in thinking I understand her world. In this way she is more tech savvy than I am…. she knows what trolls are and how to deal with them, and I didn’t teach her that! What it means is that our kids can run circles around us, even myself, in their online world, and the online predators know this!

Then let us take the second feature, this time to catch these online predators. Okay they may catch the newbies, and idiots out there, but like I’ve stated above, the ones that are smarter are those guys/gals who hang out together on the ‘deep or dark’ web, or whatever it’s called. They are savvy, and know how to use Tor (The Onion Router) to protect themselves from the efforts of government authorities.

So I would say of the 2 new features, the first is partially effective, and the second could catch online predators which are not a part of this ‘predators community’, or are just incredibly stupid. Of course, they will catch some perverts in the beginning, as the technology is rolled out, mistakes are made…. I remember once when a picture of me, swimming naked in the Baltic was loaded up to the Apple iCloud accidentally…. oppps… panic and then delete… so this will happen, and some guys will be caught… and good thing too, but again…

In the long term the effectiveness of these 2 features will be minimal. If we try and project ourselves to 10-20 years ahead, to see where this will take us and look back again. What I see is the proliferation of these practices… triggered by the initial success on implementation, but then it will be seen just as a step in the direction of a society which has lost its right to a private life. The online predators would have migrated further underground, would have other ways to fulfil their abnormal sexual desires… our kids will still be vulnerable… and we will be vulnerable to the whims of our governments, good or bad.

What will be the next step following this kind of functionality in our digital tools? As mentioned in this article, maybe following in the steps of North Korea, or whatever is happening in China?

So what will we be thinking when we look back to the year 2021 in 2031? That Apple started all this, giving governments an open door, okay, it was just a small window in 2021 -the function is omnipresent in Apple devices, but it was a start to where we could be in 2031, i.e. the panopticon effect will be complete, and eventually ‘freedom of speech’ vs. ‘a right to a private life’, may no longer be a concern for any of us.

Read more here.

Individual choice vs. a right to life -of another individual

A super interesting situation in Italy concerning covid. Basically an Italian business has rolled out covid testing for their 1,200 employees. It is not obligatory, there is a choice. Of the 1,200, 12 employees (1%) have refused.

The problem is that it has become known, who has refused, maybe the employees have stated their stance, the article doesn’t say, what is stated is that the vaccinated employees didn’t feel safe working alongside the non-vaccinated employees. The management has now decided that all unvaccinated employees should take 6 months leave with pay.

This brings to mind quite some dilemmas, some not linked to GDPR compliance, for example how it feels for 99% of employees who have not been offered ‘unpaid leave’… my perception is that it could be perceived as a ‘reward’? How does this work with new employees?

There are many discussions on the freedom to choose versus the safety of the individual. As anyone who knows me, I am an advocate for ‘choice’, it is a human right, and probably the most important word in the GDPR IMHO. However, when the freedom to choose can cause harm to another individual, my stance changes somewhat. Our individual choices should not harm another individual, this impacts their rights as a human being, a right to live. We should care for each other, as a community.

Although I am in the age group 50-60, so I could be biased, except that I remember having this opinion when I was 20 years old, and 30 years old… and so on. “We should not be judged on how we conduct our life, so long as it does not harm another individual”. And this is from a woman who had her first child before she turned 18 years old, and was hence damned to a life of purgatory given society norms in UK during the 1980s.

The conflict of public safety versus privacy, and our right for choice, is articulated somewhat in the GDPR -in a legal way- Article 9.2(b), of which there have been some discussions on LinkedIn. The Article itself is not clear, but if one reads associated Recitals, it gives a picture which supports public safety, versus individual choice. In Recital 52, it is referred to as a derogation which can be made for health purposes, where a serious threat is present.

I am still uncertain if I would stand by this stance if a decision was made to vaccinate all children, I have a 12-year old daughter. These vaccines haven’t been through the rigorous testing which is normal during clinical trials due to a hasty rollout…. but I guess I can append to this Post later when this topic starts to gain some traction.

Anonymisation – who is to blame and what to do?

The IAPP article provides retrospective of the #anonymisation issue under European data protection landscape and reiterates that there is still no ‘one-size-fits-all’ approach to making personal data #anonymised

The current standing can be described as confusion and vacillation, and, probably, the main culprit of this is #WP29 which took contradictory stances to anonymisation in 2007 and 2014, followed by ignorance from #EDPB side and, again, contradictory stances of national DPAs prone to either 2007 or 2014 approaches.

The simple thing is that straightforward ‘disguising of identity’ (e.g. by one-way cryptography), as WP29 suggested in 2007, can no longer be accepted as anonymisation (of course, unless stated otherwise by a national DPA). And the simple thing number two is that there is no industry standard describing step-by-step anonymisation algorithms and techniques. 

From a practical standpoint this calls for a case-by-cases assessment by an anonymisation entity. The recent AEPD-EDPS joint paper ‘on 10 misunderstandings related to anonymisation’ (‘joint paper’) specifically mentions that ‘anonymisation processes need to be tailored to the nature, scope, context and purposes of processing as well as the risks of varying likelihood and severity for the rights and freedoms of natural persons’.

Out of options suggested by the article, the most practical and realistic is, probably, arguing the risks of reidentification is sufficiently remote in every single case where anonymisation is relied on. In fact, this will require an ‘Anonymisation Impact Assessment’ (I have just come up with this term) which must include assessment of re-identification risks. The joint paper acknowledges that such risks are ‘never zero’ (‘except for specific cases where data is highly generalised’) and that ‘a residual risk of re-identification must be considered’.

Until to date, although addressed by WP29 and adopted by the GDPR, the notion of anonymisation and its application still remains ‘terra incognita’.

EDPB Recommendations 01/2020 – softening without being too soft?

Right after the final version of the Recommendations 01/2020 was issued, we (including myself) started to believe that now, here we will live the life! 

A reference to inability to rely on “subjective factors such as the likelihood of public authorities’ access to your data” is gone, data exporters may now assess how the laws are applied in practice, and even previous importer’s experience.

In fact, it may appear nothing more but just starting euphoria. Let’s be honest, we were happy because we understood: in the majority of cases the legislation of a third country will end up in the cohort of “problematic legislation” (para 43.3).

Okay, para 43.3 says that “you may decide to proceed with the transfer without being required to implement supplementary measures, if you consider that you have no reason to believe that relevant and problematic legislation will be applied, in practice, to your transferred data and/or importer”. That’s the exit, isn’t it? Let’s find some practice that “problematic legislation” does not apply to our transfer, and no need to think of supplementary measures. Everyone’s happy.

Not really. EDPB provides significant requirement to “sources of information” confirming our conclusions.

Non-exhaustive list of them is contained in Annex 3 (various reports from various credible organisations, warrants from other entities…), they must be “relevant, objective, reliable, verifiable and publicly available or otherwise accessible” (para 46). “Documented practical experience of the importer with relevant prior instances of requests” alone cannot be relied on (para 47). 

The question here is: do you know a third country with “problematic legislation” but at the same time with “relevant, objective, reliable, verifiable and publicly available or otherwise accessible” practice confirming that there is not really a problem for the transferred data?

In any event, it is clear: supplementary measures are here to stay.

Some fresh thoughts and updates on new #SCC

1. SCC cover data transfers to importers (i) established in thirds countries AND (ii) NOT subject to #GDPR through Article 3(2). This is not clearly articulated in implementing decision and SCC themselves as recitals and articles of both seem to contain controversial information. From confidential sources it’s become known that Directorate-General for Justice and Consumers will soon publish FAQ clarifying these issues. European Commission is not taking any position on the definition of the concept of international data transfers, though.

2. It is not sufficiently clear to what extent negotiating parties may “add other clauses” to SCC? Example I have seen in one of #IAPP articles: would clauses limiting liability between the parties (not towards data subjects, of course) contradict the SCC?

3. As SCC are based on modular principle, one very formal issue is still unclear: when building SCC, should the labels (“Module One: …” etc.) continue to appear in the clauses?
What to do with insertions in the middle of the text (especially for Module Three) if other clauses are used at the same time – is also not perfectly clear?

4. In terms of assessment, new SCC says that parties, when assessing how law and practice in a third country impact an importer’s ability to comply with SCC, are encouraged to take into account “reliable information on the application of the law in practice (such as case law and reports by independent oversight bodies), the existence or absence of requests in the same sector and, under strict conditions, the documented practical experience of the data exporter and/or data importer”. It is a clear shift from strict position taken by EDPB Recommendations 01/2020 that parties should take into account “objective factors, and not rely on subjective factors such as the likelihood of public authorities’ access”.

The final version of #EDPB Recommendations 01/2020 is in the pipeline, and perhaps some important things will be changed compared to the current version for public consultations.

Sensitive employee data made public in Finland

Okay, there were only 7 employees, and this personal data breach which was investigated by the Finnish DPA was concerning a single employee who was on sick-leave.

What is super interesting about this case is that the employer (a family business) put the fact that the employee was on sick leave on the company website. It seems that because the employee was sending an automated response to emails that he/she was on sick leave, gave the idea that this data was now public data.

It then digs into the employment act and secrecy concerning employee data, and the decision was that sanctions would be placed on this business, i.e. it was a personal data breach which has an impact on ‘rights and freedoms’.

Clearly I’ve cut out a load of details here… but what is important is that even the small family businesses are not immune to GDPR sanctions.

1177 result of (Sweden) audit is final

This is a super interesting case. 1177 is the number used in Sweden to ring for your healthcare provider. There was a slight personal data breach reported in 2020 whereby 2.7 calls were publicly available. Apparently the voice data was not encrypted.

The results of the audit by the Swedish Supervisory Authority has resulted in fines of 12 million SEK (1.2 €) to the data controller (Med Help), 650k SEK (65k €) to the Voice Integrate, 500k SEK (50k € county Stockholm) and 250k SEK (25k €) to counties Värmland and Sörmland.