Life in China… and..

Not many of us know each other personally as human beings with a life outside of what we do in the real world. We are connected on LinkedIn and are a sect of professionals. But behind each individual is a private life. In this private life we make decisions about how we live with our ‘right to a private life’ concerning ourself and our family.

I have had 2 children, a son in UK when I was 17 yrs, and a daughter at 46 yrs in Sweden. This is about my son who has no right to a private life. Some of the details are modified in order to protect him and his family.

Funnily enough there is no way to connect me with my son due to the fact that on his passport his name is not the same as mine, or even what he was born. The name on his passport is not his legal name. How this is possible is another interesting story. What this means is that through some fluke my son and I are -from a paper trail- angle, unconnected. This means I can speak openly about his life in China as a father of a half-Chinese child living with his Chinese wife without risk to him or his family. He has lived in China for 20 years.

The situation in China has always been difficult. As a foreigner he is monitored 24/7. There is a threshold over how much money he is allowed to have, and he doesn’t get much. He is used to receiving visits from the Chinese police. He did use ToR, but even that fact that he is using ToR, data cannot be read by the Chinese nannies (police) is suspicious behaviour, and warrants a home visit. He loves his wife but she is under the influence of Chinese propaganda. She does not see a better life for her family outside of China.

Since the Russian invasion of Ukraine life has become increasingly difficult for my son, and my grandchild because they are not Chinese, or only half Chinese. My grandchild is treated differently at school. My son has occasionally been stopped in the street to ask if he is Russian, when he replied “no”, the reaction is not agreeable.

My son suspects that he is one of a very few European/British/Americans left in the city where he lives. They have all left. What is good news is that finally his wife is coming round, even her close friends are advising that she and her family leave China, for the sake of her family. Although this is also bad, because if Chinese are advising, it’s time to leave…..My son is deeply concerned that they will be segregated and worse. Time is running out.

I have so many stories, shared by my son with me over the years about life in China and how it is changing. So if you are interested, have questions, etc., just add in Comments, and I will write more when I feel in the mood.

Dilemmas

A dilemma that you are a part of can be confusing. You may not even realise that a decision that you must take is presenting for you a personal/professional dilemma. As a data protection officer, this becomes very evident, and problematic requiring skills beyond what is considered normal, i.e. legal, security, etc.

The most profound privacy dilemma is that in any democratic society, it is a human right to have freedom of speech, as is it a human right to a private life. On a more personal level, we want a right to privacy for ourselves per se, but for others, we want to know what our family, friends, acquaintances, are up to!

Then there is the human right to feel safe, which can be provided with the installation of cameras, alarms, etc., in shops, petrol stations, metros, and even in our homes. But this conflicts with our human right for a private life!

Dilemmas come in many forms even in our daily life, especially as parents. I remember when I was offered a job at Cern in Geneva in 1996, at the time I was living in UK, a single parent with a 16 year old son. I was not immediately in a dilemma, I gave him the choice to join me in France, go to a French school, learn French, skiing every weekend, etc., a wonderful life ….which he refused. Funnily enough, looking back now, my immediate dilemma was not to respect my son’s wishes and then find a way so that he could stay and I could go, my dilemma was what others would say about me as a mother. Maybe I had to refuse the job offer?

If we return to the role of the data protection officer. Often the DPO will advise on what is the best course of action when standing in the shoes of the data subject, then it is up to the business to make a decision. If this decision conflicts with advice from the DPO. This is a business dilemma, e.g. lack of transparency concerning a personal data breach (not yet confirmed) versus doing what is right as per GDPR.

If the DPO role as defined in the GDPR Articles 37-39 is respected by the business, then as long as the DPO is experienced and aware of the dilemma and potential conflict with the business, a way forward can be found. It is another skillset required for a DPO not mentioned in the GDPR, or even in any book I’ve read so far, and that is the ability to mediate between the business and meet data subject rights. To able to see the trees for the wood, to see there is a dilemma, to see that it is not personal, it is life.

Just in case you are interested in the outcome of my dilemma as a mother. A year following the decision my son told me when visiting me in France, that it was in fact the best parenting decision I had made to leave him in UK. He was freed from my mothering…which he’d had enough of. So as unlikely was this outcome, it was indeed the right decision for us both.

Human choice and automated decisions

A super article from James Casey -who has authored this blog- on automated decision making. A nice follow up to my last blog. Although this article is deep and insightful. Enjoy!

As I See It
Nudges, Algorithms, and Human Choice: What Does the Future Hold?

https://www.wisbar.org/NewsPublications/WisconsinLawyer/Pages/Article.aspx?Volume=95&Issue=3&ArticleID=28962

The box and automated decisions

It is said that the best way to explain anything in a way so that it can not only be understood but also retained as a piece of knowledge is to tell a story. So I am going to tell a story about what is called in the GDPR ‘profiling’ and ‘automated decision making’.

The connection between the two is understandably difficult for many to comprehend, for good reasons. It seems technical after all and a consequence of technology, data warehousing/analytics/AI etc. The question for many is why are they connected? So this is a story about me and a box and automated decisions made before technology became a part of it. I say me first, because I came before the box did, although you may argue this point after reading my story 😉

In fact it is not just a single box but hundreds of boxes, each one unique and different and through some act of fate, or magic, the single instance of me happens to have resided in each of the boxes at some time in my life. Often I reside in multiple boxes at the same time, a physical impossibility I know, but it is true.

My first memory of the box, was before I was aware of them, or their function. I was about 10 years old, and with my sister and brother introduced to a new friend/acquaintance of our family.

My mother speaking, proudly. “This is Karen she’s the intelligent one of the family.” Then she moves on to introduce my sister and brother, as the tomboy, and the trouble-maker. So I was profiled even before computers arrived on the scene in a normal family or even business life. It was even before store/loyalty cards, data-warehousing and big data analytics. I internalised this when looking back 10 years later, that I had been put into a box.

So what? You may think. Well the consequences are subtle and profound. Clearly for those that have been introduced to our family, the automated decision made is that Karen is intelligent which implies an inverse situation with the other 2 kids. Karen is the favoured according to the other 2 siblings. Clearly it didn’t make me popular, hence an automated decision is made on how I should be treated by those who are aware of this box. What is more is that my siblings have been placed into another box, inadvertently which influences how they are treated by others, and even how they perceive me.

I started to become aware of these boxes when I got pregnant at the age of 17 years and I was moved in a new box. The old box I didn’t mind so much, but this new one was not nice at all. I had done something really bad, my life, my dreams, it was over. I was in the ‘teenage pregnancy’ box and there were a whole load more boxes waiting to grab me when I took the marry the father route, teenage mother box… and let us not forget the box that my son ended up in a statistic by social services. Decisions were made over which I had no control, no more smiling faces, but faceless bureaucrats, George Orwell 1984 style. Nothing digitally automated, faceless human beings filling in checkboxes.

By the time I was 21 I landed in a new box of my own making for the next 13 year of my life. I was a single mum with a young kid living in an area occupied by mainly single mums like me, dysfunctional families, prostitutes and drug pushers. I didn’t choose to be there even though this is the box I was in. I lived in a flat with no heating, no wall insulation, single pane windows and concrete floors. For periods of time I didn’t even have hot water due to an automated decision. At this stage to have a choice in anything in life was a luxury. You survive. Which takes us to the most important word when it comes to human rights, and that is the right to choice. The right for human intervention in the GDPR on automated decisions made on us gives us a choice to challenge decisions made which affect your rights and freedoms as a data subject.

Fast forward to today. I realise now that what I called a ‘box’ when I was young, is actually a ‘profile’ in GDPR. I had been profiled, and automated decisions were made on my life over which I had no control -and this was before profiling and automated decisions were digitally automated -as we know it today.

This all happened in the UK. Now to place this into today’s context in Sweden.

By 2019 I’d been an entrepreneur for 6 years, and my second startup was suffering. I became personally liable for tax debts which I was unable to pay. I requested a ‘repayment plan’ from the Swedish tax authority, which I knew was technically possible, but was refused. The reason was that I’d had 3 parking fines over 3 years that I had been overdue in payment, hence I had passed the threshold to not get offered a repayment plan. I explained to them that I’d paid all my taxes for over 15 years, but it didn’t make a difference. I had been profiled on 3 parking fines, and an automated decision had been made which was was going to devastate me and my family. This was bizarre considering I had significant capital in assets (our house) which I was unable to take a loan against due to that I was by then profiled as not creditworthy. Soon afterwards I received a fat letter in the post, a repossession order on our family home.

What was interesting are the similarities from when I was young, and today which are as scary as the differences. The main difference being that I knew what was happening this time. I knew I’d been profiled using technology, and an automated decision had been made using technology. What was the similar, was that I felt equally as vulnerable as I had done in the 1980s-90s. The problem was that even though I had rights now, I was aware of my rights, I wasn’t in a position to use them.

Article 22 of the GDPR is a really important data subject right that we can overlook because we get bogged down with the technical/legal stuff. It seems complicated. But in reality it is not complicated nor even technical, it is very personal. I hope my story has succeeded in bringing this message across.

Who owns my personal data?

A question which is still creating something of a storm in some circles. Mainly in those that make money from the intricately complex interpretations of law text which is further complicated by neighbouring laws, such as Intellectual Property (IP).

Most legal guys, and even worse my esteemed colleagues who started their careers in other domains such as cyber, just love to do this. Make it difficult and the more complex and intricate the better; and will even go as far as to condemn those who dare to state that my personal data belongs to me.

Well condemn me as you like, because this is the message I give to my customers! And why? Because to do anything different would forge dependancy instead of fostering empowerment.

Karen Lawrence Öqvist, Founder/CEO Privacy

Clearly from a deep legal technical standpoint I cannot have claim to own my personal data. But this is irrelevant to 99.99% of the worlds population who (1) do not have a law degree, or (2) are not working in the domain of GDPR, or in the slightest bit interested to do so.

If I was to start with a dive into the intricacies of IP law, there will be a zero comprehension of an individual’s rights pertaining to GDPR and their personal data. If this is where I, as a professional were to start… my customer who maybe installs alarms for a living, will be thinking but “….who cares except for the music industry?”

For my customers, I have only 20-30 minutes of training for each employee, to get the message across. And the message needs to be formulated in a way so that the business achieves the desired result, i.e. compliance as far as is practicable with GDPR Rules. And yes, I even use the terminology GDPR Rules, because my customers understand what a rule is!

For the GDPR Administrator the training is 3 hours. This is not long enough to dive into 250 pages of GDPR legal text and even IP law. It is long enough to ensure that he/she can do her job, and understands data subject rights and that these rights are not absolute. Some GDPR Admins I train even get excited about the subject matter. And only then can one take the next step.

I sometimes am concerned that I may simplify things too far, to a level whereby my skills are redundant, but in short this approach: costs less for my customers; and, they know exactly what they are getting for every € they spend which are building blocks for trust.

Scope of the obligation to provide «a copy of the personal data undergoing processing» (Article 15(3) GDPR).

#EDPB in draft #Guidelines 01/2022 attempted to clarify the scope of the controller’s obligation to provide «a copy of the personal data undergoing processing» (Article 15(3) #GDPR).

What Article 15(3) implies by «a copy» – has long been a controversial issue, while approaches varied across #EU Member States. Below are some examples:

#Germany: there have been contradictory views as to how the term “copy” should be understood – i.e. whether it should be literally a copy or just summary (https://lnkd.in/eRbC5gY7)

#Austria: GDPR (Article 15(3)) does not grant a right of access to files or documents. However, the content of documents may qualify as personal data. Providing copies of personal data stored within a document will often be the easiest option by redacting superfluous information and providing the document to the applicant (https://lnkd.in/eRgPcEDs)

More insights can be obtained from the #IAPP article – https://lnkd.in/e9g37p9v

Now, EDPB seems to take a so-called ‘fit-for-purpose’ approach to how the notion of ‘copy’ should be understood.

Para 23, 25 of the draft Guidelines 01/2022 say that a right to obtain a copy refers ‘not necessarily to a reproduction of the original documents’ and ‘that the information on the personal data concerning the person who makes the request is provided to the data subject in a way which allows the data subject to retain all of the information and to come back to it’.

Further to this, para 150 stipulates that ‘an obligation to provide the data subject with a copy of the personal data undergoing processing […] does not mean that the data subject always has the right to obtain a copy of the documents containing the personal data, but an unaltered copy of the personal data being processed in these documents. Such copy of the personal data could be provided through a compilation containing all personal data covered by the right of access as long as the compilation makes it possible for the data subject to be made aware and verify the lawfulness of the processing’.

In other words, against this purpose, ‘it is the responsibility of the controller to decide upon the appropriate form in which the #personaldata will be provided’.

EDPB Guidelines 05/2021 on the interplay of Article 3 and Chapter V: not a big deal at all

I intentionally deterred myself so far from reading opinions and analytics about newly issued Guidelines 05/2021 so that those do not inform my personal ‘first’ opinion. 

For now, Guidelines 05/2021 do not appear to be a big deal at all, nor are they free from inconsistency with the new SCCs and from casuistic examples.

1) Three criterions of transfers do not look like something ‘surprising’. C’mon, there could scarcely be anyone who expected the existence of ‘transfer’ between a controller/processor and a data subject. Maybe it is just me, but I can see few (if any) things that could be seen as significantly changing the landscape and adding value to the current understanding of things. 

2) At the same time, a misalignment between the EU Commission and the EDPB still continues. Recital 7 of the SCCs implementing decision noted that SCCs may be used for transfers “only to the extent that the processing by the importer does not fall within the scope of Regulation (EU) 2016/679”, and this is in clear contradiction with EDPB’s transfer criteria #3. More on the conflict between the Commission’s and EDPB’s approaches can be read here: https://iapp.org/news/a/why-it-is-unlikely-the-announced-supplemental-sccs-will-materialize/?mkt_tok=MTM4LUVaTS0wNDIAAAGAiv2DhonU2mSs-GNpYnvfsyMcmuYxz64LrNpH1YIA75K7-YZFEz3tT0a3i4wGnMiMXfBDlsr1mVDx_wDm-qJrSV0CybkgplN9HxJo5DkdpDW2

More interesting, there is still no uniform definition of ‘data exporter’ and ‘data importer’. From new Guidelines 05/2021 it is clear that only controllers, processors and joint controllers may qualify as ‘data exporter’ or ‘data importer’, and only between exporters and importers a transfer may take place. More or less (with some textual discrepancies) the same understanding may be seen in Annex 1 of the EDPB Recommendations 01/2020. But the different approach is seen in Clause 1(b) of the SCCs where the understanding of ‘exporter’ of ‘importer’ bears no relation to controllership issues. 

3) Such details may become important in some scenarios – let’s look at Example 5 (employee of a EU-based company travelling to a third country). First of all, this example seems to be borrowed from Norwegian DPA’s guidance – https://www.datatilsynet.no/rettigheter-og-plikter/virksomhetenes-plikter/overforing-av-personopplysninger-ut-av-eos/ . Second, what if, let’s say, an employee is not travelling to a third country but permanently sits there? Will this change the assessment and why does EDPB endorse such casuistic examples? Will this make the employee ‘importer’ and will this give rise to a ‘transfer’? My answer is ‘No’ for many reasons. And if the EDPB agrees (does it?..), what would be the role of such employee in the scheme? I tend to believe these will qualify as an ‘establishment’ of an employer (who, in turn, can be either a controller or processor). 

But never mind, it is just an example, and it does not really matter what I (or you) personally think. It is EDPB (not us) who is here to give clear answers applicable in a vast majority of scenarios – as opposed to superficial and often evident explanations and casuistic examples, evading a deep-dive into the heart of the issues. 

Apple on the road to hell?

Apple has always been the ‘white sheep’ of the corporate world when it comes to privacy. They actually build in privacy as a differentiator, woven into the DNA of their products. However, it’s not easy being a privacy body with all the conflicts out there.

There is for example the conflict of ‘freedom of speech’ vs. ‘privacy’, both are essential for a proper functioning democratic society, but they conflict with each other. The quote that I love from David Brinn’s book ‘Transparent Society’ is that ‘we want privacy for ourself’ but ‘we want transparency for others’. How the hell do we solve this one?

Then if we move back to the reason for this Post, it is the conflict of ‘protection of our kids’ vs. ‘privacy. And Apple have taken this ‘bull by the horns’ and have now launched 2 new features in the latest updates for iOS, iPadOS, and macOS operating systems.

  1. Protect our kids from online predators, and this is a parental control for its Message App. It captures nude images and the child will be presented with a message that this is the case. If they chose to view it anyhow, the parent will be notified.
  2. Our (backed up) photo libraries will be scanned against a library of images by maintained by the National Center for Missing and Exploited Children (NCMEC). by scanning the user’s Photo Library for matches against a table of hash values of known child abuse images.

We all, I am sure, agree that our kids need to be protected, this is a no-brainer, although at what cost is the question? And how effective will this be in practice?

Both initiatives above seem to be logical measures to (1) protect our kids from harmful content, and (2) find the perverts. However, unfortunately this is not going to work, at least long term, and the cost to our privacy will eventually outweigh the benefits of the short-term gain.

Why do I say this?

Online predators hangout together, they share tricks on how to ‘groom’ kids online and offline. One of these will be to get the kid to use another messaging App, there’s loads out there including Telegram and Signal. So what? if parental controls are installed on the kids digital devices -we have it installed- they can’t download anything without my (as a parent) consent, right?

In theory, yes, but most parents wouldn’t see any threat in downloading another messaging App, especially Signal, used as the preferred median of Snowden. This means that all good intentions of Apple are quite useless in practice. Also, it is still a lot of parents that are not tech savvy… and this will probably be the case for another 5-10 years, or maybe much longer…. read on…

So what do we mean by ‘tech savvy parents’? I have been tech savvy for 30 years, before kids were online, and I had a kid who was playing offline, SIM city and the like. I was tech savvy in those days. Now roll forward to 2021 andI have another kid who is 12 years old, and this is where it gets strange. I am tech savvy, most definitely, but not in her world, and maybe I am deceiving myself in thinking I understand her world. In this way she is more tech savvy than I am…. she knows what trolls are and how to deal with them, and I didn’t teach her that! What it means is that our kids can run circles around us, even myself, in their online world, and the online predators know this!

Then let us take the second feature, this time to catch these online predators. Okay they may catch the newbies, and idiots out there, but like I’ve stated above, the ones that are smarter are those guys/gals who hang out together on the ‘deep or dark’ web, or whatever it’s called. They are savvy, and know how to use Tor (The Onion Router) to protect themselves from the efforts of government authorities.

So I would say of the 2 new features, the first is partially effective, and the second could catch online predators which are not a part of this ‘predators community’, or are just incredibly stupid. Of course, they will catch some perverts in the beginning, as the technology is rolled out, mistakes are made…. I remember once when a picture of me, swimming naked in the Baltic was loaded up to the Apple iCloud accidentally…. oppps… panic and then delete… so this will happen, and some guys will be caught… and good thing too, but again…

In the long term the effectiveness of these 2 features will be minimal. If we try and project ourselves to 10-20 years ahead, to see where this will take us and look back again. What I see is the proliferation of these practices… triggered by the initial success on implementation, but then it will be seen just as a step in the direction of a society which has lost its right to a private life. The online predators would have migrated further underground, would have other ways to fulfil their abnormal sexual desires… our kids will still be vulnerable… and we will be vulnerable to the whims of our governments, good or bad.

What will be the next step following this kind of functionality in our digital tools? As mentioned in this article, maybe following in the steps of North Korea, or whatever is happening in China?

So what will we be thinking when we look back to the year 2021 in 2031? That Apple started all this, giving governments an open door, okay, it was just a small window in 2021 -the function is omnipresent in Apple devices, but it was a start to where we could be in 2031, i.e. the panopticon effect will be complete, and eventually ‘freedom of speech’ vs. ‘a right to a private life’, may no longer be a concern for any of us.

Read more here.

Individual choice vs. a right to life -of another individual

A super interesting situation in Italy concerning covid. Basically an Italian business has rolled out covid testing for their 1,200 employees. It is not obligatory, there is a choice. Of the 1,200, 12 employees (1%) have refused.

The problem is that it has become known, who has refused, maybe the employees have stated their stance, the article doesn’t say, what is stated is that the vaccinated employees didn’t feel safe working alongside the non-vaccinated employees. The management has now decided that all unvaccinated employees should take 6 months leave with pay.

This brings to mind quite some dilemmas, some not linked to GDPR compliance, for example how it feels for 99% of employees who have not been offered ‘unpaid leave’… my perception is that it could be perceived as a ‘reward’? How does this work with new employees?

There are many discussions on the freedom to choose versus the safety of the individual. As anyone who knows me, I am an advocate for ‘choice’, it is a human right, and probably the most important word in the GDPR IMHO. However, when the freedom to choose can cause harm to another individual, my stance changes somewhat. Our individual choices should not harm another individual, this impacts their rights as a human being, a right to live. We should care for each other, as a community.

Although I am in the age group 50-60, so I could be biased, except that I remember having this opinion when I was 20 years old, and 30 years old… and so on. “We should not be judged on how we conduct our life, so long as it does not harm another individual”. And this is from a woman who had her first child before she turned 18 years old, and was hence damned to a life of purgatory given society norms in UK during the 1980s.

The conflict of public safety versus privacy, and our right for choice, is articulated somewhat in the GDPR -in a legal way- Article 9.2(b), of which there have been some discussions on LinkedIn. The Article itself is not clear, but if one reads associated Recitals, it gives a picture which supports public safety, versus individual choice. In Recital 52, it is referred to as a derogation which can be made for health purposes, where a serious threat is present.

I am still uncertain if I would stand by this stance if a decision was made to vaccinate all children, I have a 12-year old daughter. These vaccines haven’t been through the rigorous testing which is normal during clinical trials due to a hasty rollout…. but I guess I can append to this Post later when this topic starts to gain some traction.

Anonymisation – who is to blame and what to do?

The IAPP article provides retrospective of the #anonymisation issue under European data protection landscape and reiterates that there is still no ‘one-size-fits-all’ approach to making personal data #anonymised

The current standing can be described as confusion and vacillation, and, probably, the main culprit of this is #WP29 which took contradictory stances to anonymisation in 2007 and 2014, followed by ignorance from #EDPB side and, again, contradictory stances of national DPAs prone to either 2007 or 2014 approaches.

The simple thing is that straightforward ‘disguising of identity’ (e.g. by one-way cryptography), as WP29 suggested in 2007, can no longer be accepted as anonymisation (of course, unless stated otherwise by a national DPA). And the simple thing number two is that there is no industry standard describing step-by-step anonymisation algorithms and techniques. 

From a practical standpoint this calls for a case-by-cases assessment by an anonymisation entity. The recent AEPD-EDPS joint paper ‘on 10 misunderstandings related to anonymisation’ (‘joint paper’) specifically mentions that ‘anonymisation processes need to be tailored to the nature, scope, context and purposes of processing as well as the risks of varying likelihood and severity for the rights and freedoms of natural persons’.

Out of options suggested by the article, the most practical and realistic is, probably, arguing the risks of reidentification is sufficiently remote in every single case where anonymisation is relied on. In fact, this will require an ‘Anonymisation Impact Assessment’ (I have just come up with this term) which must include assessment of re-identification risks. The joint paper acknowledges that such risks are ‘never zero’ (‘except for specific cases where data is highly generalised’) and that ‘a residual risk of re-identification must be considered’.

Until to date, although addressed by WP29 and adopted by the GDPR, the notion of anonymisation and its application still remains ‘terra incognita’.