Apple on the road to hell?

Apple has always been the ‘white sheep’ of the corporate world when it comes to privacy. They actually build in privacy as a differentiator, woven into the DNA of their products. However, it’s not easy being a privacy body with all the conflicts out there.

There is for example the conflict of ‘freedom of speech’ vs. ‘privacy’, both are essential for a proper functioning democratic society, but they conflict with each other. The quote that I love from David Brinn’s book ‘Transparent Society’ is that ‘we want privacy for ourself’ but ‘we want transparency for others’. How the hell do we solve this one?

Then if we move back to the reason for this Post, it is the conflict of ‘protection of our kids’ vs. ‘privacy. And Apple have taken this ‘bull by the horns’ and have now launched 2 new features in the latest updates for iOS, iPadOS, and macOS operating systems.

  1. Protect our kids from online predators, and this is a parental control for its Message App. It captures nude images and the child will be presented with a message that this is the case. If they chose to view it anyhow, the parent will be notified.
  2. Our (backed up) photo libraries will be scanned against a library of images by maintained by the National Center for Missing and Exploited Children (NCMEC). by scanning the user’s Photo Library for matches against a table of hash values of known child abuse images.

We all, I am sure, agree that our kids need to be protected, this is a no-brainer, although at what cost is the question? And how effective will this be in practice?

Both initiatives above seem to be logical measures to (1) protect our kids from harmful content, and (2) find the perverts. However, unfortunately this is not going to work, at least long term, and the cost to our privacy will eventually outweigh the benefits of the short-term gain.

Why do I say this?

Online predators hangout together, they share tricks on how to ‘groom’ kids online and offline. One of these will be to get the kid to use another messaging App, there’s loads out there including Telegram and Signal. So what? if parental controls are installed on the kids digital devices -we have it installed- they can’t download anything without my (as a parent) consent, right?

In theory, yes, but most parents wouldn’t see any threat in downloading another messaging App, especially Signal, used as the preferred median of Snowden. This means that all good intentions of Apple are quite useless in practice. Also, it is still a lot of parents that are not tech savvy… and this will probably be the case for another 5-10 years, or maybe much longer…. read on…

So what do we mean by ‘tech savvy parents’? I have been tech savvy for 30 years, before kids were online, and I had a kid who was playing offline, SIM city and the like. I was tech savvy in those days. Now roll forward to 2021 andI have another kid who is 12 years old, and this is where it gets strange. I am tech savvy, most definitely, but not in her world, and maybe I am deceiving myself in thinking I understand her world. In this way she is more tech savvy than I am…. she knows what trolls are and how to deal with them, and I didn’t teach her that! What it means is that our kids can run circles around us, even myself, in their online world, and the online predators know this!

Then let us take the second feature, this time to catch these online predators. Okay they may catch the newbies, and idiots out there, but like I’ve stated above, the ones that are smarter are those guys/gals who hang out together on the ‘deep or dark’ web, or whatever it’s called. They are savvy, and know how to use Tor (The Onion Router) to protect themselves from the efforts of government authorities.

So I would say of the 2 new features, the first is partially effective, and the second could catch online predators which are not a part of this ‘predators community’, or are just incredibly stupid. Of course, they will catch some perverts in the beginning, as the technology is rolled out, mistakes are made…. I remember once when a picture of me, swimming naked in the Baltic was loaded up to the Apple iCloud accidentally…. oppps… panic and then delete… so this will happen, and some guys will be caught… and good thing too, but again…

In the long term the effectiveness of these 2 features will be minimal. If we try and project ourselves to 10-20 years ahead, to see where this will take us and look back again. What I see is the proliferation of these practices… triggered by the initial success on implementation, but then it will be seen just as a step in the direction of a society which has lost its right to a private life. The online predators would have migrated further underground, would have other ways to fulfil their abnormal sexual desires… our kids will still be vulnerable… and we will be vulnerable to the whims of our governments, good or bad.

What will be the next step following this kind of functionality in our digital tools? As mentioned in this article, maybe following in the steps of North Korea, or whatever is happening in China?

So what will we be thinking when we look back to the year 2021 in 2031? That Apple started all this, giving governments an open door, okay, it was just a small window in 2021 -the function is omnipresent in Apple devices, but it was a start to where we could be in 2031, i.e. the panopticon effect will be complete, and eventually ‘freedom of speech’ vs. ‘a right to a private life’, may no longer be a concern for any of us.

Read more here.

Individual choice vs. a right to life -of another individual

A super interesting situation in Italy concerning covid. Basically an Italian business has rolled out covid testing for their 1,200 employees. It is not obligatory, there is a choice. Of the 1,200, 12 employees (1%) have refused.

The problem is that it has become known, who has refused, maybe the employees have stated their stance, the article doesn’t say, what is stated is that the vaccinated employees didn’t feel safe working alongside the non-vaccinated employees. The management has now decided that all unvaccinated employees should take 6 months leave with pay.

This brings to mind quite some dilemmas, some not linked to GDPR compliance, for example how it feels for 99% of employees who have not been offered ‘unpaid leave’… my perception is that it could be perceived as a ‘reward’? How does this work with new employees?

There are many discussions on the freedom to choose versus the safety of the individual. As anyone who knows me, I am an advocate for ‘choice’, it is a human right, and probably the most important word in the GDPR IMHO. However, when the freedom to choose can cause harm to another individual, my stance changes somewhat. Our individual choices should not harm another individual, this impacts their rights as a human being, a right to live. We should care for each other, as a community.

Although I am in the age group 50-60, so I could be biased, except that I remember having this opinion when I was 20 years old, and 30 years old… and so on. “We should not be judged on how we conduct our life, so long as it does not harm another individual”. And this is from a woman who had her first child before she turned 18 years old, and was hence damned to a life of purgatory given society norms in UK during the 1980s.

The conflict of public safety versus privacy, and our right for choice, is articulated somewhat in the GDPR -in a legal way- Article 9.2(b), of which there have been some discussions on LinkedIn. The Article itself is not clear, but if one reads associated Recitals, it gives a picture which supports public safety, versus individual choice. In Recital 52, it is referred to as a derogation which can be made for health purposes, where a serious threat is present.

I am still uncertain if I would stand by this stance if a decision was made to vaccinate all children, I have a 12-year old daughter. These vaccines haven’t been through the rigorous testing which is normal during clinical trials due to a hasty rollout…. but I guess I can append to this Post later when this topic starts to gain some traction.

Sensitive employee data made public in Finland

Okay, there were only 7 employees, and this personal data breach which was investigated by the Finnish DPA was concerning a single employee who was on sick-leave.

What is super interesting about this case is that the employer (a family business) put the fact that the employee was on sick leave on the company website. It seems that because the employee was sending an automated response to emails that he/she was on sick leave, gave the idea that this data was now public data.

It then digs into the employment act and secrecy concerning employee data, and the decision was that sanctions would be placed on this business, i.e. it was a personal data breach which has an impact on ‘rights and freedoms’.

Clearly I’ve cut out a load of details here… but what is important is that even the small family businesses are not immune to GDPR sanctions.

1177 result of (Sweden) audit is final

This is a super interesting case. 1177 is the number used in Sweden to ring for your healthcare provider. There was a slight personal data breach reported in 2020 whereby 2.7 calls were publicly available. Apparently the voice data was not encrypted.

The results of the audit by the Swedish Supervisory Authority has resulted in fines of 12 million SEK (1.2 €) to the data controller (Med Help), 650k SEK (65k €) to the Voice Integrate, 500k SEK (50k € county Stockholm) and 250k SEK (25k €) to counties Värmland and Sörmland.

A US update on the TikTok saga

As you know Trump tried to ban TikTok from the US, and a compromise was reached with TikTok that US user data would only be stored in US data-centers. Sounds a bit similar to the Irish ruling in 2020. What I am thinking is that US intelligence have the power/mandate to access data of EU data subjects under FISA 702, so what if China have something similar?

Anyhow despite my speculations, there is a new development. It seems that biometric data may or will be collected by TikTok, as it stands now, only US TikTok users, although consent will be required. Apparently it seems that now all US states require consent for the collection of biometric data!

But what about all the underage users? There is a law which mandates parental consent (of minors) in the US. A significant number of TikTok users are minors, and the mind boggles when it comes to the collection of biometric data of minors…..how aware are the parents. More and more I am coming to the view that TikTok should be banned…. even though my daughter is a user, and the fun and benefits are boundless.

Mailchimp is out, even if…..

I am pretty creative when it comes to taking the GDPR legal stuff and working out how to make it work in practice. No business/organisation should hit a wall of what I call ‘GDPR paralysis’ because of something legal which prevents a business from functioning. Our livelihood depends upon a working economy and a healthy GNP. In fact if we didn’t have this, human rights starts to become problematic, because if we as private people do not have access to jobs we lose something which is the most important word in IMHO, and that is CHOICE.

Whenever I am presented with a stop, i.e. “no can’t do”, it is an opportunity to think new. Schrems II is one such example. I did not see it as a stop on international transfers over to the US. It just meant we needed increase diligence, document all and do those Transfer Impact Assessments (TIA) so we understand risks to the rights and freedoms of the natural person. Identify supplementary measures. We need to be realistic.

However, I must admit that the latest decision on Mailchimp in Germany is a show-stopper. From what I’ve dug out, it is only email addresses used in a mailing campaign which was in scope of the international transfer. Risk to the rights and freedoms of the natural person is zero/negligible. Yet due to indications that Mailchimp may in principle be subject to data access by US intelligence services on the basis of the US legal provision FISA702 (50 U.S.C. § 1881) as a possible so-called Electronic Communications Service Provider and thus the transfer could only be lawful if such additional measures (if possible and sufficient to remediate the problem) were taken. “

My take on this previously was to assess risk to the rights and freedoms of the individual, however, now this approach has been kicked out, ignored. I wonder where is the logic, the balance in this decision? Clearly if Mailchimp was being used to send out marketing communications from a Sex Shop, or from a specialist group around a health condition, I could understand this… but an email address used in a standard non-personal communication?

I am wondering which monkey was behind this decision, or am I missing something?

Booking.com reported the breach too late

So the question is when do you press the red BREACH button?

  1. Is it when you first become aware a personal data breach could have occurred?
  2. Is it when you are sure that a personal data breach has occurred which triggers an investigation,
  3. or is it on conclusion of this investigation?

Booking.com decided on option 3.

Booking.com have been fined €475k because of this. They did report the breach but what is significant about this fine is that a minor incident reported by a customer of a hotel was dismissed on 9 Jan, then a second identical report from another customer of the same hotel triggered an investigation on 13 Jan. However, the report of the breach was not until 7 Feb, 3 days after the internal security investigation was concluded. The fine is because booking.com should -according to the Dutch DPA- have reported the breach on the 13 Jan.

In fact it is common practice, to not report the breach until one is sure there has been a breach, and sometimes even the circumstances of the breach. This case shows that this is not an advisable route.

French court decision and use of safeguards for international transfers

I was most delighted when this case popped up in my feed today.

“The court noted for the purposes of hosting its data, Doctolib uses the services of the Luxemburg company AWS Sarl, the data is hosted in data centers located in France and in Germany, and the contract concluded between Doctolib and AWS Sarl does not provide for the transfer of data to the U.S. However, because it is a subsidiary of a company under U.S. law, the court considered AWS Sarl in Luxemburg may be subject to access requests by U.S. authorities in the framework of U.S. monitoring programs based on Article 702 of the Foreign Intelligence Surveillance Act or Executive Order 12333. “

Even so the court decided there were sufficient legal and technical safeguards to protect the data, and this was related to covid-19.

Read more.

Virtualshadows blog is back!

This blog has got a resurrection. It was closed down in November last year because of non-compliance concerning the amount of cookies that the blog was using (WordPress was a cloud service based in US), and Schrems II ruling and that all cookie consent banners were too expensive, after all this is a private blog, its just there were quite a few visitors each month. I guess if this blog had been about my dog, or anything else, maybe I wouldn’t have bothered with all the GDPR stuff, but even so I am professionally a ‘privacy guy’, so the blog had to go.

So what happened to my blog was something I call ‘GDPR paralysis’, everything comes to a stop, and GDPR is the cause. I remember when my business (Privasee) which I founded in 2015 came into a state of GDPR paralysis in 2017, the privacy purists versus myself as CEO, in that ‘business has to function’. There needs to be a compromise, otherwise Privasee would cease to exist, making money for the business is necessary for survival, and for my business to achieve what it set out to do, i.e. ‘make privacy compliance accessible’.

One could claim that a blog comes under ‘household exemption’, which was how I was thinking, maybe misguided, but you know how we can be, human beings, believing in what is easiest, and anyhow what harm can it do to the ‘rights and freedoms of the natural person’? It’s just all those cookies made it a privacy risk to visitors, and today something popped up in my LinkedIn feed that the Danish Data protection authority have passed a ruling that a so called ‘private website’ was not exempt under Article 2(1). I can’t find the case now.

When reading this blogpost, you should only have 7 cookies downloaded, and they are all session cookies, except one with a life of a single day. The WordPress site is based in the EU, so no international transfers.

Enjoy reading the blog again, and welcome back!

Digital online rights for children

Sweden is ahead of the rest of the world when it comes to children’s rights, even in the digital/online world. Read more here.

To say I felt an excitement deep in me is an understatement. It was children’s safety online which brought me into privacy. My master thesis for my MSc Information Security was on protecting children online, which led to the publication of my first book “Virtual Shadows” in 2009. This was 8 months before the birth of my daughter.

But what triggered me, was long before this, was my son who was 18 by the time I had published my first book. I often had computers at home, normally open as I was twiddling with them, and so was he since he was 10 years old.

I saw his fascination in Sim City and other highly educational games which transported him into worlds of logistics and consequences. The theme of conversation amongst the boys was which level they are reached, e.g. how a famine had broken out, bad decisions on arming, etc. Gaming was not multi-player, it was single player, and installed on a PC in those days.

What Sweden has triggered is awesome. Beyond what any country has done when it comes to human rights, not surprising considering they were the first country globally to give equal rights to children in 1971. Now in 2020, it has reached the digital world.