Digital online rights for children

Sweden is ahead of the rest of the world when it comes to children’s rights, even in the digital/online world. Read more here.

To say I felt an excitement deep in me is an understatement. It was children’s safety online which brought me into privacy. My master thesis for my MSc Information Security was on protecting children online, which led to the publication of my first book “Virtual Shadows” in 2009. This was 8 months before the birth of my daughter.

But what triggered me, was long before this, was my son who was 18 by the time I had published my first book. I often had computers at home, normally open as I was twiddling with them, and so was he since he was 10 years old.

I saw his fascination in Sim City and other highly educational games which transported him into worlds of logistics and consequences. The theme of conversation amongst the boys was which level they are reached, e.g. how a famine had broken out, bad decisions on arming, etc. Gaming was not multi-player, it was single player, and installed on a PC in those days.

What Sweden has triggered is awesome. Beyond what any country has done when it comes to human rights, not surprising considering they were the first country globally to give equal rights to children in 1971. Now in 2020, it has reached the digital world.

BCRs and Tetra Pak has just got them approved in Sweden

An extremely interesting development considering the recent Schrems II decision and that Tetra Pak has US operations.

This is a first for the Swedish Data Protection Authority with BCRs. OneTrust has a good summary of the decision, etc., in English. Here is the decision in Swedish.

Now, there is much discussions on the legality of Binding Corporate Rules since Schrems II, after all surveillance in the U.S. is omnipresent, over which we have no control over here in the E.U., but in reality what this decision means is that the we need to be realistic, business must go on.

My take on the transfer of data is to dive into the potential risks to rights and freedoms of the natural person. If there are none, e.g. you are only transferring email address and name of the individual, and maybe they are adding business activities into a log, e.g. financial records. I find it difficult to really force myself to change an established business practice, especially now with coronavirus times, and many businesses are in survival mode, and many close to bankruptcy. If HR data is being transferred then this must change clearly.

I am, even as a privacy professional sceptical of all the fuss and hype there is on blocking all personal data transfers out of the EU to a country such as the U.S. (lacking adequacy decision now with Privacy Shield gone), because of Schrems II.

I guess if I wasn’t a small startup myself, serving small-medium businesses, I would think differently. But if this is all too complex, the SMB will do nothing, they have too much to lose, and when it happens it can go quick, money spent must be prioritised. For the SMB Schrems II is like double-dutch, all this legal speak, it’s out of their boundaries of business operations, and and the Data Protection Authorities get this, and are not normally targeting the small actors selling consulting, car repairs, chickens, or a pair of shoes, they are after the biggies.

What went wrong? Foodora hacked!

Half a million customer data was stolen by hackers is being reported by Swedish newspapers. Foodora a Swedish concern is owned by a German business, Delivery Hero. As one can guess by the combination of both names: 1) its about food, and 2) yes, customers book online from whichever is their favourite restaurant and get it delivered.

From what I can gather, the data was stolen from their test environment. This means that live data was stored in test which was not appropriately protected as is required by Art 32 (GDPR). Moreover it seems that the purpose limitation (Art 5.1b) and data minimisation (Art 5.1c) principles were not respected. There is probably more, but this is what I have based on a couple of newspaper articles.

So the affected data subjects are included as customer data was from 2016. The only data stolen in clear text was data which is in main public in Sweden (except if you have a protected identity), so it seems low risk, but read on…

What is not public data is the fact that the individual is a customer of Foodora, and this is a great way to social engineer a phishing attack that seems to come from Foodora to these customers.

On the plus side it looks as though Foodora have got out their communications function, sent a message to all customers warning them on what has happened, and not to click on any links in emails from them. Their quick action is impressive, very transparent, and a good example on how to act when this kind of incident occurs.

Nevertheless, I see that there will be an investigation of Foodora by the Swedish Data Protection Authority, which is scheduled to finish before December 2021.

Image taken from https://www.missethoreca.nl/ restaurant guide.

Happy Birthday 2 years on with GDPR!

In celebration for GDPR 2 years on, I thought to repost some blogposts from June 2018. However, when looking I realised that they were a few and the theme was strong on how our personal data is public in Sweden and the use of utgivningsbevis to keep this status quo. So, I ended writing an additional blogpost, realising that I’m still really unhappy about the Swedish status quo on this.

GDPR has brought progress in ensuring that we, data subjects, have rights over our personal data, but sadly what I posted 2 years ago is still acutely relevant today in 2020.

The fact is in Sweden our personal data is made public and we have no say! After all public is public, impossible to restrict processing when this is the case, and as acknowledged in privacy laws, not just in the EU. The data brokers get to this data scrape from public sources, do some intelligent profiling and sell on to businesses, e.g. based on where you live will determine how you are profiled and to whom you will be sold.

Someone tried to argue with me once that a street name (missing house no.) was not personal data. The fact is that the street where you live says quite a lot about who you are. It gives an indication on your wealth, if you’re young, with kids, or elderly and if you’re likely to have a garden, 1 or 2 cars, etc. Your street name is directly or indirectly linked to you as an individual. The street name could be enough that you receive cold calls either by phone or someone knocking on your door to sell you double-glazing.

In UK for example, you are hidden by default. The difference in Sweden is that it still stands today the clash between laws pertaining to ‘freedom of press’ versus ‘a right to a private life’. In Sweden it is the former which wins.

I read somewhere that there are 100s, maybe 1000s of complaints from Swedish data subjects on the lack of control and rights (as per GDPR) they have over their personal data. This is positive! People are aware of their rights and are asking questions, why is this happening? I can’t find the article now, so would appreciate if anyone can dig it up? The question is if this will change? Can it change?

The e-Privacy Regulation has something to protect from unsolicited calls, and by default protected, as in UK the resident needs to opt-in to be included in a public directory.

Protection against spam: this proposal bans unsolicited electronic communications by emails, SMS and automated calling machines. Depending on national law people will either be protected by default or be able to use a do-not-call list to not receive marketing phone calls. Marketing callers will need to display their phone number or use a special pre-fix that indicates a marketing call.

How it works in Sweden today is that every business needs to have a ‘do not call list’, it seems that what is proposed in the e-Privacy Regulation is a national list, which is an improvement, but still does not solve the root of the problem. I do not want my data public unless I have specifically consented to this or I have myself made my data public.

If you have a policy, make sure it is documented, if you have a procedure, document that too…else..

Well it seems that another government authority in Sweden has been fined 120 000 kr (circa €12k) by the Swedish Data Protection authority. The region (county) of Örebro, and it was the heath authority, and it was sensitive data.

What is important in this case, is that although they had procedures, they were not documented, it was word of mouth… oopps, and this is not good enough. Where is the evidence?

Clearly processing of sensitive data means that extra care must be taken, but what is key here outside of this is that Article 5.2 of the GDPR requires accountability, which means there must be evidence that 5.1 is being adhered to.

Swedish DPA has handed out a fine to SSC a government authority of SEK200k

All us DPOs who feel a bit more relaxed with the Corona times… thinking that no Data Protection Authority would be so unfeeling as to hand out a fine now…. think again..

My hat off to the Swedish DPA (Datainspektionen). A fine for SEK200k has been awarded to a Government Service Centre (SSC) which handles salaries and such for 47 Swedish government authorities. SSC is a processor to the 47 government authorities, although a controller to their own employees. It was a breach of 282k employees salary data, including their own.

In short the cause was a technical flaw in an application (Primera), and SSC failed to report not only to the controllers, but also the DPA…. but clearly the case is much more complex. Articles 24, 28, 32, 33-34 are all quoted in the report from Datainspektionen. This is a really important case and gives some really great clarifications, not only on personal data breach notification but also responsibilities of the controller/processor.

I tried to map it out below. Basically there were 2 fines:

  1. For the delay in reporting the breach to the controller (SEK150k), Article 33.2 and;
  2. For the delay in reporting to the Swedish DPA (SEK50k), Article 33.1.

Basically a personal data breach must be reported to the controller (if you are a processor) without undue delay. As soon as you know you have a breach, clearly you want to know why it happened, but this has to wait. What is important is to ascertain that a breach has happened, and it is personal data which has been breached. What happened in this case is that SSC wanted to understand more about why it happened, and the delays became serious.

On reporting as the controller, you have 72 hours to report to the DPA, and the same applies as above. During this time priority is to ascertain that a breach has occurred and have enough data to know if it has causes a risk to the rights and freedoms of the natural person. The cause of the breach is a ‘nice to have’ at this stage. You can always send in an updated report later when you know. These are not my opinions, these are facts.

There are always lots of politics which come into play when these kind of incidents occur. I believe SSC wasn’t just lacking in the breach process, but issues with conflicting opinions, and dilemmas.

There was also a problem with the DPA had with Evry, it was outdated, and not compliant with GDPR, but they got a warning for this, no fine.

I created some flows to get my head around this. The paper was quite long (and in Swedish), and loads of other data, e.g. there was an employee who found the bug and exploited the hole, ended up getting reported to the police… I think this was retracted by the Datainspektionen later.

I hope you find this useful? Feel free to ask questions… as I mentioned at the beginning, this is a super interesting case !!

To publish pictures of your kids can become illegal in Sweden

More on kids, and Sweden is ahead of the trend as is normal on children’s rights.

There is a new law (barnkonventionen svensk lag) being discussed which looks as though it will be effective in 2020 which basically means that parents are not permitted to post pictures of their children online without their permission.

This came to my notice following a Post I made on a private group on Facebook informing that it was against human rights and a right to a private life to Post pictures of children and any individual should not be posted without their permission. I made this Post because I was horrified (although not surprised) to find that someone had posted a video of a couple of teenagers on mopeds on the island (where I live) driving too fast, and was asking who they were. The culprits were uncovered. In main she was praised for stopping them, names were mentioned, until the mother popped up in the thread.

This reminded me of something which happens in China, a practice called ‘cyber manhunt‘. An individual does something bad, and a hunt is initiated to find him/her via social networks and other connected means, once found their life is made a misery.

In this closed group there were almost a 1000 members. So the 2 teenagers were publicly exposed. They did something wrong, but it doesn’t matter, they didn’t deserve public humiliation. I also wonder that if adults are posting these kind of videos online of kids, then clearly kids will not hesitate to do the same.. consequences can be fatal -if a child takes their life due to something posted on them to which they have not agreed to.

It is therefore, a delightful development, the new law which protects kids in the digital age, connected age. How this will work in practice, we will see. From a practical perspective, just wondering how an under 5 will be able to consent to their pictures being posted online. But I’m sure there is something in the legal text which covers this…

Tracking kids in schools

Seems the school sector has gotten cold feet on the use of tracking technologies in schools. Since the decision by the Swedish SA on the use of facial recognition biometrics, other schools are following suit.

A right to feel safe vs. a right to a private life – both human rights

The question is that sometimes it is VERY useful to use tracking technologies, for example in order to protect vulnerable persons, i.e. small children, and old people (who tend to wander). So the decision by Norrköping kindergarten was a bad one IMHO to not allow the use of tracking – use of armband- of toddlers/small children.

As a parent it would give me peace of mind. Human rights states that we have a ‘right to feel safe’ and ‘a right to a private life’. These rights can often conflict with each other which results in the wrong decisions being made. Hence in fear of breaking the GDPR a school has made a rather incorrect decision which has so many benefits for all. What’s more is that RFID/sensors are not biometrics, so have no relation to the other decision. Sensors do not even need to be linked to an identity. All the school needs to know is if they have lost a child, not which one… that they can work out pretty quickly by seeing which they have.

This presents another problem in that decisions are made by persons who are are not able to take this careful balancing act and really identify the potential risk of harm to the natural person. In the case of Norrköping school I can see none which outweigh the benefits on a ‘right to feel safe’.

Thanks to Inge Frisk for bringing this decision in Norrköping to my attention.

Fine SEK200k on use of facial recognition in Swedish school

Finally some action in Sweden!

The ruling is in Swedish, but to summarise the school was using facial recognition on its students. Facial recognition is biometric data, hence sensitive (special categories of data in the GDPR). They used consent as the legal basis but this was considered as unlawful due to the imbalance of relationship between the controller (school) and the data subject (student of 16+ yrs). Basically the student had no choice.

But there is more. The Swedish data protection authority based their decision on the following:

  1. Art 5 – personal data collected was intrusive and more was collected that was needed for the purpose
  2. Art 9 – the school did not have a legal exception to handle sensitive data. It is forbidden to collect sensitive data unless this is the case.
  3. Art 35-36 – seems that a DPIA was not done.

What does this mean to other schools or even any public or private entity looking to use intrusive biometrics? Do a data protection impact assessment (DPIA), from here you will be able to get a clean picture on the potential risk of harm to the rights and freedoms of the data subject.

For me personally and professionally, I’m just happy that China’s big brother approach has been nipped in the bud here in Sweden 🙂

PII Collection – Purpose Limitation & Proportionality

I’ve been publishing on the subject of personal privacy since 2007, and finally, now, in 2015 I decided to take my CIPP/E. The CIPP credential says you know privacy laws and regulations and how to apply them according to the International Association of Privacy Professionals (IAPP).

Why did I take this certification? After all I have a Masters Degree in Information Security in supposedly the most famous (in this subject) globally, with the Royal Holloway University of London (RHUL). I also have an MBA with Henley Management School (University of Reading). On top of 20 years of rich experience in IT and IS, it looks as though I am in the league of ‘over-qualified’ and then ‘what next?’. Or am I?

No! I am driven by a desire to ‘fix the Swedish ID promiscuity problem’. (There is more on this in my blog, lots of posts.) I took CIPP/E to get a toolkit that I could use to stop, my and your Swedish ID, being publicly sold online without my or your consent! So now I finally understand what the problem is, and I believe I can solve this, to finally squash this conflict between ‘freedom of information’ laws and ‘PuL’. Watch this space…..