Employer fined for employee fingerprints

Another interesting case. Each new case is helping us to understand better how to implement and be compliant with GDPR in our organisations.

So this is a fine of €725k by the Dutch DPA to an organisation which started using biometrics, i.e. fingerprint authentication. If you’ve checked the link above with an explanation provided by DLA Piper blog, there are 2 factors which really surface.

Firstly, consent cannot be used as a legal basis if there is an imbalance in relationship, and in the case of employer/employee, this is always the case. If fingerprinting is to be used then the employee needs to have a choice to use another method, e.g. access cards. In this example, there was a lack of choice, employees were forced to provide consent. Consent was not freely given.

Secondly, it seems that the Dutch law gives a second alternative on the using of biometrics for authentication and security purposes. However, this is only if it can be proved that it is proportionate to the purpose. For example, to use as a means to access high security facilities is proportionate, not access to office space.

Why I love this case is that it really emphasises on the use of consent in the employer/employee relationship.

Swedish DPA has handed out a fine to SSC a government authority of SEK200k

All us DPOs who feel a bit more relaxed with the Corona times… thinking that no Data Protection Authority would be so unfeeling as to hand out a fine now…. think again..

My hat off to the Swedish DPA (Datainspektionen). A fine for SEK200k has been awarded to a Government Service Centre (SSC) which handles salaries and such for 47 Swedish government authorities. SSC is a processor to the 47 government authorities, although a controller to their own employees. It was a breach of 282k employees salary data, including their own.

In short the cause was a technical flaw in an application (Primera), and SSC failed to report not only to the controllers, but also the DPA…. but clearly the case is much more complex. Articles 24, 28, 32, 33-34 are all quoted in the report from Datainspektionen. This is a really important case and gives some really great clarifications, not only on personal data breach notification but also responsibilities of the controller/processor.

I tried to map it out below. Basically there were 2 fines:

  1. For the delay in reporting the breach to the controller (SEK150k), Article 33.2 and;
  2. For the delay in reporting to the Swedish DPA (SEK50k), Article 33.1.

Basically a personal data breach must be reported to the controller (if you are a processor) without undue delay. As soon as you know you have a breach, clearly you want to know why it happened, but this has to wait. What is important is to ascertain that a breach has happened, and it is personal data which has been breached. What happened in this case is that SSC wanted to understand more about why it happened, and the delays became serious.

On reporting as the controller, you have 72 hours to report to the DPA, and the same applies as above. During this time priority is to ascertain that a breach has occurred and have enough data to know if it has causes a risk to the rights and freedoms of the natural person. The cause of the breach is a ‘nice to have’ at this stage. You can always send in an updated report later when you know. These are not my opinions, these are facts.

There are always lots of politics which come into play when these kind of incidents occur. I believe SSC wasn’t just lacking in the breach process, but issues with conflicting opinions, and dilemmas.

There was also a problem with the DPA had with Evry, it was outdated, and not compliant with GDPR, but they got a warning for this, no fine.

I created some flows to get my head around this. The paper was quite long (and in Swedish), and loads of other data, e.g. there was an employee who found the bug and exploited the hole, ended up getting reported to the police… I think this was retracted by the Datainspektionen later.

I hope you find this useful? Feel free to ask questions… as I mentioned at the beginning, this is a super interesting case !!

CNIL DPO accreditation

Well I was pretty impressed that France seemed to be the first on the block to get some kind of official recognition for the DPO role. Organisations which train and certify DPOs can apply to be on their list of accredited organisations.

Great I think. We need to apply… in ‘we‘ I mean Privasee of course!

Privasee has DPO training which is accredited at 5 ECTs* on exam completion (Scottish Credit and Qualifications Framework which equals Level 6 Certification *EQF (European Qualifications Framework))

But Privasee will not apply, and why? Well because it requires (1) inclusion of the French Data Protection Act in the training content, and (2) candidate for CNIL accreditation must first be accredited by an accreditation body pursuant to standard ISO/CEI 17024:2012.

There is absolutely no inclusion of academic accreditation to which the Privasee CPP/EU-DPO has earned. The ISO standard mentioned above is purely that the certification conforms to a specific schedule. The academic accreditation that Privasee has earned for their DPO training has both content and structure assessed.

Why are academic qualifications not included here? And why exclude all DPO training/certification organisations which are not French?

Flashback to when I was a security guy and the proud owner of the MSc in Information Security from the Royal Holloway University of London (RHUL, 2006), renowned best globally in Infosec/cybersecurity education with gurus such as Prof. Fred Piper. I was nonetheless continually frustrated by the need for CISSP certification which required an individual to read a book, memorise and regurgitate in multiple choice test questions. Whereas with the Master Degree which many of us studied part-time or distance in addition to a full-time job over 2-4 years was completely ignored. The headhunters had a search algorithm which searched for CISSP and NOT MSc. This hurts, as those of us who have completed the MSc will acknowledge it is expensive, and then just because of an automated decision engine we are excluded from potential jobs.

Fast forward to now. I realise that with GDPR that those recruiting may have a challenge with these kind of automated decisions. I wonder when the job applicants will cotton on to this?

And then back to the CNIL as a DPO certification accreditation body. As you’ve probably realised by now, I’m just a little bit peeved that again… maybe I’m taking this personally… being excluded.

On the bright side. Even IAPP with the combined CIPP/E and CIPM (to be the DPO) will not be able to fulfil this requirement. The CIPP/E has nothing on French data protection.

Taking a practical approach. Privasee could theoretically get the ISO thingy, and if you are a French privacy/legal guy/girl with a French business, who would like to give this a bash. Contact me and become a Privasee OWL Partner. The adaptation of the CPP/EU-DPO training to a CPP/Fr-DPO training would be minimal… IMHO

To publish pictures of your kids can become illegal in Sweden

More on kids, and Sweden is ahead of the trend as is normal on children’s rights.

There is a new law (barnkonventionen svensk lag) being discussed which looks as though it will be effective in 2020 which basically means that parents are not permitted to post pictures of their children online without their permission.

This came to my notice following a Post I made on a private group on Facebook informing that it was against human rights and a right to a private life to Post pictures of children and any individual should not be posted without their permission. I made this Post because I was horrified (although not surprised) to find that someone had posted a video of a couple of teenagers on mopeds on the island (where I live) driving too fast, and was asking who they were. The culprits were uncovered. In main she was praised for stopping them, names were mentioned, until the mother popped up in the thread.

This reminded me of something which happens in China, a practice called ‘cyber manhunt‘. An individual does something bad, and a hunt is initiated to find him/her via social networks and other connected means, once found their life is made a misery.

In this closed group there were almost a 1000 members. So the 2 teenagers were publicly exposed. They did something wrong, but it doesn’t matter, they didn’t deserve public humiliation. I also wonder that if adults are posting these kind of videos online of kids, then clearly kids will not hesitate to do the same.. consequences can be fatal -if a child takes their life due to something posted on them to which they have not agreed to.

It is therefore, a delightful development, the new law which protects kids in the digital age, connected age. How this will work in practice, we will see. From a practical perspective, just wondering how an under 5 will be able to consent to their pictures being posted online. But I’m sure there is something in the legal text which covers this…

Tracking kids in schools

Seems the school sector has gotten cold feet on the use of tracking technologies in schools. Since the decision by the Swedish SA on the use of facial recognition biometrics, other schools are following suit.

A right to feel safe vs. a right to a private life – both human rights

The question is that sometimes it is VERY useful to use tracking technologies, for example in order to protect vulnerable persons, i.e. small children, and old people (who tend to wander). So the decision by Norrköping kindergarten was a bad one IMHO to not allow the use of tracking – use of armband- of toddlers/small children.

As a parent it would give me peace of mind. Human rights states that we have a ‘right to feel safe’ and ‘a right to a private life’. These rights can often conflict with each other which results in the wrong decisions being made. Hence in fear of breaking the GDPR a school has made a rather incorrect decision which has so many benefits for all. What’s more is that RFID/sensors are not biometrics, so have no relation to the other decision. Sensors do not even need to be linked to an identity. All the school needs to know is if they have lost a child, not which one… that they can work out pretty quickly by seeing which they have.

This presents another problem in that decisions are made by persons who are are not able to take this careful balancing act and really identify the potential risk of harm to the natural person. In the case of Norrköping school I can see none which outweigh the benefits on a ‘right to feel safe’.

Thanks to Inge Frisk for bringing this decision in Norrköping to my attention.

Fine SEK200k on use of facial recognition in Swedish school

Finally some action in Sweden!

The ruling is in Swedish, but to summarise the school was using facial recognition on its students. Facial recognition is biometric data, hence sensitive (special categories of data in the GDPR). They used consent as the legal basis but this was considered as unlawful due to the imbalance of relationship between the controller (school) and the data subject (student of 16+ yrs). Basically the student had no choice.

But there is more. The Swedish data protection authority based their decision on the following:

  1. Art 5 – personal data collected was intrusive and more was collected that was needed for the purpose
  2. Art 9 – the school did not have a legal exception to handle sensitive data. It is forbidden to collect sensitive data unless this is the case.
  3. Art 35-36 – seems that a DPIA was not done.

What does this mean to other schools or even any public or private entity looking to use intrusive biometrics? Do a data protection impact assessment (DPIA), from here you will be able to get a clean picture on the potential risk of harm to the rights and freedoms of the data subject.

For me personally and professionally, I’m just happy that China’s big brother approach has been nipped in the bud here in Sweden 🙂

Knock knock … join our religion -and btw GDPR doesn’t apply to us!

I just loved this case decision in Finland whereby Jehovah’s Witnesses must comply with GDPR, determined by EU court.  In 2013 Finland’s Data Protection Supervisor prohibited the Jehovah’s Witnesses religious community from collecting or processing personal data in the course of door-to-door preaching by its members unless Finnish data protection legislation was observed.

Jehovah’s Witnesses created maps from which areas are allocated between the members who engage in preaching and by keeping records about preachers and the number of the Community’s publications distributed by them. In essence they are collecting and processing personal data.

In its judgment, the European Court of Justice considered that the Jehovah’s Witnesses’ door-to-door preaching is not covered by the exceptions laid down by EU Law on the protection of personal data.

  1. There is the fact that the door-to-door preaching is protected by the fundamental right of freedom of conscience and religion enshrined in Article 10(1) of the Charter of Fundamental Rights of the European Union; but this does not,
  2. Confer an exclusively personal or household character on that activity because it extends beyond the private sphere of a member of a religious community who is a preacher.

For those newbies here, this is about something called ‘material scope’ in the GDPR. You can liken ‘material scope’ (and there is also ‘territorial scope’) as scoping parameters for the GDPR.

Think about it as a project scope … and it is almost cool to know that even legal documents have a scope just as any project you may have driven or been a part of. What this means is that all the legal text in the GDPR is only relevant if personal data falls within the scope defined in Articles 2 and 3.

Material scope (Article 2)

The GDPR applies to the processing of personal data wholly or partly by automated means and to manual processing if the personal data form part of a filing system or are intended to form part of a filing system.

Now back to the case.

  1. The Jehovah’s Witnesses used ‘household exception’, hence exempt from GDPR. This was overruled, stating that the JW organisation and those knocking on doors collecting personal data were joint controllers.
  2. What material scope also states is that data needs to be part of a ‘filing system’ of some kind, and it was stated that even though data was collected manually, just the ordering, e.g. by address during collection, which made retrieval easier, placed it in scope.

So there you have it… lovely example for the classroom IMHO 🙂

Safe Habor, so what now?

I’ve been asked this question more than once, funnily enough. The fact is that even the Safe Habor experts don’t have concrete answers 😉

Noh-MasksBasically it’s business as usual until some way forward is found. For those companies that are following Safe Habor practices today and tomorrow, they will not going to be penalized for this. It’s not their fault that what was considered legal last week is not this week!

There is a revised Safe Habor that has been worked on for a couple of years now which includes the restriction on U.S. government (intelligence) access to personal data of non-Americans, but it has not been finalized yet. From what I understand, it is not agreed precisely because the U.S. want this exact point removed, which is exactly the motivation of the ruling on Safe Habor! I guess the EU and U.S. must fix this now.
I can imagine that Binding Corporate Rules (BCRs) will gain a new momentum from hereon. However this is significant work for any company working across legal jurisdictions, and today it is only some of the really large global corporations who have BCRs in place and working.