Another interesting case. Each new case is helping us to understand better how to implement and be compliant with GDPR in our organisations.
So this is a fine of €725k by the Dutch DPA to an organisation which started using biometrics, i.e. fingerprint authentication. If you’ve checked the link above with an explanation provided by DLA Piper blog, there are 2 factors which really surface.
Firstly, consent cannot be used as a legal basis if there is an imbalance in relationship, and in the case of employer/employee, this is always the case. If fingerprinting is to be used then the employee needs to have a choice to use another method, e.g. access cards. In this example, there was a lack of choice, employees were forced to provide consent. Consent was not freely given.
Secondly, it seems that the Dutch law gives a second alternative on the using of biometrics for authentication and security purposes. However, this is only if it can be proved that it is proportionate to the purpose. For example, to use as a means to access high security facilities is proportionate, not access to office space.
Why I love this case is that it really emphasises on the use of consent in the employer/employee relationship.
The question is that sometimes it is VERY useful to use tracking technologies, for example in order to protect vulnerable persons, i.e. small children, and old people (who tend to wander). So the decision by Norrköping kindergarten was a bad one IMHO to not allow the use of tracking – use of armband- of toddlers/small children.
As a parent it would give me peace of mind. Human rights states that we have a ‘right to feel safe’ and ‘a right to a private life’. These rights can often conflict with each other which results in the wrong decisions being made. Hence in fear of breaking the GDPR a school has made a rather incorrect decision which has so many benefits for all. What’s more is that RFID/sensors are not biometrics, so have no relation to the other decision. Sensors do not even need to be linked to an identity. All the school needs to know is if they have lost a child, not which one… that they can work out pretty quickly by seeing which they have.
This presents another problem in that decisions are made by persons who are are not able to take this careful balancing act and really identify the potential risk of harm to the natural person. In the case of Norrköping school I can see none which outweigh the benefits on a ‘right to feel safe’.
Thanks to Inge Frisk for bringing this decision in Norrköping to my attention.
The ruling is in Swedish, but to summarise the school was using facial recognition on its students. Facial recognition is biometric data, hence sensitive (special categories of data in the GDPR). They used consent as the legal basis but this was considered as unlawful due to the imbalance of relationship between the controller (school) and the data subject (student of 16+ yrs). Basically the student had no choice.
But there is more. The Swedish data protection authority based their decision on the following:
Art 5 – personal data collected was intrusive and more was collected that was needed for the purpose
Art 9 – the school did not have a legal exception to handle sensitive data. It is forbidden to collect sensitive data unless this is the case.
Art 35-36 – seems that a DPIA was not done.
What does this mean to other schools or even any public or private entity looking to use intrusive biometrics? Do a data protection impact assessment (DPIA), from here you will be able to get a clean picture on the potential risk of harm to the rights and freedoms of the data subject.
For me personally and professionally, I’m just happy that China’s big brother approach has been nipped in the bud here in Sweden 🙂
The science fiction of the future is getting closer. You walk by a store and the large digital screen advertising products presents goods to you that are tailored to your age using facial recognition technology. This is here today, both Adidas and Kraft have plans for this type of digital ads. Read more at Forbes (http://www.forbes.com/sites/kashmirhill/2011/09/01/kraft-to-use-facial-recognition-technology-to-give-you-macaroni-recipes/).
Now what happens if this technology is connected to Facebook, so that they don’t need to guess how old you are based on how you look, but can see for a fact. Also that you have children, dogs, cats, and whatever more there is is glean, based upon how Facebook will organise unstructured data into a structured format so it is easy to link and process.
Now facial recognition technologies are also to deny access to casinos in Las Vegas. Now imagine if every club and bar could effectively do away with the traditional bouncer and instead implement this technology.
Previously I have talked a lot about storecards, RFID and how this type of invasion on privacy could make you vulnerable to tailored ads… although maybe you like this, it really depends on your viewpoint. However now, it really may not matter if you have a storecard, RFID, whatever, your face will reveal all, your FB account will feed the digital ads, and I guess you won’t have any say in this at all!
Largest biometric database being created in India with over 1.2 billion identities. Scary and a serious concern outside of the personal privacy aspect is the security of this database. From a positive standpoint and driving argument is that is provides the means to overcome the significant levels of corruption in India. Particularly for those living outside of the city and at the mercy of corrupt officials. In fact this database if implemented well would free these very persons from certain tyranny. Another dilemma ….
Interesting that participation is voluntary. Same as Sweden’s approach.
Read more here. It is fascinating reading and worth a visit 🙂 .
An excellent article on the use of CCTV, biometrics, databases, etc., in schools in the UK.
Can you imagine that on the uncertainly of whether CCTV should be permissible in toilets, Sayner (managing director of Proxis, a security installation company) reasons that “it depends exactly on what it is looking at,” adding that “If you’ve got nothing to hide, why should you object to that?” I just love this “nothing to hide” argument. For myself I’m not too keen on being the star on some camera footage when I visit the ladies room!
It’s not just the FBI that are keen to collect DNA of innocent persons. In Australia Mr McDevitt chief executive of CrimTrac, the agency which maintains the database, said the next step was taking samples from people charged but not convicted and from people charged for minor crimes as well as serious offences. Read more…
The federal government is quietly working on a controversial plan to collect biometric information from visitors to Canada, immigration department officials revealed yesterday.
“The idea will be that we will take biometrics from people who are coming temporarily to Canada and need a visitor’s visa — temporary workers, students and visitors,” Claudette Deschenes, assistant deputy minister in the immigration department told members of Parliament. “Those who don’t need a visitor’s visa to enter Canada will be taken at the port of entry.”
Following in the footsteps of the U.S. border controls…. fun. I wonder what they are really going to use it for…and more importantly I wonder what privacy laws exist in Canada that will protect this biometric data? Read more on Toronto Sun.
The Department of Motor Vehicles recently proposed a $63 million contract with a company that uses facial-recognition software, which can detect whether a person photographed for a new driver’s license already has a license. The software allows the DMV to match a photograph with the entire DMV database of driver’s license pictures. This risk identified by the privacy group is of ‘mission creep’, that is this technology being used to identify persons in other situations, such as in a crowd.
This move has been blocked. Hence the DMV’s request to fast-track a new technology that the agency is seeking to deter identity theft, . The DMV sought permission from Gov. Arnold Schwarzenegger to sign the contract as early as this week, without the scrutiny of public hearings. This is a victory for privacy-rights groups as this proposal will now have to undergo a public hearing.