Culture Change During this Momentous Time

I watched the congressional testimony on Capitol Hill today regarding the pandemic, and listened to the medical experts from NIH/NIAID, CDC, FDA, and the Administration. Their observations got me thinking about the concept of culture change and how much we are hearing about how the pandemic is changing (or going to change) cultural norms and expectations. We all know that the pandemic is having an impact on the field of privacy and how that has been operationalized in the past few years (GDPR, etc.).

Changing a corporate culture so that it is (more) privacy – centric is one thing. Changing a corporate culture as a result of the pandemic and what that is going to mean across many countries is another. Accomplishing both simultaneously is a tall order, but I think it is possible if people remember that change has both technical and interpersonal / humane components.

These times require much more human understanding and flexibility than we are used to. But if people remember what the goal is, and work together, it will get done.

Thoughts?

An interesting twist in the ‘cookie walls’ saga.

France’s Council of State has ordered the CNIL (French data protection watchdog) to cancel parts of its guidelines on cookies as the ban on cookies walls was not valid. The court explained that the CNIL exceeded its specific mandate under an act called “flexible law” which refers to instruments, such as regulatory authorities’ guidelines, which do not create a legal right or obligation.

Although a recent update of the EDPB Guidelines on consent invalidated ‘cookie walls’, our patient may still be very much alive. There potentially might be similar court decisions in some other Member States.

Recently, the BfDI (German watchdog) said that “cookie-walls are permitted if a comparable service is also offered without tracking, for example, as a paid service”. This happened right after the update of the EDPB Guidelines on consent came out.

Original text of the decision is in French:

https://www.conseil-etat.fr/actualites/actualites/le-conseil-d-etat-annule-partiellement-les-lignes-directrices-de-la-cnil-relatives-aux-cookies-et-autres-traceurs-de-connexion

Breaking news: EDPB has published the “one-stop-shop” decision register.

Being a great tool for privacy pros to keep up to date with extensive case law, it also increases the overall awareness of how data protection laws are applied in cooperation between the lead DPA and the other DPAs concerned (the GDPR Article 60).

As I expect more comments on this occasion in the days/weeks to come, for now just two interesting points:

– most cases published so far are related to data subject rights and lawfulness of the processing;

– so far, lead DPAs issued more compliance orders and reprimands than fines.

To read more – see below.

https://edpb.europa.eu/news/news/2020/edpb-publishes-new-register-containing-one-stop-shop-decisions_en

PwC vs. employee privacy

PwC developed a facial recognition tool that logs when employees are absent from their computer screens while they work from home. In particular, there have to be a specific excuse for any absence (including toilet breaks).

Too invasive? No doubt. Disproportionate with no likely legal grounds? WP29 Opinion 2/2017 on data processing at work suggests a positive answer, especially given that the tool monitors employees in their private location.

Predictably, this caused a barrage of criticism from different privacy enthusiasts, followed by unconvincing explanations provided by PwC that this tool helps “support the compliance environment required for traders and front office staff in financial institutions”.

Read below to learn more:

At the same time, there might be much more than meets the eye: monitoring of employees from their homes may also occasionally involve monitoring of their family members through webcams. Besides, depending on technical peculiarities and an ability to scan the background in a private premise, such monitoring may also reveal some special categories data about, e.g., employees’ sex life or religious beliefs (Article 9 of the GDPR).

Interplay between the GDPR Articles 25 (‘Data protection by design’, DPbD) and 35 (DPIA).

One is not a ‘special case’ of another as it may seem prima facie. The KEY consideration here is that DPIA is conducted prior to rolling out new projects implying data processing operations posing a high risk and thus tailored specifically to them. In contrast, DPbD comes into play at the very earliest stage of the lifecycle of a data controller and applies to every processing activity (not only those posing a high risk), including core ones.

Similarly, DPIA may just say whether the particular processing is in line with the controller’s privacy policy in the context of the project at issue, but it will not evaluate this policy’s content, etc.

This leads to a clear understanding that DPIA is not a substitution for DPbD and, hence, may not be the answer.

Further to this, it should also be noted that DPbD has recently received an increased attention from EDPB (see Guidelines 4/2019) and national watchdogs in Romania, Greece and Germany issuing fines for non-compliance with Article 25.

More to read on this – in an article from IAPP authors (see below)

https://iapp.org/news/a/privacy-by-design-gdprs-sleeping-giant/

Status of non-EU processors under Article 3(2) GDPR

A thorough analysis of clear things and grey zones of the EDPB Guidelines 3/2018 on territorial scope.

My attention was, in particular, drawn by a friendly reminder that a status of a non-EU processor is dual as per Article 3(2):

  • it is indirectly influenced by the GDPR if carries out processing on behalf of a EU controller (through the data processing agreement under Article 28 and Chapter V obligations);
  • It is directly caught by the GDPR if the respective processing activities carrying out on behalf of a controller meet the ‘targeting criterion’ in a sense of Articles 3(2)(a) and 3(2)(b).

More to read – see below.

https://www.globalprivacyblog.com/gdpr/edpb-guidelines-what-is-the-territorial-reach-of-the-gdpr/?utm_source=feedburner&utm_medium=feed&utm_campaign=Feed%3A+GlobalPrivacyAndSecurityComplianceLawBlog+%28Global+Privacy+and+Security+Compliance+Law+Blog%29#page=1

Ticking time-bomb in the EDPB Guidelines on consent?

An old issue each privacy pro learnt by heart: “risk of negative consequences (e.g. substantial extra costs)” for data subject = no freely-given consent. 

Substantial. But what if extra costs are not substantial? What if, say, 10$ turns into 11$ if you refuse to consent? Is it ok? 

At leats, German watchdog seems to say yes. Some privacy pros agree (see below).

One can say that I am picky, indeed, 1$ or even 10$ surcharge will unlikely lead to bankruptcy. But what will happen if this practice becomes commonplace? Right, data subject will overpay every time he/she is requested to give consent. 1$ per one requested consent will turn into 10$ per ten requests. How long could that ‘receipt’ be a year after? 5 years after?

You got to the crux of the matter. While EDPB considers substantiality in the context of a one-off consent request, it does not address the aftermaths when overcharging becomes a rule.

https://www.bclplaw.com/en-GB/insights/does-the-gdpr-prohibit-charging-more-to-consumers-that-do-not-consent-to-certain-types-of-processing.html

CJEU & legitimate interest in scope: what the controller should remember of.

CJEU gave the Judgement in the course of a preliminary ruling on whether Articles 6(1)(c) and 7(f) of the Data Protection Directive (95/46/EC) precluded national law from allowing installation of a CCTV system in the common parts of a residential building, relying on a legitimate interest (Case C-708/18).  

The overall answer is “No, it didn’t”. But what else is inside for data protection pros? 

Well, CJEU re-brought to the attention of data controllers critical cornerstones of the legitimate interest as a legal basis:

– there must be present and effective legitimate interest (‘purpose test’);

– processing at issue must be strictly necessary, i.e. the purpose “cannot reasonably be as effectively achieved by other means less restrictive of the fundamental freedoms. (‘necessity test’). This is closely intertwined with the ‘data minimisation’ principle; 

– a balancing test must be conducted (ref. WP29 Opinion 06/2014 on the notion of legitimate interests).

More to read:

https://www.rpc.co.uk/snapshots/data-protection/cjeus-cctv-ruling-guidance-on-legitimate-interests-processing/

Bring the forces back!

We have spoken a lot about WFH. But what about “return to office”. Here are some tips for a seamless return from a privacy perspective. Firstly – be careful with sensitive data. If you are processing test results , these are health data, and hence they are sensitive data. You need an Article 9 condition. The relevant condition will be the employment contract legal basis in Article 9(2)(b).

Demonstrate accountability through a DPIA. This DPIA should set out:

  • the activity being proposed;
  • the data protection risks;
  • whether the proposed activity is necessary and proportionate;
  • the mitigating actions that can be put in place to counter the risks; and
  • a plan or confirmation that mitigation has been effective

Collect the minimum amount of data. For example, you might probably only require information about the result of a test, rather than additional details about underlying conditions.

Keep the data accurate. Record the date of any test results to pin it to a particular time period. The health status of individuals may change over time and the test result may no longer be valid. 

Keep lists of affected employees very securely. Work with your HR teams or other site leaders to ensure restricted access, password protection etc.

Transparency is crucial so a privacy notice to staff will be required prior to processing. This doesn’t have to be “legalistic” it could be beneficial to write a small note to colleagues to let them know how you plan to support them and their families in case of  infection.

Belgian data protection watchdog sends controversial ‘message’ with regard to non-profit data controllers.

An interesting GDPR enforcement case came from Belgium in late May. Imagine that a data controller is sending unsolicited postal communications and ignoring data subject rights to object (Article 21) and to be forgotten (Article 17). On top of that, it misidentified legal basis and relied on the legitimate interest instead of consent (of course, no balancing exercises have been conducted and no safeguards have been put in place).

What could happen to such a data protection ‘nihilist’? Article 83(5) suggests that its DPO may start looking for another job. However, things may go upside down if the controller is a… non-profit organisation. 

Not to keep an unnecessary suspense, the data controller in the case above was fined mere 1000 EUR (nope, I did not miss additional ‘zeros’). Of course, factoring in that it was the first case against this organisations and that the controller is a non-profit organisation with no regular turnover.

This all may be well true, but it seems that such ‘enforcement’ naturally tears the fabric of the GDPR as it factually gives all non-profit organisations carte blanche to violate ‘tastefully’ for their first time.

More details on this case: