Observations on Office Re – Engineering: Privacy Offices and Research Offices

Earlier today I had the opportunity to watch the highly useful IAPP webinar entitled What Works: Benchmarking and Improving your Privacy Program. I was particularly intrigued by the comments directed at improving / re – engineering a privacy office. The presenters emphasized the constant evolution of privacy regimes on a global scale, and that today adaptability and flexibility are key for people and structures (such as a privacy office).

That got me thinking about a large part of my career to date – the establishment and re – engineering of research offices at American universities. By “research” I mean the administration of grants, contracts, and other legal instruments that support faculty research. International grants and contracts are a large component in this area. For instance, the NIH (National Institutes of Health) in Washington, D.C., funds research undertaken by European scientists. That global dimension will only continue to increase in a post – pandemic world, although it appears that a robust European posture towards research is in question as I write this.

My own involvement with the establishment and re – engineering of research offices began at Northwestern University in Evanston, Illinois. We had a major challenge at NU as we were re – engineering operations while maintaining the administration of $165M USD in research funding. Subsequent to that, I established two research offices at smaller universities and then established a contracts / industrial agreements office at a larger university in Texas. While at the latter institution I oversaw two additional re – organizations that built upon the original office.

Those universities provided me with a lifetime of unique and challenging experiences. So, here are my thoughts and observations on best practices for building and re – engineering offices, along with specific comments to the privacy office context:

  1. Every university research office was designed to be public facing, client (faculty) – oriented, and collegial with other university offices. It was critical that the research office work effectively with other university offices. What is the parallel situation in privacy? A privacy office that works collaboratively with a security office (or any other office, for that matter).
  2. No research office was meant to operate as an “island” or a “silo.” A privacy office should not be its own island or silo within a company or other organization.
  3. One particular aspect of these offices was that they were designed for staff to “get out” into the greater community of the university – and beyond. It seems to me that privacy office personnel serve in a similar capacity within in their environment.
  4. When re – engineering an office, particular attention must be paid to client satisfaction and “upping your game.” What does your office do well in Version 1.0, and what do you want to do well in Version 2.0? What pressure points need to be eliminated?
  5. Professional development opportunities for staff must be plentiful. I see this as a common thread between the privacy and research worlds. When you think about it, both areas are intellectually vibrant and subject to rapid change. While it is important to stay abreast of such change, getting ahead of said change is more preferable.
  6. How are you going to measure office success? What are the metrics or KPIs? In the realm of research contracting, for instance, one such measure is the length of time to get a contract negotiated and signed. In privacy, one such metric is the length of time it takes to respond to DSARs.
  7. Lastly, the human / interpersonal dimension of an office is just as important as the technical / legally satisfying dimension. Not only must the office be enjoyable for the staff to work in, but it must be viewed – and in reality – as an enjoyable partner within the environment(s) within which it operates. Research management and privacy management are truly Art + Science.

Research offices and privacy offices have more in common than probably many people would have thought. Both operate in a rapidly changing global environment and are intellectually vibrant. It will be quite interesting to see how these offices function and change over the next few years.

On a crucial importance of TOMs under GDPR Article 32

DPA of Baden-Württemberg (Germany) fined a health insurance company 1’240’000 EUR for insufficient implementation of TOMs resulted in personal data of app. 500 individuals being accidentally processed for advertising purposes without due consent. 

The fine is quite high, especially given that there have been some mitigating factors in this case:

  • not too many data subjects concerned
  • cooperation with DPA
  • TOMs were not absent at all, the level of implementation thereof was just insufficient

Besides, no data breaches or other factors posing a (high) risk to data subjects were identified.

The investigation resulted in one of the highest fines issued under Article 32 (if not highest). This can be explained, in particular, by the adoption of the German model for calculating fines under the GDPR.

Anyway, this is another one reminder for controllers and processors about the importance of putting TOMs in place appropriate to the risk as ‘somewhat good’ TOMs will unlikely be enough.

More to read – see below.

https://digital.freshfields.com/post/102garn/1-2m-fine-in-germany-for-failure-to-implement-appropriate-toms

Tiktok moves under control of Irish DPC

From 29 July 2020 onwards, Tiktok Ireland will control the data of all users in the EEA and Switzerland.

Nothing specific, just another smart move of a non-EEA company (parental company Tiktok Inc incorporated in the US) in an attempt to use one-stop-shop mechanism via its EEA subsidiaries.

Except for one thing. The recent French scenario where CNIL issued an administrative fine directly to Google LLC (US) instead of its EU subsidiary (and this was upheld by the Conseil D’Etat) may become a real problem in case of receiving a support from Irish authorities.

The decision of Conseil D’Etat, probably, ended the era of so-called ‘delegated controllership’. If supported by other DPAs, this will affect all non-EU ‘factual’ controllers willing to use one-stop-shop mechanism. Think about it, TikTok.

An interesting twist in the ‘cookie walls’ saga.

France’s Council of State has ordered the CNIL (French data protection watchdog) to cancel parts of its guidelines on cookies as the ban on cookies walls was not valid. The court explained that the CNIL exceeded its specific mandate under an act called “flexible law” which refers to instruments, such as regulatory authorities’ guidelines, which do not create a legal right or obligation.

Although a recent update of the EDPB Guidelines on consent invalidated ‘cookie walls’, our patient may still be very much alive. There potentially might be similar court decisions in some other Member States.

Recently, the BfDI (German watchdog) said that “cookie-walls are permitted if a comparable service is also offered without tracking, for example, as a paid service”. This happened right after the update of the EDPB Guidelines on consent came out.

Original text of the decision is in French:

https://www.conseil-etat.fr/actualites/actualites/le-conseil-d-etat-annule-partiellement-les-lignes-directrices-de-la-cnil-relatives-aux-cookies-et-autres-traceurs-de-connexion

Breaking news: EDPB has published the “one-stop-shop” decision register.

Being a great tool for privacy pros to keep up to date with extensive case law, it also increases the overall awareness of how data protection laws are applied in cooperation between the lead DPA and the other DPAs concerned (the GDPR Article 60).

As I expect more comments on this occasion in the days/weeks to come, for now just two interesting points:

– most cases published so far are related to data subject rights and lawfulness of the processing;

– so far, lead DPAs issued more compliance orders and reprimands than fines.

To read more – see below.

https://edpb.europa.eu/news/news/2020/edpb-publishes-new-register-containing-one-stop-shop-decisions_en

Interplay between the GDPR Articles 25 (‘Data protection by design’, DPbD) and 35 (DPIA).

One is not a ‘special case’ of another as it may seem prima facie. The KEY consideration here is that DPIA is conducted prior to rolling out new projects implying data processing operations posing a high risk and thus tailored specifically to them. In contrast, DPbD comes into play at the very earliest stage of the lifecycle of a data controller and applies to every processing activity (not only those posing a high risk), including core ones.

Similarly, DPIA may just say whether the particular processing is in line with the controller’s privacy policy in the context of the project at issue, but it will not evaluate this policy’s content, etc.

This leads to a clear understanding that DPIA is not a substitution for DPbD and, hence, may not be the answer.

Further to this, it should also be noted that DPbD has recently received an increased attention from EDPB (see Guidelines 4/2019) and national watchdogs in Romania, Greece and Germany issuing fines for non-compliance with Article 25.

More to read on this – in an article from IAPP authors (see below)

https://iapp.org/news/a/privacy-by-design-gdprs-sleeping-giant/

CJEU & legitimate interest in scope: what the controller should remember of.

CJEU gave the Judgement in the course of a preliminary ruling on whether Articles 6(1)(c) and 7(f) of the Data Protection Directive (95/46/EC) precluded national law from allowing installation of a CCTV system in the common parts of a residential building, relying on a legitimate interest (Case C-708/18).  

The overall answer is “No, it didn’t”. But what else is inside for data protection pros? 

Well, CJEU re-brought to the attention of data controllers critical cornerstones of the legitimate interest as a legal basis:

– there must be present and effective legitimate interest (‘purpose test’);

– processing at issue must be strictly necessary, i.e. the purpose “cannot reasonably be as effectively achieved by other means less restrictive of the fundamental freedoms. (‘necessity test’). This is closely intertwined with the ‘data minimisation’ principle; 

– a balancing test must be conducted (ref. WP29 Opinion 06/2014 on the notion of legitimate interests).

More to read:

https://www.rpc.co.uk/snapshots/data-protection/cjeus-cctv-ruling-guidance-on-legitimate-interests-processing/

Belgian data protection watchdog sends controversial ‘message’ with regard to non-profit data controllers.

An interesting GDPR enforcement case came from Belgium in late May. Imagine that a data controller is sending unsolicited postal communications and ignoring data subject rights to object (Article 21) and to be forgotten (Article 17). On top of that, it misidentified legal basis and relied on the legitimate interest instead of consent (of course, no balancing exercises have been conducted and no safeguards have been put in place).

What could happen to such a data protection ‘nihilist’? Article 83(5) suggests that its DPO may start looking for another job. However, things may go upside down if the controller is a… non-profit organisation. 

Not to keep an unnecessary suspense, the data controller in the case above was fined mere 1000 EUR (nope, I did not miss additional ‘zeros’). Of course, factoring in that it was the first case against this organisations and that the controller is a non-profit organisation with no regular turnover.

This all may be well true, but it seems that such ‘enforcement’ naturally tears the fabric of the GDPR as it factually gives all non-profit organisations carte blanche to violate ‘tastefully’ for their first time.

More details on this case:

‘Privacy by design’: does all begin with corporate privacy culture?

In scope – a useful hands-on guidance from IAPP authors for privacy pros on what to focus when taking very first steps to internalize PbD principle.

It may come as a surprise for us being buried under tons of privacy-related papers that the author suggests to begin with the inner privacy culture and getting C-level buy-in with this regard. However, it can be confirmed that this is in fact very true. At least, this will make people listen, but this is, of course, not entirely enough. Click below to know what should be brought to your attention next.

https://iapp.org/news/a/how-to-operationalize-privacy-by-design/

A “purpose”​ element: what is inside the controller’s mind?

In ‘Opinion 4/2007’ on the concept of personal data, Working Party 29 (‘WP29’) identified four building blocks in the definition of personal data – ‘any information’, ‘relating to’, identified or identifiable’, ‘natural person’. They remained the same in the GDPR, thus rendering ‘Opinion 4/2007’ relevant for understanding the concept of personal data. 

However, WP29, instead of eliminating all subjectivity to the extent possible, seemed to add some unclarity to the explanation of what ‘relating to’ means.

WP29 sets out that ‘in order to consider that the data “relate” to an individual, a “content” element OR a “purpose” element OR a “result” element should be present’. In turn, ‘“purpose” element can be considered to exist when the data are used or are likely to be used, taking into account all the circumstances surrounding the precise case, with the purpose to evaluate, treat in a certain way or influence the status or behaviour of an individual’.

By itself, an idea to decide on whether the data are personal or not through the interpretation of the “purpose” element is quite controversial due to the subjective (rather than objective) nature of the notion of purpose.

An example given by WP29 brings this problem front and center:

Passenger vehicles owned by a transportation company suffer repeated damage when they are dirtied with graffiti. In order to evaluate the damage and to facilitate the exercise of legal claims against their authors, the company organises a register containing information about the circumstances of the damage, as well as images of the damaged items and of the “tags” or “signature” of the author. At the moment of entering the information into the register, the authors of the damage are not known nor to whom the “signature” corresponds. It may well happen that it will never be known. However, the purpose of the processing is precisely to identify individuals to whom the information relates as the authors of the damage, so as to be able to exercise legal claims against them. Such processing makes sense if the data controller expects as “reasonably likely” that there will one day be means to identify the individual. The information contained in the pictures should be considered as relating to “identifiable” individuals, the information in the register as “personal data”, and the processing should be subject to the data protection rules, which allow such processing as legitimate under certain circumstances and subject to certain safeguards.

Most likely, it is only common sense that can lead to the conclusion that the purpose is to precisely identify authors of the graffiti. However, the controller can potentially argue that it keeps the register and images for some other internal purposes not connected with the purpose of future identification. As a result, we may end up being engaged in a discussion about true intentions of the controller which might not be established easily due to a lack of the factual grounds.

The issue described above may prima facie seem to be solely theoretical. Moreover, the language used by the GDPR contains various ‘floating’ criteria implying the necessity to conduct evaluations on a case-by-case basis. However, one should not overlook that, by applying the concept of purpose as described above, we decide on whether the data are personal or not, and a positive answer inevitably triggers set of responsibilities vested in the controller under the GDPR and Member States laws. It can be assumed that more certainty is need when addressing such a fundamental issue which may (or may not) trigger application of the data protection legislation in general.

Interestingly, the GDPR suffers from the same flaw like the WP29 ‘Opinion 4/2007’. Under Article 9(1), processing of biometric data for the purpose of uniquely identifying a natural person is prohibited (unless one of the exemptions under Article 9(2) applies). This brings us back to the issue of identification of the controller’s intention. Ironically enough, Recital 51 applies more objective criteria when addressing the same issue:

“The processing of photographs should not systematically be considered to be processing of special categories of personal data as they are covered by the definition of biometric data only when processed through a specific technical means allowing the unique identification or authentication of a natural person

In other words, under Recital 51, it is ability of technical means to identify individuals that plays a key role (and not just purposes pursued by the controller). Unfortunately, this wording has been changed in Article 9(1) requiring to identify the subjective purposes (instead of objective abilities).