Earlier today I had the opportunity to watch the highly useful IAPP webinar entitled What Works: Benchmarking and Improving your Privacy Program. I was particularly intrigued by the comments directed at improving / re – engineering a privacy office. The presenters emphasized the constant evolution of privacy regimes on a global scale, and that today adaptability and flexibility are key for people and structures (such as a privacy office).
That got me thinking about a large part of my career to date – the establishment and re – engineering of research offices at American universities. By “research” I mean the administration of grants, contracts, and other legal instruments that support faculty research. International grants and contracts are a large component in this area. For instance, the NIH (National Institutes of Health) in Washington, D.C., funds research undertaken by European scientists. That global dimension will only continue to increase in a post – pandemic world, although it appears that a robust European posture towards research is in question as I write this.
My own involvement with the establishment and re – engineering of research offices began at Northwestern University in Evanston, Illinois. We had a major challenge at NU as we were re – engineering operations while maintaining the administration of $165M USD in research funding. Subsequent to that, I established two research offices at smaller universities and then established a contracts / industrial agreements office at a larger university in Texas. While at the latter institution I oversaw two additional re – organizations that built upon the original office.
Those universities provided me with a lifetime of unique and challenging experiences. So, here are my thoughts and observations on best practices for building and re – engineering offices, along with specific comments to the privacy office context:
Every university research office was designed to be public facing, client (faculty) – oriented, and collegial with other university offices. It was critical that the research office work effectively with other university offices. What is the parallel situation in privacy? A privacy office that works collaboratively with a security office (or any other office, for that matter).
No research office was meant to operate as an “island” or a “silo.” A privacy office should not be its own island or silo within a company or other organization.
One particular aspect of these offices was that they were designed for staff to “get out” into the greater community of the university – and beyond. It seems to me that privacy office personnel serve in a similar capacity within in their environment.
When re – engineering an office, particular attention must be paid to client satisfaction and “upping your game.” What does your office do well in Version 1.0, and what do you want to do well in Version 2.0? What pressure points need to be eliminated?
Professional development opportunities for staff must be plentiful. I see this as a common thread between the privacy and research worlds. When you think about it, both areas are intellectually vibrant and subject to rapid change. While it is important to stay abreast of such change, getting ahead of said change is more preferable.
How are you going to measure office success? What are the metrics or KPIs? In the realm of research contracting, for instance, one such measure is the length of time to get a contract negotiated and signed. In privacy, one such metric is the length of time it takes to respond to DSARs.
Lastly, the human / interpersonal dimension of an office is just as important as the technical / legally satisfying dimension. Not only must the office be enjoyable for the staff to work in, but it must be viewed – and in reality – as an enjoyable partner within the environment(s) within which it operates. Research management and privacy management are truly Art + Science.
Research offices and privacy offices have more in common than probably many people would have thought. Both operate in a rapidly changing global environment and are intellectually vibrant. It will be quite interesting to see how these offices function and change over the next few years.
The IAPP Privacy Advisor published an excellent article on 23 June entitled “The thin line between privacy and antitrust.” In particular, the three scenarios presented by the authors are concise introductions to the important ways that privacy issues may arise in antitrust matters / investigations. And how the areas of privacy and antitrust are more linked as a way forward in the future.
As someone who has worked at the nexus of antitrust and privacy for the past couple years – and involved in 10 such U.S. matters (involving the U.S. Federal Trade Commission and the U.S. Department of Justice) – I have the following general observations to share:
It is important to be extremely careful in internal corporate communications when it comes to privacy issues as discussed by those “in the know.” That may sound like an obvious piece of common sense, but I have been shocked by how corporate leaders (from the CEO on down) are inappropriate and sloppy when it comes to privacy discussions in antitrust matters. Email is an easy mode to fire off one’s thoughts, but discipline of thought and tact are incredibly important.
I have been pleased by the awareness of company personnel when it comes to personal sensitive information, PHI – PII, etc. Very impressed.
I have seen little discussion of privacy as a basic human right. Much more work needs to be done in the U.S. in terms of cultural change. As privacy pros know, they are excellent ambassadors for that point of view.
In some situations, discussions of privacy issues were subtly couched in ways to restrain competition in the industry. As everyone here knows, never say that. As well versed antitrust lawyers also know, sometimes corporate leaders and counsel cease writing emails on a topic and continue the discussion on the phone.
Some of the situations I have been involved with involved mergers where getting the data from the acquired company is one proposed benefit of the merger. The discussion by the authors in their section entitled, “Sharing data raises privacy concerns” is spot on and bears multiple reading. Once again, if you view data protection & privacy as a basic human right, there should be no question that a more rigorous conception of those topics is necessary from Day One. Privacy should be baked into the company’s DNA – and a newly merged entity is an excellent opportunity to make that a reality.
The section in the IAPP article focusing on nascent competition is especially pertinent for the future, though now with the pandemic in full force it remains to be seen what the final damage inflicted upon the U.S. economy will be. And how that will ultimately change corporate leadership in the future – especially with regards to the privacy / antitrust relationship.
If you are interested in contract negotiation best practices, check out my discussion in the latest installment of NCURA YouTube Tuesday (National Council of University Research Administrators). Not exactly concerned with privacy per se, but I would consider the topic to be within the larger universe of privacy issues.
Privacy is a fundamental human right recognized in the UN Declaration of Human Rights, the International Covenant on Civil and Political Rights and in many other international and regional treaties. Privacy underpins human dignity and other key values such as freedom of association and freedom of speech. It has become one of the most important human rights issues of the modern age. And yet, for many, the GDPR is the beginning of privacy law as we know it. The most remarkable difference being the introduction of some really sizeable fines. So how does this affect the ethics of privacy?
Privacy is, in its nature, an element of compliance. Compliance with privacy laws and with the “intention” of privacy laws is how we show optimal data protection. When talking of compliance, I always say that “Compliance is not about just doing the right thing, but showing we are doing the right thing”. Compliance is only possible with accountability. No one ever challenges the concept that compliance is about doing the right thing. We should remodel our approach to privacy away from compliance with law, but towards the behaviour of doing the right thing. The GDPR helps us to show we are doing the right thing; it helps us to show our accountability, but it is not the reason privacy exists.
Why is this important for companies? Privacy is now a central element of business ethics. It forms part of the corporate approach to mitigating controversial subjects in order to gain public trust and support. No matter what industry, data is essential to the functioning of business. Without an ethical approach to treating data, it will not be entrusted to those who need it most to make business turn and of course, maintain reputation, help avoid significant financial and legal issues, and thus, ultimately benefit everyone involved.
From 29 July 2020 onwards, Tiktok Ireland will control the data of all users in the EEA and Switzerland.
Nothing specific, just another smart move of a non-EEA company (parental company Tiktok Inc incorporated in the US) in an attempt to use one-stop-shop mechanism via its EEA subsidiaries.
Except for one thing. The recent French scenario where CNIL issued an administrative fine directly to Google LLC (US) instead of its EU subsidiary (and this was upheld by the Conseil D’Etat) may become a real problem in case of receiving a support from Irish authorities.
The decision of Conseil D’Etat, probably, ended the era of so-called ‘delegated controllership’. If supported by other DPAs, this will affect all non-EU ‘factual’ controllers willing to use one-stop-shop mechanism. Think about it, TikTok.
France’s Council of State has ordered the CNIL (French data protection watchdog) to cancel parts of its guidelines on cookies as the ban on cookies walls was not valid. The court explained that the CNIL exceeded its specific mandate under an act called “flexible law” which refers to instruments, such as regulatory authorities’ guidelines, which do not create a legal right or obligation.
Although a recent update of the EDPB Guidelines on consent invalidated ‘cookie walls’, our patient may still be very much alive. There potentially might be similar court decisions in some other Member States.
Recently, the BfDI (German watchdog) said that “cookie-walls are permitted if a comparable service is also offered without tracking, for example, as a paid service”. This happened right after the update of the EDPB Guidelines on consent came out.
Being a great tool for privacy pros to keep up to date with extensive case law, it also increases the overall awareness of how data protection laws are applied in cooperation between the lead DPA and the other DPAs concerned (the GDPR Article 60).
As I expect more comments on this occasion in the days/weeks to come, for now just two interesting points:
– most cases published so far are related to data subject rights and lawfulness of the processing;
– so far, lead DPAs issued more compliance orders and reprimands than fines.
PwC developed a facial recognition tool that logs when employees are absent from their computer screens while they work from home. In particular, there have to be a specific excuse for any absence (including toilet breaks).
Too invasive? No doubt. Disproportionate with no likely legal grounds? WP29 Opinion 2/2017 on data processing at work suggests a positive answer, especially given that the tool monitors employees in their private location.
Predictably, this caused a barrage of criticism from different privacy enthusiasts, followed by unconvincing explanations provided by PwC that this tool helps “support the compliance environment required for traders and front office staff in financial institutions”.
Read below to learn more:
At the same time, there might be much more than meets the eye: monitoring of employees from their homes may also occasionally involve monitoring of their family members through webcams. Besides, depending on technical peculiarities and an ability to scan the background in a private premise, such monitoring may also reveal some special categories data about, e.g., employees’ sex life or religious beliefs (Article 9 of the GDPR).
One is not a ‘special case’ of another as it may seem prima facie. The KEY consideration here is that DPIA is conducted prior to rolling out new projects implying data processing operations posing a high risk and thus tailored specifically to them. In contrast, DPbD comes into play at the very earliest stage of the lifecycle of a data controller and applies to every processing activity (not only those posing a high risk), including core ones.
This leads to a clear understanding that DPIA is not a substitution for DPbD and, hence, may not be the answer.
Further to this, it should also be noted that DPbD has recently received an increased attention from EDPB (see Guidelines 4/2019) and national watchdogs in Romania, Greece and Germany issuing fines for non-compliance with Article 25.
More to read on this – in an article from IAPP authors (see below)