CoC for Cloud Service Providers is now underway

It’s been announced last week that the EU Data Protection Code of Conduct (CoC) for Cloud Service Providers is now underway.

Designed as a safeguard for the international data transfers under the GDPR Article 46(2) in a post-‘Schrems II’ world, the CoC might become an interesting one by itself. At the same time, it still leaves us with the same question like SCC upheld by the CJEU: how a formal legal mechanism can remediate inadequate privacy practices in a third country?

After the Privacy Shield (PS) invalidation, a suggestion to migrate to the SCC to continue EU-US data transfers looks weird because a formal change of an underlying legal mechanism actually change nothing in defective privacy practices of the US intelligence. If we replace USA with another random third country with similar practices and/or take CoC instead of SCC – the conclusion will remain the same.

To that end, it is highly questionable that a CoC is able to become a ‘window’ to America (as currently expected). At the same time, let us see how this will work in real life. Indeed, if SCC can factually be deemed as a proper safeguard instead of PS (despite the conflict with common sense), why CoC cannot?

DPAs’ guidances to survive in the post-‘Schrems II’ world

IAPP has set up a valuable resource collecting together guidances and statements issued by national DPAs in response to the recent CJEU ruling on the so-called ‘Schrems II’ case. The IAPP will aim to update the register on an ongoing basis.

The link is below:

https://iapp.org/resources/article/dpa-and-government-guidance-on-schrems-ii-2/

While privacy pros advise to seek to put in place SCC as a substitution for the invalidated Privacy Shield, it should, however, be noted that SCC are by itself a safeguard with a limited scope of application as: (i) it still does not cover many processing scenarios (e.g., processor-to-controller, processor-to-sub-processor); (ii) it is quite outdated (issued in 2001, 2004 and 2010 in the pre-GDPR world); (iii) its validity has been put on several conditions by the ‘Schrems II’ decision.

Ambiguous status of SCC under the ‘Schrems II’ decision

As all privacy community already know, the CJEU has today struck down EU-US Privacy Shield scheme, while confirming the validity of SCC.

Arguments against Privacy Shield has changed little since the ‘Schrems I’ decision that invalidated Safe Harbour – governmental intrusion, lack of proportionality, ineffective role of ombudsperson.

What is really new is that a EU-based data controller relying upon SCC is now expected to assess how public authorities in third countries obtain access to personal data and how legal system in those countries works.

Two questions still remain:

1. How such controllers in question are expected to conduct such evaluation? Any methodology in this regard? It may seem somewhat similar to what we have in Article 45(2) – which factors Commission shall evaluate when issuing adequacy decisions. However, a private entity living with SCC is not a EU body and often does not have sufficient resources and understanding as to how to conduct the research and put necessary safeguards in place.

2. Enforcement. Amid DPAs facing lack of financial resources and manpower, the CJEU’s decision puts even extra burden on them. Thus, a newly invented (by CJEU) requirement may easily end up becoming unviable with no practical effect due to insufficient oversight.

Bonus question: taking into account the ‘accountability’ principle, how exporting controllers should demonstrate their compliance with the new obligation?

Hopefully, answers are yet to come.

On a crucial importance of TOMs under GDPR Article 32

DPA of Baden-Württemberg (Germany) fined a health insurance company 1’240’000 EUR for insufficient implementation of TOMs resulted in personal data of app. 500 individuals being accidentally processed for advertising purposes without due consent. 

The fine is quite high, especially given that there have been some mitigating factors in this case:

  • not too many data subjects concerned
  • cooperation with DPA
  • TOMs were not absent at all, the level of implementation thereof was just insufficient

Besides, no data breaches or other factors posing a (high) risk to data subjects were identified.

The investigation resulted in one of the highest fines issued under Article 32 (if not highest). This can be explained, in particular, by the adoption of the German model for calculating fines under the GDPR.

Anyway, this is another one reminder for controllers and processors about the importance of putting TOMs in place appropriate to the risk as ‘somewhat good’ TOMs will unlikely be enough.

More to read – see below.

https://digital.freshfields.com/post/102garn/1-2m-fine-in-germany-for-failure-to-implement-appropriate-toms

The ethics of privacy

Privacy is a fundamental human right recognized in the UN Declaration of Human Rights, the International Covenant on Civil and Political Rights and in many other international and regional treaties. Privacy underpins human dignity and other key values such as freedom of association and freedom of speech. It has become one of the most important human rights issues of the modern age. And yet, for many, the GDPR is the beginning of privacy law as we know it. The most remarkable difference being the introduction of some really sizeable fines.   So how does this affect the ethics of privacy?

Privacy is, in its nature, an element of compliance. Compliance with privacy laws and with the “intention” of privacy laws is how we show optimal data protection.  When talking of compliance, I always say that “Compliance is not about just doing the right thing, but showing we are doing the right thing”. Compliance is only possible with accountability. No one ever challenges the concept that compliance is about doing the right thing. We should remodel our approach to privacy away from compliance with law, but towards the behaviour of doing the right thing. The GDPR helps us to show we are doing the right thing; it helps us to show our accountability, but it is not the reason privacy exists.

Why is this important for companies? Privacy is now a central element of business ethics.  It forms part of the corporate approach to mitigating controversial subjects in order to gain public trust and support. No matter what industry, data is essential to the functioning of business. Without an ethical approach to treating data, it will not be entrusted to those who need it most to make business turn and of course, maintain reputation, help avoid significant financial and legal issues, and thus, ultimately benefit everyone involved.

Tiktok moves under control of Irish DPC

From 29 July 2020 onwards, Tiktok Ireland will control the data of all users in the EEA and Switzerland.

Nothing specific, just another smart move of a non-EEA company (parental company Tiktok Inc incorporated in the US) in an attempt to use one-stop-shop mechanism via its EEA subsidiaries.

Except for one thing. The recent French scenario where CNIL issued an administrative fine directly to Google LLC (US) instead of its EU subsidiary (and this was upheld by the Conseil D’Etat) may become a real problem in case of receiving a support from Irish authorities.

The decision of Conseil D’Etat, probably, ended the era of so-called ‘delegated controllership’. If supported by other DPAs, this will affect all non-EU ‘factual’ controllers willing to use one-stop-shop mechanism. Think about it, TikTok.

An interesting twist in the ‘cookie walls’ saga.

France’s Council of State has ordered the CNIL (French data protection watchdog) to cancel parts of its guidelines on cookies as the ban on cookies walls was not valid. The court explained that the CNIL exceeded its specific mandate under an act called “flexible law” which refers to instruments, such as regulatory authorities’ guidelines, which do not create a legal right or obligation.

Although a recent update of the EDPB Guidelines on consent invalidated ‘cookie walls’, our patient may still be very much alive. There potentially might be similar court decisions in some other Member States.

Recently, the BfDI (German watchdog) said that “cookie-walls are permitted if a comparable service is also offered without tracking, for example, as a paid service”. This happened right after the update of the EDPB Guidelines on consent came out.

Original text of the decision is in French:

https://www.conseil-etat.fr/actualites/actualites/le-conseil-d-etat-annule-partiellement-les-lignes-directrices-de-la-cnil-relatives-aux-cookies-et-autres-traceurs-de-connexion

PwC vs. employee privacy

PwC developed a facial recognition tool that logs when employees are absent from their computer screens while they work from home. In particular, there have to be a specific excuse for any absence (including toilet breaks).

Too invasive? No doubt. Disproportionate with no likely legal grounds? WP29 Opinion 2/2017 on data processing at work suggests a positive answer, especially given that the tool monitors employees in their private location.

Predictably, this caused a barrage of criticism from different privacy enthusiasts, followed by unconvincing explanations provided by PwC that this tool helps “support the compliance environment required for traders and front office staff in financial institutions”.

Read below to learn more:

At the same time, there might be much more than meets the eye: monitoring of employees from their homes may also occasionally involve monitoring of their family members through webcams. Besides, depending on technical peculiarities and an ability to scan the background in a private premise, such monitoring may also reveal some special categories data about, e.g., employees’ sex life or religious beliefs (Article 9 of the GDPR).

Status of non-EU processors under Article 3(2) GDPR

A thorough analysis of clear things and grey zones of the EDPB Guidelines 3/2018 on territorial scope.

My attention was, in particular, drawn by a friendly reminder that a status of a non-EU processor is dual as per Article 3(2):

  • it is indirectly influenced by the GDPR if carries out processing on behalf of a EU controller (through the data processing agreement under Article 28 and Chapter V obligations);
  • It is directly caught by the GDPR if the respective processing activities carrying out on behalf of a controller meet the ‘targeting criterion’ in a sense of Articles 3(2)(a) and 3(2)(b).

More to read – see below.

https://www.globalprivacyblog.com/gdpr/edpb-guidelines-what-is-the-territorial-reach-of-the-gdpr/?utm_source=feedburner&utm_medium=feed&utm_campaign=Feed%3A+GlobalPrivacyAndSecurityComplianceLawBlog+%28Global+Privacy+and+Security+Compliance+Law+Blog%29#page=1

Ticking time-bomb in the EDPB Guidelines on consent?

An old issue each privacy pro learnt by heart: “risk of negative consequences (e.g. substantial extra costs)” for data subject = no freely-given consent. 

Substantial. But what if extra costs are not substantial? What if, say, 10$ turns into 11$ if you refuse to consent? Is it ok? 

At leats, German watchdog seems to say yes. Some privacy pros agree (see below).

One can say that I am picky, indeed, 1$ or even 10$ surcharge will unlikely lead to bankruptcy. But what will happen if this practice becomes commonplace? Right, data subject will overpay every time he/she is requested to give consent. 1$ per one requested consent will turn into 10$ per ten requests. How long could that ‘receipt’ be a year after? 5 years after?

You got to the crux of the matter. While EDPB considers substantiality in the context of a one-off consent request, it does not address the aftermaths when overcharging becomes a rule.

https://www.bclplaw.com/en-GB/insights/does-the-gdpr-prohibit-charging-more-to-consumers-that-do-not-consent-to-certain-types-of-processing.html