An interesting case from Austria that touches on Article 85 GDPR and refers to the balancing test developed by ECtHR of how to determine whether occurred processing was for journalistic purposes or not.

The case involved personal data mentioned by ex-convict in his Facebook publication. Ironically enough, Austrian DPA and court came to different conclusion using the same balancing tests. Contrary to DPA’s decision, the court found a violation of the data subject’s rights.

While the case is interesting by itself, it also raises thoughts about consistency of application of various tests and methodologies developed by #WP29 and #EDPB in their multiple papers to practical situations. In a nutshell, “two enforcers – three opinions”.

#dataprotection #lawandlegislation #privacy #politicsandlaw #compliance #dataprivacy #gdpr

Risk-based approach

I brushed up on one important thing – risk-based approach (RBA) which WP29 touched upon in 2014 in its “Statement on the role of a risk-based approach in data protection legal frameworks” (WP 218).

So what is RBA really about? The key phrase from the Statements that suggests answer: “… a data controller whose processing is relatively low risk may not have to do as much to comply with its legal obligations as a data controller whose processing is high-risk».

What does in mean? 

RBA does NOT mean that a controller may ignore some of its obligations if it processes low-risk data. This does not lead to compliance and this is a common misconception I’ve seen in my practice many times. Instead, RBA means that in this case a controller may do less to be compliant.

In fact, this is not correct to say that the whole GDPR implies RBA. Only some particular articles in particular cases do so (Art. 25, 30(5), 32-35 and some other).

The recent cases of usage of Facial Recognition Technologies (FRT)

Enforcement actions against Dutch supermarket and Swedish police remind that the bar for using FRT is very high.

Even if used for security purposes inside the supermarket, giving of relevant privacy notification to customers that FRT is used is not enough. However, no fine was issued by Dutch DPA (only warning).

In another case, Swedish Police unlawfully processed biometric data for facial recognition without conducting DPIA which resulted in a fine of approximately EUR 250,000. In addition, Swedish police was ordered to delete all data transfer to external database (Clearview AI) and notify data subjects that their personal data were transferred.

CLICK here to learn more on that