Watch out! Ransomeware actors have turned to blackmail

Ransomware has evolved into blackmail. We are all familiar with the concept of ransomware, whereby critical operational data, which includes personal data is encrypted by hackers, and hence inaccessible to the business. In order to get access, i.e. the decrypted data (the key is owned by the hacker), they need to pay a fee. The fees are significant, this article gives an insight, e.g. a recent case resulted in a fee to be paid of $350 000.

So the business gets back their operational files, and this is where the blackmail kicks in. The hackers will request a second ransomware fee of between $100 000 and $2 000 000 for the data to be deleted or they will make it public!

What is surprising, or maybe not, is that the victims are actually paying. Especially those in private healthcare, who can’t afford the damage to their reputation should it get out that they have been hacked, and sensitive data has been stolen…. and they don’t report the breach as is required by law (in the U.S.) and Europe, and other countries globally.

If you are worried about this trend, and we all should be, then protect your data as it should be (GDPR Art 32 requires this is done). Get the experts in, they cost much less than what a ransomware demand will, if they get to you first. And it could be that it is not so difficult to fix, you maybe surprised!

Edited: PrivSec have a free ‘fireside chat’ session on ransomware, and what to do if it happens to you, you can book here.

Cookie walls are not GDPR compliant

This clarification on the use of consent came out last week, and provides no surprises for those working daily with GDPR compliance. What is noteworthy though is the mention on the use of “cookie walls”.

What is a cookie wall then?

One of the principle factors that one should keep returning to when thinking about compliance with GDPR is a single word CHOICE. Those who have attended any training I’ve delivered will hear my voice in their head now… CHOICE IS A HUMAN RIGHT. Now if there is no choice, then it is not compliant with GDPR, and the use of cookie walls is a good example.

A cookie wall is whereby it is impossible to use a website without accepting ALL cookies. In fact the website will not work unless all cookies are downloaded. There must be a choice. Take a look at UK’s .ICO website for an example, and here you can even find the toolkit to make this possible on your website!

If you have a policy, make sure it is documented, if you have a procedure, document that too…else..

Well it seems that another government authority in Sweden has been fined 120 000 kr (circa €12k) by the Swedish Data Protection authority. The region (county) of Örebro, and it was the heath authority, and it was sensitive data.

What is important in this case, is that although they had procedures, they were not documented, it was word of mouth… oopps, and this is not good enough. Where is the evidence?

Clearly processing of sensitive data means that extra care must be taken, but what is key here outside of this is that Article 5.2 of the GDPR requires accountability, which means there must be evidence that 5.1 is being adhered to.

So what would entice you to install/enable a covid-19 App?

So what would entice you to install/enable a covid-19 App?

In the UK, where they’ve developed their own centralised App, (see what .ico says) it is expected that people will download the App in the name of ‘civic duty’. Sounds very British 😉

Apparently the Australians have also developed their own App, and I’d be surprised if ‘civic duty’ would motivate Australian citizens 😉

However, the one developed to be installed as a default on the Apple and Google phone, a decentralised version, could trigger the user to enable so that they can detect if they are in the proximity of an individual who could have covid-19, i.e. they’ve been in the proximity themselves with a covid-19, on one who has developed symptoms.

You know I wouldn’t be surprised if UK citizens did actually install on the basis mentioned above, to have a ‘civic duty’ is a key British value 🙂

What will motivate most outside of the UK though would be the idea that they can -as much as is doable- continue a normal life, and minimise the risk of becoming one of the statistics for covid-19.

Mentioned above are 2 models, centralised and de-centralised. In the centralised model the phone sends data to a government authority which will be compiling stats to understand the spread of the virus. It is claimed that no personal data is collected, i.e. it is anonymised. In the latter model the data stays on the phone.

The centralised model is only privacy friendly, if the data sent is truly anonymised, which I am sceptical over. At least at this stage, even if the intentions are true, I have yet to see a process which can really anonymise data, i.e. there is in fact, to my knowledge, no industry standard on the anonymisation process, which is mulitple steps of: de-identification, masking, obscurification, etc., to make it impossible to revert back. In fact it will always be possible to revert back unless the keys used for each step are securely disposed.

When I ask myself if I would install/enable the App? For covid-19, probably not. I live on an island, and there’s no bridge to the mainland. I don’t consider myself to be a risk group. However, if I were a risk-group, I would enable the Apple App, the privacy friendly one. Although who knows my ‘civic duty’ could jump in (as a British ex-pat) if the pandemic fatality rate was much higher and a sense of panic sets in.

Covid-19, a rehearsal for the real horror? and privacy concerns

The use of tracking technologies to collect data on the spread of covid-19 has triggered a lot of discussions concerning the privacy invasions of such practices, one which I got involved in was the installation as a default a tracking App on both Apple and Android devices, which needed to be switched on by the user in order for the tracking to work.

My take on this as a privacy guy, is that nothing privacy invasive should be installed as a default. The general fear is that what is installed as a default can be enabled as a default, take a look at China.

Thinking about it, we are truly lucky in that covid-19 is not an Ebola. We are getting the chance to rehearse in preparation for the real thing which will happen one day, and I’m hoping not in my lifetime. I think of Stephen King’s book The Stand, and what we see happening with deaths in old peoples homes, whereby they have been forgotten (almost) mirrors the horrors in this book. The horrors reflect the ‘Black Plague/Death‘ people dying in their homes, there is no hope. Fatality was 80%.

Together with a right to privacy are the associated dilemmas. It is clear to us all, although we may not think consciously about it that we want privacy for ourself, but not for others. A basic dilemma which makes privacy per se difficult on a basic human values level. This is reflected in the argument on privacy and the use of a tracking App. We don’t want any government authority -something we feel is the faceless George Orwell figure- tracking our movements. Yet we don’t want to die, and we don’t want our loved ones to die.

In fact the GDPR takes cares of these dilemmas by inserting clauses which enables government authorities in the name of public safety to take measures, also the use of extraordinary circumstances come into effect. In fact even if these Apps are not installed as a default today, they could arguably be pushed out at any time without our consent in the name of public safety. Privacy International has quite a lot on this subject by different countries.

Did you know that many of the international laws on wiretapping date back to a series of seminars hosted by the FBI in the United States in 1993 at its research facility in Quantico, Virginia, called the International Law Enforcement Telecommunications Seminar (ILETS) together with representatives from Canada, Hong Kong, Australia and the EU. The product of these meetings was the adoption of an international standard called the International Requirements for Interception that possessed similar characteristics to CALEA from the United States. In 1995 the Council of the European Union approved a secret resolution adopting the ILETS. Following its adoption and without revealing the role of the FBI in developing the standard, many countries have adopted laws to this.

Virtual Shadows, 2009

The question is, do we trust our governments to do what is right in the name of public safety, i.e. not abuse their power during times of crises, and pandemics? To be honest, we don’t have much choice, as much as we kick and scream, what it boils down to is that if a pandemic breaks out which can kill 80% of the population, these type of questions will not be asked. What about afterwards, in the wake of such a disaster? I guess we will be just grateful to be alive, even the politicians and faceless bureaucrats.

This is why the use of privacy by design in application development are of paramount importance, and I think this should be the focus of discussions hereon. ENISA published a useful paper in 2015 which can be used as a great inspiration. It would be great to see some evolutions on this paper on moving forward. There are some technologies out there, pathing the way into new territories, and it could take some time before we get absolute privacy on the use of any digital technology.

IMHO Privacy by Design embedded in all technology is where our energy should be placed. Words come cheap, action for long term change is often missed in the heat of the argument. The point is missed given the context, and human behaviour which results in getting nowhere; or somewhere, where we don’t want to be.

Employer fined for employee fingerprints

Another interesting case. Each new case is helping us to understand better how to implement and be compliant with GDPR in our organisations.

So this is a fine of €725k by the Dutch DPA to an organisation which started using biometrics, i.e. fingerprint authentication. If you’ve checked the link above with an explanation provided by DLA Piper blog, there are 2 factors which really surface.

Firstly, consent cannot be used as a legal basis if there is an imbalance in relationship, and in the case of employer/employee, this is always the case. If fingerprinting is to be used then the employee needs to have a choice to use another method, e.g. access cards. In this example, there was a lack of choice, employees were forced to provide consent. Consent was not freely given.

Secondly, it seems that the Dutch law gives a second alternative on the using of biometrics for authentication and security purposes. However, this is only if it can be proved that it is proportionate to the purpose. For example, to use as a means to access high security facilities is proportionate, not access to office space.

Why I love this case is that it really emphasises on the use of consent in the employer/employee relationship.

Swedish DPA has handed out a fine to SSC a government authority of SEK200k

All us DPOs who feel a bit more relaxed with the Corona times… thinking that no Data Protection Authority would be so unfeeling as to hand out a fine now…. think again..

My hat off to the Swedish DPA (Datainspektionen). A fine for SEK200k has been awarded to a Government Service Centre (SSC) which handles salaries and such for 47 Swedish government authorities. SSC is a processor to the 47 government authorities, although a controller to their own employees. It was a breach of 282k employees salary data, including their own.

In short the cause was a technical flaw in an application (Primera), and SSC failed to report not only to the controllers, but also the DPA…. but clearly the case is much more complex. Articles 24, 28, 32, 33-34 are all quoted in the report from Datainspektionen. This is a really important case and gives some really great clarifications, not only on personal data breach notification but also responsibilities of the controller/processor.

I tried to map it out below. Basically there were 2 fines:

  1. For the delay in reporting the breach to the controller (SEK150k), Article 33.2 and;
  2. For the delay in reporting to the Swedish DPA (SEK50k), Article 33.1.

Basically a personal data breach must be reported to the controller (if you are a processor) without undue delay. As soon as you know you have a breach, clearly you want to know why it happened, but this has to wait. What is important is to ascertain that a breach has happened, and it is personal data which has been breached. What happened in this case is that SSC wanted to understand more about why it happened, and the delays became serious.

On reporting as the controller, you have 72 hours to report to the DPA, and the same applies as above. During this time priority is to ascertain that a breach has occurred and have enough data to know if it has causes a risk to the rights and freedoms of the natural person. The cause of the breach is a ‘nice to have’ at this stage. You can always send in an updated report later when you know. These are not my opinions, these are facts.

There are always lots of politics which come into play when these kind of incidents occur. I believe SSC wasn’t just lacking in the breach process, but issues with conflicting opinions, and dilemmas.

There was also a problem with the DPA had with Evry, it was outdated, and not compliant with GDPR, but they got a warning for this, no fine.

I created some flows to get my head around this. The paper was quite long (and in Swedish), and loads of other data, e.g. there was an employee who found the bug and exploited the hole, ended up getting reported to the police… I think this was retracted by the Datainspektionen later.

I hope you find this useful? Feel free to ask questions… as I mentioned at the beginning, this is a super interesting case !!

Data brokers and data subject rights

Well I’ve been working hands-on with data subject rights for almost two years now and an area which is still grey, is when it comes to data brokers.

If the data broker has scraped public sites for personal data is one aspect. Personal data has been shared by you and I in LinkedIn, Facebook, etc., a data broker can extract and use, after all it is public data.

The other is, as is the case in Sweden when personal data becomes public data but not at the bequest of the data subject. Still the data brokers are there scraping sites e.g. hitta.se, ratsit.se, all legal due to something called an utgivningsbevis issues in the name of freedom of speech. If you want some background on this, I’ve written loads!

One of the challenges that a lot of businesses are purchasing personal data from data brokers as part of their sales activities. Then requests for access to personal data (Art 15), or to be forgotten (Art 17) come pouring in from individuals who want to know why sales personnel are contacting them when they did not opt-in, saying that it’s not compliant with GDPR.

Well the fact is, there is nothing illegal in this activity as it stands today. Once you make your personal data public you lose some rights. Of course in Sweden it is more complex as individuals have not requested their data to be public, it is like this as a default.

Now often the data subject will ask to be deleted, and does not want to be contacted again, but it is not so simple. If the organisation purchases regularly data from data brokers, deleting the data won’t solve the problem, their name needs to be added to an ‘opt-out’ list. Which means processing additional data. If not, their name will pop-up again, because you see the problem is three-fold:

(1) data is public, whether this is knowingly or not,

(2) there is no mechanism to enable the individual to place themselves on an opt-out list centrally which is accessible to all data brokers, hence

(3) data brokers do not clean, and this means that each organisation purchasing personal data need to have their own opt-out lists.

What complicates the matter further is that the GDPR requires that in order to respond to data subject requests their identity needs to be verified, although Article 11 does say that additional data should not need to be collected in order to verify identity, to be compliant with GDPR.

So where does that leave us when it comes to requests from data subjects who did not ask to be contacted by our sales agents? In short, best to add them to an opt-out list and delete their data, so long as they have never been a customer, have never been employee, etc. If they persist on exercising their rights as per Article 15, request identity which is permitted in Article 11.

Although how do you explain to them why you need to add them to a list? It seems a strange workaround, to something which clearly is not working optimally today.

CNIL DPO accreditation

Well I was pretty impressed that France seemed to be the first on the block to get some kind of official recognition for the DPO role. Organisations which train and certify DPOs can apply to be on their list of accredited organisations.

Great I think. We need to apply… in ‘we‘ I mean Privasee of course!

Privasee has DPO training which is accredited at 5 ECTs* on exam completion (Scottish Credit and Qualifications Framework which equals Level 6 Certification *EQF (European Qualifications Framework))

But Privasee will not apply, and why? Well because it requires (1) inclusion of the French Data Protection Act in the training content, and (2) candidate for CNIL accreditation must first be accredited by an accreditation body pursuant to standard ISO/CEI 17024:2012.

There is absolutely no inclusion of academic accreditation to which the Privasee CPP/EU-DPO has earned. The ISO standard mentioned above is purely that the certification conforms to a specific schedule. The academic accreditation that Privasee has earned for their DPO training has both content and structure assessed.

Why are academic qualifications not included here? And why exclude all DPO training/certification organisations which are not French?

Flashback to when I was a security guy and the proud owner of the MSc in Information Security from the Royal Holloway University of London (RHUL, 2006), renowned best globally in Infosec/cybersecurity education with gurus such as Prof. Fred Piper. I was nonetheless continually frustrated by the need for CISSP certification which required an individual to read a book, memorise and regurgitate in multiple choice test questions. Whereas with the Master Degree which many of us studied part-time or distance in addition to a full-time job over 2-4 years was completely ignored. The headhunters had a search algorithm which searched for CISSP and NOT MSc. This hurts, as those of us who have completed the MSc will acknowledge it is expensive, and then just because of an automated decision engine we are excluded from potential jobs.

Fast forward to now. I realise that with GDPR that those recruiting may have a challenge with these kind of automated decisions. I wonder when the job applicants will cotton on to this?

And then back to the CNIL as a DPO certification accreditation body. As you’ve probably realised by now, I’m just a little bit peeved that again… maybe I’m taking this personally… being excluded.

On the bright side. Even IAPP with the combined CIPP/E and CIPM (to be the DPO) will not be able to fulfil this requirement. The CIPP/E has nothing on French data protection.

Taking a practical approach. Privasee could theoretically get the ISO thingy, and if you are a French privacy/legal guy/girl with a French business, who would like to give this a bash. Contact me and become a Privasee OWL Partner. The adaptation of the CPP/EU-DPO training to a CPP/Fr-DPO training would be minimal… IMHO