Individual choice vs. a right to life -of another individual

A super interesting situation in Italy concerning covid. Basically an Italian business has rolled out covid testing for their 1,200 employees. It is not obligatory, there is a choice. Of the 1,200, 12 employees (1%) have refused.

The problem is that it has become known, who has refused, maybe the employees have stated their stance, the article doesn’t say, what is stated is that the vaccinated employees didn’t feel safe working alongside the non-vaccinated employees. The management has now decided that all unvaccinated employees should take 6 months leave with pay.

This brings to mind quite some dilemmas, some not linked to GDPR compliance, for example how it feels for 99% of employees who have not been offered ‘unpaid leave’… my perception is that it could be perceived as a ‘reward’? How does this work with new employees?

There are many discussions on the freedom to choose versus the safety of the individual. As anyone who knows me, I am an advocate for ‘choice’, it is a human right, and probably the most important word in the GDPR IMHO. However, when the freedom to choose can cause harm to another individual, my stance changes somewhat. Our individual choices should not harm another individual, this impacts their rights as a human being, a right to live. We should care for each other, as a community.

Although I am in the age group 50-60, so I could be biased, except that I remember having this opinion when I was 20 years old, and 30 years old… and so on. “We should not be judged on how we conduct our life, so long as it does not harm another individual”. And this is from a woman who had her first child before she turned 18 years old, and was hence damned to a life of purgatory given society norms in UK during the 1980s.

The conflict of public safety versus privacy, and our right for choice, is articulated somewhat in the GDPR -in a legal way- Article 9.2(b), of which there have been some discussions on LinkedIn. The Article itself is not clear, but if one reads associated Recitals, it gives a picture which supports public safety, versus individual choice. In Recital 52, it is referred to as a derogation which can be made for health purposes, where a serious threat is present.

I am still uncertain if I would stand by this stance if a decision was made to vaccinate all children, I have a 12-year old daughter. These vaccines haven’t been through the rigorous testing which is normal during clinical trials due to a hasty rollout…. but I guess I can append to this Post later when this topic starts to gain some traction.

French court decision and use of safeguards for international transfers

I was most delighted when this case popped up in my feed today.

“The court noted for the purposes of hosting its data, Doctolib uses the services of the Luxemburg company AWS Sarl, the data is hosted in data centers located in France and in Germany, and the contract concluded between Doctolib and AWS Sarl does not provide for the transfer of data to the U.S. However, because it is a subsidiary of a company under U.S. law, the court considered AWS Sarl in Luxemburg may be subject to access requests by U.S. authorities in the framework of U.S. monitoring programs based on Article 702 of the Foreign Intelligence Surveillance Act or Executive Order 12333. “

Even so the court decided there were sufficient legal and technical safeguards to protect the data, and this was related to covid-19.

Read more.

Work from home safely. Get cybersecurity cement.

Since March we have seen an increase in cyber incidents relating to the current pandemic. During this period reports suggest not necessarily an increase in cybercrime but instead s a visible increase in the use of Covid19 for tricking unsuspecting victims. In other words, no new crimes, but old crimes using new tricks.

Phishing, malicious domains and ransomware using Covid19 as bait are the most prevalent tactics but there is also an increase in attacks on vulnerable remote access technologies. Out of date software or indeed software developed without adequate privacy and security considerations are higher risk when combined with home networks and inexperienced users.  Work from home has become a reality to most in a very short space of time. Many organisations have had to grapple together solutions to meet demand for example: relying on VPN solutions that had not been patched or insecure configurations exposed to unprotected internet connections.

Whilst security (like patching and pen testing) are obviously essential to protecting organisations, the increase in cyber incidents demonstrate the importance of data protection by design by default. A data protection impact assessment (DPIA) will allow for adequate risk identification and work towards achieving appropriate controls. It is also a robust way of documenting project development to ensure that privacy takes a structured place in design work-streams. Data protection by design by default can supplement and support infosec colleagues in ensuring that the incidents are dealt with in an appropriate manner.

Finally, an essential part of any DPIA assessment is to identify immediate necessary mitigations, and subsequent actions to prevent reoccurrence, i.e. remediate. I have never done a DPIA that hasn’t made reference to training. Indeed, training is the cement that ties cybersecurity and privacy together and creates the strong wall of defence for an organisation. For many organisations, they should be looking at retraining the workforce after the pandemic. This is not to “teach” them how to work from home, but how to do it “safely”!

COVID-19, Data Protection law and Privacy… Or the needs of the many vs the needs of the one.

When you have no right to privacy, Data Protection law governs the organisations respect for your information. It should not be Data Utility vs Privacy, but Data Protection and Data Utility.

The terms data protection and privacy are often used interchangeably.  Recently I have seen a high number of articles about “COVID-19 Symptom tracking Apps” and in the privacy community they all say the same – “is the loss of our privacy worth it?”  It’s tempting to look at it this way, and as Data Protection legislation is based on fundamental human rights legislation and principles, these are indeed worthy questions.  

It is always a societal balancing act, when considering the needs of the many vs the few, or the one.

It is clear in times of emergency, civil liberties can sometimes be suspended for “the greater good”, and the current “lockdown” with Police powers to enforce it is a clear example of this.  We all accept that it is for our protection in the “greater good”, but no one wants to awake from a subsided pandemic into a “new normal” of a Surveillance led Police state similar to an Orwellian 1984 Big Brother.  Power, once shifted and obtained is no easily set aside by Governments.  It only takes a look at the US Government “Patriot Act” that was passed for a limited time in response to the 9/11 terrorist attacks, that has been consistently renewed by successive governments ever since.  I’m reminded of Star Wars, where the Empire began with an unscrupulous Chancellor using emergency powers granted during a time of war to create an oppressive power hungry regime.  Yes, fantasy, but if our fiction is a mirror to our society, it is clear these are concerns that we all share.

As soon as I see technological surveillance being normalised, such as workplaces monitoring employee attention on remote web conference calls, or CCTV and Drone use being utilised to keep individuals in their place, I naturally recoil, and the libertarian in me seeks to object to the direction that our new surveillance society is taking.

But where does the law stand?  “Privacy” is not really mentioned in our current law, and instead we use the term “Data Protection”.  Often I see these terms used incorrectly as synonyms.  Privacy is not the same as Data Protection law. Why?  

The answer lies in this simple statement:

Data Protection law still applies, even when you have zero right to Privacy.

Let’s take it back a bit.  Privacy, and your human right to it, is more of a synonym for “Secrecy”, or your right not control if your personal data is disclosed or not.  Clearly we can’t live in a society where we live in Secret.  We have to transact our personal information in order to contract with businesses, lead productive social lives, contribute to society, pay our taxes etc etc.  This means our right to privacy changes depending where we go and what we do.  Clearly we have a large amount of privacy surrounding sexual practices in the privacy of our own home.  We have much less expectation of privacy in a busy high street, acting in our business capacity at work, or if we hold public office, or we use our persona to cultivate a celebrity status.   Privacy is changeable and varied, and it depends where we are and what context we do it in.  It is also not consent based in the majority of cases, as we cannot say “no” to having our data collected for tax, for a legal obligation, or deny law enforcement access where they have genuine need to investigate crime.   In the majority of cases, we may have little or no right to privacy or real choice over our data collection.

This is actually why Data Protection law is so important, as it sets rules and principles for when our privacy should be respected or when it should not.  

It does this by requiring organisations to have a legal basis for holding personal data, which defines the strength of the power between them and the organisation. Clearly if a law requires the company to hold the data they have little rights to privacy, but if relying on something like consent, the individual holds much more power – but either way, the organisation still needs to comply with their data protection legal obligations.  Most importantly where we do not have a right to privacy Data Protection law sets up responsibilities for those that hold it.  Data Protection law goes far beyond the scope of Privacy, but defining the safeguards in place for the proper use of the data for those that hold it.  Let’s consider some key parts of Data Protection Law, summarised as follow;

  • Use Justification (legal basis)
  • Transparency (privacy notices)
  • Collection Limitation (minimum necessary to the purpose)
  • Use Limitation (only used for the purposes notified)
  • Accuracy (ensuring data is up to date and 
  • Storage Limitation (held no longer than necessary)
  • Security (appropriate technological and organisations controls)
  • Individual Participation (allowing individuals rights such as access to a copy, rectification etc)
  • Transfer limitation (kept in countries/organisations with appropriate safeguards)
  • Accountability (appropriate documentation kept to demonstrate compliance)

There is much more to Data Protection law obligations than just these (controlling third parties, privacy impact assessments, privacy by design, etc etc), but I would argue, these have less to do with Privacy itself and more to do with practical Information governance and data management, examining the flows of the data and ensuring appropriate controls and safeguards have been designed in to ensure peoples data is treated with respect and only the minimum used where strictly necessary.

So let’s return to our examples of these “COVID19 symptom tracking apps”.  Clearly I can think of public interest reasons where we need to sacrifice individual privacy for the greater good, and wider public health.  However, trust is key. We must all be cautious to ensure that any of these solutions is carefully planned out in accordance with the principles above, with properly conducted Privacy Impact assessments giving rise to controls that protect our personal data, and by extension – us.  The data should be used only where strictly necessary, with minimum data collected, appropriate safeguards to minimise risks to the individual, deleted when no longer strictly necessary, and used for no other purposes that those originally identified and specified at the time of collection.  This is a far greater challenge than a simple “Yes/No” to the invasion of privacy, but instead reasoned justification and practical data management measures will win the day, providing great societal benefit and protection g the individual simultaneously.

It is not therefore Privacy vs Benefit, but Data Protection and Benefit.  A positive sum, win win solution, that benefits everyone, both individually and society as a whole.

Ralph T O’Brien, Principal, www.reinboconsulting.com

The whispering protocol and covid-19

Covid-19 smart wearable using privacy enhancing technologies popped up in my LinkedIn feed just today…. I was sceptical until I started reading the academic paper for the Whisper Tracing protocol used by the product.

This is a product which is not installed on a smart device, it is something you clip to your shirt (or whatever) and warns you if you are too close to another individual. It does not link the wearer of the wearable with an identity. The data is collected centrally, but deleted after a short time.

What is good about this product is that with covid-19 the most vulnerable group apart from those with underlying illnesses are the elderly, but it is mainly this group that do not own a smart phone, if they do probably do not use it optimally, which means they are excluded.

This is great for the workplace. I know when I’m out that I’m rubbish at this social distancing. I live on an island, at least thats my excuse, I work mainly from home, or it’s just I get so focused on what I need to do that I don’t notice people around me. I really could do with a clip-on which beeps when I get too close to others.

It is a startup Nodle which produced this wearable. It will be interesting to read more on this -when I have time- and see where this wearable ends up.

So what would entice you to install/enable a covid-19 App?

So what would entice you to install/enable a covid-19 App?

In the UK, where they’ve developed their own centralised App, (see what .ico says) it is expected that people will download the App in the name of ‘civic duty’. Sounds very British 😉

Apparently the Australians have also developed their own App, and I’d be surprised if ‘civic duty’ would motivate Australian citizens 😉

However, the one developed to be installed as a default on the Apple and Google phone, a decentralised version, could trigger the user to enable so that they can detect if they are in the proximity of an individual who could have covid-19, i.e. they’ve been in the proximity themselves with a covid-19, on one who has developed symptoms.

You know I wouldn’t be surprised if UK citizens did actually install on the basis mentioned above, to have a ‘civic duty’ is a key British value 🙂

What will motivate most outside of the UK though would be the idea that they can -as much as is doable- continue a normal life, and minimise the risk of becoming one of the statistics for covid-19.

Mentioned above are 2 models, centralised and de-centralised. In the centralised model the phone sends data to a government authority which will be compiling stats to understand the spread of the virus. It is claimed that no personal data is collected, i.e. it is anonymised. In the latter model the data stays on the phone.

The centralised model is only privacy friendly, if the data sent is truly anonymised, which I am sceptical over. At least at this stage, even if the intentions are true, I have yet to see a process which can really anonymise data, i.e. there is in fact, to my knowledge, no industry standard on the anonymisation process, which is mulitple steps of: de-identification, masking, obscurification, etc., to make it impossible to revert back. In fact it will always be possible to revert back unless the keys used for each step are securely disposed.

When I ask myself if I would install/enable the App? For covid-19, probably not. I live on an island, and there’s no bridge to the mainland. I don’t consider myself to be a risk group. However, if I were a risk-group, I would enable the Apple App, the privacy friendly one. Although who knows my ‘civic duty’ could jump in (as a British ex-pat) if the pandemic fatality rate was much higher and a sense of panic sets in.

Covid-19, a rehearsal for the real horror? and privacy concerns

The use of tracking technologies to collect data on the spread of covid-19 has triggered a lot of discussions concerning the privacy invasions of such practices, one which I got involved in was the installation as a default a tracking App on both Apple and Android devices, which needed to be switched on by the user in order for the tracking to work.

My take on this as a privacy guy, is that nothing privacy invasive should be installed as a default. The general fear is that what is installed as a default can be enabled as a default, take a look at China.

Thinking about it, we are truly lucky in that covid-19 is not an Ebola. We are getting the chance to rehearse in preparation for the real thing which will happen one day, and I’m hoping not in my lifetime. I think of Stephen King’s book The Stand, and what we see happening with deaths in old peoples homes, whereby they have been forgotten (almost) mirrors the horrors in this book. The horrors reflect the ‘Black Plague/Death‘ people dying in their homes, there is no hope. Fatality was 80%.

Together with a right to privacy are the associated dilemmas. It is clear to us all, although we may not think consciously about it that we want privacy for ourself, but not for others. A basic dilemma which makes privacy per se difficult on a basic human values level. This is reflected in the argument on privacy and the use of a tracking App. We don’t want any government authority -something we feel is the faceless George Orwell figure- tracking our movements. Yet we don’t want to die, and we don’t want our loved ones to die.

In fact the GDPR takes cares of these dilemmas by inserting clauses which enables government authorities in the name of public safety to take measures, also the use of extraordinary circumstances come into effect. In fact even if these Apps are not installed as a default today, they could arguably be pushed out at any time without our consent in the name of public safety. Privacy International has quite a lot on this subject by different countries.

Did you know that many of the international laws on wiretapping date back to a series of seminars hosted by the FBI in the United States in 1993 at its research facility in Quantico, Virginia, called the International Law Enforcement Telecommunications Seminar (ILETS) together with representatives from Canada, Hong Kong, Australia and the EU. The product of these meetings was the adoption of an international standard called the International Requirements for Interception that possessed similar characteristics to CALEA from the United States. In 1995 the Council of the European Union approved a secret resolution adopting the ILETS. Following its adoption and without revealing the role of the FBI in developing the standard, many countries have adopted laws to this.

Virtual Shadows, 2009

The question is, do we trust our governments to do what is right in the name of public safety, i.e. not abuse their power during times of crises, and pandemics? To be honest, we don’t have much choice, as much as we kick and scream, what it boils down to is that if a pandemic breaks out which can kill 80% of the population, these type of questions will not be asked. What about afterwards, in the wake of such a disaster? I guess we will be just grateful to be alive, even the politicians and faceless bureaucrats.

This is why the use of privacy by design in application development are of paramount importance, and I think this should be the focus of discussions hereon. ENISA published a useful paper in 2015 which can be used as a great inspiration. It would be great to see some evolutions on this paper on moving forward. There are some technologies out there, pathing the way into new territories, and it could take some time before we get absolute privacy on the use of any digital technology.

IMHO Privacy by Design embedded in all technology is where our energy should be placed. Words come cheap, action for long term change is often missed in the heat of the argument. The point is missed given the context, and human behaviour which results in getting nowhere; or somewhere, where we don’t want to be.