Data Collection: Minneapolis – St. Paul, USA, Civil Disturbance

It was reported on U.S. news today that a U.S. military Predator drone was used last night to collect data during the riot in Minneapolis – St. Paul, USA. The drone flew in circles for what appeared to be a 10 mile radius. The news report said that a large amount of personal data was being collected by the Predator during its sweeps.

Thus, another instance of the intersection of data collection and security. I’m looking forward to reading more about this particular data collection activity.

Work from home safely. Get cybersecurity cement.

Since March we have seen an increase in cyber incidents relating to the current pandemic. During this period reports suggest not necessarily an increase in cybercrime but instead s a visible increase in the use of Covid19 for tricking unsuspecting victims. In other words, no new crimes, but old crimes using new tricks.

Phishing, malicious domains and ransomware using Covid19 as bait are the most prevalent tactics but there is also an increase in attacks on vulnerable remote access technologies. Out of date software or indeed software developed without adequate privacy and security considerations are higher risk when combined with home networks and inexperienced users.  Work from home has become a reality to most in a very short space of time. Many organisations have had to grapple together solutions to meet demand for example: relying on VPN solutions that had not been patched or insecure configurations exposed to unprotected internet connections.

Whilst security (like patching and pen testing) are obviously essential to protecting organisations, the increase in cyber incidents demonstrate the importance of data protection by design by default. A data protection impact assessment (DPIA) will allow for adequate risk identification and work towards achieving appropriate controls. It is also a robust way of documenting project development to ensure that privacy takes a structured place in design work-streams. Data protection by design by default can supplement and support infosec colleagues in ensuring that the incidents are dealt with in an appropriate manner.

Finally, an essential part of any DPIA assessment is to identify immediate necessary mitigations, and subsequent actions to prevent reoccurrence, i.e. remediate. I have never done a DPIA that hasn’t made reference to training. Indeed, training is the cement that ties cybersecurity and privacy together and creates the strong wall of defence for an organisation. For many organisations, they should be looking at retraining the workforce after the pandemic. This is not to “teach” them how to work from home, but how to do it “safely”!

A GDPR Retrospective

Two years on and GDPR is still going strong. However, there’s still so much in front of us. Case law and other regulations like the new ePrivacy Regulation (ePR) whenever it will be approved will bring even more changes. 

In retrospective, the implementation of GDPR was an eye opener to so much more than the mere implementation of a regulation. Like any Change program, The GDPR implementation affected every part of the organisation in a way that could be deeply transformative, even disruptive. 

The reason for this is because, like dice, Data Protection has many interconnected faces. I particularly like The POPIT model, a development of Leavitt’s Diamond to explain the different elements to be considered during a Business Change process: 

  • Organisation 
  • Processes
  • Information and Technology
  • People

Organisation

The challenges smaller organisations have tend to be different from large ones.  Smaller firms may have more issues with resources or having a full time person in charge of Data Protection.  Some tech startups may not see GDPR as a priority. 

Larger organisations, on the other hand, may find difficult to have a unified Data Protection front if they have many auto sufficient business units that are not in a close geographical proximity. 

Other times organisations are structured in silos and they may not know what happens with data that leaves their area, to move into another within the same company. It’s also possible that personal data is treated differently in different areas; even in the same area, depending on the project. This can cause re-duplications but also inaccurate data. Organisational changes may be hard and take time to plan. A good outcome of the project is to build robust data governance. 


The more Data Protection practitioners liaise with the Legal department or the IT or Cybersecurity department  (depending on where the function is) the more successful the implementation and the business as usual afterwards will be.

Processes

Processes may not be mapped or stakeholders may not know there’s a process. Sometimes, things are done in a certain way but not recorded. 

Mapping and changing processes may mean cutting through silos and the collaboration of adjacent areas that were working separately before like marketing and analytics. That could also lead to changes in the organisation to reorganise the work that is carried out.

A process that needed to be created was certainly providing way for data subject rights: mostly for access and deletions requests: processes that could also go across different areas of the business and would need assistance from IT.

Information and Technology 

The most troubling word that can be heard in this area is legacy: legacy systems, legacy data. Old systems that may not be able to delete data or providing the necessary technical safeguards anymore. 

This is the area where automation of data retention schedules or privacy engineering lives. Privacy engineering is an area that is growing. LINDDUN is an example of privacy threat modelling methodology to deal with privacy threats in software architectures.

It’s very important to work alongside Cybersecurity, too (if Data Protection does not reside in this area) as there are many similar challenges that were faced by this area some years ago and have a very acute sense of risk management. 

People: mostly, people 

All the areas above must be worked on. Changing one of these elements has an impact upon the other three, as all four aspects are interrelated. In addition, there needs to be an alignment with the business, as change has to be planned taking into account the idiosyncrasies of an organisation in order to be successful. The most important part of a GDPR program is to influence the mindsets of people, so there’s a different approach to personal data. Sometimes, that also means learning what data is and what it can be: the significance in organisations and in our personal lives. The cultural change. This could be done through different kinds of awareness programs and training. 

Building Data Protection by Design and Default starts within the hearts and minds of the people. 

The change in mindset comes first and fore-mostly from understanding what data is, what particular data types a business collects and how is processed in a particular area and the business as a whole. Stakeholders will understand not only from an intuitive way what’s the work their area is actually carrying out. This is an interesting distinction to understand and improve processes but also to understand the value of personal data. Also teaches about risk and liability. 

Once knowledge about data has been built, we can start asking questions about how processing can be carried out with less or with different data; What’s the reason to retain data or asking ourselves if are we supposed to carry out the processing in the first place.

Data Protection has as many faces as each one of us, as Personal data are the immense amounts of little bits of us generated in each interaction that live in a digital form. Even if Data Protection programs can be complex and overarch many countries and jurisdictions, they always refers back to the protection of the individual: the right to privacy that needs to be taken in consideration and balanced when necessary but never forgotten as we risk to lose it otherwise. 

Available Presentation: Data Licensing and Privacy Protection Workshop for University TTOs

Here is a link to a January 2020 webinar co – presented with a colleague at the University of Southern California, covering the twin issues of privacy protection and data licensing. Enjoy!

https://techtransfercentral.com/marketplace/distance-learning/data-licensing-and-privacy-protection-workshop-for-university-ttos/

Finnish business fined for tracking employees

In Finland one of the first fines handed out to a water supply management company which used location data in the vehicles used by employees which is considered systematic monitoring. A DPIA should be conducted.

Taken from DLA Piper blog
Followed from a complaint made by an individual. Kymen Vesi processed location data of its employees by locating their vehicles. This location data was used to monitor the employees’ working hours.
The Data Protection Ombudsman stressed in its decision that a data controller must carry out a DPIA when the processing likely results in high risk to the rights and freedoms of data subjects. Kymen Vesi should have carried out a DPIA since the processing of location data concerned data subjects in a vulnerable position (employees) and the data was used for systematic monitoring. In reference to the criteria list set in WP29 guidelines on DPIA and determining whether processing is likely to result in high risk, the processing conducted by Kymen Vesi satisfied three of the criteria (processing of location data, data subjects in vulnerable position and systematic monitoring of data subjects) when usually a DPIA is already required when two of the criteria are satisfied.

Read the rest of the blogpost from DLA Piper blog.

Privacy, Civics, the STEM Disciplines, and the Future

By James Casey, Esq., CPP

The recent passage of Resolution 108 at the ABA House of Delegates meeting in Austin, Texas, presented a wonderful opportunity to speak again to the importance of Civics in American life. Supported by the Standing Committee on Election Law, Section of Civil Rights and Social Justice, Standing Committee on Public Education, Section of State and Local Government Law, and the Law Student Division, the Resolution urges all levels of government to facilitate the preregistration of voting by youth between the ages of 16 and 18. This preregistration will lead to increased youth voting in elections at all levels, but it is critical that Civics education be significantly increased in schools to facilitate informed voting. Two paragraphs in Resolution 108 are most important:

FURTHER RESOLVED, That the American Bar Association urges state and local educational institutions to adopt robust civic education programs to promote literacy in the institutions of American government, the methods of active civic participation in elections and governance, and a solid foundational understanding of the role and crucial importance of the rule of law; and

FURTHER RESOLVED, That the American Bar Association urges federal, state, local, territorial, and tribal governments to enact legislation, promulgate regulations, and appropriate sufficient funds to implement voter preregistration and civics education as called for by this resolution.

The Connection Between Privacy, Civics, STEM, and Innovation

You may be asking yourself at this point: What is the connection between Privacy, Civics, and the STEM disciplines (Science, Technology, Engineering, Mathematics)? There are a few important connections that may be named now: 1) STEM disciplines are at the forefront of technological initiatives to enhance privacy protection (regardless of the country); 2) An educated public (and youth particularly) about Civics and government also means an educated public when it comes to privacy and data protection; 3) Academic institutions conduct research into areas such as AI (artificial intelligence), which will transfer into privacy issues and strengthen the classroom experience; 4) Privacy and data protection in the future will increasingly adopt scientific improvements, which are often developed in universities; and 5) Privacy and data protection are interdisciplinary areas, just like Civics and the “hard sciences” (STEM). To the author, these areas are highly complementary. These connections will be amplified in a future blog post.

The importance of Civics education in the nation’s schools goes beyond enhanced voting. The next section addresses the STEM disciplines, innovation, and how Civics education is just as important as STEM education. Similarly, Privacy education is equal to the education required in Civics and STEM.

The STEM Disciplines and Innovation
 
Alan Leshner’s well written editorial in the 27 May 2011 issue of Science Magazine, entitled “Innovation Needs Novel Thinking,” highlights the important linkages between the STEM disciplines and innovation in ensuring that the American economy remains at the forefront of global economic growth. This section of his editorial struck me as vitally important:

In addition, innovation often comes from nontraditional thinking, and many new ideas will come from new participants in science and engineering who often are less tied to traditional ways. That argues for increasing the diversity of the scientific human resource pool, adding more women, minority, and disabled scientists, as well as researchers from smaller and less-well-known institutions. The benefits of increasing diversity by fostering innovation and economic success have been argued well elsewhere (see citation in original article). Both research institutions and funders need to attend more to these sources of novel thinking and may have to refine recruitment, reward, and funding systems accordingly (Leshner, p. 1009).

The ideas he outlined in his editorial, furthermore, can find a kinship with points made by Federal Reserve Chairman Ben S. Bernanke in his speech entitled “Promoting Research and Development: The Government’s Role,” given at Georgetown University on 16 May 2011. As Mr. Bernanke says on pages 10-11 of his speech:

… At the same time, critics of K-12 education in the United States have long argued that not enough is being done to encourage and support student interest in science and mathematics. Taken together, these trends suggest that more could be done to increase the number of U.S. students entering scientific and engineering professions.

The commentary by Mr. Bernanke and Mr. Leshner are absolutely on point. The United States needs increasing numbers of graduates who are skilled in the STEM disciplines if it is to remain a dominant economic power. But that objective is only part of the goal of increasing innovation and economic wealth. The innovation environment needs to be expanded beyond STEM.

Expanding the Context of Innovation

While focusing on the STEM disciplines is a meritorious approach to increasing innovation and wealth creation in the United States, it does not cover the entire universe of what is necessary to create an innovation society. Attention to non-STEM areas – such as Civics – is critical to creating an innovation society. Civics is the broad area encompassing such disciplines as history, law, and political science. An educated and engaged citizenry is critical to the creation of an innovation economy in the United States. And advances in privacy are critical to an innovation economy anywhere in the world.

One can find the genesis of law and innovation in the U.S. Constitution. Article I, Section 8, Clause 8, of the Constitution empowers the U.S. Congress to:

To promote the Progress of Science and useful Arts, by securing for limited Times to Authors and Inventors the exclusive Right to their respective Writings and Discoveries.

This clause serves as the constitutional bedrock for U.S. intellectual property law. This is the first clue that technology and innovation is not solely a STEM concern.

The May 2011 issue of the ABA Journal discusses these issues in an excellent article entitled, “Flunking Civics: Why America’s Kids Know So Little.”[i] The article says the following with regards to a focus on certain disciplines (p. 34):

Since the late 1990s, when American students tested poorly in reading, science and math against students from 20 other Western nations, federal education policy has focused strongly on those three subjects at the expense of history, social studies, government and civics.

That trend began in 2001 with the Bush Administration’s landmark No Child Left Behind Act, which gives priority to federal funding for efforts to improve student performance in reading and math, skills that are considered fundamental to student success in the workplace. The program continued under the Obama Administration’s support for so-called STEM programs, which rewarded student achievement in the fields of science, technology, engineering and math.

Educators fear that this long-range focus on a few limited subjects that are considered fundamental to student success is squeezing out the amount of time and effort devoted to subjects considered non-fundamental, such as history, social science, government and civics.

This concern over the “squeezing out” of non-STEM subjects is matched by documented evidence that U.S. students and adults have a very poor grasp of law, history, or government, all of which are considered essential for civic engagement. The ABA Journal article (p. 34) notes that a 2005 survey by the ABA found that nearly half of all Americans were unable to correctly identify the three branches of government, and a FindLaw survey that same year found that only 57% of Americans could name any U.S. Supreme Court justice. Retired U.S. Supreme Court Justice Sandra Day O’Connor is quoted in the article as saying (p. 37):

There are all kinds of polls out there showing that barely one out of three Americans can name the three branches of government, let alone describe what they do.

If the polls are correct in large measure, meaning that most Americans are illiterate when it comes to their government and what it does, how can they function and benefit in an innovation economy? There is more to government than releasing funds to beneficiaries.

The American Bar Association has long had a significant interest in civics education. As noted in the ABA Journal article (p. 37), the ABA Commission on Civic Education in the Nation’s Schools is co-sponsoring a series of academic events around the country where community leaders can teach students about the law, the Constitution, and the importance of civic engagement. The Commission has supported these activities with other resources, such as a resource guide and a website where law schools, courts, civic organizations, and other organizations interested in sponsoring such a forum can find suggested curriculum, formats, lesson plans, strategies, and other information (p. 37).

The Connection Between Civics, Voting, and Innovation

It is easy to design a high school or undergraduate course drawing the connection between civics, voting and innovation. This includes such topics as: 1) Why it is important that Civics be taught in grade and high schools and why it is important for the rule of law; 2) The constitutional basis of copyrights and patents in the U.S. (Article I, Section 8, Clause 8); 3) The history of inventions in the United States, particularly those of significance; 4) Basic STEM dimensions that bear upon innovation today; 5) The major laws and regulations impacting innovation today; 6) Current issues in innovation; and 7) The future of innovation.

This approach – tailored for a specific educational level – would help engage all students in the concepts of innovation and raise the level of civic engagement in the area of innovation. Such a course would educate all, not just students engaged in the STEM disciplines or majoring in those areas.

Conclusion

A strong Civics curriculum at the grade, high school, and college levels would benefit America in several ways.

As exemplified by ABA Resolution 108, a robust dedication to teaching Civics at all levels, coupled with voter preregistration between the ages of 16 and 18, would lead to increased and informed youth voting. American democracy is strengthened by these improvements. There is more to American democracy than the internet, Facebook, and Twitter. Students must be well versed in American history, law, politics, and Civic engagement. Privacy and data protection are strengthened by having educated youth and an engaged citizenry.

An American citizenry educated in Civics and STEM (or STEAM as the new acronym – adding Arts) will also go a long way to creating a culture of innovation. If America truly wants an innovation society that creates wealth for all its people, then the education of America’s youth will have to go far beyond the STEM disciplines. Privacy is a critical component in that education. Students will learn that true innovation in the United States stems from democracy and a largely capitalist economic system. Increased Privacy and Civics education, increased voting, and increased STEM education will lead to continued American success in a global economy.

The current pandemic is a time of monumental change, sadness, and uncertainty. Despite those characteristics, it is also a time of great opportunity, with Privacy at the forefront.

__________________________________________________________________________

James Casey, Esq., CPP, is an attorney, certified privacy practitioner (CPP), and consultant based in Washington, DC. He is also an Adjunct Associate Professor in the CUNY M.S. Program in Research Administration and Compliance. He is presently a State Bar of Wisconsin representative to the ABA House of Delegates and holds several positions within the ABA Science and Technology Law Section. He is a past president of the State Bar of Wisconsin Nonresident Lawyers Division and is a Life Fellow of the Wisconsin Law Foundation and a Fellow of the American Bar Foundation. The opinions expressed in this article are solely his.


[i] Mark Hansen, “Flunking Civics: Why America’s Kids Know So Little.” ABA Journal, May 2011, pp. 32-37.

The sizeable gap

Is it more important for a Data Protection Leader to be an expert in data protection law, or to orchestrate behavioural change from top to bottom?

I’m still surprised by the number of job ads for data protection leadership roles that focus heavily on the need to have either a legal background, or a deep understanding of laws and regulations, yet almost fail to specify critical leadership behaviours, let alone competences needed to change behaviour in all levels of an organization.

It’s about People
In simple terms, data protection is about people.

People entrust companies to process their data about themselves, and companies must demonstrate they respect their rights.

People (employees) in companies process the data.

Senior managers and leaders in companies are people making critical decisions that make or break the success of data protection compliance initiatives.

The legal bias
Unfortunately, many data protection compliance initiatives float around companies tagged as ‘necessary evils’ or ‘compliance issues’ typically anchored in legal or compliance departments. It is still rare to find a Data Protection Strategy aligned with key data-fueled elements within the company’s business strategy and anchored in the parts of the business who mostly benefit from the processing of personal data.

These ‘necessary evils’ often only pay lip service to ‘the people factor’.

Policies and procedures are often imposed on employees without any of their involvement in the drafting process. There may be some generic data protection education, or an off-the-shelf eLearning package.

Just giving employees information doesn’t change their behaviour. 

Senior managers and leaders often see themselves above the need for the specific education needed for them to understand fully the implications of decisions they’ll take that can make or break the success of the project, program or BaU process.

The successful Data Protection Leader
To be successful in fulfilling the aims of legislation such as the GDPR, a Data Protection Leader needs to be able to actively guide, lead, influence and inspire a diverse range of stakeholders (people) in their companies. They need to understand how companies work, not least the ever-changing ‘invisible architecture’ of inter-personal power dynamics, relationships, agendas, motivations, etc. that are unique in all companies. 

Focusing on people requires influencing their behaviour. The successful Data Protection Leader understands and applies the same tools used around us all in other contexts influencing our behaviours. Often, the same behavioural science techniques used by their own company’s product development and marketing departments to influence consumer behaviour.

The successful Data Protection Leader uses these tools to influence senior executives and other leaders to respect data protection in the same way as they respect say, data analytics – two sides of the same coin. The leaders are coached in key data protection concepts relevant to the decision making expected of them, particularly risk acceptance and investments, especially investment in behavioural change across the organisation.

Employees will then start to ‘get it’ instead of trying to decipher bold, generic corporate statements and principles about ‘GDPR’.

They will then know exactly what’s expected of them at 10.12 on a Tuesday morning when they are scoping a new marketing campaign, or at 14.30 on a Thursday afternoon when they are participating in a kick off workshop for the new consumer app.

Conclusion
Often, small and simple behavioural changes drive significant results. The first change companies must make is to recognize that data protection is not solely a legal issue.

Many competences are required – including strong legal expertise – and companies need to appoint Data Protection Leaders who are well equipped to guide, lead, influence and inspire people at all levels of their organization.

Tim Clements
Copenhagen, Denmark

COVID-19, Data Protection law and Privacy… Or the needs of the many vs the needs of the one.

When you have no right to privacy, Data Protection law governs the organisations respect for your information. It should not be Data Utility vs Privacy, but Data Protection and Data Utility.

The terms data protection and privacy are often used interchangeably.  Recently I have seen a high number of articles about “COVID-19 Symptom tracking Apps” and in the privacy community they all say the same – “is the loss of our privacy worth it?”  It’s tempting to look at it this way, and as Data Protection legislation is based on fundamental human rights legislation and principles, these are indeed worthy questions.  

It is always a societal balancing act, when considering the needs of the many vs the few, or the one.

It is clear in times of emergency, civil liberties can sometimes be suspended for “the greater good”, and the current “lockdown” with Police powers to enforce it is a clear example of this.  We all accept that it is for our protection in the “greater good”, but no one wants to awake from a subsided pandemic into a “new normal” of a Surveillance led Police state similar to an Orwellian 1984 Big Brother.  Power, once shifted and obtained is no easily set aside by Governments.  It only takes a look at the US Government “Patriot Act” that was passed for a limited time in response to the 9/11 terrorist attacks, that has been consistently renewed by successive governments ever since.  I’m reminded of Star Wars, where the Empire began with an unscrupulous Chancellor using emergency powers granted during a time of war to create an oppressive power hungry regime.  Yes, fantasy, but if our fiction is a mirror to our society, it is clear these are concerns that we all share.

As soon as I see technological surveillance being normalised, such as workplaces monitoring employee attention on remote web conference calls, or CCTV and Drone use being utilised to keep individuals in their place, I naturally recoil, and the libertarian in me seeks to object to the direction that our new surveillance society is taking.

But where does the law stand?  “Privacy” is not really mentioned in our current law, and instead we use the term “Data Protection”.  Often I see these terms used incorrectly as synonyms.  Privacy is not the same as Data Protection law. Why?  

The answer lies in this simple statement:

Data Protection law still applies, even when you have zero right to Privacy.

Let’s take it back a bit.  Privacy, and your human right to it, is more of a synonym for “Secrecy”, or your right not control if your personal data is disclosed or not.  Clearly we can’t live in a society where we live in Secret.  We have to transact our personal information in order to contract with businesses, lead productive social lives, contribute to society, pay our taxes etc etc.  This means our right to privacy changes depending where we go and what we do.  Clearly we have a large amount of privacy surrounding sexual practices in the privacy of our own home.  We have much less expectation of privacy in a busy high street, acting in our business capacity at work, or if we hold public office, or we use our persona to cultivate a celebrity status.   Privacy is changeable and varied, and it depends where we are and what context we do it in.  It is also not consent based in the majority of cases, as we cannot say “no” to having our data collected for tax, for a legal obligation, or deny law enforcement access where they have genuine need to investigate crime.   In the majority of cases, we may have little or no right to privacy or real choice over our data collection.

This is actually why Data Protection law is so important, as it sets rules and principles for when our privacy should be respected or when it should not.  

It does this by requiring organisations to have a legal basis for holding personal data, which defines the strength of the power between them and the organisation. Clearly if a law requires the company to hold the data they have little rights to privacy, but if relying on something like consent, the individual holds much more power – but either way, the organisation still needs to comply with their data protection legal obligations.  Most importantly where we do not have a right to privacy Data Protection law sets up responsibilities for those that hold it.  Data Protection law goes far beyond the scope of Privacy, but defining the safeguards in place for the proper use of the data for those that hold it.  Let’s consider some key parts of Data Protection Law, summarised as follow;

  • Use Justification (legal basis)
  • Transparency (privacy notices)
  • Collection Limitation (minimum necessary to the purpose)
  • Use Limitation (only used for the purposes notified)
  • Accuracy (ensuring data is up to date and 
  • Storage Limitation (held no longer than necessary)
  • Security (appropriate technological and organisations controls)
  • Individual Participation (allowing individuals rights such as access to a copy, rectification etc)
  • Transfer limitation (kept in countries/organisations with appropriate safeguards)
  • Accountability (appropriate documentation kept to demonstrate compliance)

There is much more to Data Protection law obligations than just these (controlling third parties, privacy impact assessments, privacy by design, etc etc), but I would argue, these have less to do with Privacy itself and more to do with practical Information governance and data management, examining the flows of the data and ensuring appropriate controls and safeguards have been designed in to ensure peoples data is treated with respect and only the minimum used where strictly necessary.

So let’s return to our examples of these “COVID19 symptom tracking apps”.  Clearly I can think of public interest reasons where we need to sacrifice individual privacy for the greater good, and wider public health.  However, trust is key. We must all be cautious to ensure that any of these solutions is carefully planned out in accordance with the principles above, with properly conducted Privacy Impact assessments giving rise to controls that protect our personal data, and by extension – us.  The data should be used only where strictly necessary, with minimum data collected, appropriate safeguards to minimise risks to the individual, deleted when no longer strictly necessary, and used for no other purposes that those originally identified and specified at the time of collection.  This is a far greater challenge than a simple “Yes/No” to the invasion of privacy, but instead reasoned justification and practical data management measures will win the day, providing great societal benefit and protection g the individual simultaneously.

It is not therefore Privacy vs Benefit, but Data Protection and Benefit.  A positive sum, win win solution, that benefits everyone, both individually and society as a whole.

Ralph T O’Brien, Principal, www.reinboconsulting.com

Belgium DPO conflict of interest resulted in a fine

2 years on and finally a fine pertaining directly to the role of the DPO…. hurray! What a great celebration for GDPR and each of us who have the privilege to be a Data Protection Officer.

Avoidance of a conflict of interest for the DPO is super important in any organisation because the role requires that he/she stands in the shoes of the data subject which potentially can conflict with how the organisation views risk.

If we take this from a privacy risk angle, what is privacy risk? It is the risk of harm to the rights and freedoms of an individual (or natural person as per GDPR). You can think of the DPO similar to a consumer advocate in an organisation, except it’s ensuring that the organisation is fulfilling its obligation as a fit custodian of personal data, and ensuring that the rights of the data subject are met.

A conflict of interest can occur when looking at risk. Every privacy risk will equate to another organisational risk, i.e. missing encryption on laptops is a privacy risk but it is a security risk which is the cause of this privacy risk.

When you as DPO need to decide on risk appetite, you need to do this in the shoes of the data subject first. It’s not practical to ask all (data subjects) if they find this risk okay to accept, most wouldn’t understand what you’re talking about. As a CISO/CRO you will be looking at risk from the view of the organisation’s risk appetite. In fact these 2 views can create conflict in the role of the DPO, hence a conflict of interest.

This is why the recent ruling in Belgium is so important since GDPR came into force.

GDPR is very Personal

It is a personal post. Not that it is annivarsary of GDPR so I am very emotional due to that but because to me, GDPR is very personal and I hope you don’t mind.

Personal Letter

When was the last time you sent a letter? I do not mean letter to tax office, employment agency, or invoice to customers. Real ones?

When I was younger, it was the most joyful thing to write and then send a letter. I knew, I could write whatever I want to write, my girl friend could read, and I could get letters from here. Everything was private, everything was between two of us, and so emotional, so special, so joyful.

Now, instead of letters we send emails, fb messages, we whatsup, we viber, we hangout, we telegram.

I was sure that I was almost confident that, my letters were not opened, I had this trust. Because I could put signs in a way that I could let receiver notice that whether my envolope opened or not.

Yes, it was an analog process and now we are living in the era of digital. Now things changed. Now we whatsup, messangers, twitt, email each other.

I would like to continue with nice example Peter Krantz gave when he was CIO of Swedish National Library.

Just to illustrate how communication has changed I would like to use analogy of lending a book from library. Peter Krantz, who is CIO of National Library presents it like this:

You, as a user just lend the book and everything  between library and you. Now, when you read a digital book, there are many different stakeholders as stated in the picture. I have no control of use of this information.

But don’t get me wrong, I think we don’t even need GDPR and we can fully trust companies and states.

Who ever complains about privacy, talks about human rights etc, these are some bunch of crazy people who live in dillutional world.

Echelon

When I was younger, were were told some consipracy theories that there is a secret ECHELON program that collects and stores all digital communication. It was crazy. Why there should be an organization like this? Why should they collect all these data?

But now we know it is the fact. We know because we have evidence provided by Edward Snowden. We know that because there are whistleblowers. Governments are not hiding it anymore. They just do that to protect us from terrorists! Companies are not hiding it. They accept that they collect data, take our consent, as if we could have another option and use it.

As a result, governments, and public bodies, creates, collects, stores and holds lots of critical information about us. It is not only our name, street address and registration number. It is our health data, it is our videos recorded by security officers, it is our photos and voice recordings when we enter and leave the countries at the airport. It is our photos (and may be even voice) when we stop at stoplight, it is our fingerprints when we enter (some) countries or get a new passport and driving license.

Governments ask us to trust them, private companies ask us trust them.  As I said, I trust all!. When I read “personalized content” I just understand that “Oh boy, great they are recording and following us for my safety and security”.

Trustcorp.

“The information we collect includes unique identifiers, browser type and settings, device type and settings, operating system, mobile network information including carrier name and phone number, and application version number. We also collect information about the interaction of your apps, browsers, and devices with our services, including IP address, crash reports, system activity, and the date, time, and referrer URL of your requests”

“We will share personal information outside of Google if we have a good-faith belief that access, use, preservation, or disclosure of the information is reasonably necessary to:”

Let me give explain what this means in practice for TrustCorp:

They  collect personal information like your name, email address, telephone number or credit card. They can collect our phone number, which is not core business, they can collect identifieir of my phone, again, I did not buy phone from them, they can see all calls, dates, durations, type of calls, nothing mentioned about whether they also record my calls or now (may be, who knows?) Then, they can collect information about the websites I visited, what worked what crashed, all location information that they can identify from any sensors, wireless pot around, information about my local storage.  Look, storage is interesting actually: It is like a IKEA knowing how much space I still have empty in my wardrobe and received regular updates about my space in wardrope, is not it?

If you think this information is too much, no it is not! If they think they might need more information from me, then they will notify me, if it is something notifiable, and they will ask my explicit consent, meaning that, I have accept their terms and conditions otherwise I cannot check my emails, check maps etc. I will not have this option: “No, I reject but still use the service as I was using before.”. I can not reject and use old version either! I will have the only and the one great option: Consent!

Not only these company have our data, they can also share all these data to

 “companies, organizations or individuals outside of TrustCorp if we have a good-faith belief”

They can share with individuals. Who can be these individuals? and if TrustCorp have “a good-faith belief”. What is “good” and whose “faith” is it?

TrustMEtoo CORP

Now my second example is from another TrustMEtoo corporation. This is the one that usually tries to improve their privacy and just deflect the questions. Let me explain how this company was used by Trump before the election: Parscale uploaded the names, email addresses, and phone numbers of known Trump supporters into the TrustMEtoo advertising platform. Next, Parscale used TrustMEtoo’s

Custom Audiences from Customer Lists” to match these real people with their virtual TrustMEtoo profiles. With TrustMEtoo’s “Audience Targeting Options” feature, ads can be targeted to people based on their TrustMEtoo activity, ethic affinity, or “location and demographics like age, gender and interests. You can even target your ad to people based on what they do off of TrustMEtoo.”

Parscale then expanded Trump’s pool of targeted TrustMEtoo users using “Lookalike Audiences”, a powerful data tool that automatically found other people on TrustMEtoo with “common qualities” that “look like” known Trump supporters. Finally, Parscale used TrustMEtoo’s “Brand Lift” survey capabilities to measure the success of the ads.

Then data was shared with trustable organizations and individuals to create their own database and then there was Project Alamo where 220 M American data were stored with approximately 4,000 to 5,000 individual data points.

What I was saying: we should trust corporations and TRUMP governments, right?

Muslim Registry

Do you remember, when there as a time, when people were scared that Trump would register Muslims in USA? Honestly, why were we scared that TRUMP is going to register Muslims in USA?

Honestly, do you really think he is going to register by one by, as of today, I guess we all know that he is not stupid that he was presented to us by our “objective” media. He already have the registry, as one of my friends shared at her Facebook post how Muslims were able to receive specific letters from churches and how innovative way of reaching the church is presented (it is from 2016).

I am not against of any religion practice, but I am not sure if we all are OK that any organization, company can get that detailed list?

Shall we trust companies? OH Yes!

Trust Governments

I think not only companies, but all governments are trustable, let me give you an example:

Let me give you first example from a state:

“According to a half dozen current and former employees, who spoke on the condition of anonymity, leaked Procera documents and internal communications, Turk Telekom requested not just a feed of subscribers’ usernames and passwords for unencrypted websites, but also their IP addresses, what sites they’d visited and when.” Forbes, October 2016

This except is  an old news from Forbes, when Turkish states technology provider company asked a Canadian company to  give access to “usernames and passwords for unencrypted websites, but also their IP addresses, what sites they’d visited and when”. We could only hear about this because they had Swedish branch and Swedish employees and CEO, and they protested, and CEO resigned. What if they did not? What if there are some companies that do not care about these issues but just profit from it?

Private Fridays and Privacy of Health Data

It is not only about when we use the service, with every device we are adding to our life, corporations are so trustable that they start to dare to say be careful what you say next to their voice activated devices. You don’t need to worry about private talks or moments or Friday nights with your partner anymore! Your dear friend Alexa will take care of it!

If you want to have some private moment and do not want them to hear and see, go to storage room! Wait a moment, maybe we already put a camera there!

Now we have covid-19 and some countries are making mandatory for people to provide input to some specific apps and go regular screening about their health with specific tools and cameras. They claim that it will ONLY be used for Covid-19. Let me give you a great example and ongoing discussion about PKU- blood registry in a very democratic and open country: Sweden.

PKU is a genetic disease and parents are asked to donate their kids’ blood for PKU clinical & health research. Majority of people, for the Samaritan reasons donated their kids’ blood.

You know what happened? In 2003 after assassination of Foreign Minister Anna Lindh, police were able to identify the perpetrator by means of blood samples from the PKU database, despite protests by the health service. When identifying Swedish citizens after the 2004 tsunami disaster, the Biobank Act was temporarily amended by a parliamentary decision that allowed the International Identification Commission to use the samples,

Imagine, they tell you to give permission of your newborn baby to be taken for the research to cure diseases, and you decide to donate. Now it can be used by police and international commission!

Imagine that you are “that kid” and your blood is registered to a database without your own consent and what if government decides to open these databases to not only police but insurance providers, to find your preconditions, genetic diseases that can be shared by companies that are trustable as I described above!

Privacy is Creativity

Imagine the world we are entering, we are recorded registered from our birth from our blood to our every move, by different companies, states and governments where we are supposed to trust them and they can share this information..

There are tons of studies on privacy and cameras etc. These sociological studies conclude that, behavior becomes conformist and as accepted by the power owners, as expected by power elite and we can basicly conclude that creativity, freedom, and resistance to power, actually humanity dies!

Think for a moment, what if Hitler had all these survalience technology we have now?

Hitler Selfie - Imgur
Ref: Mary Jane Sunshine https://imgur.com/gallery/D7ZYCTO

What could we do in order to protect ourselves with the all political and technological power he would have!

Privacy of Personal Data

I hope my examples above show why and how data privacy is very important for citizens.
But I have to make a distinct difference here. When government representatives talk about privacy and security, they ONLY refer privacy and security of government files. The problem with that context is that, these governments do not care about privacy of citizen data, do not care about privacy of, basically, “mydata”.


How much can we trust that google, Samsung, Microsoft and any other private companies will protect our data? What gives them this right to collect so much information about every one of us, every device of us?

I am not here to draw a negative picture, but we must face the reality and define problems properly without falling into ideological trap so that come up with suggestions. Because whenever people raise their voices, these organizations create an environment that privacy advocates are bunch of radical people who does not get the new world!

But just think about: How many of you know how your data collected and treated?

You are being told that it is anonymizer, right? but which level? I am sharing you my anonymized pictures!  Which one is stored in the database?

Encryption, right? I know a company promised so, please search for Asley Maddison, but then all their database is leaked, and data was easily seen, and people lost their reputation, several people committed suicide. People trusted the company that is regulated by state rules but in the end, their privacy is comprimised!

Ok Serdar, the DrZero, you complained a lot, are there any solution: Yes

There are many solutions, but I want to keep this discussion for future posts, first, we should see that these are about everybody and about each of us!

All these “trustable” companies and governments are pushing us to the corner by saying: (As once Google CEO Eric Smith did)

Trust us we are good guys!and

If you have something that you don’t want anyone to know, maybe you shouldn’t be doing it in the first place

Edward Snowden has an answer to them:

“Arguing that you don’t care about the right to privacy because you have nothing to hide is no different than saying you don’t care about free speech because you have nothing to say

I think I have right to flip the sentence here:

Hey Dear Companies and Dear States, If you do not want to share how you store our data, encyrpt our data, how you process the data,  then you have something to hide!

We put our trust in states, they are not able to protect our data! We put trust in government, they do not do anything to protect individuals! We put trust in private companies but they are being hacked and actually they abuse their power.

Google, Microsoft, Samsung, Facebook and all governments, human right activists, citizens need to understand more broadly that ignoring personal privacy in the data era, we not only let invasions of our personal space, but open the door to future abuses of power.

We have all the tools and the technology available to address all these problems. I just want to have my right of privacy and want to have private communication as I had with the physical letters years ago! Is it too much?

https://lakartidningen.se/opinion/debatt/2017/04/freda-pku-registret/