On the Second Anniversary of the GDPR: Mobile App Descriptions

With today being the second anniversary of the GDPR, below is an article I wrote regarding mobile apps and privacy, particularly with respect to the U.S. COPPA statute (Children’s Online Privacy Protection Act). I reviewed over 10,400 mobile apps in the Google and Apple stores while working at a Washington, DC, law firm and it was an enriching experience. So here are my suggestions and tips. Enjoy!

__________________________________________________________________________

Mobile App Descriptions: Observations and Tips

James J. Casey, Jr., Esq., CPP

INTRODUCTION

One of the most dynamic aspects of the smartphone revolution has been the introduction of mobile apps that are downloaded for use in smartphones and tablets. The Apple App and Google Play Stores are the primary players and have been since their establishment nearly 12 years ago. Phone manufacturers who attempted to create their own native app ecosystems, such as BlackBerry, generally failed. Of course, many know that BlackBerry phones now have the Android OS, which means access to the Google Play Store.

With the long – overdue focus on privacy – 20 years late in the estimation of the author – mobile apps are under increasing scrutiny. This is particularly true where apps may be directed towards children – and collecting their personal data in the process. I have been fortunate to have reviewed over 10,400 apps in the Apple App and Google Play Stores, and have some observations and tips to share with you. These observations and tips are exactly that – they are not legal advice. Thus, you are recommended to seek legal advice from your general counsel or outside attorney.

OBSERVATIONS AND TIPS

  1. I have seen sloppily written app descriptions – complete with spelling and grammatical errors. Thus, apps should be precisely written and explicitly clear as to what the app(s) is / are designed for, what age rating is appropriate, and what audience(s) the app(s) is / are directed towards. The app store content ratings should be accurate to the directed age groups and content descriptions. This is especially true for children as well as for mature subject matter and topics involving violence, crude humor, blood / gore / death, stronger sexual themes and content, and partial nudity. If an app is part of the Google Play Store “Designed for Families” (DFF) ratings, then it should be marked as such on the app page.
  2. Make sure there is consistent alignment between the descriptive words and pictures / media / images / screenshots in the app description. This is especially true when it comes to the ages of “directed persons” / target audience and where the content involves mature or potentially controversial subject matter. If an app is directed towards children (under age 13) and older individuals (a “mixed” app), then the descriptions should clearly state that. It is critical that app descriptions for children be crystal clear.
  3. There is a fine line between too much and too little information on the app page. Some apps have too much extraneous information and not enough important detail. App descriptions are truly art + science.
  4. It is also important to recognize that there are differences globally with the use of child – specific images. In some regions / cultures, that use may be perfectly appropriate for older audiences (not directed at children) while in other regions and cultures those images are not used except in apps directed towards children. Be aware of this global dimension.
  5. Be clear about the financial dimensions of apps (as applicable). Will they require purchases once the app is downloaded? If so, what is the cost and how often? Is there is a subscription option? Consumers do not want to be surprised by these costs. Reading the reviews of apps in these stores illustrates the “surprise” of these additional costs. It is quite informative to read the reviews of apps in both stores.
  6. If ads will pop up while an app is used, alert the consumer to this fact. This is another area where consumers would rather not be surprised.
  7. Ensure that your company privacy policy and other associated terms and conditions / terms of service are current and in compliance with the requisite statutes and regulations (such as the EU GDPR, U.S. COPPA, and the State of California CCPA).

SUMMARY

App descriptions in the Apple App and Google Play Stores serve two important purposes – to entice people to download / use that app and comply with the relevant country / jurisdiction statutes and regulations. Apps require the same concise writing and dedication to detail that many other areas of technology and law require. From what I have seen in reviewing app pages, the biggest issues are sloppy writing (including missing substance) and inconsistent messages in them (especially between words and images / media). It is better to identify child – directed apps in the app page rather than have a governmental authority begin to question / analyze apps to ensure that the privacy interests of children are being protected.

We are entering a heightened age where the protection of personal data is much more important than previously desired or expected. It is better to adopt privacy – protecting practices now than react to legislation / regulation later.

I may be reached at jcasey@caseyprivacycontracting.com if you have any questions or comments.

Happy Birthday 2 years on with GDPR!

In celebration for GDPR 2 years on, I thought to repost some blogposts from June 2018. However, when looking I realised that they were a few and the theme was strong on how our personal data is public in Sweden and the use of utgivningsbevis to keep this status quo. So, I ended writing an additional blogpost, realising that I’m still really unhappy about the Swedish status quo on this.

GDPR has brought progress in ensuring that we, data subjects, have rights over our personal data, but sadly what I posted 2 years ago is still acutely relevant today in 2020.

The fact is in Sweden our personal data is made public and we have no say! After all public is public, impossible to restrict processing when this is the case, and as acknowledged in privacy laws, not just in the EU. The data brokers get to this data scrape from public sources, do some intelligent profiling and sell on to businesses, e.g. based on where you live will determine how you are profiled and to whom you will be sold.

Someone tried to argue with me once that a street name (missing house no.) was not personal data. The fact is that the street where you live says quite a lot about who you are. It gives an indication on your wealth, if you’re young, with kids, or elderly and if you’re likely to have a garden, 1 or 2 cars, etc. Your street name is directly or indirectly linked to you as an individual. The street name could be enough that you receive cold calls either by phone or someone knocking on your door to sell you double-glazing.

In UK for example, you are hidden by default. The difference in Sweden is that it still stands today the clash between laws pertaining to ‘freedom of press’ versus ‘a right to a private life’. In Sweden it is the former which wins.

I read somewhere that there are 100s, maybe 1000s of complaints from Swedish data subjects on the lack of control and rights (as per GDPR) they have over their personal data. This is positive! People are aware of their rights and are asking questions, why is this happening? I can’t find the article now, so would appreciate if anyone can dig it up? The question is if this will change? Can it change?

The e-Privacy Regulation has something to protect from unsolicited calls, and by default protected, as in UK the resident needs to opt-in to be included in a public directory.

Protection against spam: this proposal bans unsolicited electronic communications by emails, SMS and automated calling machines. Depending on national law people will either be protected by default or be able to use a do-not-call list to not receive marketing phone calls. Marketing callers will need to display their phone number or use a special pre-fix that indicates a marketing call.

How it works in Sweden today is that every business needs to have a ‘do not call list’, it seems that what is proposed in the e-Privacy Regulation is a national list, which is an improvement, but still does not solve the root of the problem. I do not want my data public unless I have specifically consented to this or I have myself made my data public.

Happy GDPR Day!

On the two-year anniversary of the EU’s GDPR I thought it would be timely to post an excerpt from the 2nd edition of my Cybersecurity Law, Standards and Regulations book published earlier this year.

The European Union (EU) General Data Protection Regulation (GDPR) was approved by the EU parliament on April 14, 2016 and became effective May 25, 2018. The GDPR replaces the EU Data Protection Directive and is designed to:

• Standardize disparate data privacy laws throughout Europe.
• Protect EU citizen privacy.
• Harmonize EU data protection and privacy safeguards.
• Encourage compliance through meaningful fines and sanctions.
• Put EU citizens back in charge of their personal data.

GDPR applies to organizations located within the EU as well as organizations located outside of the EU if they offer goods or services to, or monitor the behavior of, the EU data subjects. GDPR applies to all companies processing and holding the personal data of data subjects residing in the European Union, regardless of the company’s location. Figure 3-1 provides a model of how GDPR is designed.

Figure 3-1. EU GDPR Model.

The GDPR differs from the EU Data Protection Directive in the following ways:
Directive vs. Regulation – GDPR carries more clout and removes the discretionary language that comes with a directive. The GDPR applies to all member states of the EU and removes data protection inconsistencies of member states.
Jurisdiction Expansion – The coverage of GDPR is expanded past European boundaries and extends compliance to any organization that houses or processes EU citizen information regardless of location.
Citizen Consent and Rights – Organizations can no longer use ambiguous terminology or confusing legalese to define or secure consent. Organizations must clearly define the terms of consent and how data will be used in plain language. Citizens also have the right to access (right to access) and receive (data portability) their own data as well as have it erased (right to be forgotten) on demand.
Privacy Safeguards – Privacy is now a legal requirement where privacy protection must be designed in systems and processes to meet the requirements of GDPR.
Enforcement – The GDPR is similarly enforced through courts, with penal and administrative sanctions in addition to civil remedies. What has changed is the amount of the fines a court can levy for a violation. Fines can go as high as EUR 20 million or four percent of an organization’s turnover or annual sales.
Breach Notifications – Under GDPR it is no longer necessary to submit breach notifications to each local privacy authority. A Data Protection Officer (DPO), which is a mandatory appointment would make the notification to a single and relevant authority.


2019 is the year when GDPR enforcement ramped up. I believe that for every data breach experienced here in the US, a parallel GDPR enforcement in cases EU citizens are impacted will be launched. Table 3-9 provides a summary of the some of the initial fines levied under GDPR.

Table 3-9. Largest GDPR Fines

The companies fined above are just the beginning with U.K. Data Protection Authority the Information Commissioner’s Office announcing in July of 2019 intends to fine British Airways and Marriott International for violating the GDPR $228 million and $124 million respectively in July 2019 (Davies, 2019).

TIP: Create a GDPR impact statement based on four percent of your organization’s annual turnover as well as covert EUR $20 million to determine total fine exposure.

GDPR compliance still requires work world-wide. A report by Thompson Reuters released approximately one year to the day that GDPR took affect states that:


• More companies are failing to meet global data privacy regulations.
• Many companies have found GDPR compliance more difficult than expected.
• Half of companies are at risk of falling further behind.
• An increasing number of companies have now been subject to enforcement actions.
• Companies are becoming less open and pro-active with consumers.
• Board and C-suite concern and engagement on data privacy issues is falling.
• GDPR is now consuming a greater proportion of data privacy budgets (Thomson Reuters, 2019).

Keep regular tabs on this site for the most current information on GDPR.

Accountability. Implications for a Controller using CCTV.

But what is a controller I hear you ask?! Once again we return to the “purpose and means (essential elements) of processing”. Not trying to get boring about it but this is where the magic happens! We have some interesting and challenging situations to consider. We need to always come back to who is the real controller of the camera. Not just who put the camera up – but the why? to what purpose? who benefits? and who controls how?

We also need to consider the types of data being processed. For cameras, it’s images and sound, probably not a lot more. This data is central to our security and it is realistic to expect it will be held for a period of time.   

Cameras in communal areas of apartment blocks; cameras on the street; cameras in areas that are semi-public -they all pose challenges that are not easily explained by the GDPR. Public cameras are also on the increase. Police forces are protecting us as a community with strategically placed cameras. It seems that no matter how far we roam we are never too far away from a CCTV camera. The central question for all of us is “who is the controller?”.  

So does the right of the controller to use this camera to “prevent” or “solve” crime override your rights of data integrity. The European Data Protection Board suggests a particular methodology to follow for private persons.  The controller should have tried other methods and determined that this is the necessary solution. From there, they need to ensure that they are applying the minimisation principle. Video surveillance to “prevent accidents” is not proportional.  Individuals should not be monitored in places they don’t expect to be monitored. i.e.in changing rooms or saunas.

Household or domestic exemption rule in GDPR is strictly viewed, and getting more strict following recent guidelines. These days if we buy a camera for our home – we must be prepared to take responsibility for it. This means that (among other things) we should be really clear about the purpose of the camera; positioning it correctly and having a sign letting people know there is camera surveillance.

The whispering protocol and covid-19

Covid-19 smart wearable using privacy enhancing technologies popped up in my LinkedIn feed just today…. I was sceptical until I started reading the academic paper for the Whisper Tracing protocol used by the product.

This is a product which is not installed on a smart device, it is something you clip to your shirt (or whatever) and warns you if you are too close to another individual. It does not link the wearer of the wearable with an identity. The data is collected centrally, but deleted after a short time.

What is good about this product is that with covid-19 the most vulnerable group apart from those with underlying illnesses are the elderly, but it is mainly this group that do not own a smart phone, if they do probably do not use it optimally, which means they are excluded.

This is great for the workplace. I know when I’m out that I’m rubbish at this social distancing. I live on an island, at least thats my excuse, I work mainly from home, or it’s just I get so focused on what I need to do that I don’t notice people around me. I really could do with a clip-on which beeps when I get too close to others.

It is a startup Nodle which produced this wearable. It will be interesting to read more on this -when I have time- and see where this wearable ends up.

Occupational doctor is controller when you test your employees for coronavirus

At least this is the latest position in Italy, which is rather interesting, and provides some lead in controlling this pandemic in the workplace, reducing the risk on rights and freedoms of employees. The relevant paragraph from the article worth reading and I am referring to is quoted below.

The Italian data protection authority held that serological tests run on employees are privacy compliant in Italy provided that the occupational doctor is the data controller and is the sole individuals aware of the results of the test, communicating to the employer only the suitability/unsuitability of the employee to perform the working activity.

Watch out! Ransomeware actors have turned to blackmail

Ransomware has evolved into blackmail. We are all familiar with the concept of ransomware, whereby critical operational data, which includes personal data is encrypted by hackers, and hence inaccessible to the business. In order to get access, i.e. the decrypted data (the key is owned by the hacker), they need to pay a fee. The fees are significant, this article gives an insight, e.g. a recent case resulted in a fee to be paid of $350 000.

So the business gets back their operational files, and this is where the blackmail kicks in. The hackers will request a second ransomware fee of between $100 000 and $2 000 000 for the data to be deleted or they will make it public!

What is surprising, or maybe not, is that the victims are actually paying. Especially those in private healthcare, who can’t afford the damage to their reputation should it get out that they have been hacked, and sensitive data has been stolen…. and they don’t report the breach as is required by law (in the U.S.) and Europe, and other countries globally.

If you are worried about this trend, and we all should be, then protect your data as it should be (GDPR Art 32 requires this is done). Get the experts in, they cost much less than what a ransomware demand will, if they get to you first. And it could be that it is not so difficult to fix, you maybe surprised!

Edited: PrivSec have a free ‘fireside chat’ session on ransomware, and what to do if it happens to you, you can book here.

Cookie walls are not GDPR compliant

This clarification on the use of consent came out last week, and provides no surprises for those working daily with GDPR compliance. What is noteworthy though is the mention on the use of “cookie walls”.

What is a cookie wall then?

One of the principle factors that one should keep returning to when thinking about compliance with GDPR is a single word CHOICE. Those who have attended any training I’ve delivered will hear my voice in their head now… CHOICE IS A HUMAN RIGHT. Now if there is no choice, then it is not compliant with GDPR, and the use of cookie walls is a good example.

A cookie wall is whereby it is impossible to use a website without accepting ALL cookies. In fact the website will not work unless all cookies are downloaded. There must be a choice. Take a look at UK’s .ICO website for an example, and here you can even find the toolkit to make this possible on your website!

If you have a policy, make sure it is documented, if you have a procedure, document that too…else..

Well it seems that another government authority in Sweden has been fined 120 000 kr (circa €12k) by the Swedish Data Protection authority. The region (county) of Örebro, and it was the heath authority, and it was sensitive data.

What is important in this case, is that although they had procedures, they were not documented, it was word of mouth… oopps, and this is not good enough. Where is the evidence?

Clearly processing of sensitive data means that extra care must be taken, but what is key here outside of this is that Article 5.2 of the GDPR requires accountability, which means there must be evidence that 5.1 is being adhered to.