Smart-home devices: 3rd-party privacy risks

Rewind to 1996 when I landed a job at Cern in Geneva and started a phase of my life which changed me forever. One of the exceptional engineers I met (Ivan) had configured his home into a primitive version of the ‘smart-home’ although it wasn’t called that then.

Everything was connected to a dashboard. He knew every time someone entered or left the house, every time someone visited the bathroom and how long. He had video connected which he could access from his PC. I think he had also programmed other functional aspects of the house, such as lighting, although I am not sure. What I do remember is how myself and my work colleagues, although impressed by his home were sceptical of the privacy implications. Myself and my female colleagues could not imagine living in a house whereby our partner knew every time a bathroom visit happened and for how long.

How short-sighted we were. I am now really for the first time taking a dive into smart-technologies in the home. I haven’t even started yet and am challenged to identify the controller-processor or controller-controller accountability? What data is shared with Google or Amazon, and who is accountable? I’ve looked around to see what other smart-home product vendors are writing on their privacy notices, and I have found nothing yet. The page from Google describes how their own products are working pertaining to privacy. But still nothing on what happens with 3rd-party cameras not populating the market. This blog Post is me brainstorming with myself.

Looking at the components of a smart-device, using a thermostat as our example: (1) thermostat, (2) Google Assistent/Alexis (3) App/code on smartdevice in Google Assistent/Alexis dashboard. So what is shared and where does it go?

(1) The termostat will have its own memory chip, enough to store and send data onward. In the old days data would be stored in a temporary cache, but nowadays, devices are never switched off, and the temporary cache is normally supported by a permanent memory on a hardware chip. Risk is if you sell the device and there is no hardware/factory reset button to clean the chip. This is not a high privacy risk as a thermostat is not highly sensitive data unless the temperature is set to unusually high which is the case when someone is sick or a new baby has arrived in the household. This could be quite an issue if the smart device is a camera, such as the incident with the Google Nest Indoor Cam.

(2) What is shared with Google Assistent/Alexis? The most publicity has been with the voice data collected and how it has been used. The most talked-about privacy invasive issues I’ve come across so far are (i) whereby background noise has been collected continuously, (ii) whereby the voice commands have been collected for the purpose of triggering some action, e.g. switch lights on, and in addition has been used by Google/Alexis to improve their voice recognition services without informing the users, i.e. lack of transparency in data collection and use practices.

(3) The App itself may collect other data in order to deliver the service, e.g. GPS/location data which is sent to the provider of the App, question if this flows via Google/Alexis? I guess so, as the device manufacturer is not creating their own App, they have created a piece of code to

What I see is that a smart-device which is connected to Google/Alexis is that the user is sharing their voice-data with Google/Alexis. This is not biometric data, it is voice-data. Voice-data which is not shared with the provider of the device. It is Google/Alexis which translated the voice-data into a digital format which the device can understand and act upon.

This means that (1) Google/Alexis needs to authenticate to the device App/code, which needs to provide just authorisation to send the instruction and nothing more, and (2) Google/Alexis are sharing instructions (from the voice-data) with the smart-device. (3) If (1) is done correctly, no data is sent from a 3rd-party smart-device to Google/Alexis.

I’m not sure that I’m missing anything here, but IMHO the risks to the provider of the smart-device are to ensure the code created to pop into the Google/Alexis hub/dashboard is purely to authenticate (2-way) and share a one-way instruction (from the hub to the device) on what the device must do?

Although I guess I’m forgetting here about the contextual data which is sent back to Google/Alexis hub in order to make decisions?

So you want to be forgotten?

The RTBF (Right to be forgotten) is a hot topic following the Spanish ruling against Google. The fact is that European Google must first evaluate and remove if considered reasonable search results that threaten the requester’s right to personal privacy. It is claimed to be a blow to Freedom of Speech. Google has already received 70,000 requests and receives on average 1000 requests each day! In U.K. claims are being made that it is in conflict with s.32 of the Data Protection Act 1998.

There is a good write-up on discussions following the ruling at: Debate Write-Up: Rewriting History.

Christopher Graham the Information Commissioner gives a good explanation of what it really means, but unfortunately it is lost in the panicked crys of other participants in the debate.

It is very straight-forward: There is claimed to be a the conflict between the Freedom of Speech and Personal Privacy, i.e. in this case the RTBF. However there is not, it is as Graham states:

1) There are two types of parties here: a) the data controller, and b) the journalist;
2) The ruling pertains to the data controller the RTBF, not journalists, so in UK for example, this does not impact s.32 of the Data Protection Act;
3) Just that the search results are not returned by the search engine of the data controller, does not mean that the data does not exist. It is just that is is not searchable;
4) This information pertaining to an individual is still on the website of the newspapers, and should be searchable directly on the website.

So this cannot be likened to ‘burning of books’ or ‘re-writing history’ as in George Orwell’s 1984. It basically means that if, for example an individual defrauded the Inland Revenue 10 years ago:

    – If you search for this person by name, it will not return this name in the result.
    – However if you search for ‘Inland Revenue fraud’ it could return this person’s name in one of the related articles.

What I see is that the main challenge is from a technical perspective. At the moment the onus is on the data controllers to receive requests, to decide if the requester has a valid request for removal from their search engines. However, I believe that this should be done as default by websites of newspapers. This could be difficult because on a technical level it is only possible, that I am aware of today, to exclude whole webpages from Google, not names or specific words.

The rights of Swedish residents should override the rights of the data controller

I took this from Panopticon Blog concerning the outcome of the Google order. Now what if the rights of the Swedish citizen was to be escalated to the EU courts, would the outcome be the same?

“The first question for the CJEU was whether Google was a data controller for the purposes of Directive 95/46. Going against the opinion of the Advocate General (see earlier post), the Court held that the collation, retrieval, storage, organisation and disclosure of data undertaken by a search engine when a search is performed amounted to “processing” within the meaning of the Directive; and that as Google determined the purpose and means of that processing, it was indeed the controller. This is so regardless of the fact that such data is already published on the internet and is not altered by Google in any way.

The Court went on to find that the activity of search engines makes it easy for any internet user to obtain a structured overview of the information available about an individual thereby enabling them to establish a detailed profile of that person involving a vast number of aspects of his private life. This entails a significant interference with rights to privacy and to data protection, which could not be justified by the economic interests of the search engine operator. In a further remark that will send shockwaves through many commercial operators providing search services, it was said that as a “general rule” the data subject’s rights in this regard will override “not only the economic interest of the operator of the search engine but also the interest of the general public in finding that information upon a search relating to the data subject’s name” (at paras 81 and 97).”

The Right to be Forgotten is respected by the EU Courts

Google officesI love this, the EU Court has confirmed that we have the right to be forgotten. Google and other internet search engines face a new world where they must remove links to websites containing certain types of personal data when individuals ask them to do so. The European Union says you have “a right to be forgotten” digitally online. This is great news for every citizen of the EU, including our children!

Read more in English and Swedish.

Hacked! BOYI and Risks

Bring Your Own Identity (BYOI) is on the band-wagen with BOYD (Bring Your Own Device). Wired journalist relays a sobering story of how his digital identity got stolen through trusting third-party identity providers.

Basically Hackers were able to swipe his mac, iPhone and hack into his twitter account to send twitters that were damaging to Mat Honan’s reputation. He was using iCloud, the hackers hacked into his iCloud account, and they got in because Apple uses the last 4 digits of his credit card as a form of authentication. They got the last 4 digits that are incidentally stored as clear-text by Amazon through a bit of social engineering. Know I bet you want to read the whole story? Read more here.