Privacy guidelines for IoT – what you need to know [Infographic]

9th November 2016

The government has now confirmed that the UK will be implementing the EU General Data Protection Regulation (GDPR).  To this end, Nominet has created an infographic based on the GDPR and Information Commissioner’s Office (ICO’s) Code of Practice on privacy notices, transparency and control [2] to serve as a guideline when developing IoT solutions. If you want to learn more, we will be at the Smart City Expo World Congress in Barcelona from 15-17 November, and will be happy to discuss in more detail.

A frequently used argument in debates of personal data protection is that consumers simply do not care about their privacy. Just because consumers don’t always understand the risks and consequences, this doesn’t mean they should not be protected.  EU General Data Protection Regulation (GDPR) [1] makes this very clear by setting out fines of 2%-4% of total worldwide annual turnover or 10,000000 – 20,000000 EUR, whichever is higher, depending on the type of infringement. We deliberately made the infographic broad in scope: we do not expect all techniques presented to be useful for every IoT system. What might hold true for the retail sector won’t necessarily be true for home automation or public transport systems. But these guidelines are all good ways to achieve privacy-awareness in IoT. Finally, even though we have put IoT at the centre of our discussion, the recommendations apply more broadly.

Personal data

Information collected in a system becomes personal if identity can be correlated with activity [2].  Such identification can be direct or indirect.  The identifier can be a name, an identification number, location data or an online identifier (such as IP address). It may also be specific to the physical, physiological, genetic, mental, economic, cultural or social identity of that natural person [1]. This is why data protection law does not apply to anonymous data (i.e., data in which the data subjects are no longer identifiable). However, if the risk of identification is reasonably likely the information should be regarded as personal data [3] – experience shows that the risks may be quite high [4].

The first rule of personal data protection is that personal data must be processed fairly and lawfully. The processing is lawful if it has taken place because

(a) the individual has unambiguously given consent,

or processing is necessary

(b) for the performance of a contract,

(c) for compliance with a legal obligation,

(d) to protect vital interests of the individual,

(e) for the performance of a task carried out in the public interest, or

(f) for the purposes of legitimate interests if such interests are not overridden by the fundamental rights and freedoms [1].

Steps to personal data protection

Here are some steps IoT businesses can take to ensure personal data protection and consequently the privacy of their end-users.

GDPR Infographic

View the infographic in a larger size.

1. Minimise

The collection of personal data should be limited to what is necessary. By ensuring that no unnecessary data is collected, the possible privacy impact of a system is limited.

To decide the personal data risks of an IoT system, a risk analysis (i.e., Privacy Impact Assessment) needs to be carried out. Minimisation builds on this assessment, and should be embedded in the system development life cycle.

The Privacy and Data Protection Impact Assessment Framework for RFID (Radio Frequency Identification) currently is the best (and only) resource for IoT systems [5].

2. Pseudonymise

Pseudonymisation is a privacy-enhancing technique where directly identifying data is held separately and securely from processed data.

The incentive for pseudonymisation is that it may relieve complexity in other parts of the system. If the individual can no longer be identified (e.g., directly identifying data is deleted, or re-identification is not possible), then it is not possible and hence, not necessary to implement solutions for data access, rectification, erasure or data portability (Article 11 in [1]).

3. Be transparent

People have a right to know what personal data concerning them is collected, and processed. Transparency requires that this information is easily accessible, simple to understand, and written in clear and plain language. The users of an IoT system need to understand:

  • What information is being collected
  • Who is collecting it
  • Why is it being collected
  • How is it collected
  • How will it be used
  • How will it be shared
  • What will be the effects of collection and sharing
  • What are the risks of collection and sharing

In contrast to current practice, this information needs to be presented without overwhelming the user with legalese. ICO code of practice suggests privacy notice websites that follow a layered approach, with shorter explanations expanding to longer versions on request. Also, the use of icons or symbols is encouraged.

Without proper interfaces, IoT systems face challenges to present proper notice. Here, options include notice at the point of sale, during set-up and QR codes on device.

4. Authorise access

Personal data should be processed in a manner that ensures its security and confidentiality, including preventing unauthorised access or use. 
There are several standardisation initiatives for authorisation including Kantara UMA (User Managed Access) and IETF ACE (Authentication and Authorisation in Constrained Environments).  The common approach both standards follow is building on the web standard for authorisation – Oauth2.  At Nominet, we are following both standards in our IoT solutions.

5. Get consent

Consent is one of the ways to lawfully process data. Consent must be freely given, specific and fully informed, and should be supported with good transparency practices. It should also be as easy to withdraw consent as to give it.  Additional measures, e.g., parental consent, needs to be considered, when personal data relates to children.

Even though IoT devices may present an inconvenience (e.g., due to limited input and output interfaces), developers nonetheless should think of methods for allowing users to opt-in to their personal data collection. This opt-in should be as finely granular as possible – i.e., people should be able to agree to use of their information for one purpose but disagree for another. For instance, in the case of smart meters, although half-hourly data collection consent will require the user to ‘opt-in’, consent for daily collection will be automatic (with an option to revoke this), and checking monthly consumption will require no consent at all [6].

Providing these opt-in options on devices that interface with the IoT device, for example in smartphone and web-based privacy management dashboards, are logical choices and considered as good practice [7]. Other options, similar to transparency notices, include choices at the point of sale, during set-up and QR codes on device.

6. Monitor

In IoT, personal data becomes a moving target as the number of attributes are increasing, and any information can be personally identifiable. It’s been shown that battery readouts, ambient light and proximity sensor values can be used for tracking and profiling [8] [9].

In order to identify and ensure proper use of personal data, collected data should be periodically reviewed and time limits should be established for its erasure. Every reasonable step should be taken to ensure that inaccurate personal data is corrected or deleted.

7. Give customers control

It is necessary to give users of the IoT systems appropriate control and choice. GDPR brings several rights for individuals, for example, the “right to be forgotten”, the “right to data portability”, or the “right to restriction of processing”.  The “right to be forgotten” requires that users are able to enforce the erasure of personal data that is not necessary anymore. The “right to data portability” requires that the individual can transfer the data to another controller.  For smart homes, the “right to restriction of processing” may mean allowing end-users to temporarily or permanently de-activate the sensors whose privacy-relevant data might be processed, e.g., to accommodate the guests of a smart home. It is essential to provide the end-users with appropriate interfaces to configure the system and practice their rights according to the regulation.


While the incentives to invade privacy may be linked to social problems, the actual ability to do so is a technical problem [10]. Thus, having the right technologies in place is a necessity, and even more critical for IoT, where the systems highly interconnected and the data flows in multiple directions. We hope the infographic serves as a useful starting point.


[1] European Parliament and Council, “General Data Protection Regulation,” Official Journal of the European Union, Brussels, 2016.
[2] M. Dennedy, J. Fox and T. Finneran, The Privacy Engineer’s Manifesto: Getting from Policy to Code to QA to Value, Apress, 2014, p. 400.
[3] ICO, “Anonymisation: managing data protection risk code of practice,” 2012. [Online]. Available:
[4] P. Ohm, “Broken Promises of PRivacy: Responding to the Surprising Failure of Anonymization,” 57 UCLA Law Review, vol. 1701, 2010.
[5] European Commission, “Privacy and Data Protection Impact Assessment Framework for RFID Applications,”
[6] E. UK. [Online]. Available:
[7] ICO UK, “Privacy notices, transparency and control,”
[8] L. Olejnik, “Battery status readout as a privacy risk,” August 2016. [Online]. Available:
[9] L. Olejnik, “Sensors Privacy,” August 2016. [Online]. Available:
[10] J. D.-F. M. H. J.-H. H. D. L. M. R. T. S. S. George Danezis, “Privacy and Data Protection by Design – from Policy to Engineering,” CoRR, 2015.
[11] Internet Society, “The Internet of Things: An Overview, Understanding the Issues and Challenges of a More Connected World,”, 2015.