Quick access jump navigation

State of City Data: Data and Trust Creating legal tools to improve trust in IoT devices

IoT trust issues?

Defined as “internet connected ‘smart’ devices that people use in their homes such as smart appliances, personal assistants, children’s toys, web cameras and baby monitors[1], the availability and popularity of IoT devices like Google Home and Amazon’s Alexa are increasing exponentially. However, IoT devices, “those which can connect with the internet (and/or other IoT devices)”[2]are appearing more and more in public spaces too.

Part of the appeal of IoT devices is that they ‘get to know us’. They collect and process personal data about us that can help both us and the company that created the device by increasing device functionality such as setting reminders to product recommendations. The devices can learn everything from our musical tastes to our shopping habits, which often positions us in a specific demographic for future targeted marketing campaigns and advertising.

Whilst this proliferation of devices might make it easy to assume an implicit trust that these devices (and the companies behind them) protect our information, studies suggest otherwise.

On average, 56% of respondents are either “concerned” or ”extremely concerned” about how companies use our personal data[3]. According to the Royal Academy of engineering, there is a concern that “IoT devices can violate the norms of private space and can cause a feeling of ‘being watched’” [4].

 

IoT Devices in Public Spaces

Where IoT devices are increasingly deployed in public spaces, that need for public trust is even greater, as passersby often have no choice but to interact with them. The Sidewalk Labs project in Toronto to develop a “smart city” from the ground up prompted much debate because of concerns about how the company would use, store and potentially profit from the data collected. In response, Sidewalk Labs have proposed that no one should own urban data, that a data trust model should be used where all data collected in the public realm would be held by an independent data trust[5]. Whilst this may be just one approach to dispel concerns about transparency and democratic process[6], what is becoming clear is that appropriate attention must be given to address the lack of trust in data collecting objects.

 

How can legal tools increase trust?

  1. Data Protection Law

In 2018 the General Data Protection Regulation (“GDPR”) and the Data Protection Act 2018 introduced controls to protect individual’s “personal data”. This is information which can identify an individual. The GDPR detailed how this will be used and giving us more control over how we want our personal data to be used. It provided a legal requirement for personal data processing to be transparent.

But whilst GDPR is a great legal tool for increasing trust in data processing, its reach is limited.

  • GDPR only covers personal data – it won’t help issues with how non-personal data is gathered, monetised and managed (for example, sensitive commercial data such as footfall in a supermarket). Non-personal data can include things like energy consumption, air pollution and traffic flow in an area.
  • GDPR can result in ‘consent fatigue’[7]– the kickback from increased transparency is that a lot of IoT companies now put the onus on us to provide consent to use our data if we want to use their devices. This ‘consent fatigue’ can mean, if we don’t want to review yet another extensive set of privacy terms, we just click ‘I agree’ and we’re back to square one.
  1. The Rise of Ethical Data Sharing Agreements

There is a recent increased demand for data sharing agreements (contracts covering how (personal and non-personal) data is stored, shared and used between two or more organisations) that are ‘ethical’. Methods of providing a more “ethical” approach include enforcing good oversight provisions, a ‘checks and balances’ approach to data management and going above and beyond what is legally required.

Ethical data sharing agreements can also provide commercially fair contractual provisions for all involved. The EU Commission is currently exploring a standardised framework of Fair, Reasonable and Non-Discriminatory (FRAND) terms[8]. This could particularly benefit SMEs without the finances to negotiate data sharing agreements from scratch.  FRAND terms could make fair oversight terms more commonplace and allow greater access to the market. In turn, this could be a catalyst for the development of trusted-by-design IoT devices.

      3. Standards and certification development

Several standards and certification models currently in development aim to produce principles to guide manufacturers and developers on what constitutes best practice.

Certification plays a similar role but can be applied to devices retrospectively, rather than during the product’s development stage. These can confirm whether a device conforms to a certain standard.

  1. Data Trusts

Data trusts, the newest addition to the legal tool kit for increasing IoT users’ trust, provide a legal framework allowing data to be pooled together and looked after by a third party. They provide a set of rules whereby data access can be limited to only those that also conform to rules of access and safekeeping. This gives people and organisations confidence when enabling access to data[9].

Data trusts could enable cities to encourage public confidence and consensual participation with IoT devices in a connected place. Data trusts ensure governance structures are in place and address both the terms of data sharing and the monitoring of access. This can be done in an acceptable (and ethical) manner by providing third-party oversight from and creating a degree of separation between developer, data collector and profiting organisation. Could this framework have protected Sidewalk Labs?

The proliferation of IoT devices at homes and on our streets can sometimes make us feel powerless over what, and who, can gather data about us. A lack of transparency can eat away at our trust. The current legal mechanisms in place to increase trust in IoT devices sharing our data with the corporations behind them are either not compulsory, or still in their infancy. Whilst GDPR provides a measure of comfort where personal data is concerned, it is not without issues, such as ‘consent fatigue’.

And as technological development continues to accelerate, the need for the regulatory framework to catch up becomes ever more pressing.

———————-

[1] p.83, HM Government, Online Harms White Paper, April 2019.

[2] https://www.forbes.com/sites/jacobmorgan/2014/05/13/simple-explanation-internet-things-that-anyone-can-understand/#fe4cc671d091(accessed 17/05/2019)

[3] KPMG. Crossing the line: Staying on the right side of consumer privacy. KPMG, November 2016. https://assets.kpmg.com/content/dam/kpmg/xx/pdf/2016/11/crossing

[4] p.39 J Blackstock et al (2018), “Internet of Things: realising the potential of a trusted smart world’, www.raeng.org.uk/publications/reports/internet-of-things-realising-the-potential-of-a-tr

[5] https://medium.com/sidewalk-talk/an-update-on-data-governance-for-sidewalk-toronto-d810245f10f7 

[6] https://www.citylab.com/equity/2019/02/block-sidewalk-labs-quayside-toronto-smart-city-resistance/583477/ (accessed 20/05/2019)

[7] https://www.wired.com/story/how-a-new-era-of-privacy-took-over-your-email-inbox/ accessed 22/05/2019

[8] https://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=CELEX:52017DC0009&from=EN  accessed 22/05/2019

[9] https://theodi.org/article/what-is-a-data-trust/ (accessed 22/05/2019). Data Trust reports can be accessed at: https://theodi.org/project/data-trusts/#1554903732788-679e5312-2203 (accessed 22/05/2019)

Sign up for the latest news and updates