IoT trust issues?
Defined as “internet connected ‘smart’ devices that people use in their homes such as smart appliances, personal assistants, children’s toys, web cameras and baby monitors”, the availability and popularity of IoT devices like Google Home and Amazon’s Alexa are increasing exponentially. However, IoT devices, “those which can connect with the internet (and/or other IoT devices)”, are appearing more and more in public spaces too.
Part of the appeal of IoT devices is that they ‘get to know us’. They collect and process personal data about us that can help both us and the company that created the device by increasing device functionality such as setting reminders to product recommendations. The devices can learn everything from our musical tastes to our shopping habits, which often positions us in a specific demographic for future targeted marketing campaigns and advertising.
Whilst this proliferation of devices might make it easy to assume an implicit trust that these devices (and the companies behind them) protect our information, studies suggest otherwise.
On average, 56% of respondents are either “concerned” or ”extremely concerned” about how companies use our personal data. According to the Royal Academy of engineering, there is a concern that “IoT devices can violate the norms of private space and can cause a feeling of ‘being watched’” .
IoT Devices in Public Spaces
Where IoT devices are increasingly deployed in public spaces, that need for public trust is even greater, as passersby often have no choice but to interact with them. The Sidewalk Labs project in Toronto to develop a “smart city” from the ground up prompted much debate because of concerns about how the company would use, store and potentially profit from the data collected. In response, Sidewalk Labs have proposed that no one should own urban data, that a data trust model should be used where all data collected in the public realm would be held by an independent data trust. Whilst this may be just one approach to dispel concerns about transparency and democratic process, what is becoming clear is that appropriate attention must be given to address the lack of trust in data collecting objects.
How can legal tools increase trust?
- Data Protection Law
In 2018 the General Data Protection Regulation (“GDPR”) and the Data Protection Act 2018 introduced controls to protect individual’s “personal data”. This is information which can identify an individual. The GDPR detailed how this will be used and giving us more control over how we want our personal data to be used. It provided a legal requirement for personal data processing to be transparent.
But whilst GDPR is a great legal tool for increasing trust in data processing, its reach is limited.
- GDPR only covers personal data – it won’t help issues with how non-personal data is gathered, monetised and managed (for example, sensitive commercial data such as footfall in a supermarket). Non-personal data can include things like energy consumption, air pollution and traffic flow in an area.
- GDPR can result in ‘consent fatigue’– the kickback from increased transparency is that a lot of IoT companies now put the onus on us to provide consent to use our data if we want to use their devices. This ‘consent fatigue’ can mean, if we don’t want to review yet another extensive set of privacy terms, we just click ‘I agree’ and we’re back to square one.
- The Rise of Ethical Data Sharing Agreements
There is a recent increased demand for data sharing agreements (contracts covering how (personal and non-personal) data is stored, shared and used between two or more organisations) that are ‘ethical’. Methods of providing a more “ethical” approach include enforcing good oversight provisions, a ‘checks and balances’ approach to data management and going above and beyond what is legally required.
Ethical data sharing agreements can also provide commercially fair contractual provisions for all involved. The EU Commission is currently exploring a standardised framework of Fair, Reasonable and Non-Discriminatory (FRAND) terms. This could particularly benefit SMEs without the finances to negotiate data sharing agreements from scratch. FRAND terms could make fair oversight terms more commonplace and allow greater access to the market. In turn, this could be a catalyst for the development of trusted-by-design IoT devices.
3. Standards and certification development
Several standards and certification models currently in development aim to produce principles to guide manufacturers and developers on what constitutes best practice.
Certification plays a similar role but can be applied to devices retrospectively, rather than during the product’s development stage. These can confirm whether a device conforms to a certain standard.
- Data Trusts
Data trusts, the newest addition to the legal tool kit for increasing IoT users’ trust, provide a legal framework allowing data to be pooled together and looked after by a third party. They provide a set of rules whereby data access can be limited to only those that also conform to rules of access and safekeeping. This gives people and organisations confidence when enabling access to data.
Data trusts could enable cities to encourage public confidence and consensual participation with IoT devices in a connected place. Data trusts ensure governance structures are in place and address both the terms of data sharing and the monitoring of access. This can be done in an acceptable (and ethical) manner by providing third-party oversight from and creating a degree of separation between developer, data collector and profiting organisation. Could this framework have protected Sidewalk Labs?
The proliferation of IoT devices at homes and on our streets can sometimes make us feel powerless over what, and who, can gather data about us. A lack of transparency can eat away at our trust. The current legal mechanisms in place to increase trust in IoT devices sharing our data with the corporations behind them are either not compulsory, or still in their infancy. Whilst GDPR provides a measure of comfort where personal data is concerned, it is not without issues, such as ‘consent fatigue’.
And as technological development continues to accelerate, the need for the regulatory framework to catch up becomes ever more pressing.
 p.83, HM Government, Online Harms White Paper, April 2019.
 KPMG. Crossing the line: Staying on the right side of consumer privacy. KPMG, November 2016. https://assets.kpmg.com/content/dam/kpmg/xx/pdf/2016/11/crossing
 p.39 J Blackstock et al (2018), “Internet of Things: realising the potential of a trusted smart world’, www.raeng.org.uk/publications/reports/internet-of-things-realising-the-potential-of-a-tr
 https://www.wired.com/story/how-a-new-era-of-privacy-took-over-your-email-inbox/ accessed 22/05/2019
 https://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=CELEX:52017DC0009&from=EN accessed 22/05/2019
 https://theodi.org/article/what-is-a-data-trust/ (accessed 22/05/2019). Data Trust reports can be accessed at: https://theodi.org/project/data-trusts/#1554903732788-679e5312-2203 (accessed 22/05/2019)