Guido Noto La Diega, University of Stirling
From smart toasters to fitness collars for dogs, we live in a world where everything around us is gradually being connected to the internet and fitted with sensors so that we can interact with them online.
Many people worry about the privacy risks of using these devices because they may allow hackers to listen to our conversations at home. But the contracts for using them are so long we don’t understand which other rights we might be signing away.
During research for my book, I found that using Alexa’s voice command triggers 246 contracts that we have had to accept in order to use it. These contracts transfer our rights and data to countless, often unidentified, parties. For example, they frequently refer to “affiliates”.
Despite months of research I wasn’t able to clarify who these affiliates are or even whether these affiliates are subsidiaries or advertisers. Of the 246 contracts, I focused on those that are most likely to be relevant to smart speaker Echo’s users. I found they are on average as long as Harry Potter and the Prisoner of Azkaban (317 pages). Not exactly a light read.
Data analysis company Statista found, it would take an hour and a half to read Apple’s terms and conditions for creating an Apple ID. And that’s assuming you don’t need to pause to check the text’s meaning.
Using the Literatin plugin, a Google Chrome extension that assesses the readability of text, I found these contracts are as readable as Machiavelli’s 16th-century political treatise, The Prince.
Does this matter?
Until recently, we might have been forgiven for thinking that the terms and conditions (T&Cs) we accept when browsing the internet were just a box-ticking exercise and nothing to worry about.
But between January and July 2023, Europe’s lead data protection enforcement authorities – the European Data Protection Board and the EU Court of Justice – shed light on Meta’s (formerly known as Facebook, Inc) practice of relying on these contracts to target us with ads. And, in an unprecedented move, they banned this practice.
T&Cs are not just about our privacy – and our privacy is not just about our data. By surrounding ourselves with devices with sensors (also known as the “Internet of Things)”, we’ve effectively invited digital landlords into our homes.
One example I refer to in my book can be found in an Amazon contract that legally binds anyone watching videos on their Echo devices: “Purchased digital content … may become unavailable … and Amazon will not be liable to you”.
In other words, if you think that you own your digital content only because you are purchasing it, think again: can we call it property if it can be taken away randomly?
Companies do act on these types of hidden clauses. In 2019 Amazon (rather fittingly) took back the ebooks of George Orwell’s Animal Farm and 1984 from its Kindle users due to alleged copyright issues.
Another example is how tractor manufacturer John Deere relied on its end-user licence agreement (Eula) to stop farmers repairing their smart tractors. John Deere’s Eula forbade customers even looking at the software it uses to run its tractors.
Betting giant Spreadex took a customer, Colin Cochrane to court to force him to pay almost £50,000 of gambling losses in 2012, racked up by his stepson. Cochrane’s girlfriend’s son had been “playing” with his computer without his permission while he was away from the house.
Spreadex pointed the UK account owner to a clause in its customer agreement that equated the use of account passwords with a confirmation of who was behind the screen using the device.
Fortunately for Cochrane, the judge held that the clause was not enforceable because it would have been “quite irrational” for Spreadex to assume the customer read the agreement and understood its implications.
Regulation won’t work
Examples of law reform include the online safety bill in the UK and the Data Act in the EU. They are both in progress, so we don’t know yet when they will be adopted.
Law reform is a painfully slow process. Big tech and other large stakeholders have a huge influence because they have money and influence to fight laws they don’t like.
Sometimes bills end up so diluted they are of little use. This was the case with the General Data Protection Regulation (GDPR) which came into effect at the end of a nine-year process. It was born out of date. Several studies have underlined GDPR’s inadequacy to deal with new technologies such as ChatGPT.
What does work
The solution is to collectively organise. Let’s circle back to John Deere and the way the company tried to deprive tractor owners of their right to fix their machines. There is much to learn from those farmers who joined together with hackers to resist “smart power abuses”.
After opposing their right to repair campaign for years, at the beginning of 2023 John Deere gave in and authorised farmers and ranchers to fix their own tractors. But only after attendees at a hacker’s convention figured out how to “jailbreak” the code that was locking farmers and engineers out.
All around the world, groups of computer scientists, digital rights activists, citizens are creating cooperatives and citizen-led movements. They are motivated by partly different yet overlapping goals for example making the IoT more open and diverse.
Big tech workers are acting collectively to prevent unethical uses of their employers’ technology. For example, in 2020 Google employees fought to stop the company’s decision to provide its AI to law enforcement agencies despite the failures of facial recognition, which has often perpetuated racism and other forms of discrimination.
We can win the fight against smart power through alliances between these collectives.
Guido Noto La Diega, Chair in Intellectual Property and Technology Law, University of Stirling
This article is republished from The Conversation under a Creative Commons license. Read the original article.