Published: September 29, 2022 | Parsons School of Design
David Carroll is an associate professor of media design at Parsons School of Design at The New School in New York.
Thomson Reuters Foundation — On Sept. 13 an employee of Boston’s Northeastern University was injured when a package delivered to a virtual reality research lab exploded in his hands. Like a scene from a futuristic thriller, a note accompanying the bomb accused the lab of secretly working for Facebook and Meta founder Mark Zuckerberg in a plot to take over society through virtual reality.
Investigators are now examining whether the employee may have lied and himself staged the incident. Whatever the outcome, the bizarre events at a university in downtown Boston have focused attention on the growing public animosity toward modern technology and Big Tech companies often called ‘techlash’.
Not long ago, optimism pervaded our understanding of technology. But attitudes began to shift in the mid-2010s. By the time Americans voted in Donald Trump and Britain voted itself out of the European Union in 2016, our honeymoon with social media was well and truly over.
Two camps of thought emerged: either digital platforms like Facebook were themselves responsible for these cataclysmic events, or the platforms were just being used as useful scapegoats by those who saw their outcomes as undesirable. Either way, the narrative on tech turned sour. Big tech execs, once apotheoses of progress, became misanthropic robber barons. Politicians who once lauded the success stories of Silicon Valley, now reached across newly polarized aisles of partisanship to try and reign in Big Tech’s unbridled power and abuse of privacy. The era of techlash had arrived.
The epitome of techlash was the Facebook-Cambridge Analytica data scandal that burst into public view in 2018, heralded by a new class of whistleblowers and muckraking. It implicated the entire apparatus of many governments and their democracies. Today, it serves as a kind of shorthand in the press for data privacy scandals that contribute toward the techlash sentiment.
The last of the Cambridge Analytica-Facebook class-action lawsuits are quietly concluding with no admission of wrongdoing. (My own circuitous attempts via the English legal system to find out what data Cambridge Analytica held on me ended in similar fashion.) Government probes are ending inconclusively because of obstructions by witnesses and transnational authorities. It’s a scandal that is widely known but poorly understood or agreed upon: the unintended consequences of driving democracies with data and the limitations of holding perpetrators of political data crimes accountable.
Back in Silicon Valley’s home state of California the feeling of techlash marshaled immense voter support for a robust data privacy law. Sacramento did the previously unthinkable: reigning in its biggest industry. Indeed, insiders of the negotiations between California state lawmakers and tech lobbyists wrangling out the California Consumer Privacy Act described to me how the phrase ‘Cambridge Analytica’ served as a kind of kryptonite incantation against industry protestations over new business practice restrictions.
Infamous congressional hearings saw conservative lawmakers from the Deep South quiz Facebook CEO Mark Zuckerberg on the advantages of European data protection law, and the 87 million users who had their Facebook data sold by a developer to Cambridge Analytica.
Techlash now also pervades crypto culture. The yearning to destroy the status quo with a silver bullet technology, the crypto coin, is palpable among the faithful. For some, crypto is the antidote to the unbridled corporate power of the Big Tech titans. But with crypto’s tendency toward fraud schemes and casual criminality, tech pessimism has started to feel like an inescapable gravitational pull.
Techlash effects can be found in the present stock drops and mass layoffs at tech firms like Meta, Snap, and Shopify. Their fortunes were upset when Apple deployed a new version of its mobile operating system which, for the first time, posed a simple question to people with breathtaking honesty: “This app wants to track you. Will you say yes?”
Unsurprisingly, only a single digit percentage of Apple product owners agreed to ambiguous surveillance. By cutting off the porous and covert data supply chain of trackers embedded in apps constantly snitching our location and behaviors, the ultra-precision of Facebook targeting softened up. Meta’s core advertisers, saw the efficacy of their ad spend plummet.
Consider also how Donald Trump, perhaps one of the main instigators of the techlash, by hiring Cambridge Analytica for his campaign, ironically went on to join in the techlash himself. After being de-platformed for his conduct that violated corporate policy, he launched his own white label branded social media service.
Despite the pervasive backlash, we continue to depend deeply on modern technology. Our behaviors barely change. Maybe we install a new privacy tool or dig deep into the settings to find the passive-aggressively-worded privacy control switches.
So what led to the sudden backlash against the mass adoption of digital technologies? No one really worried about the potential harms of mass data abuse until some worst-case-scenarios played out. Starting with controversial elections, the confidence in a data-driven democratic society was shattered. Then the COVID-19 pandemic induced a data-privacy panic epitomized by the Bluetooth-based exposure alert system deployed by Apple and Google. Countries in Europe and Asia devised their own pandemic surveillance tools, all launched with much fanfare but likely negligible benefit to public health outcomes. An app is not going to save us from the coronavirus.
Almost no one was prepared for the leak of a draft U.S. Supreme Court opinion on abortion rights, priming the pump for further erosion of privacy in the United States and begging the question: does the highest court in the land itself even enjoy privacy? The Dobbs decision arrived, terminating a woman’s right to an abortion at the federal level, and asserting that the constitution does not enumerate privacy rights. It ushered in a set of cruel vigilante laws that incentivize private citizens to catch pregnant folks in the act of seeking reproductive health care.
The worst-case scenario fears of privacy advocates crystalized with attention finally being paid to the obscure third-party vendors of the intricate data supply chain behind the screen. The Federal Trade Commission sprang into action, taking on one of these data brokers, Kochava, who purveyed data sets such as device identifiers of people visiting abortion clinics ready to be matched and joined up with personally identifiable information. Should the data broker prevail in that legal case, our federal regulator of tech privacy would be further defanged, just as the threats to privacy directly harm individual autonomy and personal liberties more clearly than ever before as data brokering is harnessed as a tool of oppression.
Let’s be clear, a return to the uncritical period that preceded techlash is neither favorable nor wise. The digital platforms and advertising models conceived in the early 2000s have been found to be incompatible with the fundamental data rights enshrined in the EU Charter. Most digital services need to be rebuilt from the ground up with data rights in mind.
The right to privacy was barely an afterthought for the developers of these systems. We know this from the Cambridge Analytica class action in California that corroborates other published leaks depicting Facebook employees realizing the Herculean task of achieving basic legal compliance. Until companies tame their own beasts, design for privacy, explain-ability, accountability, and human rights, we will not be on the path to escaping the velocity of techlash. It’s increasingly in the firm’s best interest to re-allocate some of its immense wealth and influence toward this reconciliation and resolution with humanity.
Artificial intelligence and large model machine learning is advancing with shocking velocity, ushering in the dawn of media modified by algorithms. We do not have the most basic ethical frameworks or legal foundations in place to even begin to grapple with its disruptive, destructive digital energy.
Any views expressed in this opinion piece are those of the author and not of Context or the Thomson Reuters Foundation.