An End to Surveillance Capitalism | Harvard Magazine

Most people can be known intimately, almost instantaneously, in ways that were unthinkable just a few years ago — through the wholesale collection of data from the wireless devices they keep close. “These are devices that we see as extensions of our powers and capabilities,” Mathias Risse, director of the Carr Center for Human Rights, said at a Sept. 20 forum at the Harvard Kennedy School (HKS). “If you’re like me,” he continued, “these devices are the first thing we touch in the morning and the last thing we touch at night. And all the wireless data that comes from these devices is processed for commercial purposes.”

In the absence of regulations against the collection of such private data, its exploitation has made a few individuals and companies fabulously wealthy, in a process that Harvard Business School professor emerita Shoshana Zuboff has aptly described as “surveillance capitalism.” Risse, who will co-direct a new Carr Center program with Zuboff, “Surveillance Capitalism or Democracy?” said that “the future of our living conditions is being shaped by corporations that are not in business to advance visions of the common good, but are in business—as people are in business—to make a profit.” That this is one reason, he continued, that these issues have become human rights issues of global dimension.

Risse’s comments, made just a day after the publication of a US Federal Trade Commission study documenting “massive surveillance” of social media users, set the stage for a discussion between four “enormously influential women” who have fought to reform the collection and exploitation of information for profit: Zuboff; European Commission Executive Vice-President for a Europe Fit for the Digital Age Margrethe Vestager (the European Union’s outgoing antitrust chief who recently won a European Court of Justice antitrust ruling against Google and a tax judgment against Apple); Maria Ressa LL.D. ’24, the Nobel Prize-winning journalist who spoke at Harvard’s June 2024 graduation ceremony; and Baroness Beeban Kidron, a peer in the British House of Lords who has fought to protect children’s rights in the digital environment.

“Just keep the market open”

Vestager said that the key message to get across is that “it’s not too late” to stop the exploitation of personal data. She credited Zuboff with providing a vocabulary and understanding of how personal information is being used commercially, and said that once you see that, figuring out how to respond is something that “can be navigated.” However, she stressed that it “can’t happen without systemic responses.” She outlined the comprehensive nature of regulation and enforcement in the EU, which as a first step has passed privacy laws “to make the very simple things clear,” such as “You own your data. You should be the one deciding” what happens to it. More recently, the EU passed the Digital Services Act, which says that democracies can decide what should be considered illegal online, just as they do in the physical world, and that online services should be safe for mental health.

Margrethe Vestager
Margrethe Vestager | SCREENSHOT BY HARVARD JOURNAL

Another new piece of legislation passed, Vestager said, is the Digital Markets Act, which prevents large digital platforms (“gatekeepers”) that provide services to business users and customers from using their market power to gain an unfair advantage. “Very simple idea,” Vestager said, “just keep the market open, keep it contestable, so that we have choice, because that’s the first step to being able to take action – that you have choice.” For business owners who rely on these platforms to serve customers, their ability to succeed “should depend on (their) idea, (their) work ethic, the people that (they) have on board, the capital that (they) can raise, not on some giant company that has market power.”

Finally, she pointed to the AI ​​Act (not yet fully in force), “to ensure that when artificial intelligence is used in situations that are critical to us as individuals, we are not discriminated against; that AI still serves us as human beings.” All of these laws flow from a simple idea that is nonetheless difficult to materialize: that “technology should serve people.”

A ‘hierarchy of harm’

Kidron, who While laws, regulations and international treaties have been used to protect children’s rights online and protect them from surveillance capitalism, she said she worries about patchwork legislation. One law to tackle targeted advertising, another to tackle child sexual abuse — as “the system weaves its way into every part of public and private life” — risks creating what she called a “hierarchy of harm.” That’s “not how we should be doing it,” she continued. “We need bold legislative action … we need legislators who imagine the world we want to live in and think about how technology is going to help us live in that world, not try to tack on a few harms on top of the pie.” And then the rules should be “routinely and ruthlessly applied to the digital world.”

Ressa, the journalist, added that in addition to ending “surveillance for profit,” laws must also prevent code bias: the tendency of algorithms to discriminate against certain groups. And she reminded everyone that journalism, that antidote to tyranny, is itself in danger. Democracy, she said, needs a new system to “stop the corruption of our public information ecosystem. We need new systems of governance, because the old power structures have been upended; and we need new systems of civic engagement.”

“Total Information Awareness”

How the United States The early opportunity to pass federal privacy legislation was missed partly by design and partly by accident of history, as Zuboff explained. She argued that in the early days, President Bill Clinton and Vice President Al Gore ’69, LL.D. ’94 wanted to remove barriers to Internet commerce, emphasizing that the private sector should lead the development of cyberspace. But there was deliberate resistance in Washington, she said, and an effort to outline comprehensive federal privacy legislation.

Shoshana Zuboff
Shoshana Zuboff | SCREENSHOT BY HARVARD JOURNAL

“A few months later,” she continued, “something really big happened: ‘9/11.’ The conversation immediately changed from privacy to ‘total information awareness.’ And suddenly these upstart companies with their Web bugs and their cookies and their monitoring techniques and their tracking,” Zuboff said, “became little heroes, and the new thing was: Get out of their way, let them go. Because if you’re an intelligence agency, it’s unconstitutional to monitor and surveil a civilian population. But can you stick a big straw across the country and suck up everything that’s going on in Silicon Valley? Absolutely, and you get around the whole thing that way, and that’s how (the current system) is institutionalized.”

Zuboff also drew parallels between the early days of exploitation of personal information and the use of copyrighted data — including books, newspaper content, music, artists’ voices — to train today’s AI systems. She called it stealing. “And stealing,” she added, “is a crime.”

“Who do we want to be?”

While the conversation among these influential thinkers, a question from the audience inadvertently underscored the stark difference in national responses to the rise of an information civilization: a world in which individuals barely exist unless they have a digital presence. The questioner asked Vestager whether legislation in Europe was suppressing AI: “We don’t see big AI companies in Europe anymore,” the HKS public policy student noted.

For Vestager, this raised a fundamental question: “Who do we want to be?” she asked. “I think Europeans would be very poor Chinese…” and “not good Americans either.” The European model, she explained, is a society built on “infrastructure that is available to everyone: free education, health care systems that are truly inclusive.” “There’s something cultural at stake,” she added, “and I think it’s really important to stay true to the model that you think works for you.” Ressa agreed, saying that the term “innovation” has been used to attack the EU and entrench surveillance capitalism in places like her native Philippines and the United States. “But innovation doesn’t mean you’re better,” Ressa said. “It just means you’re willing to do things that you know are wrong.”

“Innovation? That’s a dog whistle,” Zuboff added, “that says ‘don’t pass laws.’” What innovation really means, she said, is preserving the status quo, allowing AI’s architects to continue to pursue their commercial goals. “We’re never going to have AI for the common good … as long as this oligopoly owns and exploits the entire market structure of artificial intelligence.”

Real innovation begins when the status quo is broken down, she continued. “I promise you, there are millions of people out there, smart, energetic, talented, creative, just like each and every one of you,” she said, addressing the audience of Harvard students, “and they are lining up for their chance to do amazing things with these new technologies that will provide solutions for their communities and solve the world’s greatest diseases and provide solutions for planet Earth and all the things that we Real need. And it doesn’t have to be in the structures that are created by the oligopoly. We invent the new structures, and in doing so we invent the rights and the laws and the institutions that keep it all safe in a world of democratic governance.”

You May Also Like

More From Author