Kevin Lo ’13, M.Eng. '14, is one of the tiny percentage of humans who can say that they have touched an object that is now on the surface of Mars. “Having something like the Mars 2020 mission that I... Read more about MAE alum Kevin Lo is NEVER bored at work
A conversation with Dipayan Ghosh Ph.D. ’13 about his book, Terms of Disservice
In his new book, Terms of Disservice: How Silicon Valley is Destructive by Design, Dipayan Ghosh Ph.D. ’13 offers technical analysis, recommendations for economic reform, and practical ideas for using technology to create an open and accessible world that protects all consumers and civilians.
Ghosh is the co-director of the Digital Platforms & Democracy Project at the Harvard Kennedy School and a lecturer at Harvard Law School. He previously worked at Facebook, leading strategic efforts to address privacy and security issues, and he was a technology and economic policy advisor at the White House during the Obama administration. He received a Ph.D. in electrical engineering and computer science from Cornell and an MBA from the Massachusetts Institute of Technology, and conducted postdoctoral research at the University of California, Berkeley.
At Cornell, Ghosh was advised by Stephen Wicker, professor of electrical and computer engineering and a member of the graduate fields of computer science, information science, and applied mathematics. They share an interest in issues surrounding digital privacy, artificial intelligence, disinformation, and internet economics.
Ghosh and Wicker chatted by email about the new book; their conversation, with minor edits for space and clarity, follows.
Wicker: Hi Dipayan, it was a pleasure to see your new book, Terms of Disservice. What was your main motivation in writing it?
Ghosh: Professor Wicker, it is so great to have this discussion with you! As you know very well, I have a strong interest in data privacy, in large part because of the work we did together when I was your student. Building on work with you in which we studied problems of information and privacy economics, I got a chance to dive head-first into the field of technology policy. This was right after the Snowden disclosures about the NSA in 2013.
There was mounting pressure on the Obama administration to push for privacy reform, and late that year I had the opportunity to join the White House to help the team address privacy reform and technology policy more broadly. I think the two years I spent there were formative for me. I gained perspective on how companies and regulators alike think about striking the balance between regulation and innovation, and that is essentially what Terms of Disservice is about.
I left the company a couple years ago, wrote a few papers that gained some attention, and eventually, realized that there was a bigger story to tell concerning the nature of Silicon Valley, its interface with society, and the regulatory regime necessary to diminish societal harms like the disinformation problem. That was the origin of Terms of Disservice.
Wicker: As you know, technology companies collect a huge amount of our personal data. In some cases, individuals have amassed billions in personal wealth without actually making a product. How did we get to this point?
Ghosh: This question gets to the heart of why this industry—which in the book, I called the “consumer internet” sector—is so prevalent in the modern media landscape.
I think we first should acknowledge that it's in the United States that this corporate culture of uninhibited data collection seems to have emerged first, and indeed the individuals with greatest wealth generated from our personal data are the eight-hundred-pound gorillas sitting in Silicon Valley. Why are the billionaires all American? I think it's a situation best explained by two primary factors: Access and Existing Regulatory Policy.
In mentioning Access, I'm referencing the fact that California is flush with the venture capital that is so necessary to get internet business models off the ground. Consider the fact that many consumer internet firms don't make a profit at first; instead, their investors tell the startups to grow their platforms as quickly as possible by getting lots of users and collecting lots of data on those users.
These are the two resources—attention and personal information—that enable the high-margin targeted advertising regime that has become the dominant mode of monetization in the consumer internet.
Regarding Regulatory Policy, what I am suggesting is that the United States lacks a fundamental, baseline privacy law—a law at the federal level that protects a citizen's personal data from the corporation, no matter who the corporation is or what kind of data it is. There is a long list of reasons for why this might be the case, but the result, in my view, is that it enables this corporate culture of unchecked data collection for profit maximization as a default.
When you don't have a law telling companies that they cannot engage in some activity, they'll do it if it is in their profit interest. In fact, you could even suggest that this is part of the reason that Europe doesn't have its own version of a Facebook or Google or Amazon (that is to say, a company that can challenge the economic power of any of these three), because Europe recognizes a fundamental right to individual privacy, and has long had laws that enforce it in various ways.
Beyond these factors, I think it goes without saying that sometime in the past 25 years, we passed a new technological threshold that enabled the rise of a new kind of business model that now underlies all of the consumer internet. The advancement of the technological base of society is an ongoing process, and as Marx noted, it's the nature of that technological base that defines social circumstances like culture and corporation.
With ongoing advances in computing and data storage capacity, it suddenly became cost-effective to harvest and hoard personal information, analyze it to derive behavioral insights on individuals, keep them engaged on the platform through algorithmic curation, and target ads at them.
Wicker: I greatly enjoyed your book, and was particularly interested in the proposed regulatory regime. If you could implement a single part of the overall strategy today, what would it be? And what would be the primary obstacles in bringing this about?
Ghosh: This is a bit of a complicated answer, but let me give it a shot. I want to say that advancing the privacy reform agenda is most important. For one, uninhibited privacy breach—or conversely, uninhibited data collection and use to the profit-minded ends of the platform—is the central mechanism that has given rise to the corrosive business model at the heart of today's dominant digital platforms.
A fundamental baseline privacy law in the United States, something akin to the European General Data Protection Regulation, would restrict the ways in which companies collect and use our personal data. What I would prefer is institution of an opt-in regime (which, as you and I have discussed in the past, can bring participation rates from 90% down to 10%) alongside many other measures such as use specification and minimization. Essentially, strict adherence to the Fair Information Practices, which are consistently breached in the course of day-to-day business by most internet companies including Google and Facebook. In fact, I would go so far as to say that for the vast majority of internet companies, adherence to the Fair Information Practices is not even on the radar.
But part of me would also say that an even more important realization that regulators could make is around the natural monopoly status of the dominant digital platforms. I'd contend that they are monopolies, in that they possess controlling shares of various consumer internet subsectors (e.g., social media, internet search, internet-based text messaging, E-commerce, online video, picture-sharing, email and so on). And when I say controlling, I mean typically majority shares, and in several of these cases, over 85% of the market.
But I'd go one step further, which I think many people still hesitate to do, and suggest that they are natural monopolies in that they exhibit and benefit commercially from extremely strong network effects, by which I mean that with every new user and marginal engagement, their valuation increases, and at an increasing rate. This allows them to raise substantial barriers to entry, pursue anticompetitive practices to shield their market position, and most egregiously in my view, engage in the exploitative rake of monopoly rents via a novel form of digital currency that they extract from us, their users.
The obstacles to a baseline privacy law and a federal assertion of natural monopoly status are substantial: lack of public awareness, industry lobbying, broad exertion of tech's political and economic weight, complexity of the regulatory problem, inadequacy of regulatory tools and political disagreement. American policymakers eventually will have external forces pushing them to act—namely, foreign governments will determine the regulatory norms for American firms in the absence of Congressional action.
Wicker: It is fascinating to think of our current problems with privacy as “superstructure” that arose naturally from a foundation of basic economic relations. We seem to have moved from large industries that make, for example, steel and automobiles, to corporations whose goal is to peel away hundreds of dollars from each of several hundred million people. I think they call this poverty capitalism. Do you think that the Internet and app-based consumer culture has accelerated this trend?
Ghosh: The way you put this, describing the economic situation on our hands, is precisely correct in my view.
Society has moved online. Just this week, I believe, digital advertising surpassed traditional advertising in terms of revenue, in part because of the social distancing policies forced upon us by the COVID-19 pandemic. I see no great problem with our having moved our attention to digital platforms over traditional media in and of itself; society always has a technological circumstance on which it sits, and which in part determines its expression of culture.
The fact that we now have the technology that enables the consumer internet business model is no surprise. Given the advances in computing described by Moore's Law, we were always going to get to this point in a radically capitalistic society. Our constant participation in this consumer culture, as you put it, is part and parcel of the problem. But that said, I see very little we can do about diminishing that culture. Consumer education of the masses is a difficult task, especially when dealing with the complex business strategies thrust upon us by the likes of Facebook and Google.
The internet firms have benefited from that move online. When I say they are natural monopolies that extract monopoly rents, what I am attempting to suggest is that they are extracting from us a new form of digital currency, all the time, everywhere. That currency is a complex combination of our personal data and attention.
For us, the decision to join Facebook is price-inelastic: we see the opportunity to join the platform and link up with our friends, and we will pay any amount of data-and-attention currency to get that access. And it is being extracted at a monopoly rate—in this case, an unbounded rent—in part because it is a frictionless transaction for us; we don't feel our wallets getting any lighter. It's a hack of our culture and our individualities. The good thing is that in the long run, hacks are discovered and busted.
Wicker: You mentioned the GDPR, a European regulatory regime that is far stricter than anything that we have considered in the United States. In the past, Google, Facebook and others have been able to maintain their business models while working in Europe through the use of safe harbors—a form of “get out get jail free” card for those who market our private information. Do you expect this sort of evasion to continue? Or might Europe start to force U.S. companies to play a bit more fairly?
Ghosh: Money makes the world go ‘round, even the political world; and I think that fact has influenced Europe's actions to an extent in recent decades. That sounds negative, but let me explain.
As a general matter the European approach to economic design—including most notably, through the EU and western European national governments—places individual interests and economic equity ahead of the freedom of markets. This distinguishes Europe from the United States, which practices a radical form of capitalism, at least with respect to, say, the German approach. Consider, for instance, the European approaches to healthcare and unemployment: whereas in a place like Germany there is utmost economic protection even if you don't have a job, the United States for the most part lets companies decide what is best for the consumer. In my personal view, that's not always the best approach.
We can clearly see why when it comes to big tech. In the digital realm, Europe, again, has a protective approach; the EU recognizes a fundamental right to individual privacy, and the GDPR is the ultimate expression of that right in the digital realm. But you are correct—despite such protective privacy laws, Europe has carved out these safe harbors, including the Safe Harbor Framework and most recently the Privacy Shield, so that American firms can still operate in Europe even while they might technically be in breach of the baseline privacy laws there.
There was economic pressure for Europe to do this; when the Safe Harbor Framework came about shortly after the turn of the century, the internet was seen as a globalizing force, and American industry was at its forefront (as it continues to be). In one sense, without the carve-outs, Europe would slip behind the rest of the world by not participating.
I think, though, the terms of the economic debate are ever so gradually shifting. European governments are increasingly being pressured by their people to uphold their privacy rights and are well-positioned to both protect their markets and their consumers' rights as needed. We're seeing this, for instance, through the rhetoric of top officials like Margarethe Vestager, who are examining how to bring the full weight and rigor of European regulatory authorities to confront the likes of Facebook, Google and Amazon on both the privacy and competition policy fronts.
Wicker: You raise an important point: the internet economy is built on attracting our attention and purchasing power. Most people would say that we are freely choosing to spend our time visiting with “friends” on Facebook or shopping on Amazon. To what extent is that sense of freedom a mirage? I have always thought that we are raising our children to be consumers first, and all else second. The conversion of the internet (originally intended as an information resource) into yet another mode for distributing media programming is, I think, a reflection of this.
Ghosh: There is no doubt in my mind that the internet is no longer the space of individual freedom that it was in the first years after its inception as a consumer technology.
At one time, the internet was a space where we could exchange ideas without fear of corporatization or surveillance. But as certain technologies evolved and were implemented over the internet, that has completely changed. The industry—internet entrepreneurs strolling arms-linked with American venture capitalists—saw the internet as an open green field ripe with economic potential, a vast commercial opportunity to fundamentally evolve the media ecosystem as new capacities in data storage and computing emerged.
We had a few first movers—Facebook and Google, to name a couple—who settled upon user interfaces and experiences that could attract aggregated attention, and eventually morphed their revenue models such that they could monetize the attention and other resources generated through exhaust, such as consumer and proprietary data. And those first movers monopolized the industry.
Now, as consumers we sit in the digital realm amongst a few behemoths. Consider it: relative to the amount of time that we spend on the internet overall, we spend inordinate amounts of time on one of the dominant platforms. And even when we don't, we are on websites featuring third party cookies from the dominant firms.
We're living within a corporatized structure that conducts uninhibited surveillance to generate profit. We are not free consumers, but because we are experiencing this exploitation in the digital realm, we can't feel it.
Wicker: I’d like to conclude our conversation with a bit of reminiscing, if you would indulge me. What lessons stand out to you from our time working together at Cornell?
Ghosh: A few things really stand out to me, and of course there is so much that is hard to recount in this format. But the foremost lesson is the importance of having a solid foundation of knowledge. I remember this well from your approach to our research.
It was so important of course to take classes in mathematics and economics to enable my work over the years. Game theory, analysis, information theory—these were all important for the things we wanted to accomplish, and I gradually understood the importance of patiently establishing such a foundation before trying to have a broader impact.
This is an approach I've brought into all my work since, in technology and in policy. Even now, when I am asked to undertake new efforts, the first thing that I try to do is read—a lot—about the issue from every angle that I can find, before ultimately developing my own policy perspectives and economic ideas.
Wicker: How were your ideas and concerns about technology shaped or formed by your academic experience?
Ghosh: This really comes down to the interdisciplinary approach Cornell (and you) encouraged throughout my doctoral studies. Early on in my first year, we established together that I would work on studying issues around privacy. It was a hot topic at the time, but not quite in the way it is today. It was still more of a technical issue, and the problems of corporate surveillance we experience all the time today were only taking shape.
That was my foundation—to first understand how we could technically design systems and networks from what we called a "privacy-aware" perspective, and next to determine what kinds of economic and regulatory conditions would enable the adoption of such non-invasive systems at equilibrium.
Throughout my time in the world of public policy, I have tried to approach issues using this same lens of analysis—which is why I often tend to feel that regulatory rather than market-based approaches are needed to generate the kinds of incentive systems needed to change the economic terms around consumer privacy.
Wicker: Lastly, when did you last visit the Ithaca campus? Is there anything in particular that you miss?
Ghosh: It has been too long! I think the last time might have been in 2015 when I came to give a talk at a small conference on privacy issues. That was such a pleasure and even since then, the University has changed so much. I hope to be back soon, and I hope that life can return to normal on campus as soon as it is safe!