Podcast / Transcript and Show Notes

Season 2, Episode 3: How Business Models Have Shaped Big Tech

Listen now

Hear the the conversation between Kurt Andersen and Roger McNamee. Subscribe on Apple, Spotify, Stitcher, or wherever you listen to podcasts.

Transcript for Season 2, Episode 3: How Business Models Have Shaped Big Tech

Kurt Andersen: Welcome to The World as You’ll Know It. I'm your host, Kurt Andersen. 

We’re discussing the future on this podcast, this season the shape of things to come specifically as a result of technology.

Roger McNamee was a Silicon Valley true believer and investor way back in the 1980s, when the digital revolution was just beginning. He was sure that all the smart idealistic young people around him in there were transforming the world in good ways, making culture freer, democracy better, everything more accessible. He became a big-time tech insider, made tons of money for his investors and himself, thought he was doing well by doing good.  

When Facebook was still a small startup in the early 2000s, he became a mentor to young Mark Zuckerberg, and introduced him to the woman who is still his number two at Facebook, Sheryl Sandberg. And thus made lots more money when the company went public in 2012.

But then Roger McNamee started having doubts. Five years ago those doubts reached critical mass and he decided the downsides now way outweighed the upsides ––that social media and the giant tech companies pose existential threats to society, democracy, culture, individuals, people’s very tethers to reality. He wrote the New York Times bestseller, Zucked: Waking Up to the Facebook Catastrophe, and became a crusader against what he and his former friends and colleagues have wrought. 

Roger McNamee, welcome. 

Roger McNamee: It's a great pleasure to be here. 

Kurt Andersen: So as you tell your story in, in Zucked, until 2016 when we all found out Russia was using Facebook to manipulate our election, until that freaked you out, you hadn't really imagined the, the real downsides to Facebook and the rest of social media? 

Roger McNamee: Kurt, I grew up believing that technology was a force for good, and I was drawn to Silicon Valley by the clarion call of Steve Jobs and others about this notion of using technology to empower people. And from 1956 until the early 2000s, that was the cultural center of gravity of Silicon Valley. And I bought into it hook, line and sinker. And I was a very happy investor, analyst, but also booster of that value system. And it was not until roughly 2008 or 2009 that I first saw things going on in the valley, culturally, that were uncomfortable for me. The first company I saw that disturbed me was Zynga, which made Farmville, and that was the first time I saw really exploitative use of Facebook as a platform, you know, for profit. And then Uber was the one where I realized that the culture had really changed in a way that was incompatible with my value system. And in 2012, I told my partners that I thought I had passed my sell-by date, that the Valley was increasingly adopting predatory business models, using data to exploit the weaknesses of the consumers that used the products. And I didn't really understand it as it was going on. So I started to do some research, but I told my partners I was done and the fund ended in 2015 and we did not do a follow on. And keep in mind, this is a fund I started in uh 2004 with Bono from U2 as my partner. So it was a pretty exciting thing. And, you know, I met Mark when he was 22 years old. I gave him some advice that was helpful to him. We became close and for three years I advised him. I helped to bring Sheryl Sandberg into Facebook. 

Kurt Andersen: Whom you met through Bono, right? 

Roger McNamee: Yes, Bono introduced us back in roughly 2000 when Sheryl was still at the, at the Treasury Department. And when she came out to Silicon Valley. I was in those days part of the venture capital firm of Kleiner Perkins Caufield & Byers. And so I introduced her to my partner, John Doerr, who got her into Google. And, you know, so we were super, super close. And it never occurred to me in those days that Facebook would be anything other than a force for good, because the initial ethos was built around two things I really believed in. Authenticated identity, which they gathered by forcing the initial users to sign in with their email address from a school, so you knew exactly who you're dealing with. And the second thing was Mark gave you the first privacy controls that any Internet platform had given to users. And I was convinced, it turns out wrongly, that those were things he really believed in. And it turned out that, in fact, those things existed because they made the scalability of the network much easier. And that at the first opportunity, his view of privacy went out the window and authenticated identity followed shortly thereafter. 

Kurt Andersen: And I want to get back to him and the various choices he and others like him have made in these last 15 years.  But I do want to say, because we are at a moment in time, and in thanks part to you, in your transformation into being an activist five years ago. It certainly looks like 2021 could be a turning point. I see this executive order that President Biden put out basically could be summarizing your take: a small number of dominant Internet platforms use their power to exclude market entrants, extract monopoly profits and gather intimate personal information they exploit. His antitrust troika really now consists of, at the Department of Justice and the FTC and the White House, maybe the three most prominent, aggressive progressive anti-trust leaders. There are a bunch of bipartisan antitrust bills in Congress directed toward these giant tech companies. Are you more hopeful than you were a year ago? 

Roger McNamee: So I am hopeful but not super confident. And the reason is because we live at a moment in time when our democracy is under assault from within and the tool that those who are trying to defeat democracy are using is Internet platforms. And we are in this incredibly awkward situation where in order to save democracy, in order to end the Covid pandemic, in order to restore people's right to make their own choices, in order to restore capitalism to the kind of competitive, vibrant thing that it is in its best days, we have to force radical changes to the business models of the very companies on which democracy deliberates its choices. That's very much equivalent to changing a tire on a car while you're moving. 

Kurt Andersen:  So, so we've taken the first step in a, in a long journey, I guess, is what you're... 

Roger McNamee: Well, I actually applaud President Biden. I think the choices that he has made to run the antitrust division of the Justice Department, to run the Federal Trade Commission, to be on the White House Economic Council, to run the Consumer Finance Protection Bureau and even to run the Securities Exchange Commission, these are all, I mean, from my perspective, they're the perfect people. They give us the greatest opportunity to affect change. However, we should recognize that for the last 40 years, this country has deregulated business. It has defunded the agencies that do enforcement of regulations, so corporations have had a free reign for 40 years, and again, Google and Facebook came along halfway through, enjoyed all the benefits, and because of the nature of technology, because of, frankly, genius business plans, and very capable people, they have been able to create economic value for themselves at the expense of public health, democracy, autonomy and competition. Their basic notion is to package their users into clusters that can be sold to advertisers. The more extreme the behaviors that they can cluster around, the more valuable that cluster is. And so for Facebook, for YouTube, for Instagram, Twitter and for other platforms that are out there which use recommendation engines, driving people towards more extreme groups, more extreme ideas, more extreme behaviors is essential to the business, which is why when people say, well, you just need to add more moderators to clean it up, I mean, that's ridiculous. That, that is literally like trying to, you know, clear weeds by trimming the very tops of the weed. You know, if you don't take the roots out, you are stuck. 

Kurt Andersen: Looking back, uh seeing this over the last 20 years, since the first now failed social media entities came along, and then Facebook, if in an alternative history, in which magic was permitted, if a genie came and says, ‘Roger, you’ve got three wishes for a specifically different choices about business models or practices or whatever of the last 15, 20 years,’ what are those three wishes to make it not so existentially dangerous as it's become? 

Roger McNamee: So the most recent was in 2013, the Obama administration had a antitrust case under Section One of the Sherman Act against Google that they chose not to pursue. Had they pursued it, essentially nothing that happened in 2016 or after that would have been possible. The Obama administration had the opportunity to nip surveillance capitalism, which is a concept coined by the Harvard professor Shoshana Zuboff, that describes this business model -- invented by Google, adopted by Facebook. But the Obama administration had a case against Google that could have stopped it before any harm was done. And for reasons that are known to the people involved, but, you know, I can only speculate at, they chose not to pursue it.  And that, more than anything, is why we are here. But if you want to go back, there was literally a piece of just randomness that occurred in, ballpark 2003, that really started the whole thing. In March of 2000, the Internet bubble burst and it burst in a way that was catastrophic for the venture capital industry because in that era, each company needed to spend 100 to 150 million dollars to create the infrastructure for a website. So all those dotcoms, each one was independently creating the stack of technology to support itself unbelievably expensive. 

Kurt Andersen: I was doing exactly that at that March of 2000, by the way. 

Roger McNamee: OK, and so when that bubble burst, the venture capital industry left the playing field for a number of years. And it was just unbelievably bad timing because that was at a moment when a fundamental shift took place in the underlying technology itself. From 1956 until about 2003, the engineers in Silicon Valley were always resource constrained. No matter what you wanted to do, there was never enough processing power, memory, storage or bandwidth. You always had to listen to customers and do the thing that the customer valued most, what they were willing to pay for today, because you couldn't solve the whole problem. But in the early 2000s, certainly by 2004 in the world where things were wired, those constraints start to melt away --  first in memory, then in processing, power, storage and then finally in bandwidth.  And in the wireless world, they were all gone by 2010. So between 2004 and 2010, a fundamental change took place because suddenly the entrepreneurs were no longer constrained. They no longer had to listen to customers. They could create their own thing because they had as many resources they wanted. And so instead of spending $125 to $150 million dollars to create this stack of technology to support your website, all you needed was a credit card. So the cost of doing a startup fell from $150 million dollars to maybe $10 million dollars. And in that environment, you know, venture investors, had they been alert to it, would have realized, wow, we can do 10 to 15 investments for the same price we were doing one before.  We can diversify. The risk is so much lower. For the first time, they could start to look at really young entrepreneurs. I mean, if it costs 150 million bucks, you don't turn that over to a 20 year old. You turn that over to a 40 year old who's been down the path before. And so this change was humongous. And the first people to recognize it were the founders of PayPal. So this is Peter Thiel, Elon Musk, Reid Hoffman and that group. And they saw it before anybody else. And they also recognized that that would allow the World Wide Web to pivot from being a web of pages, to a web of people. This is where social media comes from. And so they went out and they literally started, first LinkedIn, and then they funded Facebook. And the reason this is so significant is because that group of  people had a very different value system than the generation that produced Steve Jobs or Nolan Bushnell or the folks who had dominated the personal computer era. I mean, they were, they were like a really extreme version of Bill Gates. And that has had an enormous cost to us ever since. If a different group of people with a different values system had had that insight first? I think we'd be in a completely different place. 

Kurt Andersen: Interesting. Interesting. So your, that, your second genie wish is: make some other better group of people make this discovery. 

Roger McNamee: I would like to take away the the the term better. I'm just saying it's different. I spent a lot of time talking with people who live in that world and they are not bad people. I understand the value system, but it is fundamentally different if you think about it this way. When you go to engineering school, one of the things they drive into you is the importance of efficiency. They also teach you about speed and scale and things like that. But in the old days, pre 2004, scale wasn't really practical. You know, that was something you got to over 20 years and that's something you got to overnight. But once all the constraints went away, if you combined the classic engineering focus on efficiency, with a relentless focus on speed and scale, you could more or less overnight create global products with huge consumer audiences that were based on the premise that data would replace oil as the most important raw commodity in the economy. And that was the insight that the team at Google had in 1999 or 2000. The thing that Shoshana Zuboff writes about so brilliantly in The Age of Surveillance Capitalism, this notion that you can create a model that converts human experience into data, you can make predictions about human behavior that you sell to advertisers. You can create recommendation engines that steer human behavior, manipulate human behavior in areas that are desirable to you. Now, if you had a different value system than the one that the PayPal mafia had, you might have applied all of that to making people healthier, more successful at work, happier. But what they chose to do instead was exploit the fact that data was really about power. People still think the issue is: am I giving up my data for targeted advertising? That hasn’t the problem for at least five years. 

Kurt Andersen: As you explain in the book, the little bits of information about our hobbies and our friends that we’ve already provided to Facebook and the rest are baked in, and now the algorithms see what we do and can predict what we’ll do next and push us in those predictable directions, right? 

Roger McNamee: Well, think about that in the context of the January 6th insurrection. Those police officers, it didn't matter whether they were on Facebook or not. The people who were the insurrectionists, were to one degree or another, believers in the Qanon conspiracy theory. They had for the most part, been radicalized on Facebook, brought into that world. Then Qanon morphed into Stop the Steal and Stop the Steal then organized the insurrection itself on Facebook and the poor police officers who were attacked that day, the members of Congress who were attacked that day, it didn't matter what had happened to their data because the real world consequences of other people being manipulated had suddenly become their reality. And people lost their lives, and many people were, were hurt badly and the country suffered a setback unlike anything it had seen since the Civil War. 

Kurt Andersen: It's analogous to people who choose not to get vaccinated because of the misinformation they've received on Facebook and other places and then end up hurting others as a result.

Roger McNamee: Precisely.

Kurt Andersen: But the third so it seems to me, I mean, maybe you're going to get there. But I came across in doing research for this show, this extraordinary paper that I'm sure you're familiar with, that the two founders of of Google wrote in 1998 saying, among other things, how bad an advertising supported search engine would be. Isn't that really the key? I mean, that it would be if if it were like Wikipedia--nonprofit--if it were subscriber based, if any number of other models obtained other than advertising, we would not be in such a rotten place, right? 

Roger McNamee: Well, that's certainly true. I do not think, though that, I think it's too facile to to just blame advertising. There are lots of forms of advertising that do not lead to insurrections, that do not lead to pandemics spreading outside the bounds of what's reasonable. To me what this issue is, is that, that targeted advertising can in fact, be a valuable service. What's going on here is something fundamentally different. This is the use of data to inflame emotions, to change behavior, not for the benefit of the user, but for the benefit of the platform, for the benefit of the advertisers. And what's really wrong with it is that almost every large corporation in America has adopted some form of surveillance capitalism. Everybody wants a piece of this pie. Everybody gathers data. They all trade it with eachother. And we as citizens have in the United States no rights, none that are material at all. The way to think about this is that, you know, there are four issues that we have to deal with. Public health --  disinformation is the big problem there. You've got democracy, where hate speech, conspiracy theories and disinformation collectively are a huge part of that problem and actually direct manipulation of, of the politics itself by these platforms. You have the right to self-determination, your right to your own autonomy, make your own choices. That's the privacy issue. And then you have the issues of competition. And I think this is the first time in the history of United States that any industry has done so much harm in so many areas. You know, we have faced industries before. Think about the chemicals industry. The entire business model of the chemicals industry in the 50s depended on their ability to dump toxic materials wherever they felt like. And, you know, there were public health disasters from that, but also environmental disasters. But I'm not aware of them having the same impact on democracy or in the same impact on people's autonomy. So go back before that: 1938 we passed a law outlawing child labor in most instances. Why, because industries like a garment trade had built business models dependent on child labor, you know, and there are industries before that, the Food Pure Food and Drug Act in 1906 was passed because the food supply system was unsafe. Things that were called pharmaceuticals were dangerous. And we created a set of rules to protect people. We did that with railroads before that.  We did it with the building trades before that. We have faced this problem before. And this is the first time that the country has struggled in this way. To prioritize the interests of the people over the interests of a tiny number of capitalists who benefit from the status quo.

Kurt Andersen: And that’s partly because it's harder to understand than rotten meat or dangerous drugs or cars that blow up which has slowed down the public recognition of the issues I think. 

Roger McNamee: But I would say that the greater problem is that these companies have made that reality. And, and so, like the tobacco companies and the energy companies before them, they have used the openness of our society to undermine the public's ability to comprehend a life threatening problem. Here's another way of thinking about it that might be helpful. The Internet was created by the Defense Department in the 60s as a way of reducing our vulnerability to a nuclear attack because the notion was, all of our computer systems, the things we used to command and control the military, were centralized. And there was a great fear that a well targeted nuclear attack would take out our ability to communicate. So the notion was, could we build a distributed communications system? That's what the Internet was. And then Marc Andreessen and his buddies at the University of Illinois come up with what became the Netscape browser, initially called Mosaic. And that popularizes the World Wide Web in this incredibly, almost utopian, highly distributed thing. You know, if you look back the original estimate of the personal computer industry before Microsoft centralized it, there were a zillion vendors and there were multiple operating systems and a lot of things going on. And then Microsoft figured out, aha, if we can standardize everything, we can centralize it. And so we think of the Internet as this highly distributed system, but it's not even remotely distributed. It runs through about five companies, of which Google is by far the most important, you know, in terms of economic value. And Facebook is the most important from a political perspective. You know, yes, they use advertising models, but centralization is the thing that's the problem. That is why antitrust matters now, not because antitrust will take us to a better world, but because antitrust is a tool. It's like a, it's like a blunt instrument that you can use to smack ‘em over the head. 

Kurt Andersen: And the tool that we've been using for hundred thirty years, that we America invented. 

Roger McNamee: And well, keep in mind, the country was founded on the principle that monopolies were associated with monarchy and that they were bad. 

Kurt Andersen: But just practically speaking, I mean, these tools that are, as you say, blunt and some of them need to be adapted for data in the 21st century. But we've been to this rodeo for one hundred and thirty years, you know? 

Roger McNamee: The Industrial Revolution allowed concentrated economic power to build and we had no tool kit for handling it. So between 1870 and 1914, we went through the process of figuring out what we were going to do about it. And in 1914, we passed the Clayton Act and the Federal Trade Commission Act. 

Kurt Andersen: Right and history repeats and here we are again. 

Roger McNamee: Exactly. And so the process again, we distribute it and then the world's centralized again. And this time it went from centralized on raw materials to centralizing on data. Google and Facebook each have roughly twice as many active users as there are people in China. So these companies have an impact on civilization that is unprecedented. I mean, I think the oil companies have had it also, but it's taken much longer for them to produce the kind of impact that these guys have managed to do in a couple of decades. 

Kurt Andersen: Well, and unlike the oil companies, which didn't have a direct hourly, daily impact on how people are thinking and, and the culture in the larger sense, that's what these companies are all about.

Roger McNamee: So let's come back to the issue of value system, because I think this is the place where, where the Internet platforms really differ from market dominant industries of the past. So engineers are trained that efficiency is the most important value on earth. And if you're making a motor, a small application, that is demonstrably true. The problem is when you get to the scale of Facebook and Google, efficiency comes into conflict with other values that matter in society. And so if you think about a country like the United States, our founding principles, our founding values came out of the Enlightenment, you know, things like the right to self-determination, democracy, those are enlightenment values. Think about it. They are inefficient by design because deliberation is required and deliberation is, in an economic sense, inefficient because it takes time. So in a conflict between efficiency at global scale and democracy or self-determination at global scale, efficiency is going to win every time because it has the advantage of not requiring deliberation. But it's worse than that because the very companies who are espousing efficiency, control the communications platforms on which the country is trying to do its deliberation. And they have the ability to influence those things in ways that we cannot fight because they use algorithmic amplification of things that trigger people emotionally.

Kurt Andersen: And by, and if we could just slow down for a minute, when we say algorithmic amplification, we mean sending people who are interested in whatever, let’s say, possible, the Wuhan virus... 

Roger McNamee: I use in a different sense than that. This is the concept that says that a news feed can be organized in lots of different ways. Right? I can just get updates from friends of mine. I can just have them in reverse chronological order so the most recent stuff is first. Or I can sit there and say I'm only interested in baseball or I'm only interested in politics. 

Kurt Andersen: Just like in the old days, you could subscribe to The National Enquirer or The New York Times. 

Roger McNamee: Right. In this context, what this news is, the company sits there and says, my business interest is based on keeping this person not only on my site, but I need them to share stuff. I need them to like stuff. I need them to interact. I need them to be emotionally engaged with the content. What content does that best? And it turns out that for most people, there is a predictable relationship between hate speech, disinformation and conspiracy theories and their engagement with the platform. And so you use algorithms to promote things that generate engagement and the algorithms which literally don't care what the content is, sit there and go, if I keep throwing this hate speech at people, they're going to react not because they like it, but because it triggers their flight or fight. And so in that context, algorithmic amplification is a tool that serves the interests of the platform, not the interests of the consumer. And the key point, they say, oh, no, we're only giving you what you like, but that is nonsense. What they're doing is giving you stuff you react to much of which you’re reacting to out of fear. 

Kurt Andersen: Right, by turning our natural instincts and interests extreme, feeding us those fears or whatever constantly, and and over-feeding us. Plus, they are these giant monopolies, really -- Facebook and Youtube -- which in the past, in terms of information, with a century of antitrust laws had never been the case, and then everything changed in 1980.

Roger McNamee: So Google's coming along in the middle of this thing and they benefit from the fact that things are already completely deregulated. And then they further benefited from essentially at that point, almost 50 years of Silicon Valley producing nothing but products that empower the people who use them. And so they were able to use sleight of hand to create the illusion that all technology is good in all instances. And this is really the con that the Obama administration fell for. And they fell for it really hard. I mean, it helped them get elected. You know, it seemed like it was nothing but good. And I will be honest with you, I was one of the people who thought that that was how it would turn out, because I thought that these people would recognize as they got larger that they exist in a community, that they have an obligation to others, that they would mature. That Mark, could as he got older and as he had children, begin to recognize that he lived in a in a community and that he had obligations to everybody. The part that really shocked me and that pained me. And that became clear to me beginning in January of 2016, was that, in fact, not only had that not happened, but that the success of the platform, the evolution of the technology, the failure of government to recognize the threat had coalesced into creating a sense of invincibility. And the difference was that the... My side, which is the side that likes democracy, that likes personal autonomy, didn't understand what was going on, so it was asymmetric. I think it would be completely legitimate for the country to just stop everything and have a debate. Maybe we would like the world Google and Facebook want for us better than the one we have today. But we should have an honest conversation. I mean, the way I think of this is that Google and Facebook aspire to a corporate led version of China. It's one where you use technology to create a form of authoritarianism that's designed to create economic growth. And, yes, it will bring up some people along the way. And I think they you know, they very much aspire to China's level of control, which is why, you know, whenever they're framing national security things, they do it in terms of competing with China. And I look at this and I go, “What utter nonsense.” I mean, if we want to compete with China, the best way to do it is not by doing what China does well, but by doing the things we do well, which is entrepreneurship, diversity, small businesses, creativity.

Kurt Andersen: And the things we did well back in the 50s and 60s and 70s, which is a lot more fairness and economic growth, the American Way.

Roger McNamee: And the the thing that makes me most sad, you and I grew up with parents who went through the Depression, went through the Second World War. They believed in collective action to make change. And so the environment following the Second World War was one where tax rates were incredibly high because we were investing in public goods, in things that we all shared. So health care, education, transportation, the space program, things that were universal goods. And since 1981, we've been operating with this different value system that says, no no, each one of us is the Marlboro Man. We are each on our own path. We owe nothing to anyone else. 

Kurt Andersen: And speaking of the psychology and the generational psychology of Mark Zuckerberg and people under 45 or 50, you’re saying they’ve only lived in an America with this new governing paradigm of everybody for themselves and there is no such thing as the common good. There is no such thing as excess profits or excessive corporate power, right? So you’re kind of saying it’s not totally their fault. That these young rational college-educated types were raised to sort of ignore bad impacts on individuals and societies of what they’re doing at Facebook and Youtube. Especially impacts on less rational people, people who aren’t like them.

Roger McNamee: I think the people inside Facebook, people inside Google are very well intentioned. They're really capable, very bright people. I believe that they have a value system and a set of goals that undermine the things that I think are important about the United States of America and are important about most Western democracies. And they have been -- I think the charitable term would be reluctant -- to acknowledge any responsibility because, again, as engineers and looking at efficiency, when they look at a failure, let's say the the ethnic cleansing in Myanmar...So when they look at those things, they view those as experiments that did not work out, and that they are a cost of doing business. In fact, there is a legendary letter to the Facebook employees by Andrew Bosworth after they turned on Facebook Live, which is their real time video product. And almost immediately a man was killed and it was filmed right. And he described this in a letter to the employees as, ‘hey, we're really big. This is a cost of our mission. This is something that's going to happen.’ In fact, he even made the point that someday it'll lead to terrorism and we just have to accept that as the cost of achieving this incredibly important mission. They really do believe in the mission. 

Kurt Andersen: You're suggesting that they actually have a vision of this Singapore writ large -- I don't know this this this undemocratic, efficient world that isn't just about making money. They actually, they, many of them maybe Mark Zuckerberg believe that that would be a better world. 

Roger McNamee: Oh, definitely. And the way I would frame this and I think this is the way to think about this, I think at the beginning, none of them appreciated how big they would become. So at the beginning, it was hard to imagine harm coming from this. I mean, let's take Google Maps as an example, right? So Google Maps, you know, the notion of Google Maps is on your phone, right? You can get anywhere and it gives you all this great advice. So you wake up in the morning and...

Kurt Andersen: Best thing about the Internet as far as I'm concerned. 

Roger McNamee: OK, well, I'm about to change your mind 

Kurt Andersen: I know you don't use it anymore. 

Roger McNamee: And so Google Maps, you have a 30 minute commute every morning. Every once in a while it says to you, ‘no, no, you can't take your regular route. You've got to other one that's going to take you twice as long.’ And, you know, you trust it because your assumption is that it's working for you. The reality of this is the way Google views this is that it’s their job is to apply an engineering concept called load balancing. And load balancing is about making the whole system of traffic as efficient as possible. And on any given day, some people need to be sent on inferior routes to optimize for everyone else. And it may be that everybody's OK with that, but they aren't given a choice. They aren't told that that's what's going on. Google's making that choice for you, that is efficiency over autonomy. 

Kurt Andersen: So they are taking over what a government would be perfectly within their rights to say, “we're going to do this and this will make everybody's life better, OK?” But they're just doing it, unilaterally.

Roger McNamee: They're doing unilaterally, and they're doing it in secret. And they do that in every one of their businesses. And the problem is that it has evolutionary advantages because it doesn't require deliberation and because under our current rules, the United States, there is literally nothing to constrain them except for two or three tiny little legal things that we might be able to apply to buy us time. But that's, I mean, we're we're really hurting on this stuff.

Kurt Andersen: That is a good segue to the legal and governmental changes we need, to fix this. And the thing is we’ve been through it before, right? Like you say, in the late eighteen hundreds — big corporations are invented and new technologies come along, so we --  the people -- invent the apparatus of regulation to rein them in. So now we need to look at Facebook and Google and the rest like that, regulating their size and power and their ability to hurt us. And in your book you talk about Ma Bell, AT&T—the total telephone monopoly for almost a century, until starting in the 1950s, when we decided yeah they need heavy regulation and then a generation or two later broke them up. So it happened kind of slowly, actually, in the second half of the 20th century, and and seemed like an illuminating model for what you think we might do now, right?

Roger McNamee: Sure. So in the, AT&T monopoly followed the Telegraph monopoly. So we made a decision in the 19th century: We were not going to allow Western Union, which the telegraph company, to go into telephony. So the Bell was system created as a separate thing, and it was relatively rapidly viewed as a public monopoly so that it was something where it would be regulated and operated, as you know, in the public interest, given an opportunity to invest, to make a lot of money, but at the same time not given total freedom.

Kurt Andersen: Like electric power companies for the most part. 

Roger McNamee: Correct, same idea. Nineteen fifty six, the Justice Department Antitrust Division entered into a consent decree because it was clear in 1956 that a new thing had come along called computers. And the question was, would AT&T be allowed to use its competitive dominance of telephony to then control the computer industry? And we decided, no, we were going to say to AT&T: You may not leave your current regulated markets. And that means that the computer industry, which at that time was just getting started, would have to develop independently in an entrepreneurial way. And that was unique. The rest of the world let their telephone monopolies dominate computers. So they didn't get separate industries. They didn't move forward rapidly because monopolies were doing it. And we got a huge competitive advantage. But there was a second piece of that consent decree that was the one that really changed everything. AT&T had a thing called Bell Labs that was a scientific research and technology development apparatus that had come up with something called the transistor, and they totally controlled it, and as part of this deal, the transistor and a bunch of other things were made freely licensable. That is to say you could license them from AT&T and not pay them anything and do whatever you want. And the notion was that that would stimulate a new industry and boy, did it ever. So that consent decree gave us not just a new computer industry, it created Silicon Valley because it created the semiconductor industry and everything that followed. And, you know, you would never have had Intel, you would never have had a lot of things without that. Then the second big case was a Federal Trade Commission case called Carterfone, that was about the telecom equipment, because in those days, AT&T owned all the phones. They owned all the switches they owned, 

Kurt Andersen: Western Electric phones were just a subsidiary of Ma Bell. 

Roger McNamee: So they owned everything. And so that case was the case that said, you know what, we're going to let third parties make telephone equipment and make networking equipment. So that case had a profound impact because it not only created the telephone equipment market where it created the data market, which created data networking as a separate industry. Then the AT&T breakup came in the early 80s. And with that, you accelerated cellular telephony by at least a decade and you wound up taking the Internet and laying the groundwork for the World Wide Web. And, you know it would be a gross understatement to describe antitrust in tech as pro-growth, because it's not only pro-growth, I think you can make the case that every major cycle since 1956 started with computers, semiconductors, then telecom, personal computers, all the stuff that came after that -- each one of those was triggered by by an antitrust case. And so investors should, in fact, be clamoring for this. 

Kurt Andersen: Right. The last, the last big federal antitrust case, of course, against Microsoft in the end of the 90s, allowed for Google and Facebook to exist. 

Roger McNamee: In every technology antitrust case in history, the target has relatively rapidly gone to all time highs and continued to prosper while you also get this brand new industry. So it is insane that people are fighting this stuff. I mean, it would be absolutely the best thing that could possibly happen to Wall Street. 

Kurt Andersen: So you think that traditional anti-trust, breaking up some of the biggest tech companies makes sense...even though it’s a blunt instrument as you say...and yet you’d go a lot further really than that. By changing some basic premises of Internet businesses that you think produce really toxic effects, that we need figure out ways to discourage and or even outlaw the worst practices, like digital microtargeting of what information people are served.

Roger McNamee: So if we start with the assumption that we want to be a democracy, that we want people to have autonomy, be able to make their own choices, have opportunity, that we want to have diversity, if those are the values we're optimizing for, then I think we have to have government intervention in three distinct areas. We need to have safety, what am I talking about? I’m saying, we've talked about what the issues are with Facebook, but let's take it forward a shot. Let's look at artificial intelligence. Artificial intelligence is actually a buzzword. It's really a marketing term applied to machine learning and other computer techniques that are designed to take the experience of the past and automate the actions of the future. And so people use artificial intelligence now to review resumes, for predictive policing, for mortgage lending. Well, here's the problem. The experience of the past in all three of those areas is blatantly unfair, right? Racism is a huge issue in mortgage lending. Racism is a huge issue in policing. And, you know, ageism, sexism and racism are issues in the resume review world. And so if you educate and train these artificial intelligences off of those historic systems and make no adjustments, you're going to get unfairness baked into a black box you can't scrutinize. And it has no right of appeal. And that is precisely what has happened. And that is unsafe. And, you know, in the pharmaceutical industry, we require people to go through an elaborate approval process, you know, they have to demonstrate safety and efficacy before they’re allowed to ship a product and nothing like that exists in tech. But it's actually worse than that. There's no Hippocratic Oath in technology. You know, if you are in the building trades, you're obligated to follow building codes. And if you do not do so properly, you are personally liable. Right? There is no such liability that exists for software in fact, there’s no accreditation. And I sit there and go, the first thing we need to do on safety is recognize that every product before it comes to market out of technology should be required to demonstrate in some way that you have anticipated harms and mitigated them before shipping. You know, I would say there for certain there are positive use cases for artificial intelligence, but the economics of the harmful ones are so attractive that they've got all the money at the front end. 

Kurt Andersen: So. So and you want, I take it more than just what we do with drugs, that we didn't used to advertise on television. Like side effects may include insurrection of the capital or... You want more than that, you want to somehow regulate  the YouTube's and Googles and Facebooks of the world to put less of that out there. 

Roger McNamee: Well, to be clear, what I really want to do is I want them to face really harsh consequences when they fail, because I don't want to pretend like a regulator is going to be able to do a perfect job of anticipating what the failure modes are going to be. 

Kurt Andersen: Right, just give them liability.

Roger McNamee: I think, yeah. Making sure that there’s very clear cut liability where the penalties are so serious that people would not consider cutting a corner. OK, whatever that is. And I think you have to have some forms of accreditation and people are going to have to sign up for that and, you know, you have to have whistleblower protection so that when companies do the wrong thing, that the people who blow the whistle, you know, are both rewarded for doing that and not harmed later on. But the key point is people will figure out the details of that. But safety is the first thing we gotta to get, right. 

Roger McNamee: The second thing we've got to get is that we have to have laws in the United States that prevent people's personal data from being used to harm either them or somebody else. And that is really about recognizing that personal autonomy is a really important value and that a world where corporations know more about you and know more about how you're going to react to things than you do and can use it in ways of which you are not aware in ways are completely invisible to you. That is fundamentally wrong. 

Kurt Andersen: But to the point of creating real corporate responsibility for the bad effects, and harms of their products, you bring up this interesting existing legal concept that I never really thought about in this context, which is the fiduciary — your lawyer is your fiduciary, your stockbroker is your fiduciary, and likewise, you say, companies have so much essential intimate personal data should have to act, be forced to act in your best interests, and should get in big, big legal trouble if they fail to do that. 

Roger McNamee: But hang on, antitrust is the blunt instrument. It isn't going to fix the problems. We have to have safety. Again, we have to look at the examples of food, pharmaceuticals...

Kurt Andersen: You want an FDA for this, basically a data FDA. 

Kurt Andersen: Well, actually, again, I think creating government agencies to do that is very difficult. But I'm open to the idea. I want to have that debate. You know, I want to hear from the people who really understand how to make that stuff work. But if you sit there and say that the essential cogs are safety and privacy, that we need those two things. But we are currently years away from getting the kind of laws we need. Remember, once we decided that we needed to regulate the chemicals and petrochemicals industries to reduce pollution, it took 20 plus years and a like dozen laws through Congress before you got there. We do not have 20 years to save democracy. We do not have 20 years to stop the Covid pandemic? We need something right now. And here's the key. 

Roger McNamee: So the third leg of the stool is competition, which is antitrust law. There are certain absolutes, you know, in our society, they are laws, that if you violate them, you've committed a crime. And it turns out there are three areas of opportunity here that the Biden administration can pursue that they can do very quickly. And that will buy us a lot of time. Obviously it is really important that we have new antitrust laws updated for the 21st century. That's what those bills in the House of Representatives are trying to be. They're the first step of that process. But we can't afford to wait even for that. We have to have stuff right now. We have to use the tools that are in our toolbox. 

Roger McNamee: So the state of Texas, the attorney general of Texas filed the case last year against Google and by extension, Facebook for price fixing in the digital advertising world. I think there are, I don't know, a dozen or more other states that joined in the case. A lot of information has been revealed about it. I've talked to a ton of antitrust people, all of whom think it is the most clear cut antitrust case in decades, you know, essentially what happened was that Google had a monopoly of a certain kind of ad network, Facebook pretended it was going to create a competitor in order to get Google to essentially divvy up the market with Facebook and they agreed to an arrangement that was price fixing. And then apparently there are e-mails that suggest they had a mutual defense agreement, which both confirms the first point, first article, and creats a second second count. When you attempt to corner a market with price fixing, that is a crime even if you do not succeed.

Roger McNamee: And that's the worst antitrust crime you can commit. It is a felony and it is a felony that includes, that is punishable, that is a standard punishment of three plus years in federal prison for the executives of the companies, two counts each three plus years. The CEO of Bumblebee Tuna was sentenced to that exact term in October of last year by the Justice Department. In a separate case, a set of financial executives were sent to prison for it. Those cases were maybe one percent as large as Google and Facebook. So my great hope is that Jonathan Kanter, the recently announced appointee to head the antitrust division Justice Department, my hope is that he will federalize that case and pursue it as a felony, including indictments of the executives. I believe that with the threat of loss of freedom, the executives of these companies can be made to negotiate in good faith for the first time since their creation, and that we can use that to come to an agreement about safety, to come to an agreement about privacy that takes today's status quo and blows it up. Look, there's no way to do that without greatly reducing the profitability of these companies. But guess what? They've had their day. I think it's time to prioritize the needs of billions of people at the expense of a handful of people.

Kurt Andersen: So when you say it reduces the profitability in whatever the solutions to the problems that they pose to society in the world and individuals, do we know that’s true or do we assume that's true. And they have enormous profit margins, over 30 percent, Google and Facebook. Do you have any idea what it would mean if we control them, as you believe we should, by how much that would reduce their, their profits? 

Roger McNamee: I've absolutely no idea. I'm just sure it would be a lot. 

Kurt Andersen: Yeah! 

Roger McNamee: When capitalism works well, it needs a referee. Historically, the government plays that role and it sits there and says, ‘look, you know, free enterprise is fine, but you're not allowed to hurt people. You're not allowed to corner markets. You're not allowed to to behave in ways that are anti-social.’

Kurt Andersen: Or anticapitalist. 

Roger McNamee: Right. All of those things have been lost. But if we were to reinstate that, right, you would sit there and say... People think of antitrust as breaking companies up. And actually the first piece of of it that really matters here is this notion that companies can have a choice, you can operate a marketplace or you can participate in a marketplace, but you cannot do both. This is the fundamental flaw of Amazon's marketplace. This is the fundamental flaw of Google's digital advertising marketplaces. This is, this is the fundamental flaw of Facebook's marketplace. This is the fundamental flaw of Apple's App Store, is that these folks not only operate within their own markets, they prejudice the systems to favor their own products. And that kind of stuff has got to end.

Roger McNamee: But the other thing that’s really, really bad here from an economic power concentration problem is that Google controls digital advertising, not just they're the largest player in it, but all the systems that do the accounting verification, all of the stuff that you know, helps advertisers place their ads, Google controls all that, too. So when I think about breakups, I think about breaking Facebook up into Facebook, Instagram, WhatsApp, but also the you know, the, the virtual reality business, the messenger business in a gazillion others. Those are the vertical break up and then the horizontal breakups, which you also do break off the user experience of Facebook from the underlying mechanics of Facebook, from the monetization of Facebook. And you do that for each piece they’re in. So Facebook goes from being one company to being, I don't know, 50 and Google goes from one company to maybe 500. 

Kurt Andersen:  So more short term, what are some things you would have us do tomorrow to get us where you think we need to be.

Roger McNamee: You know, I mentioned before this notion of federalizing the Texas antitrust case. There are two of the things that they can do really quickly. One of them is to take digital advertising and put it under the supervision of the Commodity Futures Trading Commission. Why? Because the actual underlying technology is the technology... is technology that came from Wall Street, from high frequency trading. And what they are doing is they aren't actually selling ads, but they're selling you effectively a commodity feature about an ad spot. 

Kurt Andersen: That’s interesting, that's interesting. 

Roger McNamee: So if you regulated them that way, that would enable you to force transparency. It would enable you to block all kinds of conflicts that currently exist inside the system. And that's something they could do literally overnight, through the SEC, the Securities Exchange Commission. The third thing they can do is that within this industry, it is widely understood that user counts are overstated. And Facebook, at a minimum, has admitted to overstating ad views. They’re probably others who, who have also overstated ad views who maybe haven't admitted it in public. Well, the thing is, advertisers haven't had a problem with that. And so we kind of go, well, that’s a done deal. Well except, wait a minute, there's another constituency that really matters here, and that investors. Investors are harmed. If the user counts were overstated, if the ad views overstated, that means the revenues are probably overstated. And if the revenues are overstated, then you've got revenue recognition fraud and that is not kosher. The SEC could begin investigation just to see, right?  And all of those things, um you know, those things can all happen right now. 

Kurt Andersen: But what about this issue that if you regulate or regulate wrongly, I suppose, innovation can be somehow harmed? 

Roger McNamee: I think it will be. And I'm all for it because I think innovation is a bogus term. It is innovation is about tiny improvements. The only people who benefit from innovation are rent seekers, monopolists. What we really want technology to do is to change the world for the betterment of everybody. We really want to create the integrated circuit from the transistor, right? We want to create the microprocessor from the integrated circuit. We want to have things that are revolutionary. And the problem is monopolists don't do revolution. They do innovation, little tiny improvements. 

Kurt Andersen: That doesn't threaten their business. 

Roger McNamee: That actually, in fact, enhances their business. And I would argue that we've allowed the language of the monopolist to crowd out the language of democracy. And, you know, we need more risk taking in the entrepreneurial world. We need more risk taking in technology. We need the rewards to go disproportionately to those people who make the world a better place and be taken away from those who are demonstrably harming us. 

Kurt Andersen: Uhh, But in the book you say I still own Facebook stock, which led me to think, oh, should... is there an opportunity here with these companies like there, as we've seen in the fossil fuel companies, for shareholders to buy shares and mobilize and activate as activist shareholders, to change the way companies do business? 

Roger McNamee: So people have tried to do that and it doesn't work because they have multiple classes of stock and the founders control the voting stock. And so Facebook had, I think, each of the last two years, shareholder resolutions to do things differently and better that passed overwhelmingly, but it didn't matter because Mark just did it his way. 

Kurt Andersen: As I was reading a book, I was thinking of Robert Oppenheimer, who famously, after the opening of the atomic bomb, “Now I am become death, destroyer of Worlds.” I mean, not that I'm not saying you've created nuclear Armageddon, but but the stakes, as you portray them, is practically in that neighborhood. 

Roger McNamee: So to be clear, I am no Robert Oppenheimer. I was a history major in college. My thesis was on the development, the atomic bomb. So I've read deeply in Oppenheimer. So I know that world very, very, very well. And it is certainly true that my value system is, you know, has this notion of sunk cost, there's nothing I can do to change what I did in the past. All I can do is affect what I'm doing now. And I am doing this, and I'm often criticized for it and I accept the criticism. I refuse to engage in debates with anybody from Facebook. Why? Because we live in a world that is dominated by Facebook's point of view. And they have gotten really good at gaslighting. They're the best gas lighting I've ever seen. And it's just not productive to engage in those debates. What I would prefer to do is lay my thing out, let people make a judgment, and they do. And it's pretty obvious from where we are today that I have been that successful. I mean, a lot more people are aware of this than were aware of it five years ago, and I've played a small role in that. But have we made any tangible progress? No, we've lost an enormous amount of ground. Things that were really easy to predict three or four years ago, that we could only have been stopped by Mark and Sheryl and Sundar Pichai and the executives at, you know, other companies, whether Amazon or Microsoft or Apple, wherever, that could only be prevented by them changing behavior, have come to pass because, you know, Congress has been very slow to understand. The government, frankly, has no muscle tone for dealing with problems like this. And these big companies have been incredibly successful at causing… at effectively making their point of view atmospheric. So we're all breathing it in and out all day long and, you know, it's going to be ... a miracle if we get out of this thing with our democracy intact. 

Kurt Andersen: Right, so, looking forward, let's say, 10 years, what are the odds that all of the things that you so despair about concerning technology and digital technology and social media in particular will be better than they are now? 

Roger McNamee: I got to tell you, I'm I'm really hopeful. I'm not super optimistic. But I keep at it every day. And this the funny thing is we have the power to change this and we could change this overnight if everybody raised their voice simultaneously. What's missing from this thing is the engagement of the very people being most harmed by this. And you know for a long time, people were, "Well, this is your problem is about personal responsibility. If you don't like it, log off of Facebook." And I'm going, "OK, well, ask the poor policeman who was at the Capitol on January 6th how that’s working for him," right? I mean, we have moved so far past the individual part of this thing that, you know, it's time for everybody to to raise their voice. It's not like the status quo is working for people. Right. I mean, are we really going to give this up for, you know, the equivalent of Aldous Huxley’s Soma, you know, this this digital drug that basically keeps us entertained while our economic well-being and safety is undermined by everybody? Because remember, when we're applying antitrust laws, we can't just apply it in tech. We've got to put it into health care. We've got to put it into, you know, all these other, into universities and all that kind of stuff, because the whole economy's too concentrated and it's working against the best interests of our people. 

Kurt Andersen: Roger McNamee this has been a great pleasure and I want to thank you for coming on. 

Roger McNamee: It's been my pleasure and I hope that something good comes and I want to be optimistic about the future. So let's, let's just bring that out of this session. 

Kurt Andersen: Thank you, Roger. 

Kurt Andersen: The World as You’ll Know It is brought to you by Aventine, a non-profit research institute creating and sharing work that explores how today’s decisions could affect the future. The views expressed do not necessarily represent those of Aventine, its employees or affiliates.

Danielle Mattoon is the Editorial Director of the Aventine. The World As You’ll Know It is produced in partnership with Pineapple Street Studios. 

On the next episode of The World As You’ll Know It, I’ll talk with Genevieve Bell, who is an anthropologist and director of The School of Cybernetics at the Australian National University as well as a Vice President and a Senior Fellow at Intel.  

Among other things, Professor Bell and I talk about how to make choices about technology so we end up closer to good than bad:

Genevieve Bell: I just keep wondering to myself, what on earth will the equivalent of the Railway Time Act be for artificial intelligence, right? What will be the moment where we suddenly realize that it's shaped something else, not the things that we set out to do with it, not the things that it immediately worked on. But, you know, what other things will it make possible or reimagine or reinvent?

logo

aventine

About UsPodcast

contact

380 Lafayette St.
New York, NY 10003
info@aventine.org

follow

sign up for updates

If you would like to be kept up to date on upcoming Aventine projects, please enter your email below.

© Aventine 2021
Privacy Policy.