Roger McNamee wants Canadians to ban targeted advertising in the weeks before the 2019 federal election and described Google sister-company Sidewalk Labs as “an extreme form of behavioral manipulation that we should fight.”
The former mentor to Facebook CEO Mark Zuckerberg sat down with David Skok, The Logic’s editor-in-chief, in front of about 400 attendees at an event in Toronto. McNamee was an early investor in both Facebook and Google, but he now describes both companies as monopolies misusing people’s data.
He also called for Canadian tech talent to stop working for U.S. giants, and laid out a vision for how Toronto can be more successful than Silicon Valley as a tech hub.
Roger McNamee, former mentor to Facebook CEO Mark Zuckerberg, sat down with The Logic’s editor-in-chief David Skok to talk about the power of Big Tech. McNamee criticized Sidewalk Labs, called for a ban on targeted advertising pre-federal election and laid out a vision for how Toronto can be more successful than Silicon Valley as a tech hub.
David Skok: Today, Facebook banned several extremists on their website, one of them being Faith Goldy, who is a Toronto native. That was done just before we came onstage. The other thing that happened today that’s worth pointing out is the United Kingdom releasing a sweeping new plan to penalize Google and Facebook for harmful content—I’m not sure if you had a chance to see it before—which brings me to a big question around the geopolitical landscape that we’re in and how this all plays into it. You have different jurisdictions taking different routes. If you were to pick a winner, or a horse that you would want to back here, would it be the United Kingdom? Would it be Europe? Would it be China? Would it be the United States?
Roger McNamee: The first thing I would observe is that our understanding of the problem has evolved really dramatically. When the Europeans passed the Global Data Protection Regulation [GDPR]—which I guess was three years ago—it really represented the best understanding of the problem at that time, which is to say we want to give people control and ownership of the data they put into these platforms.
The effect of GDPR—the reason why Facebook is so insistent that we want to move the whole world to GDPR—is that, for all intents and purposes, it would create a big cost for startups, while protecting Facebook’s underlying business model. So it creates the appearance of a solution without actually addressing the problem.
And I do not blame the Europeans for that. The GDPR was the right idea; they now need to expand it. The Germans have essentially—they turned down [Google] Street View. They won’t let Google merge the databases of the various products that they have. They’re definitely on the right idea. The U.K. is going after the election side with some really thoughtful things. The French have been the first to enforce a GDPR case against Google. The European Union has done three successful antitrust cases against Google. I love all of that. I love what the Australians are doing. New Zealand is contemplating a bunch of stuff. The United States, in California, we’ve got a privacy law modelled on GDPR which is, again, just a first step.
Right now, what I want to do is have as many different things going on as possible in as many different jurisdictions. Do not count on anyone to solve this problem. What we really need to do is make the cost of complying with 100 different things so high that they are forced to concede their business model in order to make the pain go away. And that’s going to be really hard.
DS: So, let me push that and say, “Well, OK, that all sounds lovely, but we’re in a geopolitical battle right now.” If you’re the United States, China doesn’t put the same kind of conditions on Weibo or WeChat—or Alibaba certainly doesn’t. So why should Americans—and North Americans, in a larger sense—put those kind of restrictions on Facebook or Google?
RM: I have one really simple question. Does anybody think that our job is to be great at behavioral manipulation? That’s what China’s doing with AI. And that is what Google, Facebook, Microsoft and Amazon are doing with AI. And they’ve cornered 80 per cent—as far as I can tell—of the AI talent in North America to do behavioral modification.
Now there are literally a million things you can do with AI that would be wildly more useful to society than behavioral manipulation. I would say that behavioral manipulation is to AI what cloning human babies is to biology. It’s on the wrong side of the line. And we should let China do that. Now I think we should restrict our markets so that they can’t bring it in here. And I think we should go to our trading partners and have them do the same thing. I think no one should be allowing WeChat and the other people who are part of China’s social credit market do what they’re doing.
But, again, I have a very simple philosophy. I believe in capitalism, I believe in open markets, I believe in democracy. And I think that behavioral manipulation is inconsistent with all three of those things.
DS: One of the things that is so fascinating about you is that you are a venture capitalist—or you were a venture capitalist—and the mindset in Silicon Valley over the last 30 years—as you describe so eloquently in your book—has been one of the lean startup model.
RM: The lean startup model starts in 2004, and, really, Facebook and LinkedIn are in the beginning of that model.
What happened after 2003 is that suddenly there was plenty of processing power: all the memory you needed, all the storage and, initially, all the wired bandwidth—and by 2010—all the wireless bandwidth you needed to do whatever you want. So, suddenly, you could go global and—in precisely the same time, because you had reached this sufficiency thing—it became economic to create companies like Amazon Web Services that took care of all the infrastructure. And then you could rent it on a credit card by the hour. The effect of that was to take the costs of a startup from US$100 million to US$10 million overnight, and to take away the need for experience. So suddenly, instead of 40-year-old entrepreneurs, you had 20-year-old entrepreneurs, and the combination of those two things coming at exactly the time that LinkedIn and Facebook started up changed everything.
And the culture changed with it. The culture before that time was totally customer-centric because you had to get to the exact thing the customer needed because you were constrained. When you’re going global, the customer literally didn’t matter. In fact, you could have different customers, you could go this pure consumer model. Now before 2003, enterprises were your primary customer, but after that, it was all consumers, and these guys—for whatever reason—chose to treat the consumers as not only not the customer, but not even the product. They were the fuel for building a data avatar, where they would gather every piece of data about your digital persona, and then use that to manipulate your behaviour. And if they didn’t do it directly, they allowed their advertisers to do it, like Google didn’t do it directly. Facebook let their advertisers do it.
DS: How much of that is the venture capital model of scaling, at that point, not worrying about—
RM: —Well, it became the venture capital model. It wasn’t the model before that, because the model before that was stuck with these engineering constraints.
The key thing was, PayPal went public in the early 1990s, and the founders of PayPal—Peter Thiel, Elon Musk, Reid Hoffman and another group of people, now known as the PayPal mafia—had two giant and completely brilliant insights. One was that the internet was shifting from a web of pages to a web of people, so social thinking might be the big deal. That was incredibly powerful insight, and they had it early enough to found almost all the really important companies in the category. The second thing that they saw was the shift where there would be enough resources—and they understood what it meant—so they went global.
And so, they took their personal philosophy—which was most extreme in Thiel but shared by all of them to one degree or another—which was that the goal of business was to be a monopoly, was to be global, and it was to make them billionaires. It wasn’t about improving the lives of consumers. That literally never occurred to them. And it’s not because they’re bad people, but the culture pivoted really hard. And once the pivot took place, it normalized super quickly, such that everybody followed that mantra.
DS: As a CEO, you’re driven by your shareholders, by your board, by your incentives structure. Do you think that, in 2019, the venture capital model still works?
RM: So, here’s the thing. I don’t think the problem began in Silicon Valley. I think, starting in 1981, the United States shifted its basic focus away from the old model of five stakeholders, where you had shareholders, you had employees, you had the communities where employees lived, you had customers and you had suppliers.
And we made this pivot away from collective action—the same thing that had beaten the Great Depression, that won the Second World War—to this notion of the Marlboro Man: everybody for his or herself, this hyper-libertarian model. And the notion was shareholders are the only stakeholders. And once you do that, your time horizon goes from long to super short.
And the result of that is throughout the economy you’ve seen egregious behavior. When you look at Boeing and the 737 Max—people died because of what was clear. It appears to be negligence, right? Or you look at Wells Fargo. I mean, taking money out of the accounts of millions of account holders. I mean, what’s up with that! Throughout the economy—you look at ExxonMobil, when Rex Tillerson was CEO, conducting a policy of contravention of U.S. foreign policy in Russia. I mean, all of this is about “You gotta optimize for today,” and I think that flaw is really deep, and Silicon Valley embraced it.
Now, the venture capital industry is still running down that path at full speed, at least in Silicon Valley. My basic point is, fine—thanks to Amazon Web Services and all this stuff, we don’t need Silicon Valley anymore, we can do this right here in Toronto. I mean, the fix for this is going to be a business opportunity way bigger than what we have now. And the difference is if we do it right, it’s spread over thousands and thousands of companies in hundreds of cities. It doesn’t need to be centralized anymore.
That’s what’s nuts about this whole thing. I mean, you’ve got the University of Toronto. Get your AI guys to stop working for Google and to start working for you guys, OK? That’s what we’re supposed to be doing here. All of this stuff, I mean, it’s right here! We don’t need to play follow-the-leader anymore. This stuff’s just not that complicated.
Balance is what we need here. We need to get from this coercive capitalism and back to one where everybody gets to be part of the game. And everybody can participate in that. I mean, there’s every reason to believe that Toronto can be more successful at this than Silicon Valley. Because you’re not running at 100 miles an hour in the wrong direction.
DS: So, one of things that is happening in Toronto is the Sidewalk Labs partnership on the waterfront.
RM: Yes, I’ve heard about that.
DS: I don’t even know if I have a question, really.
Roger: I have a very, very, very, very strong belief that Sidewalk Labs is an extreme form of behavioral manipulation that we should fight and close off. Sidewalk Labs, they want to be a government, but without any of the responsibilities of government. We should not allow it. The answer is no. Then we can have a decent conversation. The answer starts out as “No, no, you don’t get to run this stuff. You don’t get to control all this data.” That’s just a really, really, really bad idea. Private-public partnerships? Be very careful, because if the private controls the data, bad things are going to happen to the citizens in the world.
DS: One last question for me. Have you heard from Mr. Zuckerberg or Ms. [Sheryl] Sandberg since you wrote the book?
RM: I haven’t heard from either Mark or Sheryl since October of 2016. I’ve not spoken to any officer of Facebook since February 2017. I would welcome the opportunity to do so. To this point, they’ve shown no interest, but again, I want to repeat something I just said a moment ago, which is that Mark is now engaging in the political discussion. I don’t like what he’s saying, but I really respect the fact that he’s engaging. Google’s pretending like none of this applies to them, which is utter nonsense.
Selected audience questions
Audience: Hi there. My name is Ana Serrano. I’m the chief digital officer of the Canadian Film Centre, I’m the co-chair of the Open Democracy Project, I’m the managing director of an accelerator called Ideaboost—in which David is a participant—and I’m also an activist citizen member of #BlockSidewalk. And, for the first time, all of these threads that I’ve fought for very separate parts of my life are coming together in many of the things you’ve been talking about today.
My question is: how, as someone who runs an accelerator—and there are many, many in this room, or at least in Canada—can we start to help that next generation of technology leaders to become civically engaged to understand growth, on the one hand, on that kind of balanced capitalism that you talked about, but to also have an ethical framework and foundation from which they should be thinking about growing their companies?
RM: So, one of the points I want to make—and I’m going to reiterate this, because I did it in little bits along the way, but I want to say it in one chunk here. The issue here is cultural across the entire world of capitalism today. In the United States, at least, the government historically created their rules and then enforced them for everybody, but we’ve gone almost 40 years where we’ve been dismantling the enforcement systems and undoing the rules. So now we have chaos.
And smart business people move into vacuums and take what they can get! Google saw there was all this unclaimed data; they went and grabbed it. And, under the rules of the time, that was completely legitimate, because nobody really thought through what the implications were—including the people at Google, I think.
So my basic point is I know a lot of these people, I like them, I don’t think they’re bad people. I think they’re operating in this flood culture. And the starting point is asking the question “What kind of culture do we want to live in? And are we willing to apply that broadly?” Because I don’t think the tech guys are even the worst offenders on a lot of these things. I just think that the long-run impact of all that data and all that power is scarier. And, when I look at it, I’m much more afraid of Google and Amazon than I am of Facebook.
When I look at this, I just think, one of the problems is we pretend that if you’ve had a STEM education, you’re prepared for life. That is demonstrably not true. People need to read history. They need to know that we’ve seen this problem before! They need to read literature and philosophy and religion so that they understand that there are other things that matter besides making money. The thing you see in the Valley right now is this intense desire of people to live to be 200 because they’ve wasted the first 40 years of their life becoming a billionaire! It’s incredibly unsatisfying to be immensely rich and have no soul.
Share the full article!Send to a friend
Thanks for sharing!
You have shared 5 articles this month and reached the maximum amount of shares available.Close
Share the full article!
Share the full article with your friends. Recipients will be able to read the full text of the article after submitting their email address. They will not have access to other articles or subscriber benefits.
You have shared 0 article(s) this month and have 5 remaining.
DS: And making sure they all subscribe to The Logic, too.
RM: Oh, that went without saying. That applies to everybody at all ages.
Audience: My name is Dr. Sara Diamond. I’m the president of OCAD University, it’s an art and design institution in Toronto. We’ve shifted a lot of our context in the last several years to include data knowledge, as well as the creative skills.
This is an audience of elites. How do you think through the challenge of creating fluency and capability at a much deeper level within society so that people are excited about their opportunities to manage and control their own data?
RM: I would like to take the word “data” and put the word “life” in place there. The data is part of that, but a big thing here is—if you go and you read [Aldous] Huxley—these services are like soma. They have a hypnotic power. What Google and Facebook and YouTube and Instagram [and] Twitter provide is they provide status as a service in a culture, at least in the United States, that celebrates moments of fame and notoriety. That strikes me as inherently unsound. People are less interested in building things of real value [than in] building things of temporary social dynamism.
Audience: What do you think the chances are of these companies being broken up?
RM: I think you have to change the business model first thing to break them up. It doesn’t help to break them up if you don’t change the business model. Then you just have 50 versions of the same product.
This interview has been edited for length and clarity.