TRANSCRIPT: How to get social media companies to protect users (instead of hurting them)
Frances Haugen:
Because these technologies are so opaque, all the important decisions happen behind our screens. We've never had a chance for the public to build a public muscle of accountability, and these companies continue to run ahead of us, and we don't even get a chance to ask our own questions or develop our own theories.
Ian Bremmer:
Hello, and welcome to the GZERO World Podcast. This is where you'll find extended versions of my interviews on public television. I'm Ian Bremmer, and today we examine the perils and promise, but mostly perils, of social media.
In the past two decades, companies like Meta, Google, Twitter and Reddit have fundamentally changed how we all consume information. Their platforms have helped people stand up to repressive regimes in Iran, in Hong Kong and Egypt, and that's a good thing. But they've also played a major role in organizing events like the January 6th insurrection, genocides in Myanmar, and civil unrest in Ethiopia. Bad thing. People have used tools like Facebook Live to broadcast murders, suicides and torture. Bad thing. So how do we stop all of the bad that has come from making the world more connected while still allowing room for all the good? This week, I speak with data scientist and Facebook whistleblower, Frances Haugen. Let's do this.
Announcer:
The GZERO World Podcast is brought to you by our founding sponsor, First Republic. First Republic, a private bank and wealth management company understands the value of service, safety and stability in today's uncertain world. Visit FirstRepublic.com to learn more.
In a world upended by disruptive international events, how can we rebuild? On season two of Global Reboot, a foreign policy podcast in partnership with the Doha Forum, FP editor-in-chief, Ravi Agrawal engages with world leaders and policy experts to look at old problems in new ways, and identify solutions to our world's greatest challenges. Listen to season two of Global Reboot, wherever you get your podcasts.
Ian Bremmer:
Frances Haugen, thanks so much for joining us.
Frances Haugen:
Thank you for inviting me, happy to be here.
Ian Bremmer:
It's a very content rich environment that I can ask you about. We want to talk about social media, but I want to start with Europe. Because of course the Europeans, they don't have big tech companies, but they do have a lot of people that focus on how to regulate tech companies more effectively for society. Do you think they're actually accomplishing that right now?
Frances Haugen:
One of the things that I think most people don't realize about the large tech companies is that they're significantly less transparent than any of the major technologies or tech companies that ran our economy 100 years ago. One of the most important things that I think the Digital Services Act, which is the law that just passed in the European Union, is that it's the first time we have legally mandated transparency with the tech platforms. Because these technologies are so opaque, all the important decisions happen behind our screens. We've never had a chance for the public to build a public muscle of accountability. And these companies continue to run ahead of us, and we don't even get a chance to ask our own questions or develop our own theories.
And so I think the most important thing the DSA has done is actually make that a mandated right, like demand data access. The fact that they're also asking for public risk assessments, having the companies actually disclose the risks the companies know about. Because right now, the playing field's that unlevel. I think those could have really transformative effects, just because we're starting so far behind.
Ian Bremmer:
Now, Europe is a big market. It's the largest common market in the world, and I'm wondering if you believe that... yeah, over in the United States, US is number one, is one country. China's number two. China's expected to hit the US at around 2028, 2030. But on the social media side, Europe, because it's so big, even if you have regulations that are expensive for companies to put in place, it's also expensive for companies not to have unified standards. And so I'm wondering, do you believe that the Europeans passing this new law means that the American companies will eventually move towards those standards even in the United States?
Frances Haugen:
So I think the interesting thing is that I like to think about how companies change, or how companies become aligned with the public good as ecosystems of accountability. There's no industry in the world where the reason why we're safe is because there's a single actor. The government is the thing that keeps us safe. It's because there are litigators who know what it means to cut corners, and when people are optimizing for profit over safety and hold them accountable. Or it's about investors who understand what long-term success looks like, and can help govern these companies in a more sustainable way.
In the case of our relationship with big tech, we've never gone to form those larger organs. And when you look at the DSA, the DSA doesn't have a lot of things where they say, "you must do X, you must do Y, you must change your company in specific ways." What it says is, "we want a different relationship. We want you to disclose risks, we want you to actually give access to data." And doing it anywhere in the world actually changes it in the United States. Because our litigators, our investors will begin to build up the public muscle of accountability, even if we have to use the information that's coming out of Europe.
Ian Bremmer:
So in other words, what happens in Europe doesn't stay in Europe?
Frances Haugen:
Doesn't stay in Europe.
Ian Bremmer:
It's not Vegas.
Frances Haugen:
As much as Facebook might wish it was.
Ian Bremmer:
Yeah. So I think about GDPR of course, which involves the disclosure of data and cookies, where the Europeans also put this big piece of legislation in place. California ended up implementing similar laws. Now when I go to Europe and I'm opening a site and it goes and it says, "what do you think about the cookies?" And it's definitely more transparent. But what I'm hearing is that a lot of people don't want to deal with it. A lot of people would rather just actually give their data away even though they don't necessarily know what that means. Now how do you respond to something like that?
Frances Haugen:
Sure. So I worked at Pinterest while implementation of GDPR was taking place. And while the public largely perceives GDPR as I now have to give permission for cookies, that's annoying. One of the things that was really interesting and watching it play out operationally inside of Pinterest was Pinterest had lots and lots of data sitting around, insights that it had gleaned off of people. Raw data, lots of different things that often it wasn't even aware it had. It was just accumulating these things. Because that's how this actually happens, is someone asks a question at some point, accumulates some stuff, some pipelines start running in the background. And Pinterest had to go through and account for everything they had, and they deleted a lot of stuff.
One of the things that you get as part of GDPR is the right to request any data that a company has on you. And one of the interesting things that that introduces for how companies operate, is companies suddenly have to ask, "do we want to have to disclose that we have this value?" And it led to, at Pinterest deleting a lot of things. At Facebook, it led to some things that could have been gleaned, could have been recorded were not. Because when they went up for privacy review before they were launched or early in their development process, policy people, lawyers said, "Hey, if someone asked for that field, do we really want to disclose that we have that about someone?" And so those are things that people don't realize GDPR had an effect on, but I've seen had an impact at two of the largest tech companies in the world.
Ian Bremmer:
Now we're talking about Facebook now, but of course the company's called Meta. I know the companies move fast, and governments move slow. So to what extent are the ultimate threats to a Facebook, to a Google more about the competitive environment that changes as opposed to governments that in some ways may always end up being a couple steps behind where the companies are going?
Frances Haugen:
So one of things I like about how the Digital Services Act was written was that if you write laws that... One thing people ask me all the time is that, "Frances, tell us how to fix Facebook. Give us the short version, what's the five things that got to change?" And when we write laws, and in the United States, we like to write laws that are specific prohibitions. They're like, "you must do X, you must not do Y." And the problem with laws like that is that companies run around the fence. They're very clever, they hire very good lawyers who let them do what they want to do.
One of the things about the Digital Services Act that I find really interesting is that it asks for an ongoing conversation. Right now, these companies don't have to disclose things that they learn. If the public has questions, they don't have to answer the questions. When we have an ongoing risk management structure where the companies have to disclose these risks that they know about, if the government says, "Hey, these people are saying this risk exists, can you please either give us proof it doesn't exist? Or let's have a conversation about how you're going to mitigate that?" That's an ongoing, flexible approach to trying to direct them back towards the common good.
I think the secondary thing is you're saying about how governments move slowly, tech moves fast, there has always been a big gap between where technologies are and our ability as the public to hold these companies accountable. That gap is going to keep getting bigger and bigger, because let's say there's always a little bit of a delay. If tech is accelerating the gap, it's bigger. Things like effective Whistleblower Laws. And just for context, Europe passed its first Whistleblower Laws back in December, partially as a result of my disclosures. And we're going to need better and better whistleblowers, because we need to narrow that gap. We need to be able to have the public asking questions early in the design process of these systems.
Ian Bremmer:
Well having gone through this, and it's hard for me to imagine what it's like to be a whistleblower and suddenly be a public figure in such an incredibly spotlight shining on you way. What's the thing that has surprised you most about your experience since you've gone public?
Frances Haugen:
So I totally understand the sentiment of this question. I feel very cared for whenever people ask this question. And I have such an uninteresting answer, which is because I think the public was so hungry for accountability from social media. There's a lot of frustration in the public around how the relationship with Facebook has unfolded, things like Facebook lying to the public. I think because people were so hungry to live in the truth, to stop being gaslit, I have had an incredibly positive response from the public. I have opened DMs on both Twitter and Instagram, and I don't get harassed.
And as someone who has worked at four social media platforms, women who are in the public sphere, they never get away scot-free. And I have had an almost effortless whistleblower process. And so I think that's the number one thing I'm most surprised by. I was deeply scared before I came out, and I was deeply scared even in the first couple of days after I came out. Our threat researcher had a lot of scary stuff the first 24 hours, 48 hours off the dark nets, off of places like 8chan. And I think once people heard my Senate testimony, nothing ever happened to me. And so I'm super grateful for how positively I've been received, and how I feel like the internet collectively has held me. So I'm very grateful for that.
Ian Bremmer:
What do you think is more likely to change? The culture inside Facebook and related tech companies ,or government regulations in the United States?
Frances Haugen:
Ooh, how interesting. Well, I'm one of these... because I'm a slightly rarer technologist in that I was a history minor. So I was a Cold War studies minor. And the story of the period of the Cold War was about a number of different social movements that seemed absolutely impossible. It's things like the British leaving India, the overthrow of the Soviet Union. Civil rights in the United States, end of Apartheid. Huge things. Huge things that seemed impossible, but all came to be.
And I know it feels right now like the big tech companies are monoliths. But the reality is, and I say this very earnestly, I don't want to tear down any of these companies. I've spent my entire career at these companies. The thing I want is for them to be long-term successful. And what I've seen time and time again at places like Google, at Pinterest, at Facebook, is if these companies don't have incentives that require long-term thinking, It's very hard for them to operate in long-term ways. And so I have a lot of faith that as we build the ecosystem of accountability, as we build the public muscle of accountability, we are going to help all these places be more long-term successful because culture change will come along with that.
Ian Bremmer:
When I think about finance, which has gone through this back in 2008. You had a bunch of regulations that were put in place that were also meant to build muscle so that couldn't happen again. With financial institutions, the US did that fairly effectively. And I see someone like Larry Fink, and it's not because he's a fundamentally better human being than other CEOs. It's rather that he wants to be more competitive and beat the other companies. And he knows that if he doesn't address climate change first, someone else is going to.
Are we starting to see CEOs and senior leaders in the tech space recognize irrespective of the regulatory environment, if I get there first and I'm the respectable civil society, supporting, inclusive, less polarizing, more for the consumers and less for the clicks, that I'm actually going to get there first and I'm going to win? Are we seeing that yet, or not at all?
Frances Haugen:
So I think we're seeing it in some ways. So I don't want to give Google a full pass. Because let's be honest, Google is not a perfect company, there's been a number of scandals in the last couple of years. Well, one of the things that I think it's important for people to contextualize is why did Google turn down the Department of Defense machine learning contracts? And the reason they did it was really simple. Google believes that the thing will make them long-term successful, the thing that will make them the most competitive is if they're the most attractive place in the world for the most skilled engineers, particularly machine learning engineers in the world. And a lot of technologists understand the risks of these products.
I think Facebook is at a huge disadvantage now than it was 10 years ago, because it can't hire mid and senior level people. Mid and senior level people in Silicon Valley have infinite options, and Facebook isn't a place where it's very hard for them to attract senior talent. Because haven't shown a respect for the ability of people to show up and be whole people when they come to work. When you are seen or when you are just obviously cutting corners for profit at the expense of the public good, it's very hard to attract the best technologists.
So I think we're starting to see some of that. We haven't seen it play out on the consumer side I think as much. But in the talent war, which is the beating heart of Silicon Valley competition, I think we've already started to see some of those things.
Ian Bremmer:
So let me ask about Elon Musk, and this whole Twitter controversy. Because you've seen how he came out, and irrespective of whether he really wanted to buy it or not. The arguments he's making about how he says Twitter's broken, about the bot problems and the rest. How much is he identifying what the real issues are with Twitter in your view?
Frances Haugen:
So I feel very strongly about automated accounts across all of the social networks. I have talked to people who run systems for detecting fake accounts across a number of the largest platforms on the internet. They are name brand, you would recognize the name of them. They are major in the sense that they're probably in the top 10 social networks, top 15 social networks in the United States, where a substantial fraction of all the accounts are automated. We're talking upwards of 50%. And so I don't know what Twitter's actual number is. I doubt Twitter's number is 50%, but I guarantee you it's more than 5%.
And it's important for the public to understand that there is a huge, huge gap in financial reporting today that is actually a giant liability for the public good. This is something I'm planning on writing more about in the future, which is we have accounting standards for dollars because we know that companies lie about the money they have and the liabilities they have. And that creates systemic risk that is very dangerous for investors, and it's dangerous for just the public as at least to cutting of corners. In the case of tech, there is another kind of accounting that is as vital for people's share prices as dollars, which is people.
Ian Bremmer:
Who these people are, how many people you have on the account.
Frances Haugen:
Exactly, exactly.
Ian Bremmer:
Yeah, definitely.
Frances Haugen:
If you can have a 1% drop in the number of users on your site, and you can have a 10% drop in the valuation of your company. That's huge. And so right now, every time you take a bot off your site, you actually decrease the valuation of your company. And so there's this really dangerous conflict of interest here where the number one thing threatening the information environment is automated accounts. Because they allow you to set the narrative, they let you set the drumbeat. They let you amplify whatever information, true or false. Cause remember, true facts can be very divisive too. Automated accounts are extremely dangerous, and right now there's a giant financial disincentive from taking them down.
Ian Bremmer:
Do the companies...
Frances Haugen:
I think the fact that Elon is raising that as an issue, as a top level issue is I think an important thing.
Ian Bremmer:
After your testimony was that Facebook only wants to have products that help young people, help children. Obviously you disagree with that. Question. Do you think that he's lying when he says that, or do you think that this is a level of just willful alignment with his business model no matter what?
Frances Haugen:
I think there's a third option. I never think poorly of Mark. I have no evidence that he is actually a malicious human being. I do have evidence that he has surrounded himself with people who tell him very convenient stories. Mark is a very, very isolated person. I've had multiple journalists tell me he spends all day in the Metaverse. Like that's why he thinks we're all going to spend all day in the Metaverse.
Ian Bremmer:
Don't we all at the end of the day?
Frances Haugen:
I want to live in a virtual ski chateau. Don't we all? But I think that the issue there is Mark has to rule through the people who he's put around him. And there's a real problem at Facebook, which is there's no place for upward advancement, there's no movement. It's quite static at the top. And when we look at things like kids, there's some real easy low hanging fruit on kids. The hours of two to three in the morning or the 10th hour of Instagram a day are not the same as one to two in the afternoon or the first 30 minutes.
If you were a scraper stealing content from Instagram, they would slow your countdown. So you would steal less and less material. Imagine if instead of popping up a warning and saying, "Hey, you've been on here for 20 minutes. Do you want to go to bed?" Imagine if they asked you at noon, "When do you want to go to bed tonight?" And they just slowed Instagram down very, very slowly over the course of the evening. So you went and got sleepy and went to bed. They could do these things today, they have the code for hackers. Why can't they use it for kids?
And so I think it's this thing of the incentives are really hard for them. Right now, they don't have to report harm, but they do have to report how long you're on there, how many users there are, how many dollars there are. And so things like the Digital Services Act create an incentive. They make space internally to do the right thing, because now you actually do have to report how many kids are looking at Instagram for 10 hours a day.
Ian Bremmer:
So everything you're basically saying is that the purpose of regulation is to structurally shift the business model enough so that the companies themselves are truly incented to be more aware and be more supportive of the citizens.
Frances Haugen:
Yeah, that's the goal. And that's how we make capitalism successful. Capitalism unfettered burns itself out. Capitalism that we try to pull a little bit more towards the public good can be long-term successful.
Ian Bremmer:
Frances Haugen, thanks for joining us.
Frances Haugen:
Thank you for inviting me.
Ian Bremmer:
That's it for today's edition of the GZERO World Podcast. Like what you've heard? Come check us out at GZEROMedia.com, and sign up for our newsletter Signal.
Announcer:
The GZERO World Podcast is brought to you by our founding sponsor, First Republic. First Republic, a private bank and wealth management company understands the value of service, safety and stability in today's uncertain world. Visit firstrepublic.com to learn more.
In a world upended by disruptive international events, how can we rebuild? On season two of Global Reboot, a foreign policy podcast in partnership with the Doha Forum, FB editor-in-chief Ravi Agrawal engages with world leaders and policy experts to look at old problems in new ways, and identify solutions to our world's greatest challenges. Listen to season two of Global Reboot wherever you get your podcasts.
Subscribe to the GZERO World Podcast on Apple Podcasts, Spotify, Stitcher, or your preferred podcast platform, to receive new episodes as soon as they're published.
- Nicholas Thompson on the outsized influence of Big Tech - GZERO ... ›
- Podcast: We have to control AI before it controls us, warns former ... ›
- Nick Thompson: Facebook realized too late it couldn't control its own ... ›
- The ad Facebook will never run - GZERO Media ›
- Podcast: The past, present and future of political media - GZERO Media ›