Search
AI-powered search, human-powered content.
scroll to top arrow or icon

What Chernobyl Can Teach Tech: a chat with Microsoft’s Brad Smith

What Chernobyl Can Teach Tech: a chat with Microsoft’s Brad Smith
Creative Director, Senior Editor/Producer
www.twitter.com/saosasha
https://www.linkedin.com/in/alexander-kliment-789b4129/
www.instagram.com/youngnevsky

Artificial intelligence! Big data analysis! Facial recognition! Hackers! Social media!

New technologies that thrive on data have brought great promise and benefits to our lives. But they also pose new threats to our privacy, our jobs, our national security, and even to our democracies.


Few people are as keenly attuned to these challenges or as involved in trying to sort them out as Brad Smith. He is the president of Microsoft and the author, with Microsoft's senior director Carol Ann Brown, of a timely new book called Tools and Weapons: the promise and peril of the digital age.

I sat down with him recently to talk about the challenges that new technologies pose for our societies and what, if anything, can be done to address them.

The following is an edited version of our conversation. You can listen to the podcast here. Note: Microsoft is a sponsor of GZERO content.

Alex Kliment: What makes something a tool versus a weapon, and how does that apply to the technologies of today?

Brad Smith: Many things that we think about every day can be used as either a tool or a weapon. Take a broom. We can use it to sweep the floor, or somebody can use it to hit somebody over the head. This has been true of so many technologies over time. Benjamin Franklin created the postal service in the United States, and criminals turned that into a vehicle for mail fraud. The telegraph became a tool for wire fraud, as did the telephone.

Digital technology more ubiquitous and more powerful than these other tools. That also makes it potentially a much more formidable weapon as well. With the so-called #WannaCry attack in 2017, in one day, North Korea disabled 300,000 computers in more than 150 countries. No weapon in the history of military warfare had ever impacted that many countries with a single attack in a single day. Obviously, it didn't kill people the way other weapons might have, but no one would have thought in those terms a decade ago.

AK: Who decides whether a new technology turns out to be a tool or a weapon?

BS: Any individual can turn a tool into a weapon, just as you could turn a hammer from a tool into a weapon. But at the end of the day there's probably two entities that will have more impact than anybody else. One is the companies that create this technology, to the extent that we do a good job of anticipating the adverse impacts. And then, ultimately, it is the government. The government is uniquely in a position to decide whether something needs to be treated as an opportunity or a challenge, whether it should be left to the market alone, or whether it should be regulated as well.

AK: In the US we've started to see a bipartisan consensus around greater regulation of the tech sector. What should be regulated and how? What is the government's role?

The time has come for government to play a more active role. If you look at the history, digital technology has gone more decades without being regulated than almost any similar technology before, whether you're talking about the telephone or electricity or the automobile or the airplane. We've had a very laissez-faire approach to this. Now it is ubiquitous, it is powerful, it is changing our lives. And to think that we can solve the challenges that technology is creating without government playing a role I think is simply misplaced.

AK: Your book argues that to solve the challenges that technology poses today, we need more collaboration between countries, and between technology companies and governments. But cooperation among countries isn't exactly the thing these days..

Right. We see so much social political polarization at the national level globally speaking countries are increasingly going their own ways, zero sum nationalism, all of these things.

But I would also say that we're actually sailing forward and are making some progress. The number one tool that we have in the 21st century that we didn't apply very often in the 20th century is diplomacy that is not only multilateral but multi-stakeholder, meaning we're bringing governments and tech companies and other businesses and nonprofits together. That's what happened last November with what became called the Paris Call to address cybersecurity challenges including the protection of elections, and it's what happened this year with what's called the Christchurch Call to address digital safety. We think we do need to find inspiration from the past. It's why we draw a lot of inspiration from what happened in the wake of World War Two with the Geneva Convention in 1949. It's why we advocate for a digital Geneva Convention today.

AK: To what extent do you worry about a technological "cold war" between China and the United States?

BS: To some degree what happens in the world of technology is a reflection of what's happening more broadly. But we believe that it's important to appreciate how technology works. These days, almost any piece of technology hardware has a fairly high probability of having been manufactured in part or in whole in China. But what people don't necessarily appreciate is that a product that looks like it was invented in the United States in fact was probably invented with critical research advances in many places around the world and these days that almost certainly includes a number of researchers and engineers in China as well as in India and the United Kingdom Ireland and many other places.

The basic point is, I worry that there is a temptation in some quarters to try to create a new digital iron curtain down the middle of the Pacific. And yet as we explain in the book even if that were possible it would probably do more to hold back a country like the United States because you would cut yourself off from critical research that would then be used to drive technology forward elsewhere… I think that among other things there is a risk that the U.S. government in trying to protect American technology will in fact weaken it.

AK: When we talk about Artificial Intelligence (A.I.) there's a lot of concern about jobs. What in your view do the pessimists and the optimists each get wrong about A.I.?

BS: I think what the pessimists get wrong is that is that AI will create jobs as well as change and displace or destroy some jobs. And of course the optimists are too optimistic to think that it will all work out just fine… We spend a little time in the book talking about how the economic transition from the horse to the automobile in the long run created more jobs than it destroyed. But the long run took four decades, and halfway through that four decades we were in the middle of the Great Depression and one of the reasons we were in the middle of the Great Depression is because of the indirect negative economic effects of the demise of the horse and an entire economy that had been built to some degree around a horse-centric era.

AK: You may have seen the new HBO series Chernobyl, which everyone is talking about. There are few examples I can think of where a technology can be both a tool and, quite literally, a weapon as atomic energy. How should we think about digital technology as something different?

BS: I would say if you want to think about a technology that was as ubiquitous and universally impactful as digital technology, you have to think about electricity because it ended up fueling everything. And that is what digital technology, and data in particular, have become. It's a mistake to think that the future is going to unfold exactly like the past, but the key point is there are insights that can be drawn and in many ways I'd even talk about nuclear power and Chernobyl specifically. I think that there was an extraordinary insight that actually came out of that recent show. Think about the fundamental problem that was created for nuclear power in Russia because of an absence of transparency.

And then you think about this new technology [of today] and you are compelled to focus on the critical importance of transparency and rules that will compel the use of this technology, and the development of this technology, to be transparent. Without an ability to understand how technology is working, almost nothing else is possible and people's lives can be at stake. Witness Chernobyl.

AK: New technologies like social media were originally thought of as a great democratizing force but over the past several years we've seen them contribute to polarization in democracies, while giving authoritarian regimes more levers of control over their populations.

BS: Technology especially over the last five years has emerged as a weapon that has benefited authoritarian regimes in an asymmetric way. You know, the Stasi regime in East Germany went to incredible lengths to spy on the population. If [their] people [could] read Facebook pages their job [would have been] a lot easier. But I think even more than that, technology is being used to spread disinformation in a way that is not qualitatively different from what certain authoritarian regimes have sought to do for many decades, but oh my gosh it is a quantum leap these days of ability and speed. I mean you know for a group in St. Petersburg [Russia] to organize a protest and counter protest in the United States where literally neighbors are yelling at neighbors with neither one of them for a moment even suspecting that this whole issue had been fomented thousands of miles away. That is a different world.

AK: What's the one thing that worries you most that you don't think is being addressed today?

BS: That the democratic republics of the world will find it too hard to make decisions for themselves and find it impossible to work together. Cyber weapons are going to continue to become more powerful over the next decade. A.I. is going to guarantee that. Which is why in many ways we wrote this book as a call to action to say look let's get started. The worst thing we could do is say "we don't yet have all the answers, let's spend 10 years learning more and thinking about it."

AK: Does the fact that the U.S. doesn't play a leadership role [on these questions] make things significantly harder?

BS: It makes everything harder. That's one of the challenges in our view and that's not to take away from the need for important bilateral negotiations that are hard-headed and maybe even on certain days tough-edged. But technology is a global phenomenon. It cannot be addressed satisfactorily through a series of bilateral discussions. We need global initiatives. And ironically in my view unfortunately we have certain American companies trying to do more to promote these multilateral and multi stakeholder approaches without the backing of our own government. But the fact that we can make headway on these days gives me great hope ultimately for the future when the pendulum swings back as I believe it inevitably will to the U.S. government getting off the sidelines and getting more in the game.