Trending Now
We have updated our Privacy Policy and Terms of Use for Eurasia Group and its affiliates, including GZERO Media, to clarify the types of data we collect, how we collect it, how we use data and with whom we share data. By using our website you consent to our Terms and Conditions and Privacy Policy, including the transfer of your personal data to the United States from your country of residence, and our use of cookies described in our Cookie Policy.
{{ subpage.title }}
Trolling with power: Elon Musk’s online antics are getting real
Businessman, entrepreneur, and increasingly, a disruptive force in geopolitics.
Elon Musk, the owner of X, SpaceX, and Tesla, has never shied away from controversial political posts, but over these last few weeks, his online trolling has had very real-world consequences.
Last week, he amplified posts on X that fueled racist riots in the United Kingdom and prophesized that civil war in the country was inevitable. Today, he is reportedly set to interview former President Donald Trump on X, a sitdown that will generate hundreds of headlines in a presidential cycle in which the interviewer, Musk, has unabashedly chosen a side.
In the immediate aftermath of the assassination attempt in Pennsylvania last month, Musk took to his app to endorse Trump’s candidacy – shattering the norm of self-declared neutrality by the leaders of social media platforms. (Mark Zuckerberg, for example, is not nearly as vocal about his political views). And in July, Musk announced the creation of a political action committee, America Pac, that would “mostly but not entirely” support the Republican Party.
The South African-born investor has also signaled his disapproval of Trump’s opponent, Kamala Harris, and even disseminated a deep fake video purportedly showing Harris calling herself “the ultimate diversity hire.” He also suspended the account “White Dudes for Harris” on X after it held a massive fundraising call that raised more than $4 millionfor her campaign.
Musk’s political interventions on X have been particularly controversial in the UK, where his inflammatory posts have been linked to recent civil unrest. British officials have criticized Musk for spreading misinformation, including false claims that the murderer of three British girls – which fueled protests and riots last week – was a Muslim migrant. During the riots, “super sharers,” or accounts like Elon Musk’s with large followings, acted as “nodes” for disseminating this lie through their interaction with the far-right content.
Musk is also responsible for relaxing the content moderation guidelines on the site and reinstating many far-right accounts that acted as super-sharers of misinformation. For example, he unbanned Tommy Robinson, a fringe and four-times-jailed extreme-right British activist, who went viral during the riots. He also promoted Ashlea Simon – co-founder of a white supremacist group — who claimed UK Prime Minister Keir Starmer planned to send British rioters to detention camps in the Falkland Islands.
Can he be regulated? As a result of the riots, many political leaders, including Starmer, EU commissioners, and US senators, have called for an inquiry into social media’s role in spreading incendiary disinformation.
According to Scott Bade, a geo-technology expert at Eurasia Group, Musk is increasingly becoming a “geopolitical agent of chaos.” But Musk isn’t too powerful to regulate, says Bade. “The thing is, you’re not going to regulate Elon himself. You’re going to regulate the pieces of his empire.”
The Online Safety Act is already set to take effect in the UK at the end of the year and will require platforms to remove illegal content or be fined 10% of global annual turnover or £18 million, whichever is higher. In the wake of the riots, legislatures are considering tightening restrictions so companies can be sanctioned if they allow “legal but harmful” content such as misinformation to flourish.
“There is a clear consensus emerging in the aftermath of the riots that Musk and X are a problem, given the amount of misinformation, racial abuse, and incitement to violence that was spread on the platform,” says Eurasia Group Europe expert Mujtaba Rahman. “There will be a political and a policy response, but what shape that will take remains unclear for now.”
A bad case of “academentia” that needs to be cured
This week Claudine Gay, Sally Kornbluth, and M. Elizabeth Magill, the presidents of Harvard, MIT, and the University of Pennsylvania, were brought before the House Committee on Education and the Workforce to speak about the dangerous rise of antisemitism on campus, especially since the Oct. 7 attacks.
The Israel-Hamas war has triggered an alarming rise in antisemitic incidents on and off campus and also a rise in Islamophobic incidents. It was so bad that back on Nov. 14, President Joe Biden released an action plan to combat antisemitic and Islamophobic events on US campuses.
So the university presidents were steeped in this issue and knew tensions had been running high. They came to Washington prepared – well, prepared for something, at least.
Sadly, expectations for these kinds of hearings are low. Politics in Washington today is more like eye surgery done with a pickax, so no one predicted a nuanced, academic discussion with three illustrious leaders. Still, what happened under the big marble-top circus of politics was a genuine surprise.
Amid the usual grandstanding, ax-grinding, partisan preening, camera mugging, sound-bite fishing — and there was a lot of that on culture war issues like “wokeism” – something noteworthy happened.
At five hours and 23 minutes into the hearing — you can watch it here – New York Republican Rep. Elise Stefanik, who graduated from Harvard in 2006, asked a basic question of the three presidents.
Here is part of the transcript, with Stefanik questioning the president of Penn, Dr. Magill.
Stefanik: … Does calling for the genocide of Jews constitute bullying or harassment?
Magill: If it is directed or severe and pervasive, it is harassment.
Stefanik: So, the answer is yes?
Magill: It is a context-dependent decision, Congresswoman.
Stefanik explodes in incredulity: This is the easiest question to answer yes, Ms. Magill.
Magill (smiles, oddly): If the speech becomes conduct. It can be harassment, yes.
Stefanik: Conduct meaning … committing the act of genocide? The speech is not harassment?
Stefanik gave Magill one more shot at the answer and got nowhere before asking Dr. Gay, president of Harvard, the same question.
Stefanik: Does calling for the genocide of Jews violate Harvard's rules of bullying and harassment, yes or no?
Gay: It can be, depending on the context.
You get the idea.
Apparently, on campuses, calling for genocide is bullying only in certain contexts (when is it not?) and only when it turns into action.
Remember, Stefanik was not asking here if the presidents would shut down such speeches on campus. Or take action. She asked a basic, theoretical question of whether calling for the genocide of Jews constituted bullying and harassment. Not a single president answered yes.
This was academentia at its worst. The term, of course, is not medical; it describes hyper-intelligent academics who appear to have lost touch with reality. So caught up in nuance and qualifiers that they can’t answer a simple question.
Imagine for a moment, someone asking, “Is calling for the genocide of all Muslims an act of bullying or harassment? Or the killing of all women? Or the killing of all African Americans, or LGBTQ people?"
Even if US academics uphold the First Amendment, which, in the US, protects hate speech — that was not the question. The question was simply whether calling for the genocide of a specific group hit the threshold of bullying on campus.
How hard is that? Harder than we thought.
Free speech in the US versus Canada is handled very differently. In Canada, there are reasonable limits to speech, and the Criminal Code section 319 is clear that hate speech and antisemitic speech are indictable offenses and are liable for imprisonment.
Context matters as well. Hate crimes against the Jewish, African-American, Muslim, and LGBTQ communities are all up, according to recent stats. The latest FBI hate crimes data shows a 25% rise in antisemitic hate crimes between 2021 and 2022 — which is more than half of all reported hate crimes — against a population that comprises less than 2.4% of the US population. Crimes against the LGBTQ, Black, and Muslim Americans are also overrepresented, but FBI Director Christopher Wray said this week that antisemitism is reaching “historic levels.”
The same is true in Canada, where most hate crimes still target the Jewish population, but the Muslim and Black populations are also targeted.
While the Israel-Hamas war is deeply polarizing, and confusing, there are not two sides to hate. University presidents should not have to duck behind talking points and prepared statements to answer a basic question about human decency. And university students should not have to learn in hate-filled environments. We need to trust our places of education now more than ever, not less.
Higher education should not mean lower common sense.
Bibi vows “Never Again is now”
As Israel ramped up its military campaign against Hamas this weekend in response to the deadly Oct. 7 Hamas attacks, Prime Minister Benjamin Netanyahu announced that Israeli soldiers were in the second stage of the war with ground troops entering Gaza. The goals, he said, are: “to destroy Hamas’ governing and military capabilities and to bring the hostages home.”
Referencing the cautionary slogan that emerged after the Holocaust, when six million Jews perished at the hand of the Nazis and their collaborators, he added, “We always said, Never Again. Never Again is now.” By Monday, Israeli tanks were approaching the outskirts of Gaza City.
International criticism of Israel’s Gaza campaign, meanwhile, is growing. At a massive demonstration in Istanbul in honor of Turkey’s centenary on Saturday, for example, President Recep Tayyip Erdogan said “We will declare Israel a war criminal” and called Hamas militants “freedom fighters.” In response, Israel withdrew its diplomats.
At the United Nations the day before, an amendment put forward by Canada to condemn Hamas for its atrocities and call for the release of all hostages failed to obtain the two-thirds vote necessary for passage. The amendment would have been added to a motion calling for a truce and suspension of hostilities in Gaza, the end of evacuation orders, and the granting of full and unimpeded access to UN relief workers in the area. After the vote, Israel’s representative, Gilad Menashe Erdan, said the world has witnessed that the UN “no longer holds even one ounce of legitimacy or relevance.”
Since the Oct. 7 attacks, the world has seen a shocking surge in hate crimes against Jews and Muslims, making members of both communities feel vulnerable. In the US, the FBI, the Department of Homeland Security, and the National Counterterrorism Center noted that they have “seen an increase in reports of threats against faith communities, particularly Jewish and Muslim communities.” In one instance, violence led to tragedy when a 6-year-old Palestinian-American boy was killed in Chicago a week after the Hamas attacks.
Pro- and anti-Israel protests have been held on US college campuses – from California to NYC – and in state capitals around the globe. In Dagestan, Russia, a mob stormed an airport this weekend in search of Jewish passengers after a flight arrived from Tel Aviv.
Antisemitism in the US, which was already on the rise, has seen a nearly 400% rise since Oct. 7 compared to the same period last year, according to the ADL.
“Antisemitism is the oldest, longest, most enduring, most toxic, and most lethal of hatreds,” said chair of the Raoul Wallenberg Center for Human Rights and longtime Canadian government official Irwin Cotler. Combatting it, he noted, requires “... a whole-of-government approach and a whole-of-society approach” – something he said is lacking.
In some places, politicians and celebrities are calling out antisemitism and denouncing hate crimes – and we’re seeing the debate hit the presidential campaign trail. Former President Donald Trump, while speaking to the Republican Jewish Coalition leadership summit on Saturday, for example, accused President Joe Biden of turning “a blind eye to the greatest outbreak of antisemitism in American history.”
We’ll be watching to see how Israel’s isolation on the world stage impacts its Gaza campaign, and how political squabbles over Israel play out in the US presidential race.
Hearing the Christchurch Call
After a terrorist attack on a mosque in Christchurch, New Zealand, was live-streamed on the internet in 2019, the Christchurch Call was launched to counter the increasing weaponization of the internet and to ensure that emerging tech is harnessed for good.
Since its inception, the Christchurch Call has evolved to include more than 120 government and private sector stakeholders. The organization, pioneered by the French and New Zealand governments, will hold its next major summit at the Paris Peace Forum in November.
Dame Jacinda Ardern, former Prime Minister of New Zealand who led the response to the Christchurch attack; Ian Bremmer, president and founder of Eurasia Group and GZERO Media; and Brad Smith, vice chair and president of Microsoft sat down with CNN’s Rahel Solomon for a Global Stage livestream on the sidelines of the UN General Assembly in New York. The event was hosted by GZERO Media in partnership with Microsoft.
Reflecting on the catastrophic attack that prompted the formation of the Call and its mission, Dame Ardern recalled how, on that day, ”I reached for my phone to be able to share that message on a social media platform, I saw the live stream.” She notes how that became a galvanizing moment: In the “aftermath of that period, we were absolutely determined … we had the attention of social media platforms in particular to do something that would try and prevent any other nation from having that experience again.”
That led to the formation of the organization in a mere eight-week period, Ardern said. But identifying hate speech and extremism online that can fuel violence is no small feat, Ardern acknowledges, adding that while the goal can indeed appear “lofty,” the group’s focus is on “setting expectations” around what should and shouldn’t be tolerated online.
But what did tech companies learn from the Christchurch experience about their own roles in moderating content, overseeing algorithms, and mitigating potential radicalization and violence?
One major development that came out of the Christchurch Call, Smith notes, is what’s known as a content incident protocol. “Basically, you have the tech companies and governments and others literally on call like doctors being summoned to the emergency room at tech companies and in governments so that the moment there is such a shooting, everybody immediately is alerted.”
Emerging technologies – most notably artificial intelligence – mean that the Christchurch Call must remain nimble in the face of new threats. Still, Ardern says that’s not necessarily a bad thing because AI presents both challenges and opportunities for the organization. “On the one hand we may see an additional contribution from AI to our ability to better manage content moderation that may be an upside,” she says. But “a downside,” she notes, “is that we may see it continue to contribute to or expand on some of the disinformation which contributes to radicalization.”
Bremmer shared this view of AI, calling it both “a tool of extraordinary productivity and growth, indeed globalization 2.0,” while also acknowledging the threat of disinformation proliferation: “Fundamental to a democratic society, an open society, a civil society, fundamental to human rights and the United Nations Charter is the idea that people are able to exchange information that they know is true, that they know is real,” he says.
Four years after the Christchurch attack, there is indeed a sense of urgency surrounding the need for governments to better understand emerging technologies and their powers over politics and society. “Governments understand that this is systemic, it is transformative, and they're not ready,” Bremmer says, adding that “they don't have the expertise, they don't have the resources, and we don't yet have the architecture … we're late!”
Exclusive GZERO/Maru Poll: With hate speech rising, Americans want a crackdown on social media
The recent, unhinged anti-Jewish rants by musician and designer Kanye West are only the most prominent example of a wider phenomenon: antisemitism is rising in the United States.
Last year, attacks nationwide targeting Jewish people, property, or institutions rose by 35% to more than 2,100, according to the Anti-Defamation League. That’s the highest level since the ADL began tracking antisemitism more than 40 years ago.
This tracks a broader trend: Across 15 major US cities, hate crimes – that is, acts of violence that target a specific community – rose more than 20% last year, according to the Center for the Study of Hate and Extremism. That was true for attacks on Blacks, LGBT, Latinos, Asians, and Whites.
Those are the facts, but how do Americans perceive things? Do they feel that hate is rising? And if so, what should be done about it? As part of GZERO’s new polling partnership with Maru Public Opinion, we asked them.
In a new nationwide GZERO/Maru survey of 1,500 Americans, conducted between 9-11 December, 69% of respondents said antisemitism was on the rise.
But it’s not coming from “us”, they say. Curiously, with all that hate floating around, most of those polled seem to think it’s coming from somewhere else. Only 42% of Americans say hate speech is present within their own communities.
So where does the hate come from? Majorities surpassing 70% say hate speech in general is rising on social media platforms like Twitter and Facebook, on mainstream media talk shows, among celebrities, and even within America’s political parties.
Police the platforms? As a result, the survey showed considerable support (57%) for the idea that the government should use its regulatory power to force social media platforms to “put a stop” to hate speech appearing on their platforms.
“Free speech may be a much-lauded value,” says John Wright, executive vice president at Maru, “but it’s clear a majority of Americans draw a line that’s been crossed.”
Depending on your perspective, more government regulation is either a welcome intervention to tamp down the flames of hate, or it’s a chilling overreach by the state, raising thorny questions about the boundaries of free speech.
What do you think? Are antisemitism and other hate crimes rising where you live? What do you think is fueling the trend, and what should be done about it?
Let us know here. Include your name and location with your answer and we may publish it in an upcoming Signal.Journalism on trial in the Philippines: interview with Maria Ressa
Ian Bremmer talks to embattled Filipina journalist Maria Ressa, CEO of the online news agency Rappler. Ressa and her team have been involved in a years-long legal battle that challenges press freedoms and free speech in the Philippines, as President Rodrigo Duterte continues to assert authoritarian control in his nation. In the conversation Ressa details the ongoing court battles that have her facing up to 100 years in prison if convicted. She also discusses Duterte's militaristic approach to COVID-19 response, and then issues strong warnings about social media's role in promulgating hate speech globally.
Facebook allows "lies laced with anger and hate" to spread faster than facts, says journalist Maria Ressa
In a new interview with Ian Bremmer for GZERO World, embattled Filipina journalist and CEO Maria Ressa issues strong warnings about social media companies, and Facebook in particular, for their inability or unwillingness to control hate speech online. Ressa, who runs the online news site Rappler, has been involved in a prolonged legal battle in the Philippines that threatens press freedom and free speech in that nation.
The fight has been fueled, she says, by a weaponization of social media."Facebook and other social media platforms allow lies laced with anger and hate to spread faster and further than facts, which are really boring," she says.
The conversation, part of the latest episode of GZERO World, also focuses on her ongoing case and how, she says, President Rodrigo Duterte has used the COVID-19 pandemic to further his authoritarian agenda in the Philippines. The episode begins airing nationally on US public television Friday, July 17. Check local listings.
Can Facebook's algorithm remove hate speech? Meltdown-proof nuclear reactors
Nicholas Thompson, editor-in-chief of WIRED, discusses technology industry news today:
Do some of the Facebook's best features, like the newsfeed algorithm or groups, make removing hate speech from the platform impossible?
No, they do not. But what they do do is make it a lot easier for hate speech to spread. A fundamental problem with Facebook are the incentives in the newsfeed algorithm and the structure of groups make it harder for Facebook to remove hate speech.
In general, have tech companies become more or less wary of the Trump administration in recent months?
Vastly less wary. I think that's partly because they think Trump might lose, so they're less worried about retaliation. Also their employees are very mad.
Do you really believe that a meltdown-proof nuclear reactor is possible?
No, but I am excited about the future of small nuclear reactors that have anti-meltdown technology built into each little grain of uranium.
When are you joining Parler?
Parler is the free speech, social media alternative. And I am already on it. I joined it a few days ago.