How is this war playing out differently on social media than previous wars and conflicts?
Emily Bell: Certainly civil wars and internal conflicts have played out before on social media, but not with the same focus from Western mainstream media. If you think about how the war in Syria was documented, it was largely through social media. We saw an enormous number of innovative techniques and coverage emerging from Syria.
But if you think back to the Arab Spring, the social media tools were very much those of insurgencies. And now that’s not the case. They are just as likely to be used by people with power and authority as they are by people who are resisting power or who are insurgent in some way.
Read also
Platforms like Meta, Twitter, Google, TikTok, Spotify, and others have been proactive in removing or demoting Russian state-affiliated media. What do you think of this?
It’s really unhelpful to have uncontrolled material of all types when people’s lives are at risk. It’s genuinely dangerous when…propagandistic narratives and disinformation are allowed to proliferate. I don’t subscribe to the view that says we are better off if everything circulates freely. I think we know what happens when everything circulates freely, and there’s no evidence that produces a better outcome. In fact, quite the opposite.
Do you see any blind spots in these platforms’ enforcement actions in the war so far?
I think that they were very slow to take action on [Russian state-affiliated broadcaster] RT, so RT disappeared slowly [after Meta and Twitter restricted access in early March]. I think that there’s a debate to be had about the amount of extremely graphic content which is circulating on Twitter at the moment.
The one big blind spot has been simply not acting quickly enough to stop abuse of power on their platforms. And that’s not about the immediate reaction in Ukraine. It’s really about everything that they’ve learned from the past 10 years and not really wanting to enact policies or change the design of their platforms in a way which prevents that.
YouTube said it removed thousands of Ukraine-related videos violating its policies on hate speech, misinformation, and graphic content and Facebook and Twitter have removed accounts over disinformation. How do these content removals affect the process of newsgathering, fact-checking, and disinformation research?
There’s always been a problem with takedowns, and now we’re just seeing a really exaggerated version of this. And this comes back to my point about the platforms really needing to take seriously what kind of role and function they’re performing in society over and above them being commercial organizations with a duty to their shareholders.
The material that’s removed, where is it? What does it contain? Where is it archived? Who can access it? Why was it taken down? All of that is actually materially incredibly important, and yet it’s entirely voluntary on the part of the platforms, whether or not they make that type of archive.
We are seeing platforms who have been incredibly reluctant and deliberately obtuse about providing insight into how their stories are viewed, what’s circulating, where the lack of really good tools to measure the impact of social posts is shameful.
Do you think the platforms have taken actions to limit Russian accounts in part due to criticism that they didn’t do enough to stop Russia from using social media to try to sway the 2016 U.S. presidential election?
I can’t guess at what goes on in the minds of people who run these places. What I do think is that this is absolutely the worst nightmare in some respects. So, all the things that they’ve been trying to put right post 2016 and all of the issues that they have in terms of enabling powerful speech on their platforms and all the failures that they’ve had in terms of containing disinformation has come down now to them having to make a series of really blunt instrument moves because they are not designed in the right way for this environment.
I think the unpredictability of how [Russian President Vladimir] Putin has acted has highlighted for lots of companies the issues that you have when you do business in political environments with dictatorships. But I think that what happens next is going to be most interesting, which is whether or not at any point they try and return to normal or business as usual. I don’t think so. I think we will see much more restrictive practices [from the] platforms now…that they will become ever more interventionist.
I worry as well that we will see a whole series of laws passed, some of which are passed in good faith to try and make things safer and more equitable for people in democracies, and some of which are passed by bad faith governments who want more power for themselves and less power for the press.
Have tech companies “crossed the Rubicon,” as the saying goes, going from being relatively hands-off in wartime to taking sweeping actions?
Arguably, they crossed the Rubicon of modern warfare at the moment they allowed any government anywhere in the world to set up pages, channels, and accounts on their platforms without the implementation of adequate rules.
I was teaching students the other day, and I asked how many of them knew who [head Nazi propagandist] Joseph Goebbels was, and only about four of them had heard [of him]. I said, you really need to learn because what we are seeing now is no different, in terms of the use of technology in service of a particularly appalling type of dictatorship. We are seeing that now. The channels are different, the behavior is different, the tactics are pretty much the same. The inversion of truth, the discrediting of the press, the telling of ever bigger lies and repeating those lies until populations don’t really understand and can’t really think for themselves. Everybody who is a journalist in Ukraine, everybody who’s a journalist in Russia completely understands this. And now I think American companies have to wake up to the fact that they are really invested in there.
Any other takeaway from this moment?
I think this is yet another phase of how to construct a better environment for journalism and journalists, and it’s demonstrated how much we need the platforms to help us do that. And I think that they have to wake up and think about how they want to support the free press. And supporting the free press does not mean having inadequate content moderation policies. It does mean really thinking about your systems and design and priorities and how you circulate which kind of information to whom.
Committee to Protect Journalists