10 March 2022

Big Tech War Activism

The war in Ukraine is live. It’s not only live on CNN or Al Jazeera but it’s live on different social media platforms, for better and worse. It is live for everyone to follow, comment on, and engage with. This war is not only being fought in the Ukrainian streets, landing grounds, fields, and hills of Kharkiv, Kyiv or Mariupol. This time it is being waged in the stock markets, the European harbors refusing Russian vessels, the sports and cultural worlds, and online with unprecedented tech activism. The well-known international hacker group, Anonymous, has declared war on Russia and invited hackers throughout the world to attack its government websites, recently claiming to have breached the Russian space agency and hacked national television channels. Elon Musk activated Starlink satellite terminals for Ukraine, even though some security experts have warned against the vulnerability risks of satellite transmissions for tactical operations. And this was merely the beginning.

In this war context, even Big Tech platforms are not neutral. Rather, along with their users, they are giving rise to a new wave of tech war activism, siding with Ukraine. Google has disabled traffic data from being displaced on its Maps app in Ukraine, allegedly to protect the safety of local communities. Social media platforms such as Facebook, Twitter, and TikTok are trying to address their role of parallel battlefields for information and disinformation, mobilization channels for the recruitment of volunteers, crowdfunding actions and support networks for refugees. Many scholars have recently written or shared their opinions in Twitter threads on platforms’ reaction to the war. Anna Rogers, a natural language processing scholar, took a deep dive into the urgent policies social media platforms enacted in the light of the recent humanitarian catastrophes. Rogers reported how Meta halted its monetization services for Russian state media; Twitter is carefully monitoring high profile accounts; or how Google (and YouTube) set up a monitoring team. Another important battlefield for social media platforms is disinformation, which can be qualified as a cybersecurity threat. At the same time, social media users — including individuals with large numbers of followers (influencers) — are using social media platforms to disseminate information about their real-time experiences of war or that of others, launch fundraising actions, mobilize peers, and influence public opinion.

While many of these initiatives may be well intended, this new form of tech activism raises questions about the role of social media in times of war. Media communication, in its digital or analog, professional or amateur forms, has always played a role in armed conflicts. However, never before has communication been so heavily intermediated by private global powers (e.g. Facebook, Twitter) who have extensive influence but limited accountability. This contribution inquiries into the role of Big Tech activism in times of war, both from the side of the social media platforms that enable it and from the users that promote individual action. The intervention of both corporate and individual private actors can be problematic, due to the risk of misalignment of interests, opportunism, political influence, online vigilantism, and potential jeopardy of military operations. The platformization of war through tech activism means that public values as well as human rights may be endangered in the short and long-run. In this contribution, we contend that existing legal frameworks that address content moderation or the proposed Digital Services Act are only the starting point for a broader and more complex regulatory conversation on tech activism and accountability.

(Social) Media, War and Disinformation

The role of media and propaganda has been a longstanding aspect of war, and social media platforms are increasingly a part of this engine. The Myanmar case is a recent example of how social media contributed to inflaming conflict and mass atrocities. However, it was not the first one. The world has almost forgotten other recent examples with profound consequences. Traditional media outlets were in the past usually subject to restrictive measures, such as radio jamming, the seizing and destruction of broadcasting towers (for example, in the Bosnian conflict). In 1994, long before the emergence of social media, the Radio des Mille Collines broadcasts in Rwanda changed the course of history, inciting Hutu citizens to pick up machetes, sticks or other weapons to assassinate their Tutsi neighbors. We all know what happened to them.

Media has long been used to inspire and mobilize citizens during crises. But now this mobilization can be achieved at the global scale and it can be led by governments, tech platforms, and influencers. Big Tech activism may not echo as loud as the words of the Radio des Milles Collines but it has a much broader audience. The current contours of media mobilization are also different than they were in the Rwandan conflict. First, social media platforms do not produce content directly. They facilitate it because in our current online world, intermediation is more effective and profitable than content production thanks to advertisement-based business models. Second, tech giant’s spaces are used for a more varied array of purposes, such as to mobilize funds, help, influence political opinion on different sides of the conflict, as well as for the personal glory of users.

#WW3: Social Media Monetization Trends

While the world is looking at the responses of social media platforms, an important dimension of online content is under-discussed: how the war is being monetized by regular people through content that caters to our fears, hopes, and curiosity. While setting up taskforces to monitor information operations, platforms should not forget their role in amplifying this content. With the war on Ukraine, we are witnessing the potential for content creators to use but also to abuse their platforms, in an attempt to find relevance within the conflict narratives describing current international politics. It will take years to fully understand what is currently happening on social media. Yet, a few clear trends can already be discerned.

Creators feel like they need to acknowledge or capitalize on current events. In a video from 2 March, fashion TikTokker Gabriella Zacche told a story from Paris Fashion Week, where she sat next to a few influencers discussing the war on Ukraine: ‘This war thing is so annoying’, said one influencer, to which her friend replied: ‘Yeah, if this turns into a BLM thing and I have to post about it, it’ll ruin my feed.’ This is an example of the business opportunism around social justice movements (e.g., making their personal brand more socially meaningful), which could potentially translate into unfair commercial practices (e.g. similar to greenwashing). This trend is another expression of the increasing meaning given to commercial social justice by new generations of consumers, such as Gen Z, as a driver of business behavior.

Global attention means more monetization options for scammers. Under fake Ukrainian journalist identities, young people not from Ukraine (e.g. from Kentucky) have been rapidly making and monetizing alleged news pages on Instagram, where they share content about the war on Ukraine, which is not subject to any editorial controls, or concerns about the spread of misinformation. Creators can capitalize on the war, by becoming the new amateur journalists reporting on anything they can possibly link to Ukrainian news (including for instance crypto prices and investment advice), and can receive sponsorships from brands that wish to ride the virality wave, or ad money on platforms like Youtube.

Creators are facilitating crowdfunding. Many content creators who wished to contribute to the humanitarian support network have turned to their followers for support. Faiãr, a Romanian streamer, hosted a live stream on Twitch and donated the proceedings to the NGO for Social Service Federation, providing support for Ukrainian refugees in Romania. However, not all creators are equally transparent and responsible. The monetization of philanthropy on social media remains a huge legal and infrastructural issue, as it has been seen in earlier global rallies for crowdfunding (e.g. Covid relief, Black Lives Matter donations), where social media platforms failed to provide transparent ways for creators to raise and use donations to support social justice or relief movements.

Not All is Fair in War

Tech activism has ugly sides. As recent images of Russian prisoners of war show, platforms are also allowing the exposure of those who have been captured or surrendered to public curiosity. Social media platforms claim to be willing to fight against the dissemination of online hate and disinformation. However, how will they do so when this affects their medium-run profits? How can national regulators, and largely the international community, address this challenge? How should social media be governed in contexts of war?

In other recent conflicts in Africa and Asia, governments have taken strict measures against online hate, violence, and disinformation, imposing extensive censorship in the form of internet shutdowns. These situations have also been exploited to pursue political purposes which were often only justified by security reasons. Alternative approaches include cooperation with social media which is a more long-term and proportionate solution than relying on general censorship. This cooperation can be materialized in different interventions on online content, including guidelines to address hate and disinformation in times of war and specific reporting duties. In the context of the war on the Ukraine, this is particularly necessary, as one of the primary challenges consists in moderating online hate and disinformation in the Russian and Ukrainian languages. This requires human moderators to fill a gap that has not yet been closed by digital technology. Other important measures include deprioritization or demonetization of content, which means that some content may not be censored but it will barely be findable.

Western democracies have also increased the pressure on social media platforms to demonetize content or block some services in Ukraine. However, this is easier said than done. The business model of social media makes it difficult to address online hate and disinformation. Unlike traditional media outlets, platforms do not create content but organize their spaces for profit according to user profiling services, which attract advertising revenue. This creates an important paradox: a peaceful environment which prioritizes content over engagement is unlikely to generate the same amount of income, as engagement is the money-making metric, and is sustained by controversial content including disinformation and hate.

Although social media platforms seem to take the Ukrainian situation seriously, their moderation of online hate and disinformation remains governed by their ethical, business, and legal frameworks (read: terms of service). International legal frameworks do not require social media platforms to fight online hate and disinformation. The cooperation of platforms in the Ukrainian case to do so is thus partly a voluntary action and partly the effect of Western public opinion. Indeed, social media have no direct obligation to safeguard human rights because they are not state actors. There is nonetheless increasing pressure on private actors to comply with international human rights law when moderating online content. Examples are the Guiding Principles on Business and Human Rights and the Rabat Plan of Action.

For the future, the Digital Services Act may offer a starting point to change the current situation. The new proposed legal instruments explicitly require very large online platforms such as Facebook or Twitter to assess any significant systemic risks stemming from the functioning and use made of their services in the Union. This includes the dissemination of illegal content through their services, any negative effects for the exercise of the fundamental rights, and intentional manipulation of their service. This new step also leads to wondering whether social media should moderate content differently in times of war, and whether the new framework provided by the Digital Services Act will play a critical role in providing a first response to this question.

Conclusion

Technology is not neutral in armed conflicts and the same applies to tech activism. Social media is the message and the messenger that can greatly contribute to who wins or loses the war. Tech activism, despite its best intentions, is also prone for abuses and misinterpretations, especially when content moderation is not adequately used to oversee users and protect the different public values at stake. Existing legal frameworks do not offer adequate regulatory answers and the adoption of strict measures against social media platforms such as bans can produce more harm than good. Governments and the international community should thus cooperate with tech platforms and urge them for the time being to be responsible with tech activism. This can entail deprioritization of content regarding prisoners of war, captured military equipment, images of Ukrainian and Russian soldiers, misguided fundraising actions, and potentially offensive or sensitive content. A truly responsible governance framework for tech activism in the context of war should require social media platforms to be more neutral than they currently are. As a result, they could prioritize, for example, more high quality journalistic content supported by evidence rather than ill-funded opinions of influencers and viral videos of prisoners of war. Times of crisis showcase in an unprecedent way how the opacity of platform governance is a regulatory issue that requires swift solutions.


SUGGESTED CITATION  Ranchordas, Sofia, De Gregorio, Giovanni; Goanta, Catalina: Big Tech War Activism, VerfBlog, 2022/3/10, https://verfassungsblog.de/big-tech-war-activism/, DOI: 10.17176/20220310-121147-0.

2 Comments

  1. Raphael Thu 10 Mar 2022 at 17:03 - Reply

    Interesting take, though I am unsure if I can fully agree with your framing as to how tech activism requires more neutrality. Activism and neutrality don’t often come hand in hand. It is a value system, rather than the absence thereof, that is needed in this space.

  2. David Dzidzikashvili Fri 18 Mar 2022 at 13:27 - Reply

    What is happening in Ukraine today these events had been happening for the past 20+ years, when Putin came into power by bombing his own people – civilian apartments and committing atrocities against the Chechen people. The response from the US, EU and NATO had been just complete silence and welcoming Putin to the summits and holding red carpet meetings for him. This further emboldened Putin who attacked Georgia in 2008 and conquered Abkhazia and Samachablo. What did the Western powers do? Absolutely nothing! Reset by the Obama Administration and warm handshakes by Merkel, total ignorance of the international laws and Putin’s war crimes against the Georgian people. What happened afterwards? Putin invaded Crimea and Eastern Ukraine. What did the Western powers do? Bare minimum of symbolic sanctions that continued to feed Putin’s war machine. Then Syria, use of chemical weapons, more atrocities… . What did the Western powers do? Absolutely nothing!
    So we are here as a result of Putin’s false perception that he could chew more than he could bite and the 20+ year ignorance from the EU, US and the NATO. Today there is strong response and sanctions that will take the Russian economy back to the 1990s indicators, however it is too late and too little. Ukraine needs the Patriot missiles, S-400s, S-300s, missiles to shoot down airplanes and incoming rockets at much higher altitudes than Stingers could reach, Ukraine needs much more firepower and the ability to control and close its own skies. Lets help Zelensky establish the No Fly Zone! The Biden administration looked weak, but slowly they are starting to wake up and see the true face of evil – Vladimir Putin who is trying to restore the new Russian empire…

Leave A Comment Cancel reply

WRITE A COMMENT

1. We welcome your comments but you do so as our guest. Please note that we will exercise our property rights to make sure that Verfassungsblog remains a safe and attractive place for everyone. Your comment will not appear immediately but will be moderated by us. Just as with posts, we make a choice. That means not all submitted comments will be published.

2. We expect comments to be matter-of-fact, on-topic and free of sarcasm, innuendo and ad personam arguments.

3. Racist, sexist and otherwise discriminatory comments will not be published.

4. Comments under pseudonym are allowed but a valid email address is obligatory. The use of more than one pseudonym is not allowed.




Explore posts related to this:
Big Tech, Platform Governance, Russian War against Ukraine, content moderation


Other posts about this region:
Europa