Ця стаття також доступна українською мовою тут.

“It sure seems that way to me.” That is how President Joe Biden supported his off-the-cuff comment that Russian atrocities in Ukraine amounted to genocide. A legal finding to that end, however, requires a substantially higher standard of proof (and to be fair, Biden also said that he would “let the lawyers decide”). The “effective accountability” that world leaders like U.N. Secretary-General António Guterres have called for begins with a thorough, rigorous, and independent investigation into allegations of genocide, war crimes, and crimes against humanity.

International justice mechanisms will insist upon unassailable evidence of the alleged atrocities. However, a vast amount of potential evidence in the form of photos and videos uploaded by Ukrainians on social media platforms is at risk or potentially unavailable to courts.

The risk arises from the loss of evidence due to the permanent removal of content deemed to have violated platforms’ terms of service. Imagine, for example, that there existed a video placing a specific individual at the scene of a summary execution of Ukrainian civilians. This content and its accompanying metadata – which have proven crucial in debunking disinformation and documenting Russian atrocities in the war – would likely be invaluable evidence in a war crimes trial, but because it would be graphic in nature, platforms such as YouTube, TikTok, and Facebook would remove it for violating their Terms of Service. Furthermore, the vast majority of content removals are proactive – TikTok, for example, reported that 90.1% of removed videos in the fourth quarter of 2021 were taken down before receiving any views. While these measures uphold community norms and promote user safety, they can result in the loss of valuable potential evidence, hampering efforts by investigators to document war crimes and other atrocities.

Even if social media evidence for an international trial is preserved, it is not clear that the data would be available to most international courts. The laws governing data sharing by U.S. companies were developed long before the advent of the internet and social media, not to mention online discussion forums, instant messaging, and user-specific metadata – and, for that matter, before much of the development of the international criminal justice regime as we know it.

These laws do not present a clear legal avenue for international justice mechanisms to request deleted social media data for use as evidence in human rights trials. To obtain digital evidence held within the United States, parties to legal proceedings abroad have the option of using a federal statute known as Section 1782 (28 U.S.C. § 1782). This law ​​empowers U.S. district courts to compel the divulgence of evidence “for use in a proceeding in a foreign or international tribunal.” However, the Stored Communications Act (28 U.S.C. Chapter 121 §§ 2702(a)) (SCA), declares that “a person or entity providing an electronic communication service to the public shall not knowingly divulge to any person or entity the contents of a communication while in electronic storage by that service.“ Enacted in 1986, the Act was intended to protect the privacy of electronic communications, a motivation that social media companies have argued prevents them from sharing valuable potential evidence with courts. While the SCA plays a critical role in extending Fourth Amendment privacy protections to digital communications, it does not account for the central role that social media content plays in mass atrocity crimes today.

It is unclear whether the pathway to evidence-sharing that Section 1782 offers can overcome the barriers to sharing created by the SCA. This is not just a theoretical concern – disagreements over how the two laws should be reconciled have hindered investigations into the persecution and killing of the Rohingya people in Myanmar by the Burmese military, which the U.S. formally declared a genocide last month.

In June 2020, as part of a case brought before the International Court of Justice, The Gambia requested a U.S. district court compel Facebook (now Meta) to disclose deleted content of Burmese state officials, military leaders, and Facebook groups related to atrocities committed against the Rohingya. The Gambia requested this evidence through Section 1782 to help prove that Myanmar’s leaders had “genocidal intent”, given their use of hate speech in Facebook posts. While a U.S. magistrate judge ordered Facebook to turn over the posts and associated metadata, a U.S. district court ruling vacated the portion of the order directing Facebook to produce private pages and communications (i.e., direct messages between users). It did so primarily on the grounds that the deleted content in question was being held by Facebook “for purposes of backup protection,” which, per the SCA, constituted communications held in “electronic storage” and was thus non-disclosable. In essence, the Court determined that the measures Facebook took to remove the posts in question from public display on its platform had the perplexing consequence of expressly rendering that content unavailable to the ICJ proceedings. When the ICJ began deliberating on The Gambia’s genocide case against Myanmar on February 28, it thus appears they did so without potentially crucial data for the case.

The Myanmar case shows that existing legislation is not fit for international justice in the digital age. In service of ongoing and future efforts to prosecute those responsible for war crimes in Ukraine, the United States must clarify how international justice mechanisms can access digital evidence.

A first step should be to amend the Stored Communications Act to accommodate (and define) social media and other forms of digital data, particularly in the context of international crimes. There are many possible strategies available to achieve this. For example, Rebecca Hamilton has proposed to add one more category to the list of SCA non-disclosure exceptions (found in  § 2702(b)) “to permit disclosure in situations where SCA-protected content will help establish the truth in a legal process related to the commission of war crimes, crimes against humanity, or genocide.”

As currently written, § 2702(b) holds that providers “may divulge the contents of a communication” if one of the listed exceptions applies (italics added). It thereby lifts liability without necessarily compelling cooperation, as Michael Becker flagged following the magistrate judge’s initial ruling in The Gambia v. Facebook. A stronger version of the clause would order the contents of a communication be shared, provided that certain conditions are met. The conditions might include meeting at least one of the items on the expanded list of non-disclosure exceptions, clearing some form of administrative review (or a FISA-like warrant process), and guaranteeing the privacy of individuals implicated in the communication beyond the confines of the court proceedings.

Given that many atrocity crime investigations and prosecutions take place outside U.S. jurisdiction, it may also be necessary to expand the reach of sub-clause § 2702(b)(9) – the existing SCA exception that permits electronic communications service providers to share data with foreign governments – to international justice mechanisms as well. For The Gambia, the only remaining options to obtain missing Facebook evidence in the Myanmar case are to work through a Mutual Legal Assistance Treaty (MLAT) or establish a bilateral agreement through the CLOUD Act. Unfortunately, these options are slow and of limited applicability, respectively. To create a clearer and faster path for the disclosure of social media communications to international and foreign tribunals involving questions of war crimes, crimes against humanity, and genocide, it may also be necessary to amend Section 1782. To satisfy congressional concerns, this process might involve empowering the State Department or a dedicated interagency entity with the authority to ensure a given data sharing request does not conflict with the national interest.

Drawing from existing frameworks such as the Berkeley Protocol on Digital Open Source Investigations, the Biden Administration should also coordinate with allies abroad, civil society, and social media companies to standardize an approach for preserving and verifying digital evidence while respecting privacy and security concerns. For example, platforms should be required to preserve publicly posted content that could be evidence of serious international crimes rather than permanently deleting violent content algorithmically. Limited access could then be granted to investigators, contingent upon the adherence to minimum standards of content authentication and privacy protections. The guardrails are essential given that some international tribunals and courts may themselves inadequately protect privacy rights. (For example, the ICC is silent on data privacy within its own Rules of Procedure and Evidence.) Whatever the means of these reforms, the goal would be to support the prosecution of international crimes – such as those potentially committed by Russia in Ukraine, the military junta in Myanmar, or ISIS in Iraq, among others – while still preserving core principles of communications privacy outside of that narrowly defined context.

Turning back to Ukraine, the only thing we can be confident in is that the path to justice will be long and arduous, and that it starts with the collection and preservation of digital evidence emerging today. To preserve the hope that Ukrainians may someday receive the justice that they deserve, legislative fixes to support the present and future work of war crime investigators and prosecutors require immediate attention.

IMAGE: VYSNE NEMECKE, SLOVAKIA – FEBRUARY 24: A man from the Ukrainian town Svaliava shows to the photographer the newest pictures shared on social media from recently invaded Ukraine on February 24, 2022 in Vysne Nemecke, Slovakia. The man arrived in Slovakia early in the morning to look for accommodation for his wife and three kids following him from Ukraine. (Photo by Zuzana Gogova/Getty Images)