Home Science Amid War in Ukraine, Open Source Intelligence Investigators Need Better Ethics

Amid War in Ukraine, Open Source Intelligence Investigators Need Better Ethics

0
Amid War in Ukraine, Open Source Intelligence Investigators Need Better Ethics

[ad_1]

Since the outbreak of warfare in Ukraine, intelligence reporting from publicly available information, open-source intelligence (OSINT), has made a groundbreaking contribution in piercing the fog of war. Nevertheless, the rapidly growing OSINT community has ignored the ethics—the “should we’s” rather than the “what’s”—of publicly releasing wartime intelligence. Failure to grapple with these questions will cripple our understanding of Russia’s war on Ukraine, and may instead mislead the public. And it threatens unintended harm to civilians and investigators alike. 

Thanks to the growth of telecommunication in the 21st century, OSINT has quickly become an efficient tool to track terrorist communication, facilitate criminal investigations, and better understand military conflicts. In time, with the ever-growing abundance of data on the Internet, this intelligence helped journalism groups such as Bellingcat and others who don’t have access to classified information to uncover plots, counter disinformation and even mitigate security risks otherwise hidden from the public. The Economist noted that the “decentralized and egalitarian nature of OSINT erodes the power of traditional arbiters of truth and falsehood.”

Open-source intelligence during the Russian war on Ukraine has surely proven this point. The open source community successfully tracked the Russian military buildup that preceded the invasion, identified war criminals, and even documented equipment losses in the conflict. It has changed information-gathering and belligerents’ operations.

Despite these achievements, much of the rapidly growing OSINT community devotes little attention to the ethics of their activities. As an investigator myself, I am calling on open-source researchers to ensure that our actions do not cause unintended harms.

H. I. Sutton, one of the respected veterans of open-source intelligence, recently stated in irony that “OSINT is a term coined in March 2022 in order to gain followers on social media…. The most important prerequisite for becoming an OSINT analyst is being fast to download videos from [Telegram].” He is definitely on the mark. Aside from long-term professionals who have been active in this field for years, the number of amateur open-source researchers active on social media has skyrocketted as a result of the war in Ukraine. Many went viral and changed how the public perceived this military conflict. While some focused on the quality of the information released, many preferred to capitalize on their popularity. The power of “likes,” the dominating driver of contemporary social media, leaves little room for careful and time-consuming data analysis. From the viewpoint of ethics in science, it is the primary cause of several problematic attitudes on display in the war in Ukraine among the OSINT community that raise increasing controversies.

First and foremost, as open-source intelligence researchers, we are responsible for what, how and when we share. We must ensure that we are not being used to make things worse by political actors, which aligns with a core principle of science—primum non nocere—first do no harm. OSINT is not about racing to get content published on Twitter as soon as possible without spending time on its verification and impact assessment. The rush to tweet and making decisive judgments based on the scraps of data from the frontlines without the necessary vetting means that combatants may use investigators as instruments of information warfare. Instead of debunking false claims, there is a growing risk that the OSINT community may mislead the public. It has happened many times before and during the Russian war on Ukraine.

We can see this in the aftermath of the recent massive U.S. intelligence leak on the social media platform Discord. Many OSINT-focused profiles on Twitter raced to share uncensored pictures of the classified documents. Within the first hours of the leak, some claimed that these images were either fake or legitimate without providing any evidence. Groundless interpretations of this event, mainly by those seeking to win the war for attention on social media, contributed to obscuring the truth. Bellingcat and the Washington Post later carried out proper investigations. They ethically censored all sensitive information to prevent malicious use of their research results.

The same lighthearted attitude to data analysis and impact assessment may threaten individual privacy and safety. A part of the OSINT community frequently profiles real people with online identifiers,  and publicly shares their information online. That may harm people in many ways. Failed OSINT investigations in the past caused innocents to be blamed for the Boston Marathon bombing in 2013. The same things happened during Ukraine’s war. An innocent man was doxxed, and his identity and address were shared online due to errors committed by open-source researchers when investigating war crimes. This harm shows how careful we must be when dealing with data regarding individuals.

OSINT’s ethics problems partially result from a legal limbo where many investigations are carried out. The global community faces cross-jurisdictional dilemmas, especially those related mainly to personal data protection and ways of processing sensitive or leaked content. We also lack the necessary regulations to explore some layers of the Internet, such as the dark web, which is particularly popular among Russian Internet users. Effectively, some investigations may cause legal liability if we are not careful.

Many OSINT researchers also seem to disregard risks to their well-being, ones usually weighed into ethics assessments. Working with online sources on Ukraine’s war exposes us to gore, such as the recent decapitation video posted on the Russian Telegram channel, which may cause secondary trauma. Such reactions have been reported in similar fields, such as online terrorism research. Effectively, some basic trauma prevention standards must be adopted. Otherwise, OSINT researchers may expose themselves to mental harms difficult to overcome in the long run. This especially threatens amateur researchers who lack institutional support.

Their safety may also be at risk. Participating in high-profile investigations may trigger political actors to threaten them with arrest or extralegal violence. This recently happened to Bellingcat executive director Christo Grozev. There are many more cases known of OSINT researchers receiving death threats. Searching for and processing raw data also exposes them to potential cyber incidents, including spear-phishing attacks and malware infection. These, in turn, may lead to leaks of personal information online. Addressing these challenges investigators requires adopting  operations security standards, ones sometimes not followed by amateur investigators.

Ignoring these problems will inevitably devalue open-source intelligence. Public perception of OSINT is at stake. Perhaps more importantly, strong attachment to ethics by our community is the only way to ensure that open-source investigators are not used as instruments in information warfare or as a source of new risks to innocents. If nothing else, researchers’ own safety may also be at stake.   

This is an opinion and analysis article, and the views expressed by the author or authors are not necessarily those of Scientific American.



[ad_2]

LEAVE A REPLY

Please enter your comment!
Please enter your name here