Sitting on his front porch in Birmingham, Alabama, a man calling himself a “student nerd” explains via Zoom how he’s coping with one of the most popular tweets about the war in Ukraine. Around 275,000 people regularly check his The Intel Crab account.
Justin Peden, 20, shows how data can be used to uncover misinformation in today’s high-tech ecosystem. It uses geolocation, satellite imagery, TikTok, Instagram and other investigative tools to track Europe’s deadliest conflict since World War II.
Searching the Internet for streaming webcams, smartphone videos and photos to pinpoint the location of Russian troops, aerial bombardments and destruction of once-peaceful areas is part of his daily routine. If a Russian commander denies bombing an area, Mr. Peden and other military observers are quick to release evidence that exposes the lies.
“I never thought that what I do could be so relevant. I just wanted people to know what’s going on. [en Ukraine]. In fact, I am an ordinary student, ”said a young man in his third year at the University of Alabama at Birmingham.
Open source intelligence (OSINT) has become a powerful force for online sleuths like Mr. Peden. They use data to dispel the fog of war thanks to computers thousands of miles away, and their impact has not gone unnoticed.
“Intelligence gathering, fact-checking and debunking happen in real time. The online community also documents Russian troop movements and deployments, creating more than just a snapshot of recent history. journalist. Miles O’Brien told the PBS show in April.
On the air that afternoon, Mr. O’Brien called Mr. Peden “a highly respected practitioner in the rapidly evolving field of Open Source Intelligence, or OSINT,” and noted that his publications about Ukraine are monitored “outside and inside the intelligence community.” in washington post included it in an article about the “Twitter spy uprising”.
There is a saying: “The first casualty of war is the truth.” Today, however, changes are taking place. With the push of a button, anyone can deliver false information, no matter how dangerous, malicious, or frightening. The invasion of Ukraine is a classic example of how digital lies fueled a humanitarian crisis that resulted in loss of life and massive destruction.
It is important to note that disinformation differs from disinformation in that it is not only false, but also part of a “deliberate attempt to mislead, deceive, or mislead”. In short, this is content created for harm.
The Deutsche Welle (DW) in Germany showed how a verification system can detect attackers who want to cause damage. As war approached, the DW fact-checking team began collecting dossiers of false claims and propaganda from both sides of the conflict and making corrections. She also made a startling discovery: false information was spread on her behalf.
“Manufactured pro-Russian messages masquerading as BBC, CNN and DW are fueling a disinformation war between Russia and Ukraine,” DW reported in July. The article provides an example from the Japanese Twitter network. Here is an excerpt:
“Looks like a DW report,” a Twitter user comments in Japanese on an alleged video from a German channel about a Ukrainian refugee who allegedly raped women in Germany, making serious allegations against a man named “Peter Savchenko.” A twitter user writes: “Please give me the URL of the original video.” It looks like the user is doubting the origin of the video, and for good reason. This is not a DW production. It’s a fake”.
In another case, when a Twitter user posted a video purporting to show heavy air-to-ground fighting between Russia and Ukraine, DW fact-checkers traced it back to a 2013 video game.
DW has reached out to academics and practitioners for suggestions on how to make fact-checking more effective. These tips are useful for journalists around the world. Among them :
- Emphasize correct information rather than exaggerate statements
- Give unambiguous ratings (and avoid confusing labels like “mostly fake”),
- Avoid establishing false equivalents between opposing points of view,
- Place fact-checks in larger issues: don’t focus on isolated statements,
- Analyzing and explaining disinformation strategies: Linking fact checking to media and information literacy.
A better understanding of how propaganda techniques work can help disarm PR people. The Rand Corporation report titled “Russian ‘Fire Hose of Lies'” is a good start.
The title refers to a strategy “in which the propagandist overwhelms the public by creating an inexorable tide of false information and lies”. The report states that even blatant lies, spread rapidly and continuously through multiple channels such as newsletters and social media, can shape public opinion.
This analysis, released in 2016 in the midst of the US presidential election, details how the Russian disinformation system works.
“The report is very much in line with what’s happening today. Buckets full of nefarious propaganda are being thrown at us,” said sociologist Christopher Paul, senior fellow at Defense and Peace and Security Research Projects and co-author of the report. His research interests include counter-terrorism, counter-insurgency, and cyber warfare.
According to the Rand report, Russian disinformation is defined by:
- High volume and multiple channels
- Fast, continuous and repeating frequency
- Lack of commitment to objective reality
- Lack of desire for consistency.
The study also suggests best practices for dealing with fake news, such as:
- Provide warnings during initial exposure to disinformation.
- Repeat rebuttal or rebuttal.
- Make corrections that provide additional narration to help fill gaps in understanding once the misinformation is removed.
“It all comes down to journalistic requirements. All journalists really need to do everything right to be as professional as possible,” concludes Paul. “Double check, verification of sources, verification of authorship, use of data to ensure accuracy and reliability. The burden of truth, the burden of proof is much higher.”
Photo by Alina Grubnyak on Unsplash.
This article is adapted from a post on DataJournalism.com. It has been modified and published on IJNet with their permission.