When the Office of the Director of National Intelligence released an unclassified report this month outlining a widespread influence campaign launched by Russia in 2020 to boost former President Donald Trump’s candidacy and knock down then-candidate Joe Biden, it confirmed for the first time what social media companies and researchers had known for more than a year: Moscow outsourced its influence operations to companies in Ghana, Mexico and Nigeria.
The report said the Kremlin-linked influence entity and troll farm known as Lakhta Internet Research had “used unwitting third-country nationals” in those three countries to propagate false narratives through social media accounts, as well as fake news websites about divisive issues in American politics.
“It’s useful when the U.S. government puts on the record what researchers, academics and civil society groups have been seeing for a long time,” said Jessica Brandt, the head of policy and research at the Alliance for Securing Democracy of the German Marshall Fund. “It puts it in the public record and gives it a certain authority.”
One likely reason why the Kremlin turned to entities in other countries, the report said, is that Lakhta, previously known as the Internet Research Agency, has been under intense scrutiny by U.S. law enforcement agencies, the Pentagon’s Cyber Command and social media companies ever since former special prosecutor Robert S. Mueller III laid out in great detail the agency’s role in interfering with the 2016 U.S. election.
Governments and their spy agencies launching influence operations against their adversaries, whether that’s the United States or other governments and domestic political opponents, are increasingly turning to commercial entities to outsource the operations because it provides cover and plausible deniability, according to researchers. Some researchers are calling for tightening U.S. laws to stop such interference.
“Outsourcing of influence operations is something we see a lot,” said Lee Foster, senior manager for information operations analysis at the security research firm FireEye. The operations include “troll farms trying to influence discussions in other countries, fabricated news websites or other campaigns.”
Governments and spy agencies also are tapping into the social media and consumer marketing expertise of private contractors “who might have a better sense of engaging with an audience and generating virality,” or getting a spike in notices and mentions for a particular message, Foster said.
The role of a troll farm in Accra, Ghana, that was found to be targeting American voters was first reported by CNN in March 2020. The same day, Facebook and Twitter said they had taken down dozens of accounts linked to a nongovernmental organization in Ghana that appeared to have ties with Kremlin-backed individuals at Lakhta.
In 2018, FireEye published a report about a suspected Iranian influence operation that involved a fake news site called Liberty Front Press and multiple social media accounts that were targeting audiences in the United States, the United Kingdom, Latin America and the Middle East. The goal appeared to focus on promoting pro-Iranian, anti-Saudi and anti-Israel messages, as well as trying to build support for the multinational nuclear deal between Iran and six other countries, the report said.
Influence operations aren’t just aimed at changing minds, Foster said. Such campaigns often are a prelude to cyberattacks or are staged after a cyberattack, he said. “We see a direct crossover between influence operations and cyber.”
Other countries also targeted
Not all influence operations and troll farms target the United States.
In 2019, Facebook said it took down hundreds of accounts operated by people with ties to Saudi Arabia’s government. The accounts were targeting regional rivals such as Iran, Turkey and Qatar, the company said. Facebook also removed accounts operated by marketing firms that had ties to the government in Egypt and the United Arab Emirates.
Although U.S. officials and researchers focus attention on Russia, China and Iran, which target American audiences, “if you look beyond them, you’ll see Saudi Arabia and the UAE that seem to have been very aggressively using these tactics for some time,” said Emerson Brooking, a resident fellow at the Digital Forensic Lab of the Atlantic Council.
In Israel, former spies operate for-profit “persona management consultancies,” sometimes called reputation management companies, including one called Archimedes Group, whose role was highlighted in an Atlantic Council report in 2019.
The group was hired to try to influence voters in national elections across the globe. Facebook removed dozens of accounts operated by Archimedes that had been targeting audiences in Africa, Latin America and Southeast Asia. Facebook took down the accounts because they push what the social media company calls “coordinated inauthentic behavior” — essentially, activities by operators pretending to be real people and spreading disinformation or manipulating truths.
In 2020, the Atlantic Council published another report focused on a Tunisian company called UReputation that had played a similar role in that country’s presidential campaign a year earlier. Facebook removed the company’s accounts after the report.
Although Facebook may continue to take down accounts for violating its platform rules, it may not necessarily stanch the spread of such tactics because influence operations are not illegal in most countries, Brooking said.
In the absence of punishment, a shuttered entity may simply reappear and continue its work under a new name.
Trump executive order
Trump took a step to address foreign interference when he signed an executive order in September 2018 that would allow the U.S. Treasury Department to impose sanctions on foreign individuals if they are identified as interfering in a U.S. election.
The order’s “remit was pretty broad, but was rarely exercised by Trump,” Brooking said.
The order is, in fact, the basis of the U.S. intelligence agencies’ report that identified Russian influence operations, and the report cites the order.
Brandt and her colleague Josh Rudolph at the German Marshall Fund in January called on Congress and the Biden administration to close loopholes in current laws and pass new ones aimed at foreign interference in U.S. elections. Their proposals would criminalize Americans laundering disinformation from foreign sources and require greater disclosure of such information under U.S. campaign finance laws.
Brandt said Australia’s approach to tackling foreign interference as a national security matter is a model worth examining.
The Australian law, called the Foreign Influence Transparency Scheme Act, defines foreign interference as going beyond routine diplomatic influence to include covert, coercive and deceptive activities. It makes foreign interference punishable by a 20-year prison term.
“Australia, which has been ground zero for China’s interference activities, has taken a national security approach and created certain penalties for knowingly and willingly participating in foreign interference targeting political campaigns,” she said. The kind of influence campaigns described in the U.S. intelligence agencies’ report “is exactly the kind of thing that we want to find ways to strengthen our legal defenses against.”
———
(c)2021 CQ Roll Call
Distributed by Tribune Content Agency, LLC