War of weaponized influence’: U.S. spending millions on tools to find foreign tweets and memes

The U.S. government is building expensive tools to fight enemies that use tweets and memes instead of bullets and bombs.

Instead of the deserts and mountains of Afghanistan, the battleground is on technological platforms such as Twitter, where the Defense Department’s research and development arm says America is fighting an “asymmetric, continual, war of weaponized influence narratives.”

The Defense Advanced Research Projects Agency said it plans to spend $59.5 million in the next four years on researchers making algorithms and gathering content including tweets, memes, political ads and blog posts for the government’s Influence Campaign Awareness and Sensemaking (INCAS) program.

INCAS program manager Brian Kettler said the goal is to give the government tools to provide an “early warning” of foreign influence. He said agencies could use the tools to help the U.S. spread its narratives around the world and to stop online content from going viral.

Researchers for cybersecurity company Mandiant said last month that they discovered a pro-China digital influence campaign designed to stoke Asian American anger over racial injustice inside the U.S. and shift the narrative about blame for COVID-19.

The Mandiant team said the pro-China operation used at least 30 social media platforms and dozens of websites in languages including Chinese, English, German, Japanese, Korean, Russian and Spanish.

Mandiant Vice President John Hultquist said DARPA is smart to pursue the program. Still, he said, it is almost too late for solutions because China has taken action to push protesters into the streets.

“I think this is a massive problem that the U.S. is going to have to deal with, so I think trying to develop a capability like this is absolutely the right thing to do,” Mr. Hultquist said. “It’s only growing.”

Details about who will use the tools and how they will be used are yet to be known. DARPA said the INCAS program started in August. Mr. Kettler said DARPA is in charge of building tools and techniques, not deciding how the U.S. government uses them.

Researchers working for DARPA are divided into five teams, according to DARPA documents. One team will collect foreign digital content, such as tweets and memes.

Two teams will take the content and develop algorithms to sort the content for indicators of foreign influence and for an intended audience’s “psychographic attributes,” which involve a community’s sacred values and worldviews such as politics and religion.

Another team will use the algorithms and data to build a model to understand foreign influence campaigns and help give analysts confidence that the algorithms are picking up a particular adversary, such as China or Russia, and not a random person online. This team will provide “human machine interfaces” that analysts will use to interact with the INCAS program’s tools.

The other team will test the tools by working with government agencies on real-world scenarios.

DARPA is not forthcoming about the type of foreign influence assault it intends to prevent. Its documents cite China’s Belt and Road Initiative, a worldwide infrastructure funding effort run by the Chinese communist government, as part of a scenario to be examined in measuring the program’s function.

Documents also point to adversaries trying to undermine foreign people’s attitudes about American military bases inside their countries. The government’s broadest fear is that an enemy will use foreign influence to accomplish something the U.S. has not considered.

“Quite a few people, quite a bit of people now, are worried about election security because we’ve seen some examples where that’s been concerning, but I want to cast the net much broader and say, ‘Can we see adversary information campaigns around a variety of topics?’” Mr. Kettler said. “What haven’t we thought of?”

DARPA documents and Mr. Kettler stressed that the tools are focused on detection. University of Illinois researchers involved in the program say their work will be the “first step towards development of effective countermeasures” against influence operations.

Such countermeasures may involve ways to preempt messages from being memorable or stopping messages from going viral, Mr. Kettler said. He added that his program is not devising a system for how countermeasures could be developed.

The University of Illinois said last month that it received $5.8 million from DARPA for the INCAS program. DARPA said the Illinois researchers would work with counterparts at the University of Southern California to develop ways to sort an audience by psychographic attributes.

“People create narratives that are divisive,” Tarek Abdelzaher, a University of Illinois professor working with DARPA, said in a statement last month. “They are intended to polarize, radicalize, whatever … but we don’t understand the impact of that weapon on the population the way we understand the impact of a bomb or lightning strike.”

Researchers’ focus on psychographic attributes is aimed at identifying which groups of people are susceptible to foreign messages in a way that sparks an emotional response, such as outrage against the U.S. government, or that prompts real-world action, such as protests.

Just as marketers are concerned with personality traits that determine whether someone is likely to buy a product, DARPA is concerned with groups’ emotional responses to content based on their values and worldviews stemming from religious or political beliefs, according to DARPA presentation slides and a broad agency announcement from October 2020.

DARPA’s presentation slides showed marketers using beauty ads to sell makeup differently to introverts and extroverts.

By contrast, DARPA’s new approach at dividing people was depicted in a graph plotting an audience’s risk of emotional response to an issue such as “guns/gun control,” “climate change,” “gays in military” and “immigration.”

Mr. Kettler said DARPA wanted to avoid duplicating the private sector’s tools built for marketing. He said the agency intends to make its psychographic algorithms available for widespread use.

“The various algorithms for analyzing, for looking at aggregate social media data and saying, ‘There seems to be these psychographic attributes involved,’ that’s the kind of stuff that we hope to be making available, and that could be incorporated into a variety of different kinds of tools by different commercial customers, academic researchers, so forth, could take advantage of those tools,” Mr. Kettler said.  

DARPA is relying on an array of companies, institutions and laboratories to develop the government’s capabilities. Alongside the universities of Illinois and Southern California researchers, other teams participating include representatives from Smart Information Flow Technologies, Protagonist Technology, Uncharted Software, University of Maryland Applied Research Laboratory for Intelligence and Security, and Lockheed Martin Advanced Technology Laboratories. Mr. Kettler joined DARPA in 2019 from Lockheed Martin.

The government agencies that may use these tools have not been selected and are expected to emerge as the program unfolds. Mr. Kettler said the tools are not singularly for the Department of Defense. He noted that the “entire government” has concerns about foreign influence, and he touted the State Department as particularly focused on the issue.

Sign up for Daily Newsletters

View original article

Scroll to Top