Opinion: Your update on the war against information saboteurs in American politics

A view of the four-story building known as the "troll factory" in St. Petersburg, Russia in February. The U.S. government alleges the Internet Research Agency started interfering as early as 2014 in U.S. politics, extending to the 2016 presidential election, saying the agency was funded by a St. Petersburg businessman Yevgeny Prigozhin. AP/Naira Davlashyan

Credit: Naira Davlashyan

Credit: Naira Davlashyan

Up in South Carolina, Darren Linvill is hoping for a quiet weekend. But hope can be a forlorn thing these days.

“I’m neck deep,” he said over the phone. “We have learned so much since the last time we spoke.” It was late afternoon, and CBS News was due to call at 5:30 p.m.

Last Sunday, Linvill was still helping Washington Post reporters piece together an article on two dozen or so fake — and now suspended — Twitter accounts purporting to be Black supporters of President Donald Trump.

Linvill had found a trace of the Cyrillic alphabet, used in both Greek and Russian, in online records of the “digital blackface” accounts. One had previously been used to advertise an escort service in Turkey.

Another, called @CopJrCliff, claimed to be the account of a Black police officer in Pennsylvania, which again is a crucial swing-state in the presidential election. Its profile picture was the purloined image of an African American officer in Portland, Ore., who had risen to prominence during the unrest there.

The @CopJrCliff account supposedly dates to 2017 — but first became active on Oct. 6, an important date in the history of the Russian disinformation effort to disrupt American politics.

On that day in 2016, we were at the height of the presidential contest between Trump and Hillary Clinton. The tawdry “Access Hollywood” tape was about to surface. Wikileaks would soon dump a first batch of emails hacked from the personal account of John Podesta, the chairman of the Clinton presidential campaign.

On that Oct. 6, a Russian troll factory reached a crescendo, pumping out 18,000 Twitter messages in a single day — about a dozen every minute. All aimed at us and the hearts we wore on our sleeves that year, for these Twitter crows to peck at.

This Russian surge went undocumented for nearly two years.

And then two Clemson University researchers had a thought while downing a couple beers and playing a now-highly inappropriate board game called “Pandemic.”

A few days earlier in June 2018, the U.S. House Intelligence Committee had released a list of 3,841 Twitter handles associated with that 'round-the-clock troll factory -- the Internet Research Agency, located in St. Petersburg and affiliated with the Russian government.

Linvill, a professor of communications, and his friend Patrick Warren, an economist, realized that Clemson University had something no one else did — an almost complete record of all Twitter messages ever sent.

They found close to 3 million Twitter messages that had been processed in that Russian boiler room — and the Oct. 6, 2016 surge. “We were able to use that data set to really understand the strategy and tactics of the IRA — and by extension, others that have mimicked them,” Linvill said.

The last two years have been more than a rush. The Senate Intelligence Committee has cited their work. Linvill and Warren now have security clearances, and a spokesman for U.S. Army Cyber Command in Augusta has confirmed they are making use of the pair’s research.

Resources for that troll factory in St. Petersburg have apparently increased. It’s moved to a bigger building in a better part of town, across from a Volvo dealership. But according to the Washington Post, U.S. Army Cyber Command was able to turn off the IRA’s access to the Internet in the days leading up to the 2018 midterm elections.

And then Linvill and Warren helped CNN track down disinformation operations the Russians had outsourced to Ghana and Nigeria.

Linvill and Warren now do their disinformation research under the auspices of the Clemson University Media Forensics Hub, which was recently funded with a four-year grant. They have grad students in their employ.

A team of undergrads has discovered a hundred or so accounts, perhaps connected to a state-affiliated Shia leader in Iran, that are directing Twitter messages in English in our direction. “They’re talking a lot about the Black Lives Matter in a manner that makes American culture look really awful,” Linvill said.

Their operation is also partnering with the Commission on Presidential Debates, monitoring online conversations around the events — which were placed on pause this week.

This is where the job gets difficult — discerning between legitimate and illegitimate conversations.

“The Libertarians organized, saying, ‘We want Jo Jorgenson on the debate stage.’ Totally legitimate,” Linvill said. “But we also found some accounts that were organized that may have been from some foreign nation. Maybe Cuba, but definitely dishonest — stealing people’s Facebook profiles and that kind of thing. We worked with Twitter to have those accounts suspended.”

Two years ago, Linvill and Warren described an operation that sought to amplify the voices of America’s extremes — more often re-Tweeting our own dispiriting messages rather than creating their own. “Real users are not only the targets of disinformation, they are the tools of it as well,” the pair wrote for Harvard University’s Misinformation Review.

And that is still happening. In 2019, Russian accounts were giving full voice to Bernie Sanders in the Democratic presidential primary “and attacking whatever moderate candidate happened to be in the lead at the moment — which was consistently Joe Biden,” Linvill said.

But foreign trolls have also learned to put on a friendlier face, building a following with each account, before getting down to the business of division. In 2018, @PoliteMelanie sent out a message to her 20,000 or so Twitter followers: “Criticizing Trump in a book is just unfair. It’s like criticizing the Amish on television.”

The Chicago Tribune named it the newspaper’s “Tweet of the Week.” PoliteMelanie, of course, was a Russian troll.

This isn’t going to stop. Russia, along with many other nations, have decided that it’s in their interest to help us be mean to one another. It’s effective, and as cheap as a laptop with wireless access. So far, we’ve been happy to help them.

I asked Linvill what might be in store for us in the next two weeks. He didn’t know. “We’re certainly seeing disinformation now,” Linville said. “It’s probably something you can’t fully understand in real time.”

And after Nov. 3? “I’ll probably be spending my Christmas vacation delving into Facebook data,” he said.

In Other News