As you might have heard by now, the US Department of Justice (DOJ) has issued an indictment alleging a Russian influence operation that had Russian employees of RT (which is banned in the U.S. and several other countries) secretly paying right-wing influencers via Tenet Media, which is owned and controlled by Lauren Chen and her husband Liam Donovan, to the tune of $10M from Russia. Kostiantyn Kalashnikov and Elena Afanasyeva are alleged to have used fake personas and shell companies to manage the operation, with the intention of, as stated by the DOJ: “to amplify domestic divisions in the United States.”
There’s been no evidence presented that the influencers—like Benny Johnson, Tim Pool, and Dave Rubin—were necessarily aware where these funds were coming from.
The indictment, for now, charges the two Russian nationals with "conspiracy to violate the Foreign Agents Registration Act (FARA) and conspiracy to commit money laundering."
The justice department has also cracked down on 32 internet domains designed to mimic news outlets like the Washington Post and Fox, but were filled with fake Russian stories meant to be disseminated via social media.
Fake websites and fake news are tool that have been used a lot over the years, so it’s not particularly surprising.
Neither is this influence operation, which is unlikely to be the only one.
What had surprised me about some of the reactions, aside from their immediate stance that anything that the DOJ pursues is false and is meant to hurt the Republicans, is the indifference. Some view this, somehow, as an infringement on free speech, while others simply don’t care about Russia (or other players) attempting to manipulate narratives.
If they would like to listen to the Russian perspective, that’s one thing. It’s not illegal for any of these influencers to state their views and there are plenty of sources that present the Russian POV. But in this case, it’s the deception that’s problematic.
Wouldn’t you want to know who is the actual source of the information and opinions you’re getting? Shouldn’t you desire transparency?
There are legitimate criticisms to be made of the mainstream media, but so many of the same people who voice their grievances over coverage that they find to be biased or misleading somehow don’t mind literal propaganda and disinformation being pushed at them by an entity other than what they thought?
Propaganda is defined as information, often misleading or false, used to promote a political cause or point of view.
While some propaganda is completely fabricated, the most effective, I find, tends to use grains of truth. It’s true that people are upset about immigration, the economy, culture wars, and so on. But when you disproportionally magnify certain narratives while excluding others, you’re building up a false perception of reality.
If all I was ever exposed to on my feed was videos of violence across the US, I’d think it to be a completely dangerous country and would be terrified to ever visit. Whereas the reality is that there are areas with violence as well as many safer areas. To properly assess whether a place is safe, I’m better off looking at crime statistics, and also seeing whether they’ve gone up or down. But over time, if I see, say videos of violent encounters, repeated over and over, I get the sense that the problem is far greater than it might actually be. It’s like a funhouse mirror that scales up and amplifies whatever is being pushed.
This is a particularly well-documented issue with television news and seniors. As the tag line goes “if it bleeds, it leads” — television broadcasts aren’t known for doing too many “feel good” stories. Seniors, who are generally less mobile, tend to consume more TV news and can become fearful, even if where they reside is perfectly safe. It has even been shown that they leave home less as result of the fear mongering. They are responding to a skewed perception of the world.
The same applies to narratives we find online that are further boosted by foreign entities.
If I’m regularly getting content on my feed showing, say, trans people harrassing people in bathrooms, I might get the impression that it happens a lot. And yet, while it does present an issue at times, it is something rather rare rather than being systemic. But repeated exposure to these stories, will tend to influence how you might view certain policies and trans people generally.
I’ve watched both in horror and awe at how Russian disinformation spread around the war on Ukraine. Being fluent in Russian, I could track certain stories to the point of origin and could also observe the differences in the narratives that were spun to their own population vs the West. (hint: they are radically different).
Both in the Ukraine and Gaza conflicts, I also saw how easily narratives were twisted simply by sticking misleading headlines on top of a real photo or video. And it was nearly impossible to spread the correction despite irrefutable evidence that these things were false.
That is troubling, and I’m hoping that perhaps in the future AI can be used as a tool for spotting false claims.
We don’t know the extent of foreign influence, but Russia is far from the only player. For example, Iranian government-funded Press TV have paid "thousands of dollars to Wyatt Reed, a writer who is now a Washington-based editor for the online publication Grayzone." A number of their writers have also worked for Russian state media.
Meanwhile, Newsmax is funded, in part, by $50M of investment from Qatar, which also is an investor in X, and has attempted to buy the Daily Mail in the UK.
China uses “Spamouflage” to impersonate North American users to influence political debates, create chaos, and promote their interests.
Canada’s Rapid Response Mechanism even detected an influence operation that was targeting Canadian Members of Parliament—and even the Prime Minister and leader of the Official Opposition—with disinformation. The bot networks spread claims like the Hawaiian wildfires being caused by a secret US military weapon, and that a “critic of the Chinese Communist Party (CCP) in Canada had accused the various MPs of criminal and ethical violations.”
During the pandemic, China worked overtime to confuse the West with a disinformation campaign designed to downplay the virus’s origins, spread false claims of the U.S. military introducing the virus to China, and highlighting how well China has managed the crisis.
While the Mueller Report, published in March 2019, did not establish that the Trump campaign conspired or coordinated with the Russian government in its election interference activities, it did confirm Russian interference in the 2016 election to favor him. This was conducted by the Russian Internet Research Agency, which created thousands of fake social media accounts across social media, targeting US citizens with divisive content related to race, immigration, gun control, and political issues—as well as fake content.
Do we know what exact impact it had on the outcome of the elections? It’s hard to measure. But to deny that it had any impact at all means that we deny that ANY information at all affects us.
Russia has also sought to reduce trust in Canada’s energy sector by promoting narratives about environmental damage caused by Canadian oil sands, in order to undermine it as a competitor. The campaigns involved fake social media posts, misleading articles, and promoting divisive discussions around energy policies.
In 2018, Iran spearheaded an operation known as "Endless Mayfly" which impersonated legitimate news outlets to disseminate fake stories aimed at discrediting U.S. and Israeli policies. Similarly, in 2020, Iranian actors posed as members of the far-right group "Proud Boys" to send threatening emails to Democratic voters in the U.S. to influence the presidential election.
As you can see by the examples above, the spread of disinformation is not just an abstract concern. And if it was indeed so ineffective as many like to claim, then countries wouldn’t be investing so heavily in their influence campaigns.
Whether it’s Russia paying influencers to amplify divisive narratives, China using fake personas to sway opinions, or Iran impersonating another group, these efforts exploit our open exchange of ideas, undermining our ability to make decisions based on an accurate perception of reality.
Diverse viewpoints are important for a functioning democracy or constitutional republic, but it is critical for us to ensure that there is transparency as to where information and viewpoints are coming from.
If we see our social media feed flooded with people claiming that sushi is bad for you because they grew a green tail after consuming it, then we can’t reliably deduce that this is a common phenomenon, not when such claims are coming from anonymous avatars or potentially fake profiles—or even just a few real people being heavily amplified by algorithms and bots to favor this narrative.
This is a great vulnerability that we will continue to grapple with for years to come. How do we understand the reality of what is happening in the world and what REAL people are actually talking about when it might all be a crafted illusion?
These foreign influence operations will only grow more sophisticated, which is why we need to understand who is behind the information we consume and critically evaluate the narratives presented to us. And as always: Verify before sharing.
☕️ By popular request, you can also support my work by making a one-off donation via Buy Me a Coffee.
Order my book, No Apologies: How to Find and Free Your Voice in the Age of Outrage―Lessons for the Silenced Majority —speaking up today is more important than ever.
NOTE TO READERS:
Thank you for keeping me company. Although I try to make many posts public and available for free access, to ensure sustainability and future growth—if you can—please consider becoming a paid subscriber. The more paid subscribers I have, the more time I’ll have to work on new essays.
In addition to supporting my work, it will also give you access to an archive of member-only posts. And if you’re already a paid subscriber, THANK YOU!
1> Sushi is bad for you :) I know because I have a green tail🤣
2> impersonating a news outlet ir a person us illegal just about everywhere. Thise entities engaging in this activity should be shut down/out where possible and prosecuted is possible.
3> beyond that there is really not much a free society can do to limit disinformation without becoming Orwellian dictatorships.
4> It is up to the consumer to engage their brain and do even rudimentary investigation of every thing they consume before comming to their own conclusion on the matter. Will everyone.do that? of course not, but to try and protect those that do not 1st imvestigate from disinformation is IMHO a wide road leading to 1984.
Lmao I don't one shit about this. Better to be influenced by Putin than Soros or Fink.