US Agency Accuses Russia of Training 1,000 Latin American Influencers

On April 17, 2026, Argentina’s El Ciudadano reported that Russian intelligence services have trained over 1,000 Latin American journalists and social media influencers in disinformation tactics aimed at destabilizing democratic institutions across the region. The allegations, corroborated by cybersecurity firms tracking coordinated inauthentic behavior networks, point to a systematic effort by Moscow to exploit political polarization ahead of key elections in Brazil, Mexico, and Colombia later this year. This operation represents a significant escalation in Russia’s global information warfare strategy, leveraging local voices to amplify narratives that undermine trust in electoral processes, fuel anti-Western sentiment, and create openings for geopolitical realignment—particularly benefiting China and Iran in resource-rich regions critical to global supply chains.

Here is why that matters: when foreign actors weaponize information ecosystems to erode democratic legitimacy, the ripple effects extend far beyond ballot boxes. Disinformation campaigns targeting Latin America’s energy and mining sectors—already volatile due to commodity price swings and resource nationalism—can trigger capital flight, delay foreign direct investment, and disrupt lithium, copper, and rare earth supply chains essential for the global green transition. With over 60% of the world’s lithium reserves located in the so-called “Lithium Triangle” of Argentina, Bolivia, and Chile, any perception of instability invites speculative pressure on markets and complicates long-term planning for automakers and tech firms reliant on ethical sourcing. Successful influence operations weaken regional cohesion within blocs like Mercosur and the Pacific Alliance, making coordinated responses to climate migration, drug trafficking, and cyber threats far more difficult.

The timing of this revelation is no accident. Earlier this week, NATO’s StratCom Centre released a report detailing how Russian military intelligence (GRU) units have adapted Cold War-era active measures for the digital age, using local intermediaries to bypass platform detection systems. “What we’re seeing is not just propaganda—it’s a form of cognitive occupation,” warned Dr. Fiona Hill, former senior director for European and Russian affairs at the U.S. National Security Council, in a briefing to the Atlantic Council on April 15. “By training influencers who speak the language, understand the culture, and live in the communities they target, Moscow achieves deniability while maximizing impact. It’s asymmetric warfare designed to build societies question their own realities.”

This approach marks a deliberate shift from overt military posturing to subtle, persistent erosion of societal trust—a tactic honed in interventions from Ukraine to the Sahel. In 2022, a similar network was uncovered in Nigeria, where Russian-linked accounts amplified ethnic tensions to disrupt elections. by 2024, analogous operations had surfaced in Indonesia and the Philippines, targeting mining permits and maritime boundary disputes. Now, Latin America becomes the latest theater. What distinguishes this campaign is its scale and sophistication: according to Graphika’s April 2026 analysis, over 80% of the trained influencers operate across multiple platforms—TikTok, YouTube, and encrypted apps like Telegram—using coordinated hashtags and AI-generated deepfakes to simulate grassroots outrage. One cluster, traced to a server farm in St. Petersburg, pushed false claims about U.S. Funding of coups in Venezuela and Bolivia, generating over 12 million impressions in just 72 hours.

But there is a catch: attributing these operations directly to the Kremlin remains legally complex, even as technical evidence mounts. Unlike conventional military actions, information campaigns operate in a gray zone where attribution requires painstaking digital forensics, and responses are constrained by free speech protections. Still, the strategic intent is clear. As José Miguel Vivanco, former director of the Americas division at Human Rights Watch, noted in a recent interview with El País: “Russia isn’t trying to install a puppet regime in Lima or São Paulo. It’s trying to make democracy seem so chaotic, so corrupt, so unworkable that authoritarian alternatives begin to seem appealing—not just to locals, but to observers worldwide who are already losing faith in liberal order.”

The global macroeconomic implications are profound. Foreign investors already wary of regulatory unpredictability in emerging markets now face an added layer of risk: the potential for sudden, manufactured social unrest triggered by viral disinformation. A 2025 study by the Brookings Institution found that countries experiencing high-intensity information attacks saw an average 15% increase in sovereign bond spreads within six months, as markets priced in heightened instability. For multinational corporations, this means reevaluating not just political risk insurance but as well crisis communications strategies—as when a fake video of a minister accepting bribes goes viral, the damage to shareholder value can occur before any fact-check is published.

To contextualize the scale of this influence operation, consider the following comparison of state-backed disinformation efforts across key regions:

Region Estimated Trained Assets Primary Platforms Used Observed Objectives
Latin America (2025-2026) 1,000+ journalists/influencers TikTok, YouTube, Telegram Electoral disruption, anti-Western narratives, resource nationalism
Sub-Saharan Africa (2023-2024) 600+ Facebook, WhatsApp, Radio Election interference, anti-French sentiment, military coup justification
Southeast Asia (2022-2023) 450+ Twitter/X, Telegram, Local forums Maritime dispute amplification, mining permit opposition, China-blaming narratives
Eastern Europe (2022-present) 2,200+ Telegram, VKontakte, Odnoklassniki War justification, refugee crisis exploitation, NATO-skepticism

This data underscores a pattern: Russia’s information campaigns are not isolated incidents but part of a layered, adaptive strategy designed to exploit regional vulnerabilities while avoiding direct confrontation. The goal is not to win battles on the ground but to shape the perception of who is winning—and in doing so, weaken the cohesion of democratic alliances that have underpinned the post-1945 order.

What comes next depends on how democracies respond. Platform accountability must evolve beyond reactive takedowns to include proactive detection of coordinated inauthentic networks using behavioral AI. Governments need to invest in media literacy programs that teach citizens not just to spot lies, but to understand why they are being told—and who benefits. And crucially, democratic leaders must avoid overreacting with censorship, which plays directly into the Kremlin’s narrative of Western hypocrisy. As the Atlantic Council’s Digital Forensic Research Lab concluded in its 2026 Global Disinformation Index: “The best defense against information warfare is not silencing voices, but strengthening the resilience of the information ecosystem itself.”

This moment demands more than outrage—it demands clarity. As we navigate an era where truth is contested not in trenches but in timelines, the question is not whether foreign powers will try to manipulate our perceptions. It’s whether we, as societies, have the institutional immunity, the cultural discernment, and the collective will to see through the fog—and still choose to govern ourselves.

Photo of author

Omar El Sayed - World Editor

Universal Classic Monsters: A Dark Cinematic Homage

Smell Like Kiryu: New Like a Dragon Perfume Released

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.