The Paradox of News Value
It is a well-established fact in media research that the impact of news dramatically diminishes the second time something similar happens. This phenomenon became visible notably during the Vietnam War, when the Têt Offensive in January 1968 received massive media attention, while subsequent operations in the same year went largely unnoticed, despite their similar scale. This desensitization effect is at the heart of what media researchers call compassion fatigue—a gradual weakening of empathy that occurs when repeatedly exposed to trauma and crisis.
For the public, this means the perception of reality may be systematically distorted. As the media and audiences collectively tire of particular types of news, those topics fall off the agenda, regardless of their actual importance. Researchers point out that news selection is based not only on time, space, and cultural proximity, but also on sensationalism and the unexpected. When something becomes routine, it loses its newsworthiness—even if its consequences remain dire.
Trump as Master of Attention
Donald Trump has for nearly a decade shown how this media desensitization can be systematically exploited. His strategy is simple yet effective: by constantly generating new controversies and statements, he forces the media to chase the next sensation rather than investigate substantive issues.
A case in point: On October 17, 2025, Trump met Ukrainian President Volodymyr Zelenskyy at the White House to discuss the possibility of supplying Ukraine with Tomahawk missiles. However, other events dominated the news cycle the same day: Trump’s commutation of former Congressman George Santos’s sentence, convicted of fraud, and confirmation of a US strike on a suspected drug vessel near Venezuela.
Trump understands that both traditional and social media platforms reward engagement—and nothing drives engagement like provocation and conflict.
European News Fatigue and Refugee Crises
Europe has seen clear examples of how compassion fatigue shapes both policy and society. The initial goodwill towards Syrian refugees in 2015–2016 has gradually given way to what researchers call “symptoms of exhaustion”—a combination of emotional fatigue and political polarization. When Ukrainian refugees began arriving in 2022, the EU activated its Temporary Protection Directive for the first time, but within a year, public opinion shifted according to extensive research towards compassion fatigue.
Recent data from the Migration Policy Institute demonstrates how compassion fatigue continues to shape refugee policy. In the UK, support for refugees has fallen while 49% now favor completely closed borders, a rise from 44% the previous year. Notably, 37% of those who support taking in refugees also advocate border closures, a contradiction driven by skepticism, with 62% of Britons believing most refugees are economic migrants.
This desensitization is not merely a psychological reaction—it has concrete political consequences. In Turkey’s 2023 election, promises to “send Syrians home” became a central campaign theme, while European nations are seeing growing support for anti-immigration parties. Awareness of this dynamic makes it a weapon in itself—when the public becomes aware that news can be false, they may become more likely to doubt even truthful information.
The Information Weapons of Hybrid Warfare
In this environment, hybrid warfare becomes especially dangerous. Russia and other authoritarian states have systematically begun exploiting the West’s open information landscape and democratic vulnerabilities through what scholars describe as sophisticated information operations. Sweden’s Armed Forces describes hybrid operations as those that harm Sweden through disinformation, sabotage, and infiltration to undermine trust and create division. On October 15, 2025, a senior NATO official confirmed that “hybrid warfare has begun”. NATO countries are facing increased hybrid attacks that combine cyberattacks, disinformation, and drone interference.
Elections in Moldova in October 2025 offer a clear example of Russia’s systematic hybrid operations. The Kremlin has invested extraordinary resources to influence the election, blending financial manipulation, disinformation, and potential violence. Especially alarming is the use of AI-driven disinformation systematically targeting teenagers with false narratives—which may be the first known instance of Russian operators systematically tailoring messages for youth. Germany now formally classifies state- or proxy-controlled disinformation as a hybrid threat.
After the 2024 EU elections, extensive disinformation campaigns were identified that exploited this very desensitization effect. By flooding the information landscape with contradictory messages—what researchers call the “firehose of falsehood”—they create a sense that truth is impossible to establish. When everything can be false, nothing seems true, and the public retreats to cynicism and apathy.
In Sweden, Russia has reportedly systematically worked since the 1990s to divide, create chaos, and reduce trust in society and between people. Methods include everything from spreading conspiracy theories about the Swedish social services supposedly seizing Muslim children to cyberattacks on critical infrastructure.
The Amplification Effect of Algorithms
Social media platforms’ algorithms make the problem worse. Research shows these algorithms prioritize content that drives engagement, and negative news routinely generates more interaction than positive. This creates a vicious cycle where sensational and polarizing messages have a disproportionate impact, while nuanced reporting drowns in the noise.
A systematic review published in Frontiers in Communication in October 2025 analyzed 78 studies on algorithmic influence in news journalism. Results show that algorithmic systems reshape gatekeeping by prioritizing engagement metrics and reframing news value toward “shareability.” Platform business models intensify metrics dependence, limiting investigative depth.
A multidisciplinary study on computational propaganda from September 2025 highlights that “unlike traditional propaganda, computational propaganda operates at scale, with automation, synthetic media, and personalization to undermine trust, manipulate voters, and worsen polarization”. The study warns that generative AI and algorithmic amplification now coincide with critical global elections and rising democratic fragility.
The phenomenon of “doomscrolling”—the compulsive consumption of negative news—leads to what psychologists describe as digital compassion fatigue. When the brain is constantly bombarded with traumatic information, it eventually shuts down as a defense mechanism. The result is emotional desensitization and reduced agency precisely when society most needs engaged citizens.
Paths Forward in the Information War
Combatting this crisis in an age of hybrid warfare requires several parallel strategies. First, media must develop new formats that break through desensitization without fueling sensationalism.
Second, systematic education in source criticism and awareness of how information influence works is essential. The EU’s warnings about disinformation caution that faith in democratic processes is at stake if reliable information cannot be maintained.
In July 2025, the UK House of Lords published a scathing report entitled “failure to prioritise media literacy in the UK presents a risk to social cohesion and democracy”. The report describes how the UK “is losing ground” compared to other countries and criticizes the government’s incoherent and fragmented policy in the field.
Third, empathy and attention are finite resources that must be consciously managed. This means both media and citizens must actively choose what truly deserves focus, rather than letting algorithms and sensationalism steer the agenda.
Conclusion: A Battle Over Reality Itself
The principles of news selection and human desensitization are vulnerabilities now systematically exploited in hybrid warfare. When Trump turns politics into entertainment, when Russia floods the information landscape with disinformation, and when social media algorithms amplify the most polarizing content, the very ability to perceive and react to reality is at risk.
Addressing this challenge requires not just technical solutions or fact-checking, but a fundamental reevaluation of how we produce, distribute, and consume information. At a time when attention has become the battlefield’s most important front, democracies must develop counterstrategies that both acknowledge human limits and safeguard the meaning of truth. Otherwise, desensitization risks becoming so complete that we lose the capacity to respond even to society’s most urgent threats. These new strategies must include the benefits of using verified AI-generated content to match disinformation as much as possible in real time.
Lars-Erik Lundin
