Monday, April 6, 2026

War : Survivors, Memory, and Moral Responsibility (30)

In the theatre of international relations, few figures have disrupted the established script as provocatively as Donald Trump. His "America First" doctrine, often perceived as a paradoxical blend of isolationism and aggressive unilateralism, has forced a global re-evaluation of the American hegemon's intentions. While the drums of conflict often beat to the rhythm of "stability" and "democracy," the underlying cadence frequently suggests a more pragmatic, perhaps even more cynical, choreography of power.

The central enigma of this era lies in the true catalyst behind the escalations and military posturing witnessed during his tenure. One must consider whether these confrontations were primarily driven by a neo-mercantilist desire to secure foreign oil reserves, thereby continuing a long-standing tradition of using energy as a formidable lever of geopolitical influence. Alternatively, were these actions intended to act as a bellicose stimulus for the American economy, seeking to guarantee domestic prosperity through a policy of high-stakes external aggression?

Furthermore, there is the compelling possibility that such conflicts were merely components of a more expansive ambition: a relentless projection of dominance designed to reassert the United States as the world's uncontested superpower. Yet, as with all complex geopolitical manoeuvres, the surface narrative rarely tells the whole story. Could there be other, more clandestine motives—hidden beneath the layers of populist rhetoric and strategic posturing—that provide the definitive explanation for these global confrontations?

Commentators often note that military interventions or confrontational policies under Trump have been framed in several ways. Some argue they were linked to securing energy resources, particularly oil, given the long-standing role of energy in U.S. foreign policy. Others suggest they were primarily aimed at bolstering American economic growth by ensuring favourable trade conditions and protecting strategic industries. A further perspective is that such actions were part of a broader attempt to project U.S. power globally, reinforcing dominance in international affairs.

In short, interpretations vary: some see oil as the driver, others see economic protectionism, and still others see a bid for global influence. What unites these views is the recognition that energy, economics, and geopolitical ambition are deeply intertwined, and that no single explanation fully captures the complexity of U.S. actions.

However, speculation regarding conventional motives such as oil and military prowess in the Trump era may merely represent the final chapter of an old manual on power. Today, the global paradigm has shifted radically, where influence is no longer measured solely by the count of warheads or the vastness of seized oil fields. We now stand at the threshold of a new epoch that redefines sovereignty, compelling us to pose a question far more critical to the future of civilisation.

In this increasingly complex and digitised global landscape, who truly holds the key to world dominion? Will the global hierarchy continue to be dictated by the oil magnates who control physical energy, or has that control transitioned into the hands of the economic titans who orchestrate the flow of global capital? Alternatively, could the throne of future power be occupied by the masters of AI who govern artificial intelligence and algorithms, or perhaps by those most adept at controlling the narrative and shaping public perception within the digital realm?

In our present age, the notion of who might truly “rule the world” is less about crowns or armies and more about the forces that shape everyday life. Artificial intelligence, for instance, has the capacity to infiltrate nearly every sphere—commerce, defence, culture, even governance—yet its power remains tethered to regulation, ethics, and public trust. Economic giants, meanwhile, continue to wield immense influence, able to sway governments and dictate the rhythms of global trade, though their dominance is often fragile in the face of crises. Oil, once the undisputed lever of geopolitics, is now challenged by the urgency of climate change and the rise of renewable energy, though control of energy resources still confers formidable leverage.

And then there is narrative. In an era defined by social media and instantaneous communication, those who can craft and disseminate compelling stories hold sway over hearts and minds. Narratives legitimise power, galvanise movements, and destabilise regimes. They are intangible yet pervasive, capable of reshaping reality itself.

If one were to distil it, artificial intelligence provides the infrastructure and capability, while narrative supplies the legitimacy and direction. Economics and energy remain vital, but without a persuasive story or technological scaffolding, their grip weakens. In truth, the contest is not between these forces in isolation, but in how they intertwine: algorithms that amplify narratives, economies that depend on technological systems, and energy that fuels both.

So the question becomes less “who rules” and more “which combination of forces will define the future.” Would you find it more unsettling to live under the cold precision of algorithms, or under the emotional sway of narratives that bend perception itself?

A world steered by narrative is indeed frightening, for it can infiltrate collective consciousness and bend emotions with subtlety. Yet the dominion of cold algorithms presents a different, perhaps deeper, unease. These systems no longer merely follow explicit instructions; they learn, adapt, and generate outcomes that even their creators cannot fully anticipate. This opacity, often described as the “black box” problem, means that while we can witness the results, the reasoning behind them remains shrouded.

The true peril lies not simply in our inability to predict, but in our growing dependence upon such systems. As more aspects of life—finance, security, health, even social interaction—are mediated by algorithms, we risk surrendering control to mechanisms we do not comprehend. Narratives may deceive the heart, but algorithms can quietly restructure reality itself. When the two converge—algorithms amplifying narratives—the potential for unforeseen transformation becomes profound.

Thus, the tension is stark: the unpredictability of algorithms versus the deliberate certainty of manipulative narratives. One unsettles because it cannot be fathomed, the other. After all, it is designed to mislead. Which of these, do you think, would ultimately prove the more corrosive to human freedom?

Narratives, however manipulative, still leave traces of their origin. They can be challenged, countered, and dismantled by other narratives. Algorithms, on the other hand, are far more elusive. They operate beneath the surface, hidden in layers of code and data, producing outcomes that often appear neutral or inevitable. Yet the logic that drives them is rarely transparent, and even those who design them may struggle to explain their decisions once the systems begin to learn and evolve.

This opacity is precisely what makes algorithmic power so unsettling. Unlike a narrative, which can be debated in the public square, an algorithm can quietly shape reality without revealing its hand. It decides what information you see, which opportunities you are offered, and even how institutions treat you—all while cloaked in the aura of objectivity. The danger lies in the fact that we cannot easily contest what we cannot see or understand.

In that sense, the fear of algorithms is not only about unpredictability, but about invisibility. They do not merely manipulate emotions; they restructure the very conditions under which emotions, choices, and narratives emerge. And when such systems are entrusted with decisions that affect millions, the question of who truly holds power becomes far more difficult to answer.

The question of whether human beings can truly control algorithms is a vexed one. In principle, algorithms are human creations, designed, coded, and deployed by us. Yet once they are set loose in the world, particularly those built upon machine learning, they begin to evolve in ways that exceed our immediate grasp. They absorb vast amounts of data, identify patterns invisible to human perception, and generate outcomes that may surprise even their architects. This does not mean they are entirely beyond control, but rather that the nature of control shifts: it becomes less about direct command and more about governance, oversight, and the imposition of ethical boundaries.

Humans can regulate the conditions under which algorithms operate, limit their scope, and demand transparency. We can insist upon accountability, requiring that those who deploy such systems explain and justify their use. Yet the deeper challenge lies in the opacity of the systems themselves. To “control” an algorithm in the fullest sense would mean understanding and anticipating every decision it makes, and that is increasingly impossible. What remains within our reach is the ability to decide where algorithms may be applied, how their outputs are interpreted, and whether their authority is accepted.

Thus, the question is not whether humans can control algorithms in the mechanical sense, but whether we can control the social and institutional frameworks that give them power. The danger is not that algorithms act alone, but that human institutions abdicate responsibility, hiding behind the supposed neutrality of machine logic. In that abdication lies the greatest risk: not the algorithm itself, but our willingness to let it govern without challenge.

The gravest danger does not reside in the algorithms themselves, but in the human institutions that choose to cloak their authority behind them. Algorithms, after all, are tools—complex, opaque, and often unpredictable, but tools nonetheless. It is the decision to elevate them to the status of unquestionable arbiters that transforms them into instruments of domination. When governments, corporations, or other centres of power invoke the neutrality of machine logic, they absolve themselves of responsibility, presenting outcomes as inevitable rather than chosen.

This abdication of accountability is what renders the situation so perilous. The algorithm becomes a convenient mask, a way of enforcing decisions without debate, of silencing dissent by appealing to the supposed objectivity of code. In truth, every algorithm reflects human choices: the data selected, the objectives defined, the contexts in which it is deployed. To pretend otherwise is to surrender agency and to allow power to operate without scrutiny.

Thus, the true threat lies not in the machinery itself, but in the willingness of institutions to hide behind it, to wield its authority while disavowing their own. In that concealment, the line between governance and manipulation blurs, and the possibility of resisting or contesting decisions diminishes. It is not the coldness of the algorithm that should frighten us most, but the warmth of human hands that place it upon the throne and then retreat into the shadows.

The answer is not that a single force—be it artificial intelligence, economic might, oil, or narrative—will reign supreme, but rather that power in our age is defined by the interplay between them. Artificial intelligence provides the infrastructure, the machinery through which decisions are made and realities are shaped. Economic power continues to dictate the distribution of resources and the stability of nations. Oil and energy, though challenged by climate imperatives, remain critical levers of influence. Yet it is narrative that confers legitimacy, that persuades populations to accept or resist, that transforms raw power into authority.

The greatest danger lies not in any one of these forces in isolation, but in the way human institutions wield them together. An algorithm may be cold, but it is human hands that place it upon the throne. A narrative may be manipulative, but it is human voices that choose to spread it. Oil and money may command, but it is human systems that decide how they are deployed. Thus, the true rulers of the world are not the tools themselves, but those who hide behind them, cloaking their authority in the neutrality of machines, the inevitability of markets, or the seduction of stories.

In this sense, the world is most likely to be ruled by those who master the fusion of technology and narrative, institutions that can harness algorithms to amplify stories and use those stories to justify economic and political power. It is not AI alone, nor narrative alone, but the convergence of both, wielded by human institutions that refuse accountability, that poses the most formidable claim to global dominion.

Key References

Abhijeet Sarkar, AI and Global Power Shifts: The New Geopolitical Battleground, 2025, Synaptic AI Lab Press 
Sarkar explores how artificial intelligence is redefining geopolitics, arguing that AI is becoming a decisive factor in global power rivalries, surpassing traditional military and economic dominance.

Nick Srnicek, Silicon Empires: The Fight for the Future of AI, 2026, Polity Press 
Srnicek examines the struggle among corporations and states to control AI, emphasising how narratives around technology are deployed to legitimise authority and reshape societies.

IEEE Collective, The Power of Artificial Intelligence for the Next-Generation Oil and Gas Industry, 2024, IEEE eBooks 
This work demonstrates how AI is transforming the energy sector, showing the convergence of digital technologies with the oil and gas sector, and how control of energy resources remains central to geopolitical leverage.