Use the future to build the present
Digitalisation of Conflict
Stakeholder Type
1Quantum Revolution& Advanced AI2HumanAugmentation3Eco-Regeneration& Geo-Engineering4Science& Diplomacy1. ANTICIPATIONPOTENTIALAdvancedArtificial IntelligenceQuantumTechnologiesBrain-inspiredComputingBiologicalComputingCognitiveEnhancementHuman Applications of Genetic EngineeringRadical HealthExtensionConsciousnessAugmentation DecarbonisationWorldSimulationFuture FoodSystemsSpaceResourcesOceanStewardshipComplex Systems forSocial EnhancementScience-basedDiplomacyInnovationsin EducationSustainableEconomicsCollaborativeScience Diplomacy
1Quantum Revolution& Advanced AI2HumanAugmentation3Eco-Regeneration& Geo-Engineering4Science& Diplomacy1. ANTICIPATIONPOTENTIALAdvancedArtificial IntelligenceQuantumTechnologiesBrain-inspiredComputingBiologicalComputingCognitiveEnhancementHuman Applications of Genetic EngineeringRadical HealthExtensionConsciousnessAugmentation DecarbonisationWorldSimulationFuture FoodSystemsSpaceResourcesOceanStewardshipComplex Systems forSocial EnhancementScience-basedDiplomacyInnovationsin EducationSustainableEconomicsCollaborativeScience Diplomacy

Invited Contribution:

Digitalisation of Conflict

Digital technologies are playing a twofold role in conflict. On the one hand, they are used to extend power politics into the poorly regulated domain of global data exchange, where they are used to exploit and exacerbate existing political tensions, posing significant risks and potential harms to individuals, communities and businesses across the globe; on the other hand, they can contribute to better understanding and monitoring of conflict and, if accompanied by the necessary political momentum, agency and effort, to preventing and even resolving various sorts of conflicts, thus mitigating the aforementioned harms. Looking to the future, it is unclear whether existing institutions traditionally tasked with developing norms for international peace, security and stability and relevant structures, tools and processes will suffice to both understand and manage the challenges emerging around digital technologies, including the escalatory potential of certain uses, or whether the world needs alternative governance structures.

Many parts of the technological infrastructure, algorithmic systems and data flows that we rely on every day can be exploited for criminal or political purposes. The bigger the societal reliance on the uninterrupted, trustworthy operation of digital technologies for essential services and other functions deemed crucial to our collective well-being, the higher the disruptive potential. While the world has yet to see a large-scale destructive global attack on a critical infrastructure, non-state, semi-state and state actors’ cyber operations cause minor to major disruptions on an almost daily basis. Apart from hacking into technical systems to extract data or cause other types of damage, algorithmic technologies are being deployed to influence opinion, to spread distrust and to undermine faith in traditional social-political structures. Understanding the significance of this, the roles of international actors and how manipulation techniques will change in future is a major challenge. At stake is the nature of trust and trustworthiness, which, already under significant pressure due to non-technological drivers, has become badly eroded in many communities, increasing extreme polarisation. Undoubtedly, different uses of digital technologies are contributing to this growing trust deficit which continues to weaken some of the fundamental pillars of social cohesion and civic order.

On the other hand, digital technologies, when used in support of existing human analytical capabilities, has proven effective in enabling an enhanced understanding of certain forms of conflict and have significant potential for early warning, the monitoring of peace agreements and other such arrangements in the event of armed conflict. Computational social scientists have begun to simulate conflicts on local and regional levels using agent-based models in order to develop more complex early warning systems. These models capture the simultaneous interactions and movement of multiple actors and simulate the complex behaviour that emerges. This is made possible by By a number of technological developments: a rapid increase in computing power, improved computational based simulations and methodologies such as machine and deep learning, the development of intelligent agents and the availability of ever-larger datasets to train these systems on a multitude of conflict vectors. Together, these have created a significant opportunity to model broader social, political and economic systems and to study conflict at unprecedented scale and resolution. However, issues regarding data access and quality and the complexity of modern conflicts that engage a broad range of international and external actors, make real and impactful progress in digital conflict modelling difficult.

To date, modelling falls short of offering explanations and culturally sensitive understandings of conflict and the actors involved that are critical to then shaping strategies for engagement. And as with earlier developments regarding the opportunities that can be derived from information technologies for conflict prevention and resolution purposes, such opportunities need to be accompanied by a political will to engage, technology literacy, agency and investment, all of which there appears to be a significant shortage at present, adding to the growing strains on the international system.

Whether digital technologies are used with positive or malicious intent or something in between is not inherent to the technologies themselves but depends on human decisions and behaviours. The intentions, norms, and value structures of technological developers find their way into the artefacts during the design stage, while existing power structures influence the desirability of specific aspects, forms or functions of technology. Given the disruptive potential of digital technologies, state and non-state actors are discussing voluntary and binding norms to balance the opportunities and risks of global society’s ongoing digital transformation and shape behaviours relevant to the development and uses of the technologies. Due to enduring uncertainties about the scope and pace of ongoing socio-technological transformations, the growing centrality of digital technologies and data to great power rivalry, an increasing willingness to use disruptive tools in the context of accelerating great power rivalry, and significant fragmentation of authority and accountability on different levels, managing digital insecurities continues to be a most challenging governance issue in contemporary international affairs.

With the expected increase in data availability, advances in machine learning as well as better understandings of the relation between digital information flows and behaviour, it seems likely that monitoring and modelling performances will improve. At the same time, the world might see an extension of manipulation, militarisation, weaponisation and targeting capabilities. The convergence between cyber- and bio-technologies will potentially result in further threats to the individual and society, including new encroachments on personal privacy and fast growing commercialisation of bio data flows involving the body and the brain. That said, future developments are uncertain: the interaction between politics and digital technologies often creates disruptive, unexpected effects.

Monitoring and modelling conflicts

The widespread deployment of sensor technologies and progress in data gathering will allow for better monitoring and managing of live conflicts and drivers of conflict. To be effective and trusted by all parties, this will require more transparent data gathering and storing practices and a shift in the collaboration of private actors and security agencies.

The ability to simulate conflicts through digital (machine learning, simulation) techniques will allow actors to create plausible scenarios and to take part in joint problem-solving exercises or use modelling as a tool for both conflict prevention and resolution efforts. Monitoring online activity, machine learning algorithms will be able to anticipate potential crises. Trust in the models depends on their vetted ability to simulate complex systems, in an imperfect environment, including actors and organisations involved and their perceived legitimacy.


Psycho-social strategies enabled by digital and algorithmic technologies will play a greater role in manipulating and controlling narratives and populations. Current trends in monitoring and surveillance by private, largely unregulated entities will trigger moves to regulate these forms of interference, and over time may induce drastic shifts in the business models of social media companies and other relevant corporations, cementing the legitimacy of the state against the backdrop of pursuing counteractions.

Convergence between cyber- and bio technologies

Health sector entities and research institutions that focus on data gathering — pharma, nutrition, personal sensors etc — risk becoming more deeply and directly involved in conflict and will continue being the target of malicious cyber activity. In addition to new personal privacy encroachments on bio data flows and brain interfaces, such a scenario will give rise to important personal safety and national security concerns as unauthorised access to certain types of health data (human genomes for example) may allow them to micro-target specific groups with biological weapons. Accountability and regulatory action will become a bigger issue for private actors, including biotechnology, digital and social media companies, as these and other challenges to existing models of self-governance emerge.

Why do they matter?

The impact of digital technologies and digital flows of information on societies across the globe is evident, even more so in fragile societies with historically poor or nonexistent digital infrastructure and data regulations. As the power of digital technologies increases through more precise data gathering and more applied uses of algorithms, the potential of digital technologies for conflict prevention and resolution increases. At the same time, however, both state and non-state actor can deploy more elaborate and wide-reaching destabilisation strategies and capabilities. This raises a series of essential questions to the wider security establishment: how do we protect societies against the destabilising potential of technologies alongside targeted attempts of using technologies for geopolitical purposes and to destabilise the international order? How can the multilateral system adapt to manage extant and emerging challenges associated with digital technologies and better leverage the opportunities they offer for conflict prevention and resolution?

Complex, dynamic frameworks already govern some fields of digital technologies. Some of these are quite advanced, although may be contested. Others remain under-developed or are under development, and most lack any implementation framework. All while new risks and vulnerabilities relevant to the technologies and how they may be exploited in conflict continue to emerge. Some of these frameworks involve just states, while others involve a range of other critical actors. Some may be anchored in hard or soft law while others may involve a mix of binding, non-binding and self-regulatory elements. Looking to the future, responding to the attendant risks and challenges of digital technologies, particularly how they may be used in conflict, requires more than an understanding of these frameworks and the relevant organisations, tools, structures and processes. It will require a deeper understanding of how they interact with each other; an appreciation of the overlapping social, cultural, economic, environmental and (geo)political contexts against which they are crafted; a firmer grasp of the overarching questions of power and conflict that tend to shape our relationship with digital technologies; and moreover, a much deeper and informed public debate at every step of this journey.