As this year's Radar goes to print the current landscape is dominated by competing narratives, strategic ambiguity, hybrid models of collaboration, diverging interests and converging technologies. Non-state actors are assuming increasingly significant, and often unexpected, roles and functions. These developments will continue to raise ethical and normative questions, as well as questions around decision making-transparency and supply chain accountability. Decisions by non-state actors or entities --- notably private technology firms --- on how they engage with a conflict or other contested issues will be increasingly framed in geopolitical terms, influencing their public standing as well as revenue streams.
With the onset of so-called immersive environments, human enhancement practices and cognitive warfare, the battlefield is expanding into the human body and mind. Information, hedging and influence operations and other such subversive activity is likely to become more intense and divisive over the coming years due to advances in life and neurosciences and greater insights into behaviour modulating technologies and their perceived strategic value in conflict.
Looking 25 years ahead, it appears that we are heading toward a great systemic decoupling with mutually incompatible normative and technological trajectories and with significant implications for global trade, multilateralism and international cooperation, as well as for preventive diplomacy and conflict management. The 2023 Radar will feature a deep-dive into these and related issues. Here, we offer a visualisation of the Anticipation Potential of this field, and reprint the 2021 Radar's invited contribution on the Digitalisation of Conflict from Myriam Dunn Cavelty, Anja Kaspersen and Camino Kavanagh.
Invited Contribution: Digitalisation of Conflict
Myriam Dunn Cavelty, Center for Security Studies at ETH Zurich
Anja Kaspersen, Carnegie Council for Ethics in International Affairs
Camino Kavanagh, Kings College London, Carnegie Endowment for International Peace
Digital technologies are playing a twofold role in conflict. On the one hand, they are used to extend power politics into the poorly regulated domain of global data exchange, where they are used to exploit and exacerbate existing political tensions, posing significant risks and potential harms to individuals, communities and businesses across the globe; on the other hand, they can contribute to better understanding and monitoring of conflict and, if accompanied by the necessary political momentum, agency and effort, to preventing and even resoling various sorts of conflicts, thus mitigating the aforementioned harms. It remains to be seen whether existing institutions traditionally tasked with building and maintaining international norms for security and stability and relevant structures, tools and processes will suffice to both understand and manage the challenges emerging around digital technologies, including the escalatory potential of certain uses, or whether the world needs alternative governance structures.
Many parts of the technological infrastructure, algorithmic systems and data flows that we rely on every day can be exploited for criminal or political purposes. The bigger the societal reliance on the uninterrupted, trustworthy operation of digital technologies for essential services and other functions deemed crucial to our collective well-being, the higher the disruptive potential. While the world has yet to see a large-scale destructive global attack on a critical infrastructure, non-state, semi-state and state actors' cyber operations cause minor to major disruptions on an almost daily basis. Apart from hacking into technical systems to extract data or cause other types of damage, algorithmic technologies are being deployed to influence opinion, to spread distrust and to undermine faith in traditional social-political structures. Understanding the significance of this, the roles of international actors and how manipulation techniques will change in future is a major challenge. At stake is the nature of trust and trustworthiness, which, already under significant pressure due to non-technological drivers, has become badly eroded in many communities, increasing extreme polarisation. Undoubtedly, different uses of digital technologies are contributing to this growing trust deficit which continues to weaken some of the fundamental pillars of social cohesion and civic order.
On the other hand, digital technologies, when used in support of existing human analytical capabilities, has proven effective in enabling a enhanced understanding of certain forms of conflict and have significant potential for early warning, the monitoring of peace agreements and other such arrangements in the event of armed conflict. Computational social scientists have begun to simulate conflicts on local and regional levels using agent-based models in order to develop early warning systems. These models capture the simultaneous interactions and movement of multiple actors and simulate the complex behaviour that emerges. This is made possible by developments in several key technologies: a rapid increase in computing power, improved computational based simulations and methodologies such as AI and machine learning, the development of intelligent agents and the availability of ever-larger datasets to train these systems on a multitude of conflict vectors. Together, these have created a significant opportunity to model broader social, political and economic systems and to study conflict at unprecedented scale and resolution. However, issues about data access and quality, the complexity of modern conflicts that engage a broad range of international and external actors make real and impactful progress in digital conflict modelling difficult. To date modelling falls short of offering explanations, culturally sensitive understanding or strategies for engagement. As with earlier developments regarding the opportunities that can be derived from information technologies for conflict prevention and resolution purposes, such opportunities need to be accompanied by a political will to engage, technology literacy , agency, investments, of which there appears to be a significant shortage at present, adding to the growing strains on the international system.
Whether digital technologies are used with positive or malicious intent or something in between is not inherent to the technologies themselves but depends on human decisions. The intentions, norms, and value structures of technological developers find their way into the artefacts during the design stage, while existing power structures influence the desirability of specific aspects, forms or functions of technology. Given the disruptive potential of digital technologies, state and non-state actors are discussing voluntary and binding norms to balance the opportunities and risks of global society's ongoing digital transformation and shape behaviours relevant to the development and uses of the technologies. Due to enduring uncertainties about the scope and pace of ongoing socio-technological transformations, the growing centrality of digital technologies and data to great power rivalry, an increasing willingness to use disruptive tools in the context of accelerating great power rivalry, and significant fragmentation of authority and accountability on different levels, managing digital insecurities continues to be a most challenging governance issue in contemporary international affairs.
FUTURE TRENDS WITH A 25 YEAR PERSPECTIVE
With the expected increase in data availability, advances in machine learning as well as better understanding of the relation between digital information flows and behaviour, it seems likely that monitoring and modelling performances will improve. At the same time, the world might see an extension of manipulation, militarisation, weaponisation and targeting capabilities. The convergence between cyber- and bio-technologies will potentially result in further threats to the individual and society, including new encroachments on personal privacy and fast growing commercialisation of bio data flows involving the body and the brain. That said, future developments are uncertain: the interaction between politics and digital technologies often creates disruptive, unexpected effects.
Monitoring and modelling conflicts
The widespread deployment of sensor technologies and progress in data gathering will allow for better monitoring and managing of live conflicts and drivers of conflict. To be effective and trusted by all parties, this will require more transparent data gathering and storing practices and a shift in the collaboration of private actors and security agencies.
The ability to simulate conflicts through digital (machine learning, simulation) techniques will allow actors to create plausible scenarios and to take part in joint problem-solving exercises or use modelling as a tool for both conflict prevention and resolution efforts. Monitoring online activity, machine learning algorithms will be able to anticipate potential crises. Trust in the models depends on their vetted ability to simulate complex systems, in an imperfect environment, including actors and organisations involved and their perceived legitimacy.
Psycho-social strategies enabled by digital and algorithmic technologies will play a greater role in manipulating and controlling narratives and populations. Current trends in monitoring and surveillance by private, largely unregulated entities will trigger moves to regulate these forms of interference, and over time may induce drastic shifts in the business models of social media companies and other relevant corporations, cementing the legitimacy of the state against the backdrop of pursuing counteractions.
Convergence between cyber- and bio technologies
Health sector entities and research institutions that focus on data gathering --- pharma, nutrition, personal sensors etc --- will risk becoming more deeply and directly involved in conflict and will continue being the target of malicious cyber activity. In addition to new personal privacy encroachments on bio data flows and brain interfaces, such a scenario can will give rise to important personal safety and national security concerns as unauthorised access to certain types of health data (human genomes for example) may allow them to micro-target specific groups with biological weapons. Accountability and regulatory action will become a bigger issue for private actors, such as biotechnology, digital and social media companies, as these and other challenges to existing models of self-governance emerge.
WHY DO THEY MATTER?
The impact of digital technologies and digital flows of information on societies across the globe is evident, even more so in fragile societies with historically poor or nonexistent digital infrastructure and data regulations. As the power of digital technologies increases through more precise data gathering and more applied uses of algorithms, the potential of digital technologies for conflict prevention and resolution increases. At the same time, however, more elaborated and wide-reaching destabilisation strategies and capabilities can be deployed. This raises a series of essential questions to the wider security establishment: how do we protect societies against the destabilising potential of technologies alongside targeted attempts of using technologies to destabilise the geostrategic order? How can the multilateral system adapt to manage extant and emerging challenges associated with digital technologies and better leverage the opportunities they offer for conflict prevention and resolution?
Complex, dynamic frameworks already govern some fields of digital technologies. Some of these are contested and others remain under-developed or are under development, and most lack any implementation framework. All while new risks and vulnerabilities relevant to the technologies and how they may be exploited in conflict continue to emerge. Some involve just states, while others involve a range of other critical actors. Some may be anchored in hard or soft law while others may involve a mix of binding, non-binding and self-regulatory elements. Looking to the future, responding to the attendant risks and challenges of digital technologies, particularly how they may be used in conflict, requires more than an understanding of these frameworks and the relevant organisations, tools, structures and processes. It will require a deeper understanding of how they interact with each other; an appreciation of the overlapping social, cultural, economic, environmental and (geo)political contexts against which they are crafted; a firmer grasp of the overarching questions of power and conflict that tend to shape our relationship with digital technologies; and moreover, a much deeper and informed public debate at every step of this journey.