ARCAS (the acronym stands for AI-Powered, Computerized Solution for Assault Rifles) reduces the difficulty of a soldier involved in combat in analysing the scenario and making decisions on engaging opponents. It is a system built around the concept of an assault rifle that allows soldiers to identify threats in advance, optimise firing trajectories, lock onto targets and have visual information on the weapon’s operation and interaction with the environment. In addition, the system allows the operator to receive information from the control room and other team members equipped with the same instrument by Andrea Monti – Initially published in Italian by Strategikon – an Italian Tech Blog
The integration of artificial intelligence to perform all these functions may be a novelty factor but, apart from that, ARCAS is nothing revolutionary. It is the miniaturisation of systems already used in the Air Force and the Navy.
However, it worths thinking about the relationship between reality and fiction, already mentioned when commenting on the arrival of the “online-ready” Kalashnikov on the civilian market. Like the Russian 12-gauge, ARCAS is also strongly influenced by the world of video games. The manufacturer explicitly states that soldiers operate the system using a button-joystick positioned on the front grip of the rifle and a graphic interface inspired by the world of gaming. The advertising video for this product makes it easy to believe.
As far as we are concerned, this is precisely the problem: the further thinning of the border between the reality of violence, made of flesh and blood, and the dehumanisation of the target made possible by the technological intermediation described by David Grossman in his book On Killing and by Gavin Hood in the movie Eye in the Sky.
The military and the video game industry relationship is close, documented, and goes back in time. As Huntemann and Payne point out in Joystick Soldiers. The Politics of Play in Military Video Games, published in 2010,
the popularity of military-themed video games has increased since 2001, with a significant portion of these games focusing on terrorist/counter-terrorist conflict. For the US military, the events of 11 September 2001 and subsequent events accelerated a change within the armed forces on how to better train and equip their soldiers for the realities of modern warfare. Part of this change included renewing long-standing relationships and forging further partnerships with the entertainment industry, particularly film and video game producers.
Their integration and interaction accelerated thanks to the increase in the graphic capabilities of dedicated computers and consoles (and programming languages and techniques). In the space of just over twenty years, it enabled the transition from FPSs such as Wolfenstein 3D and Doom to their hyper-realistic modern-day successors.
The dehumanisation of the adversary, which from a human being becomes an animated icon on the screen of a smart TV, begins even before installing a visor in a combat helmet. While waiting for a war that hopefully will not happen, it is reasonable to ask how the perception of killing changes individual behaviours and attitudes when the game creates a reality in which we are constantly immersed.
This last consideration introduces the other theme of the dehumanisation of the operator and not only of his opponent.
Systems such as ARCAS (assuming they work in earnest, long enough and without interference) eliminate or significantly reduce the options on which the soldier has to make a decision (which target should I hit? how many shots do I have in the magazine? which path is safer?). On the one hand, it is reasonable for this to happen because it reduces the margin of error and increases the soldier’s protection. On the other hand, if directions arrive from the interface managed by the AI, the operator risks being reduced to a biological automaton that must only execute orders given by a technological platform.
It is not the case to evoke Robocop-like science fiction scenarios, since in situations of this kind, the last word lies with those in the field. Leaving aside the military sphere, however, the problem posed by the use of ARCAS concerns all those similar cases (from autonomous driving to surgical operations) in which there is a risk of confusing a machine’s high degree of operational autonomy with the existence of a responsibility that is autonomous and separate from its user.
In even more general terms, we should ask ourselves what happens when both the hooks that keep us bound to reality (individual self-determination and the perception of the humanity of the other) break and we float in a state of non-perception of the consequences of our actions.
A few elements to answer are already there.