Encryption and the EU: all the chikens coming home to roost

The First Report on Encryption recently published by the EU Innovation Hub for Internal Security contains the guidelines and desiderata on encryption of EU structures dealing with security and crime fighting, and highlights the unresolved contradictions of the free availability of cryptographic technologies by Andrea Monti – Adjunct Professor of Digital Law – University of Chieti-Pescara – Initially published in Italian by Formiche.net

The debate on cryptography, while not gaining the limelight, continues to rage in the practitioner community. In reality, there is not much that is new, because there has been no move away from the positions that matured in the early 1990s regarding the need to have weakened encryption, to provide police with privileged access to telephone exchanges and networks, and to design methods that allow interfacing with eavesdropping systems.

From this point of view, the First Report on Enc ryption is a useful tool to understand at what point in the curve is the process of weakening certain uses of cryptography that constitute a sometimes insurmountable obstacle for the activities of police and intelligence services.

The Report gives an account of the proposed EU regulation on client-side scanning (i.e. the automatic, preventive and indiscriminate search of any device before allowing a locally encrypted message to be sent), which has aroused the most controversy in civil society and, to some extent, in industry, but which is not the only one and, in some respects, is not even the most complex to manage. In fact, the difficulties in imposing such a mode of operation are more regulatory and political than technological in nature.

Other points of the report, on the other hand, are more interesting, as they identify the relationship with the private sector (Big Tech, but also – more generally – the independent technical community) as a crucial element of the future strategy to make access to encrypted information more effective.

The unstoppable institutional dependence on the private sector
Firstly, the authors of the Report highlight the inevitable and growing dependence of investigators on the contribution of telecommunication and electronic communication service operators. These entities already play an essential role in investigative activities, which is that of the mandatory storage of their users’ telematic traffic data, but this is not enough.

In addition to the ‘classic’ parameters on IP number, date and time of connection, user accessing the network, user-agent and so on, investigators have increasingly come to rely on DNS (Domain Name System – that is, those services that let you know what the user is trying to connect to) queries.

The investigative importance of this information is obvious, but the possibility of getting hold of it is limited by the fact that DNS queries are encrypted according to standards defined by Big Tech like Google (DoQUIC) and Cloudflare (ODOH). This means, the Report notes, that ‘law enforcement will become more dependent of DNS service providers’ cooperation’.

Similarly, the Report continues, ‘the use of encryption in 4G (VoLTE) and 5G (Standalone 5G) telecommunications technologies complicates the ability of law enforcement and judicial authorities to conduct investigations. These standards introduce end-to-end encryption (E2EE) for voice calls over the network, which complicates the interception of criminal communications in roaming connections. For this reason, it is important that communication service providers disable privacy protection technologies in internal routing.”

These considerations lead the drafters of the study to call for the needs of interception and decryption to be taken into account already at the design level of technology standards, going so far as to allow (as telephone exchanges had already been doing for some time) a direct connection of investigators to the operator’s network.

Quantum computing, HPC and police investigations
Another relevant aspect that emerges from the Report is the role of quantum computing in cryptanalysis activities, i.e. ‘breaking’ the encryption of a message without knowing the decryption key.

The Report considers, in fact, that the monstrous computing power at the disposal of these computers is such that it is possible to decrypt in a reasonable time what today would take only a few decades, if not centuries or even millennia.

Equipping oneself with this technology is, it is believed, of strategic importance to enhance the investigative capabilities of institutional structures. But this, even if the Report does not say so, implies having to unravel at least two political knots.

The first concerns the choice of whether, how, and to what extent to apply the well-known principle of ‘store now, decrypt later’, according to which even if one does not possess the capacity to decrypt the material acquired, it is always advisable to collect it in order to be able to take advantage, in the future, of the new possibilities offered by progress, as is the case with quantum computing.

Apart from the industrial cost of a generalised and indiscriminate retention even of encrypted traffic on public networks, even the most ardent supporters of the theory of the glass man could hardly accept such an option.

The second knot that should be untied is to whom technologies such as quantum computing should be made available. Indeed, when their evolution allows them to be used to offer them in ‘as a service’ mode as is already the case with artificial intelligence, it is clear that these technologies will also be available to criminals and not only for legitimate uses. Thus, regulators may be faced with the choice of not only preventing the general commercialisation of quantum computing, but also of controlling the knowledge needed to develop it.

Clearly, this choice would have undeniable advantages from the point of view of the technological superiority of the state over criminals (or of some states over others, as is the case with the containment of nuclear proliferation). Such a solution, however, would have the consequence of making it impossible to provide services to a large user base, and would therefore make it difficult to make sustainable investments in research and development in the sector.

The obligation to hand over decryption keys as a complementary tool
Before we have quantum computing architectures actually available on the market and concretely usable, we will have to wait at least a decade, and thus there would be time to weigh the pros and cons of the options on the table with due consideration. In the meantime, however, the problem of access to encrypted information remains in all its seriousness, and therefore – let us come to the last relevant aspect of the Report – the problem arises of finding a solution that can work in the immediate future.

As counter-intuitive as it may seem, in parallel with technical cryptanalysis, an approach that we could call ‘legal cryptanalysis’ is gaining ground, i.e. the adoption of rules that make it compulsory even for the suspect or suspected person to hand over the decryption keys.

For instance, in the Netherlands, the Report states, it is already possible to resort to a moderate use of force to induce the suspect to unlock access to a protected apparatus without the judge’s authorisation.

However, such a solution clashes with the ban on self-incrimination that is one of the cornerstones of Western judicial systems and a pillar of the rule of law. This is even more true for those hypotheses that suggest providing for a penalty heavy enough to induce cooperation from those who do not allow access to encrypted information, since they would in any case risk being sentenced to a long prison term.

Conclusions
It is quite clear from reading the report that the relationship between encryption and judicial investigations is approached with a low profile and in a way that (apparently) lacks an overall strategy.

This approach is probably due to the need to avoid a public debate on principles that, as we have seen with the issues raised by biometric facial recognition and the use of artificial intelligence in the fields of defence and national security, could be very divisive.

Finally, the issue of the relationship between the private sector and institutional bodies remains open, with no real expectation of a solution, no longer and not only in terms of cooperation in the field, but above all with a view to the development of cutting-edge technologies to be put at the service of institutions or the market.

Leave a Reply

Your email address will not be published. Required fields are marked *