Apple and Child Pornography. The Nobility of the Goal does not justify the means

This article has been written together with Guido Scorza and initially published in Italian by Huffingtonpost.it

Apple recently announced its intention to scan the content of the devices of its millions of users for child pornography images, for now only in the United States of America. It has triggered a global debate of enormous magnitude. It is understandable but hardly acceptable.

Two issues, among the many related to the governance of the digital ecosystem, emerge strongly: the role of private actors in the protection of public order and security and the limits to the compressibility of fundamental rights – privacy, inviolability of communications, due process – in the prosecution of even the most heinous crimes such as child pornography.

We try to condense an argument that would deserve much more extensive treatment in a handful of lines. The starting point is that private individuals do business, and – however respectful they may be of ethics and rights – they legitimately act in pursuit of profit maximisation and respond first to the market and shareholders and only then to Constitutions and fundamental rights. The State is concerned with the welfare of its citizens, the protection of public order and security, and the protection of rights, dictating rules that identify well-considered balancing positions between different interests that are never rivals and, above all, never tyrants. Moreover, it monitors compliance with the laws – to which it submits itself – and their enforcement.

Cooperation with the private sector has become, rightly or wrongly, an essential component of the functioning of the State. However, this is not enough to accept the idea that the private sector lays down the rules, ascertains and identifies any violations, and adopts initiatives and measures that restrict the rights of its users, who are, first and foremost, citizens of a democracy. It is true even when a private entity has such a significant presence on the market that it is difficult for its users to choose to evade its rules by ceasing to use all or part of its services.

Large platforms managed by a handful of private companies inhabit the ‘territory’ of the new digital ecosystem. They gained a privileged status in maintaining order and security as well as protecting rights. In the past, they used to replace public authorities, sometimes at the explicit request of policymakers, as in the case of ‘memoranda of understanding’ to combat fake news and sometimes based on regulatory obligations, as in the case of the indiscriminate storage of telematic traffic data. Now they are taking direct initiatives “in the name of fundamental rights”: the echo of the controversy over the compulsory closure of the social account of the then US President Donald Trump or the many more or less automated censorships set up by content and message sharing platforms has not yet died down when Apple’s announcement arrives.

Can we accept this progressive cession of sovereignty favouring private entities and non-EU ones at that? Even in the digital ecosystem, the State must continue to act as the State, and it cannot and must not abdicate its role in favour of private entities. Collaboration with private entities is welcome but based on a clear hierarchy of priorities: first the national interests and then, always in compliance with the law, the private ones.

From this point of view – which is more of a method than of merit – Apple’s initiative is unconvincing because it arrogates to itself the power that is not within its competence to dictate and enforce rules, restricting and, indeed, overturning fundamental rights to an extent and with methods established autonomously in the boardroom of a private company and not in a parliament.

The second issue that this affair has brought to the global community’s attention is, if possible, more complex and delicate than the first. There is no question that child pornography must be prevented and repressed. Not even the right to privacy – which is neither an absolute nor a tyrant right – can represent, in principle, an insurmountable limit in the fight against such a criminal phenomenon. It is impossible to establish a general criterion, but the Constitution entrusts to the Legislator, first, and then to the Magistrates, the task of balancing the compression of this right in real-life terms and case by case. The debate on the use – then regulated by law – of State-manned trojan horses is proof of this. At the same time, there is no doubt that the idea that, in the name of preventing a criminal phenomenon, however odious, the rights – not only the right to privacy – of tens of millions of innocent people can be overturned is neither legally nor democratically sustainable.

Does Apple’s solution achieve a proper balance regardless of concerns for its effectiveness? To answer, let us start from an unchallenged point: in this matter, the Data Protection Regulation is an impassable wall. Any processing is subject to an end (protecting an individual’s fundamental rights and freedoms) and a means (compliance with the law). Thus, individual ethical choices based on ‘doing the right thing’ cannot serve to circumvent an insurmountable obstacle: the primacy of the State over crime prevention and repression. In addition, and this is another front protected by the GDPR, it cannot be allowed that, having opened a road in the name of the sacrosanct battle against online child pornography, the same road can be, tomorrow, beaten to pursue other less relevant purposes, less noble or, even, in some cases, not noble at all.

In Europe, as of today – and without prejudice to re-evaluate the case after the possible final approval of the proposed European Regulation on the so-called chat control – the solution would contrast with the discipline on privacy and with the prohibition to carry out private investigations established by the Consolidated Text of the Laws of Public Security and by the rules on defensive investigations. After the search of users’ content, Apple suspends or closes their accounts. It marks them, in essence, as a paedophile. Lacking a court ruling, Apple does so without any certainty about the user’s responsibility whose device contained the child pornography content. This step in the process is legally and democratically unacceptable, and it matters little that, according to Apple, the risk of a false positive in recognition of images is almost non-existent. No one can be listed in a private database as being involved in such heinous criminal activity. Furthermore, no law would allow this.

As for the second level of analysis, the risk that the objectivity of the analysis gives way to emotions is higher because the reasoning is, necessarily, hypothetical. There is no doubt – and it has already happened in different fields – that once this technological approach to fighting a criminal phenomenon has been launched, Apple – and, of course, not only Apple – could be asked tomorrow, perhaps this time by a Government, to use it to fight other phenomena. Should this be the case, it could be very difficult to oppose it legally or for business convenience. It had already happened when the FBI asked to weaken the iPhone’s operating system’s security and received a stern refusal from Apple ‘in the name of privacy’.

Can we be sure that what may have been allowed to investigate and prosecute online child pornography will not be repeated in the fight against audiovisual piracy, for instance, or, even worse, in the search for lawful content, but considered by some ruler as subversive not for the democratic order but his selfish interests? The dangers of these choices and the proposal for an EU Community chat control are too many to even exceptionally consider the admissibility of such technical and regulatory solutions.

However, we cannot avoid the inevitable question that arises at the end of this reasoning: so what?

The answer can only take note of two facts and draw a conclusion.

The first fact: regulatory instruments already exist which allow the forces of law and order to carry out preventive and repressive activities.

The second fact is that the limits to the effectiveness of investigations are instead related to the lack of means and, above all, to the still immature – if not non-existent – international cooperation that prevents, except in sporadic cases, coordinated actions between several countries.

The conclusion is that whatever the technological instrument used in a criminal investigation – and therefore hypothetically also an automated scanning of private content – only public institutions should have the power to infringe the individual’s rights in compliance with guarantees and according to clear and predetermined procedures.

Leave a Reply

Your email address will not be published. Required fields are marked *