ChatGPT Block. Why the Italian Data Protection is wrong

The “ChatGPT block” was ordered on 30 March 2023 by the Italian data protection authority on the grounds that the data used to train the model had been collected without informing the people to whom it related and without verifying their age. This, according to the order, exposes minors who use the service “to answers that are totally inappropriate to their level of development and self-awareness”.

The order, it must be said, is highly questionable from a technical, legal and cultural point of view. It reveals, on the one hand, the weakness of the national data protection authorities in dealing with the matter and, on the other hand, the substantial inapplicability of the ‘privacy protection’ legislation. Finally, it triggers a very dangerous reciprocity mechanism whereby other countries with similar regulations – including Russia and China – could use them as a ‘legal’ tool to target companies on this side of the new Iron Curtain by Andrea Monti

Firstly, the data subject’s consent is only one of the legal bases on which the data can be processed and, contrary to what the Authority claims (and gives no explanation on this point), even if OpenAI had wanted to follow European rules, it would at least have had a ‘legitimate interest’ (which national data protection authorities so dislike) that it could have exercised in good faith. Data used by OpenAI were in fact made freely available by individuals on public profiles, blogs and platforms. In the absence of clauses such as those in copyright law restricting the use and re-use of data (so-called ‘forbidden reuse’ or one of the Creative Commons licences), it is therefore not reasonable to assume a ‘presumed prohibition’. Of course, by analogy with copyright, one could always argue that ‘what is not allowed is forbidden’, but this would be paradoxical because it would result in a restriction on the circulation of data that the European regulation itself does not allow. Moreover, the way in which generative AI works has been known for some time, so if the problem of processing data for training purposes was indeed relevant, it would have been necessary to intervene immediately to prevent the genie from getting out of the bottle.

Moreover, and this is a critical point, before taking any action, it would be necessary to understand where the personal data that would have been collected by OpenAI would be located. If they were located on servers outside the EU, their processing would be subject first and foremost to the rules of the country in which they are stored. It is true that the European Union is not alone in having established the reach of its own legislation beyond the borders of its member states, but it is highly questionable whether this can be done, given the political consequences of such a decision in relations with the US, especially if it is adopted in a partial and inconsistent manner.

The never-ending saga of the various ‘Safe Harbour’ and ‘Privacy Shield’ (the US-EU data protection agreements), which have been systematically annulled by the European Court of Justice, proves that sending data to the other side of the Atlantic is not an option. But the national data protection authorities of the EU member states have not done much, apart from a few statements or extemporaneous measures. Doing nothing may be politically necessary, but it is legally unacceptable. If personal data cannot be transferred to the US (or other countries with less legal protection than ours), this should not be allowed, regardless of the question of expediency; otherwise one would have to conclude that complying with law is subservient to political necessity and that therefore the law is not (any longer) above everything and everyone.

The empirical proof of this assertion lies in the inertia of the (not only) Italian Data Protection Authority on search engines, social networking and user-generated content platforms, non-EU DNS operators and non-EU software-as-a-service providers. OpenAI is certainly not the only one that ‘processes’ Italian personal data outside national borders, and it is certainly not the one that processes it ‘more’ than other subjects. It is therefore more than fair to ask on what criteria measures such as the one against OpenAI are taken. In other words, if non-EU systems processing personal data are really dangerous, then they should all be blocked, not selectively.

Another objection raised by the Garante is that ChatGPT gives unreliable results and produces disinformation. So what is the problem? The service is avowedly experimental and should not be used for applications that imply consequences for people. ChatGPT produces no more ‘disinformation’ than any search engine that does not select the reliability of its results, or Wikipedia, which tries to contain the problem but has no overall and comprehensive control over its content. Those who use these systems, including ChatGPT, therefore do so at their own risk, peril and responsibility.

And while we are on the subject of responsibility, let us consider the issue of ‘protection of minors’.

It is not up to the data protection authority, which by the way is no stranger to such decisions, to take the place of those who, by law, have parental authority over minors: mothers, fathers and guardians. Allowing or not allowing the use of this service, as of other services, is therefore a matter for the ‘legal representatives’ of minors, who cannot plead ‘ignorance’, ‘incompetence’ or ‘inevitability’ that ‘digital natives’ cannot be controlled.

However, if the Garante has the power to replace the exercise of parental authority, then this power should be exercised in respect of all electronic communications and information society services used by minors. Therefore, for example, Apple and those who use Android on their terminals should be sanctioned for not having provided mechanisms to verify the identity of the user (for example, by providing that the smartphone can only be activated by an adult from the first use and then “associated” with a specific minor, in this case activating “parental control” functions). Or telecom operators who do not take adequate measures (and which ones?) to check whether the terminal is being used by a minor should be sanctioned. Or operators of payment systems that allow ‘apps’ to be used to exchange money and buy goods or services, even by minors, should be sanctioned. Will such measures ever be adopted?

Of course, this does not change the fact that ChatGPT is also available to minors, and the existence of even worse situations does not ‘absolve’ OpenAI of the sin, any more than breaking the speed limit on the motorway is justified by the fact that other drivers are ‘pulling’ even faster. In the case of ChatGPT, however, it is questionable whether there has been a violation of children’s rights. Unlike other AI platforms, access to the service is not free, as a contract must be signed. According to Italian law, only adults can do so with full legal effect. If the minor enters into a contract independently (perhaps by lying about his or her age), it is again up to the parents to request its annulment under articles 1425 and 1426 of the Civil Code. The choice of the Italian Civil Code is very reasonable because, on the one hand, it does not block transactions (which, it should be remembered, legally constitute a superior interest to be protected) and, on the other hand, it offers protection to weak subjects.

To sum up, the Garante order creates more problems than it solves. It raises serious political and economic criticisms because it calls into question the legitimacy of the entire US ecosystem based on platforms and the data economy, without Italy having a valid alternative for citizens and businesses. It reinforces the principle of individual irresponsibility and civic disengagement, suggesting that – in order to use the umpteenth digital gadget launched on the market – one can renounce claiming respect for one’s rights because someone else will take care of it. It justifies the abandonment by adults of their role as educators and guides of the vulnerable subjects who depend on them and are entrusted to them by nature, even before the law.

Leave a Reply

Your email address will not be published. Required fields are marked *