The Italian data protection authority recently ruled on an Italian company’s unlawful Google Analytics use. The ruling exposes the ‘holes’ of the GDPR, the attempt to plug them at all costs, and the inability (or lack of will) to fully pursue the political choice to protect Europe’s digital sovereignty by Andrea Monti – Initially published in Italian on Strategikon – an Italian Tech blog.
The ruling is effective, at least formally, only against the company that has been audited. In reality, however, it is applicable in more general terms, so it constitutes a sort of Faq on whether one can continue using Google Analytics on one’s own site.
Let’s put it straight: the Italian Data Protection Authority said in clear terms that Google Analytics cannot be used because the way the service is designed allows the American authorities to access even the personal data of European users. There are no ‘security measures’ that Google’s clients can use or think of contractually imposing on the US multinational.
That the overwhelming power of Google – and of Big Tech – over our data is a severe problem is evident. That the massive accumulation of data of European citizens, pardon me, users is a market-distorting factor is equally clear. Just as there is no question that American national security policy also relies on the private technology industry. However, the nobility of the end – to stem the flow of personal data to the US – hardly justifies the use of an improper means, i.e. forcing the GDPR.
The Authority pursued an interpretation of the Regulation that makes it say what, in reality, is not written anywhere: that personal data would also be those entirely anonymous for the manager of a site but can be deanonymised in total autonomy by Google. Thus,
personal data’ would be ‘unique online identifiers that allow both the identification of the browser or the device of the user visiting the website, and of the website operator itself (through the Google Account ID); address, website name and navigation data; IP address of the device used by the user; information related to the browser, the operating system, the screen resolution, the selected language, and the date and time of the visit to the website.
However, it only takes a simple experiment to understand the weakness of the Data Protection Commissioner’s reading of the law.
Firstly access any WordPress-based blog without registering. Then check what data one can obtain using Matomo, the ‘data protection friendly’ competitor of Google Analytics. At the end of this simple process, one would discover two things:
- there is no way of knowing who is behind the terminal used for the connection. So, we are not talking about personal data, and the GDPR does not apply;
- Matomo collects, essentially, the same data as Google Analytics, with the difference that the former does not cross-reference anonymous data with other information, the latter almost certainly does.
This simple experiment exposes the limits of the Gdpr’s design. In fact, to apply the Regulation to those who use Google Analytics without identifying users, it would be necessary to demonstrate that every single IP sent to Google is actually deanonymised because the GDPR applies to individual persons and not to generic categories of subjects. Moreover, one would have to overcome the fact that the GDPR applies to those who process personal data and not to those who collect anonymous data and forward them, as such, to a third party that could deanonymise them instead.
Instead, it should be the latter that should be subjected to checks and controls because the assumption is that the latter aggregates the information and thus carries out processing regulated by the European standard. In that case, the Authority should also have investigated Google to verify, for instance, whether, like Matomo, it planned to use the analytics platform without cross-referencing the anonymous data it received with those it already possessed. Or whether it allowed the service users to autonomously protect the concerned data to avoid cross-referencing with other information. If this were the case, the anonymous data sent by the various webs around Europe would remain so, and the problem would be solved. Otherwise, Google would have to comply with obligations and requirements imposed by national protection authorities by interacting directly with each individual whose data it processes.
The issue of who is directly subject to the GDPR, however, does not only concern Google but also involves Big Tech companies participating in processes for digitising Italian civil services such as Cloud PA.
It would therefore be legitimate to expect – indeed, to demand – that the Authority continues along this courageous path. It should start a comprehensive inquiry into all the data collection and analysis tools used by Big Tech. It should also provide clear directions to public bodies and businesses instead of leaving them prey to fear, uncertainty and doubt. On the other hand, if the Data Protection Authority were to remain inert, it would be too strong a temptation to think of a game of downwards, that things are going well as they are and that, as they say in Rome, a chi tocca, nun se ‘ngrugna — when your turns comes, do not complain.