Facebook’s “smart glasses” stir up the same controversy that hit Google’s “smart glasses”. But privacy is not the real problem of such objects by Andrea Monti – Initially published in Italian by PC Professionale n. 368
The announcement of a RayBan embedding ‘smart’ functions able to collect images, sounds and – in perspective – data of any other nature to convey them on the Facebook account of the wearer has aroused the usual and short-sighted controversy on the ‘privacy risks’ already heard at the time of Google Glass. These devices – as every IoT – have drawbacks that go far beyond the highly irrelevant issue of ‘privacy’. They put a middleware between the person and reality that becomes the lens through which we are forced to interact with the world around us.
Let us clear the field, first of all, of legal issues.
The Court of Cassation is granitic in stating that there is no reasonable privacy expectation in public places. In public spaces, one cannot prevent someone from seeing what he is doing and remembering what they have seen. So, if one wants to reclaim his privacy, he has to be sure that others cannot see or hear him. It is why the criminal code punishes unlawful interference with private life through audiovisual equipment, and judges have deemed it unlawful to circumvent walls and fences or sneak into non-public areas of premises to photograph VIPs or public figures. It explains why, for example, the French courts considered it lawful to publish a report documenting daily life in the Paris metro or a shot of a lady with her dog, which, according to the judges, did not infringe her privacy or right to personal image. Different is the case of those who collect images and videos indiscriminately and without any cultural or documentary purpose (which are the basis of Street-Photography made famous by Cartier-Bresson, Vivian Maier and other less famous but no less talented photographers who capture splinters of reality).
A critical issue of Facebook RayBans is, instead, the right to the protection of individual dignity, which is different from the right to ‘privacy’.
The debate on the aesthetics of misery sparkled in the world of photography during the 1970s comes in handy. If it is abstractly legitimate to photograph a beggar or a person in misery, is it ethically correct to aestheticise suffering and pain? The answer lies in the purpose of the shot: it is one thing to document a situation for a social denunciation, quite another to pursue a purely “aesthetic” purpose. Thus, the same image may or may not be ‘acceptable’ depending on the reason that prompted the photographer to take it. It is more difficult to sustain such an argument when the decision about the ‘if’ and the ‘what’ is delegated to a camera system that does not exercise a critical judgment on what the sensor should capture.
From a different point of view, the use of smart glasses is problematic not because they allow us to film what we have around us more or less obvious to others. What matters is that the data generated by the user end up in Facebook’s information ocean and contribute to increasing its capacity for analysis, profiling and, therefore, the cultural and ideological orientation of people. It is an essential distinction that shows, once again, how misleading it is to be obsessed with ‘privacy’ to the point of not noticing anything else.
I do not know if methods of profiling based on the intersection of Jung’s, Skinner’s and data-science theories work. As in the case of astrology, what matters is that people believe in them and modify their behaviour accordingly. When the British hired an astrologer to predict Hitler’s decisions during the Second World War, they did so not because they believed it but because the Führer did.
The massive accumulation of the most disparate data on every aspect of an individual life is at the root of the disconnection between him and the world around him. We are facing the dystopian evolution of what Neal Stephenson (author of Cryptonomicon and steam-punk sagas) had already foreseen in 1999 with In the beginning was the command line.
In this book, Stephenson talked about the interfaces and monitors embedded in consumer electronics (video cameras, to be precise). The reality was increasingly determined by the creation of ‘fake real’ phenomena. Thus, Stephenson writes, when confronted with Disneyland’s Indian Jungle, the visitor is confronted with a fiction that is a perfect replica, even better, of the original; not only that, instead of looking at it directly, he does so through the mediation of his camera’s monitor. Let us press the fast forward button and move through space as well as time. From fin de siècle Florida, we arrive in today’s Seoul, where a perfect replica of the Trevi Fountain stands majestically in an underground station.
Disneyland, the South Korean Trevi Fountain and the many similar initiatives are perfectly embedded in the decontextualisation of reality intuited by Stephenson but are still, on balance, under control. It is not the case with wearable devices, which accelerate and extend this process at unprecedented speed and scale, but above all without control.
In conclusion, rather than worrying about Facebook’s RayBans and asking the law to intervene, we should understand that the existence of a tool does not necessarily oblige us to use it. It is valid for smartglasses, IoT, and, in general, all those tools that, under the pretext of making our lives easier, increasingly force us to live in a gilded cage. It may be gilded, but it is still, first and foremost, a cage.