Apple, Facial Recognition and the Right of Defense (plus, a sting at the GDPR)

The news is gaining momentum: Osumane Bah, a student that has been charged of multiple  theft in  Apple stores located in several cities of the United States, filed a suit against the Cupertino-based company seeking for a compensation of one billion USD for having been wrongly identified by Apple as the author of these crimes. The decisive evidence that lead to his involvement in the investigations, this is Mr. Bah’s basis of the claim, is that he has been  wrongly identified by a facial recognition system operated either by Apple or a security company hired for the job.

Documented and unquestionable facts are:

  1. Mr. Bah has been already tried – and discharged – in Boston and New York (a third trial is pending in New Jersey),
  2. The Apple Stores’ thief (or thieves) has been identified by the operator of the video surveillance system of the Boston’s Apple Store that said he was able to connect the thief with Mr. Bah because of this latter having being previously arrested for having stolen items in a Connecticut-located Apple Store,
  3. In the Boston Trial, Mr. Bah’s counsel made a motion to obtain all surveillance video from Defendant which allegedly implicated him with the offenses in question. He was told, however, by an agent of Defendant, identified only as Greg C., that the video did not exist. Further, he was informed that
    Defendant habitually does not keep records of “any responsive information” after an arrest is made (source: plaintiff complain, page 6 para 16)
  4. The statement released by “Greg C.” has been proven false, as lately the Boston District Attorney has been given the alleged “non-existent” footage that exonerated Mr. Bah, (source: plaintiff compain, page 8 para 28,)
  5. In the New York case, the person that appears on the Manhattan Apple Store’s footage is not Mr. Bah (source: statement of New York Police Department’s detective,) but he has been arrested nevertheless,
  6. Apple do use surveillance systems inside its stores,
  7. Apple, directly or by way of a security company, has a way to extract biometric information from video footage and to connect it with other information such as personal criminal records.

It is obviously too early to assess the merit of Mr. Bah claim, nevertheless a few lessons can be learnt from this news:

  1. If Mr. Bah claims are proven true, Apple did a fully automated processing that led to the mismatch between biometric data and personal identity, and
  2. Apple doesn’t have (either at all, or partially working) procedures to ensure that the above mentioned matching is correct before reporting an individual to the law enforcement bodies,
  3. In contrast to what may seems, the biggest fault – if any – has been committed by the various police officers involved in the investigations. The NYPD’s detective acknowledged that the thief didn’t look like Mr. Bah, and the Boston Prosecutor dropped the charges once he so the “resurrected” footage. Apparently, it didn’t take such a big effort to compare the lookalike of the thief with the appearance of Mr. Bah. Therefore the police should have been more careful before trusting Apple’s complains,

And, finally, the million dollars’ question: what if all that took place in the EU?

I can’t answer for the whole Union, but in Italy the first thing a court would have done, before issuing an arrest order, is to see whether the person captured in the footage looked – at least likely – the same than the suspect and, if not, Mr. Bah wouldn’t be arrested on such a feeble evidence.

A special mention, though, is deserved by the footage-retention issue arisen in the Boston trial: on a first instance, the Apple’s officer declared that the footage of the thief was no more available as Apple deletes it after an arrest is done (but then the footage reappeared and exonerated the defendant.) If Mr. Greg C. were true, that would have been a clear infringement of the GDPR in the EU.

Firstly, the processing didn’t involve only a security cameras’ footage, but other personal information such as criminal records. Therefore, in the Boston trial – and surely in an EU court – it would have been necessary to ask for the disclosure of these further information, together with the way the matching between the criminal records and the footage is done.

Secondly: the existence of this integrated security system should have been notified to the concerned people, as it is a pre-emptive measure that, as happens for background credit-check, cannot be  operated unbeknownst to the data subjects.

Thirdly: as soon as a footage is the foundation of a claim, it cannot be deleted. GDPR is crystal clear on this, as fairness of the processing in relationship to criminal trials implies the duty of the plaintiff of not to destroy evidence of its claims.

 

Leave a Reply

Your email address will not be published. Required fields are marked *