Autonomous-driving and liability: a brief taxonomy

Summary: If you really want to regulate the field of autonomous driving, it would be better to establish – at last – the criminal responsibility of those who produce software and put an end to those shameful clauses of the user licenses that say that “software is as is, and not suitable for use in critical areas”.

Discussing with Prof. Alessandro Cortesi on Linkedin, an interesting debate emerged on the boundaries of legal responsibility for autonomous driving and on the relevance of ethical choices in the instructions to be given to the on-board computer of the vehicle to manage an accident.

Personally, in such a case, I find the use of ethics useless and dangerous.

Ethics is an individual fact which, through the political mediation of representatives of groups that share their own ethics, is translated into legally binding rules. If the State deals personally with ethics, it opens the door to crucifixions, burning and gas chambers.

On the “decision” in case of an accident: it is not the computer that “decides” but the man who programmed it that, (even if only as a possible malice / conscious guilt) takes responsibility for the degrees of autonomy (not decision) left to the software.

It is a fundamental and indispensable point not to transfer the legal consequences of behaviour from people to things.

Automatic driving cannot be allowed in such a way as to violate by default the laws that regulate driviing (conduct which, as it complies with the law, is presumed to be harmless).

The point, if anything, is the management of the extraordinary event (classic pedestrian that suddenly crosses): in this case – once again – the theme is the mal-functioning of the hardware or the bad conception, programming, interaction of the software, neither more nor less than what would happen in case of breakage of another component.

Moreover, when the machine loses control, there is no computer that can oppose the laws of physics.

We are all beta-tester… Still.

I firmly disagree with ? David Pogue’s Scientific American column dating back to November, 2014 where the journalist wrote:

 ?Part of our disgruntlement at being served flawed software probably stems from our conception of software itself-as something that is, in fact, finishable. Software used to come in boxes, bearing version numbers. We understood each as a milestone-a program frozen in stone.

But nowadays software is a living, constantly evolving entity. Consider phone apps: nobody seems to mind that new versions pour out constantly, sometimes many times a year. Or Web sites: they’re software, too, and they’re perpetually changing.

Maybe that’s why Adobe no longer produces boxed, numbered versions of Photoshop; instead the only way to get Photoshop is to subscribe to its steady evolution all year long.

Maybe it’s time to stop thinking about traditional programs any differently. Maybe we should get rid of frozen, numbered editions, much as Adobe has done.

That wouldn’t eliminate the frustration of bugginess, but at least we would comprehend software’s true nature: a product that is never finished.

The fact that software is an ever-evolving product (and not – as we in the EU say, a “copyrighted work”) doesn’t imply that it is fair to put on the market ? a piece of crap, telling peopole that “we’ll clean the toilet with the next version”. Because in the meantime, the stink… stinks.

 

Apple and the (unrequired) Safety by Design

An individual is ultimately ? responsible for the use of a technology. This is, in a few words, the conclusion of a ? decision issued by the 6th Appelate District of the California Court of Appeal.

The merit of the controversy was a legal action taken by the victims of a car accident against Apple accused – said the plaintiff – of infringing a duty of care in the designing FaceTime so that it didn’t stop working when users drive a car, thus distracting the driver a causing the accident.

In rejecting the claim, the Court ? found that not preventing the use of FaceTime while driving neither is matter of duty of care does nor constitute a proximate cause of injuries suffered in a car crash. Continue reading “Apple and the (unrequired) Safety by Design”

The Need for Currency Privacy. An Hard Truth About Bitcoin and Its Siblings

History , (financial) scams and criminal trials teach us a lesson: public institutions, companies and private citizen need cash to enter into “private” transactions.

Be the unofficial payment of a political ransom, the black fund to hide management wrongdoing or an attempt at run from the tax authorities, the assumption remains the same: currency privacy is an asset. Continue reading “The Need for Currency Privacy. An Hard Truth About Bitcoin and Its Siblings”