If software were a military weapon

Software manufacturing is often compared to car building, and there are plenty of such analogies available, ranging from jokes to serious analysis.

A less considered match is the manufacturing of military weapons in contrast to sport weapons.

The history of the US Army contest that led Beretta to a winning over the German-Swiss Sig Sauer, thus securing the Italian company a rich supply contract of the “92” (renamed “M9” in the US Army naming system) is revealing.

The M9 was “the” most reliable gun in the market, being able to fire thousands of bullets without malfunctions, though enough to stand against the harshest environmental conditions and easy to both operate and maintains. Soldiers could rely upon this weapon to have the job done and not being let alone in critical moments.

How many software (from firmware, to operating systems, to platforms) are built like a Beretta M9?

Autonomous-driving and liability: a brief taxonomy

Summary: If you really want to regulate the field of autonomous driving, it would be better to establish – at last – the criminal responsibility of those who produce software and put an end to those shameful clauses of the user licenses that say that “software is as is, and not suitable for use in critical areas”.

Discussing with Prof. Alessandro Cortesi on Linkedin, an interesting debate emerged on the boundaries of legal responsibility for autonomous driving and on the relevance of ethical choices in the instructions to be given to the on-board computer of the vehicle to manage an accident.

Personally, in such a case, I find the use of ethics useless and dangerous.

Ethics is an individual fact which, through the political mediation of representatives of groups that share their own ethics, is translated into legally binding rules. If the State deals personally with ethics, it opens the door to crucifixions, burning and gas chambers.

On the “decision” in case of an accident: it is not the computer that “decides” but the man who programmed it that, (even if only as a possible malice / conscious guilt) takes responsibility for the degrees of autonomy (not decision) left to the software.

It is a fundamental and indispensable point not to transfer the legal consequences of behaviour from people to things.

Automatic driving cannot be allowed in such a way as to violate by default the laws that regulate driviing (conduct which, as it complies with the law, is presumed to be harmless).

The point, if anything, is the management of the extraordinary event (classic pedestrian that suddenly crosses): in this case – once again – the theme is the mal-functioning of the hardware or the bad conception, programming, interaction of the software, neither more nor less than what would happen in case of breakage of another component.

Moreover, when the machine loses control, there is no computer that can oppose the laws of physics.

We are all beta-tester… Still.

I firmly disagree with  David Pogue’s Scientific American column dating back to November, 2014 where the journalist wrote:

 Part of our disgruntlement at being served flawed software probably stems from our conception of software itself—as something that is, in fact, finishable. Software used to come in boxes, bearing version numbers. We understood each as a milestone—a program frozen in stone.

But nowadays software is a living, constantly evolving entity. Consider phone apps: nobody seems to mind that new versions pour out constantly, sometimes many times a year. Or Web sites: they’re software, too, and they’re perpetually changing.

Maybe that’s why Adobe no longer produces boxed, numbered versions of Photoshop; instead the only way to get Photoshop is to subscribe to its steady evolution all year long.

Maybe it’s time to stop thinking about traditional programs any differently. Maybe we should get rid of frozen, numbered editions, much as Adobe has done.

That wouldn’t eliminate the frustration of bugginess, but at least we would comprehend software’s true nature: a product that is never finished.

The fact that software is an ever-evolving product (and not – as we in the EU say, a “copyrighted work”) doesn’t imply that it is fair to put on the market  a piece of crap, telling peopole that “we’ll clean the toilet with the next version”. Because in the meantime, the stink… stinks.

 

Apple and the (unrequired) Safety by Design

An individual is ultimately  responsible for the use of a technology. This is, in a few words, the conclusion of a  decision issued by the 6th Appelate District of the California Court of Appeal.

The merit of the controversy was a legal action taken by the victims of a car accident against Apple accused – said the plaintiff – of infringing a duty of care in the designing FaceTime so that it didn’t stop working when users drive a car, thus distracting the driver a causing the accident.

In rejecting the claim, the Court  found that not preventing the use of FaceTime while driving neither is matter of duty of care does nor constitute a proximate cause of injuries suffered in a car crash. Continue reading “Apple and the (unrequired) Safety by Design”

Volkswagen’s Dieselgate and The Danger of Closed Source Intellectual Property

The not uncommon practice in the ICT/Mobile business of “doctoring”products to look good on benchmarks has find its way into the automotive (and God knows into how many others) business.

Volkswagen, though, isn’t the only to blame because, true, they cheated, but no public supervising authority  ever glimpsed at the software ran by its vehicles, only focusing on “hardware” tests. And – I guess – even if the controllers would have thought of examining the software, they would have been prevented to do so by “the need of protecting Intellectual Property” that – as the “National Security Excuse” – is a buzzphrase to stop any further investigation on controversial matters.

Volkswagen’s Dieselgate shows once more that (a certain way to think of) Intellectual Property – as well of Privacy – has neatly changed its role from being a tool to protect legitimate interests into a shield for wrongdoings.

Were the Volkswagen software released under an open source licensing model, the fear of being exposed would have forced the company to play by the book and would have allowed a true and thorough check by the competent authorities, avoiding a major damage for the industry, investors, employees and citizens.