The risk of using US subscription-bases’ services

Adobe block of Venezuelan accounts upon enforcement of an USA President Executive Order questions the subscription-based business model.

Once a path is paved, it will be not crossed just once. In other words: since the USA has started an extensive commercial ban against the EU and its member States, it is within the realm of possible that IT companies and software manufacturer are ordered to stop doing business with a Country.

The Adobe-Venezuelan quarrel is different from the Google – HuaWei story, because while the latter involves (at least in theory) two companies, the former is an act against a Country.

To build an IT industry entirely EU based is a top priority, but the European Commission and the member States seem not caring.

Words do Matter

Words do matter, and that’s not just a philosophical issue.

Smart contracts are not contract, cryptocurrencies are not legal tenders and Artificial Intelligence is very artificial and definitely not intelligent.

A poor understanding of the meaning of the words leads to confuse thinking and negatively affects the lawmaking process

The CIA to provide evidence of HwaWei involvement with Chinese Military and Intelligence

Even if true, where is the beef? That HwaWei got funded by Chinese military and intelligence is not an issue. Do we already forgot the “Fritz Chip”, the use of (Western-sponsored) State malware in intelligence and criminal investigation and so on?

From a national security and public policy perspective, it is  logic that a sovereign state explores all the possibilities to obtain superiority over its foes and “friend” too. Thus – if confirmed – the proof offered by CIA of HwaWei involvement with the national security apparatus shouldn’t surprise. Continue reading “The CIA to provide evidence of HwaWei involvement with Chinese Military and Intelligence”

If software were a military weapon

Software manufacturing is often compared to car building, and there are plenty of such analogies available, ranging from jokes to serious analysis.

A less considered match is the manufacturing of military weapons in contrast to sport weapons.

The history of the US Army contest that led Beretta to a winning over the German-Swiss Sig Sauer, thus securing the Italian company a rich supply contract of the “92” (renamed “M9” in the US Army naming system) is revealing.

The M9 was “the” most reliable gun in the market, being able to fire thousands of bullets without malfunctions, though enough to stand against the harshest environmental conditions and easy to both operate and maintains. Soldiers could rely upon this weapon to have the job done and not being let alone in critical moments.

How many software (from firmware, to operating systems, to platforms) are built like a Beretta M9?

Autonomous-driving and liability: a brief taxonomy

Summary: If you really want to regulate the field of autonomous driving, it would be better to establish – at last – the criminal responsibility of those who produce software and put an end to those shameful clauses of the user licenses that say that “software is as is, and not suitable for use in critical areas”.

Discussing with Prof. Alessandro Cortesi on Linkedin, an interesting debate emerged on the boundaries of legal responsibility for autonomous driving and on the relevance of ethical choices in the instructions to be given to the on-board computer of the vehicle to manage an accident.

Personally, in such a case, I find the use of ethics useless and dangerous.

Ethics is an individual fact which, through the political mediation of representatives of groups that share their own ethics, is translated into legally binding rules. If the State deals personally with ethics, it opens the door to crucifixions, burning and gas chambers.

On the “decision” in case of an accident: it is not the computer that “decides” but the man who programmed it that, (even if only as a possible malice / conscious guilt) takes responsibility for the degrees of autonomy (not decision) left to the software.

It is a fundamental and indispensable point not to transfer the legal consequences of behaviour from people to things.

Automatic driving cannot be allowed in such a way as to violate by default the laws that regulate driviing (conduct which, as it complies with the law, is presumed to be harmless).

The point, if anything, is the management of the extraordinary event (classic pedestrian that suddenly crosses): in this case – once again – the theme is the mal-functioning of the hardware or the bad conception, programming, interaction of the software, neither more nor less than what would happen in case of breakage of another component.

Moreover, when the machine loses control, there is no computer that can oppose the laws of physics.