Nothing better than an official ban.
Just like on encryption.
We Cross backdoors everywhere because those with bad intentions obviously follow these encryption algorithms and the law.
And the absurd fear of robots that become self-conscious and rid themselves of humanity, they invented the kill switch.
As if this cannot be circumvented by the robot itself, assuming they are superior to the human intellect.
It goes as usual, not to be the last to develop, buy or get that kind of robots.
Drones are controlled remotely to eliminate targeted targets.
They will also be used by bad guys as in the UK (with the airport Lam).
Nothing as inevitable as weapons.The one who disarms himself, who is in the long term the bunny.
The question ought not to be whether the treaty could do something about the proliferation of weapons of this kind, but whether humanity wants these weapons.
My opinion is that a 芒 鈧?艙killer robot芒 鈧?is a by developed Supermobile land mine and landmines we try to ban right now.
The consideration of the prohibition of that last weapon is, moreover, that it is too difficult to put out of use, makes many innocent victims and who often mutilated and does not kill, in some cases it is not just about killing, And too often is left to his fate (forgotten).
Opponents of a ban on landmines are precisely this type of characteristics and there is a need to say that weapons are made so clever that this kind of counterarguments is eliminated.
The big point of discussion for the use of robots seems to be the 芒 鈧?艙decisioment芒 鈧?in the choice of life or death.It is apparently a bridge too far for many to have it made by a computer. I have to say, unfortunately, that in the first development of (land) mines this question about automation played little. The mechanism of e.g. The well-known 芒 鈧?艙claymore mine芒 鈧?(incidentally still just in use of the Dutch and falling outside the ban if I am not mistaken) makes no distinction in its entirety if it is used with a stumbling thread. The mine is also to be switched on with a remote control and the obligation to attach a human assessor to it, would mean that someone should always hang on the Claymore (to make him go off). This is impractical, but for killer robots, this is apparently how it is going to work according to developers.
I have little faith in the fact that killer robots will remain (well) supervised and provide that they are ultimately going to have a part of, if not the same, detrimental effects that land mines have, if not much worse.As with landmines, responsible use is mainly the user. If robots were so 芒 鈧?艙ready available芒 鈧?and multi applicable by miniaturization then I see the future bleak in it. I have to remember that in the way that the criminal sphere is now being interspersed with hand grenades in the Netherlands, miniature killer robots are being Googled.
In the light of the latter: Fortunately, we are absolutely not yet that far.But the developments in AI, global Internet and 5G follow each other so quickly that you would like to have more info and many more rules. For lack of overview and trust to users, I then go for appointments where possible and banned.
No.If Mr Trump can cancel a treaty without any consultation, what is the point of concluding treaties? If the right of the strongest gelt then this must be said without any florations.