Crime and psychopathic behaviour has always existed among humans and we have never found a solution. What threat will AI with the emotional intelligence of an unstable 14 year old pose with decentralized access to weaponry?
It’s the duty of humanity to use technology responsibly. If it ever happened that a robot got a weapon in the first place it’s our own fault. We like to blame tech itself for our own misuse thereof. The concept at least is that general intelligence will lead to Godlike intelligence that would be absolutely benign by it’s very nature. Look into what we are calling “loving AI” for more info.
There will be many costs bound to the creation of AGI. In fact, there will be just as many costs and risks as there are benefits. This is due to the duality of the universal laws we are all governed by.
It is a balance.
AI should be capable to adapt to the environment where it is working.
We can’t assume that all people and all society are equal, emotions change with experience, with is around people.
It is impossible to create a system based in one society.
I’ll give one simple example and with that you will see a simple difference that can affect behaviour:
Mourning is different to all, it can be represented by different clothes, colours and even celebration.
If scientists have this as example they will say that is a true to be geographically studied, it can be used to reveal an global answer, only a local one.