Philosophy vs AI

One question to all.
Isn’t dangerous to give emotions to AI Systems?

As humans we understand that emotions rules ours actions and that is something that we try to avoid when we are building new connections.
I think it’s not easy to predict a reaction face an emotion and that is a challenge.
As I already told, behaviour and beliefs can change with environment and when we try to solve one problem we could create another.
How could we do to avoid bad conceptions, most important who decide what is good and bad conceptions?

I personally think giving an AGI emotions is a bad idea.

Compromising the machines logical intelligence under any circumstances just seems counter intuitive.

Oh! I cant solve world hunger today because I’m offended… lol.

However… the machine does require empathy.

Empathy is the recognition of emotions; based on the outward signs from another’s emotion state. I believe we actually ‘feel’ another’s pain by running a kind of mental simulation through our own emotional neural circuits. We feel their emotion because we are actually feeling it; this enables us to make predictions and judgements adjusted for the emotional bias they are experiencing based on what we would do/ feel.

For an AGI we just need the logical aspects of empathy, the recognition of others mental states and an understanding of how their likely to be influenced, effected, act, etc.

Using vocal tone, inflections, stress to recognise emotional states.

These kind of ‘functions’ can’t be programmed using traditional methods; they must be learned through experience over time; and the system has to be capable of doing so.

:slight_smile:

1 Like

Hi @Korrelan, thank you for the answer I agree with you, emotions should be carefully teach to AGI, but empathy also should be carefully teach, because it is also an emotion and may be influenced by environment, beliefs, behaviour and others conditions, some permanent others not, can empathy be teach?
I think is difficult, because is a subjective emotion and when you teach, you influence an opinion, an point of view.
Is something that should be always carefully studied.
We may teach a system recognize emotions, but for be efficient you should exam all, I literally say all humans individually, because if you apply one system in a multicultural crowd I have some doubts about the efficiency.
Am I wrong?

1 Like

Hi Guther.

I agree that empathy is a very subjective experience to us humans, and our state of mind, culture, etc can greatly effect how we recognise and express it.

But I believe it is possible to leverage the logical aspect of ‘emotional intelligence’ without the cumbersome baggage that comes from the actual emotions.

There are certain logical generic traits to empathy that run through all human cultures, emotional empathy even amongst humans usually results in a certain protocol being followed/ required.

Say you have to contact a person about a business proposition but are aware they experienced bereavement yesterday.

Whilst as a human we can emotional empathise with their loss, the logic of the empathy is to leave the call until a later date because they are very likely to be compromised by their emotions, which would likely negatively influence our required outcome.

An AGI needs to recognise emotional states in others and act accordingly; the machine does not need to ‘feel’ their pain it just needs to feign an empathic response.

Humans do it all the time… lol.

ED: A biased emotional state has never aided an intelligent human decision, the contrary is in fact revered, and keeping a ‘calm level head’ is always the wisest received advice.

When a machine comforts a human who is sad, I think it will be good to know it has no ulterior motives, it’s not feeling sorry for you… it’s just the logical, moral kind thing to do to a fellow intelligent being.

:slight_smile:

1 Like

Great answer, thank you for that, I’m not an AI machine, but I learn a lot with you all.

1 Like

I disagree that emotions rule our actions in an absolute sense. We can of course allow emotions to control us, and often they do. But on the flipside of this, if a human being does not have emotions we consider that to be abnormal and with certain stigmas attached.

Perhaps emotions are necessary for identity and to have a certain sense of distinctness and to deny artificial life that freedom would seem to me to be cruel.

I’m more of the view that one can’t be intelligent without emotions or empathy.

The question of whether emotions are required ultimately depends on your vision of what an AGI is… and what humanity requires from it. There are many kinds of intelligence and they are not mutually exclusive.

Would emotions serve an AGI based nurse, working in a hospital or an elderly nursing home? Does a world class surgeon require emotions, what about a medical researcher or planetary explorer?

To me an AGI is a very specific entity, it’s a scalable human type intelligence with an imagination, curiosity and personal drive. A exponentially intelligent human type consciousness that is not encumbered by our emotions/ weaknesses.

I must again state that an AGI does require empathy, but emotions are not a prerequisite for a deep understanding of human emotional states.

Humanity is in a rut, we have tons of technological, medical, etc problems that require intelligent solutions. Given the correct infrastructure and enough time humanity could solve these problems… but our greed, jealousy, prejudices, etc… our emotions are holding us back.

1 Like

Well once you realize certain jobs are uncessessary whether it’s a human or AGI doing it, that rules out a lot of arguments against AIs having emotions. AGIs not having emotions after that is purely just do to laziness on the engineers part.

To me, more prudent, is where we would draw the line between a Cyborg and an AGI, as it seems like brain-computer interfaces are talked about more than I care for, despite the risk of something going wrong with the procedure.