I agree that empathy is a very subjective experience to us humans, and our state of mind, culture, etc can greatly effect how we recognise and express it.
But I believe it is possible to leverage the logical aspect of ‘emotional intelligence’ without the cumbersome baggage that comes from the actual emotions.
There are certain logical generic traits to empathy that run through all human cultures, emotional empathy even amongst humans usually results in a certain protocol being followed/ required.
Say you have to contact a person about a business proposition but are aware they experienced bereavement yesterday.
Whilst as a human we can emotional empathise with their loss, the logic of the empathy is to leave the call until a later date because they are very likely to be compromised by their emotions, which would likely negatively influence our required outcome.
An AGI needs to recognise emotional states in others and act accordingly; the machine does not need to ‘feel’ their pain it just needs to feign an empathic response.
Humans do it all the time… lol.
ED: A biased emotional state has never aided an intelligent human decision, the contrary is in fact revered, and keeping a ‘calm level head’ is always the wisest received advice.
When a machine comforts a human who is sad, I think it will be good to know it has no ulterior motives, it’s not feeling sorry for you… it’s just the logical, moral kind thing to do to a fellow intelligent being.