Sort:  

Thanks for the comment. You raise a lot of good points. I agree with what you are saying about self-awareness and psychopathy and also that emotions have specific functions. I actually wrote a Steemit article a while back on the function of emotions.

I never really considered that Ava could be self-aware but also a psychopath. That is interesting and gives me something to think about. I also agree with what you are saying about pain.

In relation to empathy and love, I think that they are two things that we may not ever be able to artificially construct .

  1. Empathy is largely based on "mirror neurons" which are brain cells that fire in a persons brain when they watch someone else experience something. Basically, the same neurons fire in both individuals brains. Mirror neurons allows us to connect to each other and experience the same feelings despite them happening to someone else.
  2. Love is largely based on combinations of neurotransmitters that are released in a persons brain at certain moments and during certain experiences. Oxytocin is one such neurotransmitter. It is released during moments of connection like when humans hug, touch or share intimate moments with each other. Intimate does not necessarily means sexual - personal emotional conversations can also release Oxycontin. Basically the neurotransmitter acts as a bonding hormone.

I think that it will be hard for us humans to ever successfully re-create neurotransmitters and brain processes like mirror neurons, so I think that this will also be a limitation to creating true AI.

Hopefully this reply makes sense. Its a bit hard to write about such complex topics in a comment section (as I'm sure you realize). But thanks again for the comment, it was very insightful and made me think further about the topic.

Just want to add my thoughts to this chain to in addition to what I already commented.

Psychopathy to me is a lower level of consciousness because they lacks empathy. I don't think psychopaths can experience all of reality because they missing this part. In other words, they're less consciousness of reality because they miss experiencing it.

Thoughts @wolfenlord?

@steemjaunty That's interesting. I don't know enough about that to draw a sound conclusion but its an idea to consider. I'll keep thinking about that one

Actually it does. I don´t have a background in medicine, so I have to mentally translate everything to my background in computing. I believe we might not have to recreate neurotransmitters, or chemical reactions at all. Think about function: we don´t need to mirror its biology exactly, only its function, and make it adaptive enough so that with self awareness comes also the posibility to grow.

hmm I see. Yeah that's a good point. We can create a program that has a specific function.

I guess in that case it wouldn't be a true emotion but rather it would mimic an emotion. I suppose we also need to consider the fact that a robot with AI is not a human, that it is its own thing (like a species), and so does not necessarily have to behave exactly like a human. Things like love, and empathy and emotions in general are human attributes. Though we may want to impart those attributes onto AI to make them more like us and more relateable, that may not be necessary. Maybe robots should have the freedom to be their own "organism?" There is a lot to consider regarding the issue.

Nice delve into neuroscience! Love it!