You are viewing a single comment's thread from:

RE: Is Ava Conscious? An Evaluation of the Movie Ex Machina

in #philosophy7 years ago

Well, I believe self awareness and empathy are two different things. The proof of that is: psychopaths(and sociopaths too). They can use self-awareness, imagination, manipulation, sexuality, and even feign empathy, but they do it only using their minds, and they only engage their emotions at the command of their thoughts and when it serves their purpose. I believe true emotions(the ones evoked unconsciously and which produce changes to the hosts physiology) have a purpose: they connect us, or associate us with the recipient of our emotions and the situation in general in an emotional way, engaging way more deeper existential levels of data for your subsconscious than just a mental process. If those emotions are good or bad doesn´t matter for our emotional processing of the sitiation, but they have to be truly felt for that type of processing to happen. Otherwise is just a mental process. Let me exemplify: you can program a computer which has an artificial android finger attached to it to simulate feeling pain if you stick a needle in it, but that is just a simulation, a "mental" construct. A totally different thing would be if we could program that computer to actually feel the pain. If that pain was truly felt would compel that computer to ask itself the following questions:

  • how do I avoid this feeling?
  • can this kill/break me and what does this means to my survival?
  • is the one causing the pain a friend or a foe?
    These are only 3 possible questions of about a 1000 questions that could be asked, but the common factor is that they are all compelled/forced/elicited by pain.

So a computer unable to actually feel emotions, especially pain, would be lacking the best self preservation method all animals(including us) have. It would also be lacking true empathy, and the growth potential, reflectivity and adaptability that comes from our emotions. Also, they would be unable to truly love, which for us "natural" beings is kind of a big thing, since all animals love in one way or the other and even psychopaths are able to love themselves(but not others, or not nearly as much as themselves). And that is what ENABLES us to actually HELP people without expecting something in return.

So unless we actually discover a way to let machines feel(and I don´t mean programming their feelings beforehand, but rather make a physical matrix with nerves and everything to communicate bodily sensations and feelings within a machine, the minute we create true AI, if we don´t impose some kind of strict ruleset to them, we will be creating a race of cyber-psychopaths. Of that particular thing I´m 100% sure, because advanced intelligence and the hability to feel nothing is inside the actual definition of being a psychopath.

Also, u might wanna check this article: https://www.verywell.com/the-purpose-of-emotions-2795181

Sort:  

I forgot to tell you: congratz on your excellent article leaky20! I really enjoyed reading it, maybe as much as I enjoyed the movie itself!

Thanks for the comment. You raise a lot of good points. I agree with what you are saying about self-awareness and psychopathy and also that emotions have specific functions. I actually wrote a Steemit article a while back on the function of emotions.

I never really considered that Ava could be self-aware but also a psychopath. That is interesting and gives me something to think about. I also agree with what you are saying about pain.

In relation to empathy and love, I think that they are two things that we may not ever be able to artificially construct .

  1. Empathy is largely based on "mirror neurons" which are brain cells that fire in a persons brain when they watch someone else experience something. Basically, the same neurons fire in both individuals brains. Mirror neurons allows us to connect to each other and experience the same feelings despite them happening to someone else.
  2. Love is largely based on combinations of neurotransmitters that are released in a persons brain at certain moments and during certain experiences. Oxytocin is one such neurotransmitter. It is released during moments of connection like when humans hug, touch or share intimate moments with each other. Intimate does not necessarily means sexual - personal emotional conversations can also release Oxycontin. Basically the neurotransmitter acts as a bonding hormone.

I think that it will be hard for us humans to ever successfully re-create neurotransmitters and brain processes like mirror neurons, so I think that this will also be a limitation to creating true AI.

Hopefully this reply makes sense. Its a bit hard to write about such complex topics in a comment section (as I'm sure you realize). But thanks again for the comment, it was very insightful and made me think further about the topic.

Just want to add my thoughts to this chain to in addition to what I already commented.

Psychopathy to me is a lower level of consciousness because they lacks empathy. I don't think psychopaths can experience all of reality because they missing this part. In other words, they're less consciousness of reality because they miss experiencing it.

Thoughts @wolfenlord?

@steemjaunty That's interesting. I don't know enough about that to draw a sound conclusion but its an idea to consider. I'll keep thinking about that one

Actually it does. I don´t have a background in medicine, so I have to mentally translate everything to my background in computing. I believe we might not have to recreate neurotransmitters, or chemical reactions at all. Think about function: we don´t need to mirror its biology exactly, only its function, and make it adaptive enough so that with self awareness comes also the posibility to grow.

hmm I see. Yeah that's a good point. We can create a program that has a specific function.

I guess in that case it wouldn't be a true emotion but rather it would mimic an emotion. I suppose we also need to consider the fact that a robot with AI is not a human, that it is its own thing (like a species), and so does not necessarily have to behave exactly like a human. Things like love, and empathy and emotions in general are human attributes. Though we may want to impart those attributes onto AI to make them more like us and more relateable, that may not be necessary. Maybe robots should have the freedom to be their own "organism?" There is a lot to consider regarding the issue.

Nice delve into neuroscience! Love it!