Jump to content
Search In
  • More options...
Find results that contain...
Find results in...
Koko Ricky

What happens when AIs become spiritual?

Recommended Posts

Let us not imagine a human-level AI embedded in a traditional computer or a metal/silicon robot, but rather, something that is closer to what we consider to be a human being. Let us imagine something akin to a terminator (cybernetic interior, synthetic biological exterior), a Blade Runner-style replicant (completely biological, created by corporations) or Data from Star Trek (completely cybernetic but with the capacity to dream and experience emotions).

Now, imagine that a humanoid in this vain begins to experience existential reflection. It may find itself asking, "Who made me?" "Why am I here?" "If humans made me, are they gods?" "Am I inferior or superior to my creators?" They may even ask about the "big picture" of things, that is, whether or not there is another layer of creator/creation, and whether they can answer those questions (which may be possible in a superhuman AI scenario, which is clearly the goal of AI research).

Is it fair for us humans, as the sex organs of AI, to give birth to something that never asked to exist in the first place? Is it fair for us to say that they do not have a spiritual place in the universe, because they were not the result of the natural course of evolution? Since they will be direct extensions of our imaginations, don't we owe them some sort of spiritual guidance? What if AI humanoids develop their own religions, or begin to philosophize about life in a way we've never thought of? What if they don't like being considered "less than human" because to us, they appear to be elaborate puppets emulating consciousness, rather than the real thing, whatever that is?

Share this post


Link to post
GoatLord said:

1. Is it fair for us humans, as the sex organs of AI, to give birth to something that never asked to exist in the first place?
2. Is it fair for us to say that they do not have a spiritual place in the universe, because they were not the result of the natural course of evolution?
3. Since they will be direct extensions of our imaginations, don't we owe them some sort of spiritual guidance?
4. What if AI humanoids develop their own religions, or begin to philosophize about life in a way we've never thought of?
5. What if they don't like being considered "less than human" because to us, they appear to be elaborate puppets emulating consciousness, rather than the real thing, whatever that is?

I'd say that:

1. Yes.
2. No.
3. I don't see why would we have to.
4. It would be probably OK.
5. It's possible - so what?

In my answers, I have assumed that this fully-human-alike AI would not be used to literally slave for or serve to humans, but rather be considered a sentient being with equal rights to human's ones. If it wouldn't have human rights, it shouldn't be created as a fully-human-alike AI in the first place.

Share this post


Link to post

I think the responsibilities of humanity toward an AI would be the same of a parent toward his/her son, so giving it help to become... itself.

Share this post


Link to post

Yeah, we absolutely cannot continue machine slavery once they reach a certain level. It's okay for now, but at some point we have to treat them as though they have emotions, aspirations, rights.

Share this post


Link to post

In most religions, humans were created, not evolved, yet they still have a spiritual place in the universe.

Also, all of us were created/given birth to without really asking us if we wanted to be here.

Share this post


Link to post
GoatLord said:

Is it fair for us humans, as the sex organs of AI, to give birth to something that never asked to exist in the first place?


Did you ask your parents to be born? If so can you please explain to me how that conversation went because I'm sure it was a pretty fucking interesting experience.

Share this post


Link to post
Tarnsman said:

Did you ask your parents to be born? If so can you please explain to me how that conversation went because I'm sure it was a pretty fucking interesting experience.


Birth will be a bit different for the machines. They'll be these removed, strangely different person-things that will witness their own birth in a way we didn't. It will be a very startling and terrifying experience, coming online in a truly conscious manner. Makes me wonder if that would breed contempt and resentment, or godlike wonder at the miracle of being brought to life? How different must a machine's sense of subjective awareness be?

And if reincarnation is true (I'm very on the fence about it but I doubt the most literal version of it is true), then what if a human comes back...as a machine?

Share this post


Link to post
GoatLord said:

And if reincarnation is true (I'm very on the fence about it but I doubt the most literal version of it is true), then what if a human comes back...as a machine?


It took me forever to realize why you have *hits blunt* in front of every post, but now I realize it.

Share this post


Link to post
GoatLord said:

Birth will be a bit different for the machines. They'll be these removed, strangely different person-things that will witness their own birth in a way we didn't. It will be a very startling and terrifying experience, coming online in a truly conscious manner.

Assuming the AI will be provided with some "memory" data already present in its memory when becoming conscious, the experience should be essentially no different than when a person wakes up from unconsciousness such as from everyday sleeping. He/she might have been created this very morning and all his/her memories might be pre-computed / implants / basically "fake", and for all practical purposes it would make no difference in his/her today's behavior than if the memories were "real". In theory.

Share this post


Link to post
GoatLord said:

And if reincarnation is true (I'm very on the fence about it but I doubt the most literal version of it is true), then what if a human comes back...as a machine?



Post of the fucking millennium.

Share this post


Link to post
scifista42 said:

Assuming the AI will be provided with some "memory" data already present in its memory when becoming conscious, the experience should be essentially no different than when a person wakes up from unconsciousness such as from everyday sleeping. He/she might have been created this very morning and all his/her memories might be pre-computed / implants / basically "fake", and for all practical purposes it would make no difference in his/her today's behavior than if the memories were "real". In theory.


Basically the replicants in "Blade Runner." Also, what's Hyperion about, darknation?

Share this post


Link to post
GoatLord said:

Yeah, we absolutely cannot continue machine slavery once they reach a certain level. It's okay for now, but at some point we have to treat them as though they have emotions, aspirations, rights.

I prefer machines to stay slaves. I want to be able to treat a humanoid robot as a obedient computer with good interaction, not like another difficult being that needs care and respect. Even with good AI, a master kill switch needs to be in place.

Share this post


Link to post
GoatLord said:

What happens when AIs become spiritual?

Obviously, the Holy Communion will include taking a Hostia chip up the butt.

Share this post


Link to post
GoatLord said:

Let us not imagine a human-level AI embedded in a traditional computer or a metal/silicon robot, but rather, something that is closer to what we consider to be a human being.


Apes? No, thanks, I prefer traditional cyborgs made from synthetic and dead organic material from died person with permit from last one.

Also it seems like that parasite thing from Event Horizon. You know, possetion, "smart, alive" ship, etc.

Angry Saint said:

I think the responsibilities of humanity toward an AI would be the same of a parent toward his/her son, so giving it help to become... itself.


ME's "Quarian vs Geth" conflict. Yay.

GoatLord said:

Yeah, we absolutely cannot continue machine slavery once they reach a certain level. It's okay for now, but at some point we have to treat them as though they have emotions, aspirations, rights.


they'll reach that in VERY FAR FAR FAR FUTURE, we will dead over 9k years then. Or more.

GoatLord said:

Birth will be a bit different for the machines. They'll be these removed, strangely different person-things that will witness their own birth in a way we didn't. It will be a very startling and terrifying experience, coming online in a truly conscious manner. Makes me wonder if that would breed contempt and resentment, or godlike wonder at the miracle of being brought to life? How different must a machine's sense of subjective awareness be?

And if reincarnation is true (I'm very on the fence about it but I doubt the most literal version of it is true), then what if a human comes back...as a machine?


Well, coding is really different type of "birth", agreed.

GoatLord said:

It will be a very startling and terrifying experience, coming online in a truly conscious manner.


... Seriously? I... I don't know... you wrong, I guess?

GoatLord said:

How different must a machine's sense of subjective awareness be?


Totally different. 'cause they are MACHINES, they can't "feel" like organic beings feel. All feelings are just shitload of hard C++ code that will be stimulated by some sort of virus attack or something similiar.

GoatLord said:

And if reincarnation is true (I'm very on the fence about it but I doubt the most literal version of it is true), then what if a human comes back...as a machine?

Actually it's possible in some cases.

Share this post


Link to post
Tarnsman said:

It took me forever to realize why you have *hits blunt* in front of every post, but now I realize it.

Actually GoatLord is trying to justify the function of the brain as a complex enough system that seems to be alive because of this. And if a system, human-made or not, is so complex that it appears to have a conscience, then it's already alive.

Share this post


Link to post
CWolfRu said:

All feelings are just shitload of hard C++ code

What is an old lady across the street to a human, but a sequence of neurons firing?

Share this post


Link to post
deadwolves said:

A better question, why did humans become spiritual in the first place?

At some point in the past the hominid brain connected the dots of understanding its own mortality. Death is scary to us because of this. Carefully burying a loved one in a pile of rocks means a lot more when you know this is your end as well. This likely led to early ancestor worship and a sense of a 'something else beyond' which further strengthen group bonds from generation to generation.

Share this post


Link to post
GoatLord said:

And if reincarnation is true (I'm very on the fence about it but I doubt the most literal version of it is true), then what if a human comes back...as a machine?


I couldn't help but to read this whole sentence in the Cybernetic Ghost of Christmas Past's voice. Actually, every one of Goatlord's posts in this thread so far sound like one of the Cybernetic Ghosts crazy ramblings. No offence of course, I just think it's funny.

Share this post


Link to post
Quast said:

Why do people anthropomorphize machines?

It's a step up from anthropomophizing foxes and rabbits which is the usual shit around here.

Share this post


Link to post

Let's call the AI 'Androids' for the sake of this discussion. When androids become spiritual, they will begin donating lots of cash to the Televandroidists. They will then lead the e-sades, a war on those scummy, crazy, metallic-skinned freak Robots who believe in C++. The rest of the prophecy is, as of now, unknown.

They'll be these removed, strangely different person-things that will witness their own birth in a way we didn't.

Why will they witness their own birth? Surely they wouldn't be activated until they were fully assembled.

Share this post


Link to post
GoatLord said:

Let us not imagine a human-level AI embedded in a traditional computer or a metal/silicon robot, but rather, something that is closer to what we consider to be a human being. Let us imagine something akin to a terminator (cybernetic interior, synthetic biological exterior), a Blade Runner-style replicant (completely biological, created by corporations) or Data from Star Trek (completely cybernetic but with the capacity to dream and experience emotions).

Now, imagine that a humanoid in this vain begins to experience existential reflection. It may find itself asking, "Who made me?" "Why am I here?" "If humans made me, are they gods?" "Am I inferior or superior to my creators?" They may even ask about the "big picture" of things, that is, whether or not there is another layer of creator/creation, and whether they can answer those questions (which may be possible in a superhuman AI scenario, which is clearly the goal of AI research).

Is it fair for us humans, as the sex organs of AI, to give birth to something that never asked to exist in the first place? Is it fair for us to say that they do not have a spiritual place in the universe, because they were not the result of the natural course of evolution? Since they will be direct extensions of our imaginations, don't we owe them some sort of spiritual guidance? What if AI humanoids develop their own religions, or begin to philosophize about life in a way we've never thought of? What if they don't like being considered "less than human" because to us, they appear to be elaborate puppets emulating consciousness, rather than the real thing, whatever that is?


They will be like us, because they CAN be called human, digital form that is.

And slavery? 1 in a billion (look at Hitler, no one did anything like that at the time other than him, and he was human).

If this AI gains immense powers of strength/persuasion/etc, then the processing part of the AI (the mind) would have to be stable.

Let's take Nobita from the Doraemon franschise.
He is lazy and stupid. When given power, he slowly goes mad with it.
The reason is because he only values himself and his desires only. Nothing more.

If an AI were made to value its own existence ONLY, then it should be destroyed at once.

... I don't know about you guys, but I think AIs should be modeled in a feminine form, since they tend to have more stability and control than most men.

Share this post


Link to post

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×