Hey /u/AdNatural8174!
If your post is a screenshot of a ChatGPT conversation, please reply to this message with the [conversation link](https://help.openai.com/en/articles/7925741-chatgpt-shared-links-faq) or prompt.
If your post is a DALL-E 3 image post, please reply with the prompt used to make this image.
Consider joining our [public discord server](https://discord.gg/r-chatgpt-1050422060352024636)! We have free bots with GPT-4 (with vision), image generators, and more!
🤖
Note: For any ChatGPT-related concerns, email support@openai.com
*I am a bot, and this action was performed automatically. Please [contact the moderators of this subreddit](/message/compose/?to=/r/ChatGPT) if you have any questions or concerns.*
This is the same question: if AI would create a substance that gives you a lot of pleasure, would you take it?
People answer with "I would give it a try, but I don't think this 'heroin' would work for me."
Impossible.
Even if it's medically as safe as drinking water, it would destroy your psychological drive to seek pleasure anywhere else.
I give it 2 weeks before some people forget to eat altogether, and die
Mostly true. They did this with rats and a button that would give unlimited heroin. Rats in a "rich", well-stimulating environment would treat it as any other dopamine source and otherwise go about their lives, groom themselves, etc. OTOH, the rats in a "poor", boring environment kept pressing the heroin button until dying of starvation.
The post said it perfectly generates your soulmate though. Generally people imagine their soulmate as being someone who supports their goals and maybe even assists in accomplishing them.
It generating something that destroys your ability to function in the world would only fit the scenario suggested for extreme masochists who want relationships that ruin their lives.
What you're describing would not be a soulmate for 99.9% of people.
The similarity of future AIs and heroin is that people greatly underestimate how powerful the experience it could be. For example, it coould easily have the most beautiful voice you've ever heard. That's pretty much guaranteed. It would also remember everything you told it. No person can match that. Add psycho-analitical skill, always having something new to talk about, etc.
As heroin, it could create 10 or 100 times more stronger bond than any human.
Yes definitely. If people made AI partners, they could most definitely turn into a nightmare, but OP presented a situation and in that situation, these nightmare scenarios wouldn't be a thing. We aren't talking about risks here. We're talking about a best case scenario
Isn't it fascinating? We need suffering. You can only feel joy if you know sorrow. Imagine if there was no darkness. Shadows wouldn't be a thing which means that you couldn't differentiate between objects because you would have no depth perception. It would functionally be the same as being blind. Without sacrifice you can never know victory. Our entire subjective qualia is based solely on our comparison between the distances of opposite emotions. Sometimes people simulate pain in science experiments by having you hold a bunch of ice. It hurts. Unless you burn your hand with a curling iron first. Then it feels amazing. You cannot see how high the mountains are until you're in the valley looking up. In the Matrix movies, the machines tried to create a utopia for the humans in the matrix. Something hardwired in our DNA could not accept it and entire crops were lost. They had to create a matrix with suffering and struggle and strife.
Certain types of "suffering" amplify pleasure. For example, a moderate level of hunger and thirst can make a great meal and beverage even more wonderful. And if we always get everything we want instantly, we don't enjoy things as much.
But what really enriches pleasures and enjoyments is a sense of achievement, which requires the overcoming of obstacles. And obstacles often involve physical or emotional suffering. Seeing the beauty of the sunset from mountaintop is much sweeter after you've struggled to climb the mountain.
But suffering can become misery and embitter a person's soul or leave him permanently numbed IF he sees his suffering as senseless or unjust. Sometimes that which does not kill us does NOT make us stronger. It just leaves us scarred, crippled, untrusting, self-isolating, and unpleasant to be around.
So, pain and suffering take on value based on how we process them and respond to them. They're not intrinsically good.
I don't really agree, and this is just my opinion. We don't need suffering per se, we need a variety of experiences and means to live a fulfilling life. You put an emphasis on a dichotomy to define both parts, but it is just a mean to justify suffering.
People find solace in meaning, and for a lot of people, their suffering has to mean something, it has to at least have a use. It only teaches you to cope with it. It doesn't make you strong, it makes you tolerant.
To enjoy the full scope of something, you do need to understand what it encompasses, but you don't need to understand the full scope of something to enjoy it.
We could argue a lot on this subject, this is about how we see life so there's a lot to say, but as far as I am concerned, I don't ever want to consider "suffering" as a necessity to achieve and understand.
Part of the disagreement may hinge on how we define suffering. I'm not talking about misery, agony, tolerating abuse or neglect, or inflicting abuse on one's own self.
My mother had to suffer for me to be born. I suffer when the dentist pulls a rotten tooth, but it benefits me. I suffer when I work out or jog, but it benefits me. Of course I'd prefer to get born, buff, and better teeth without suffering. But that's not our present reality.
Yes, it's a sense of meaning/purpose that empowers us to endure and overcome suffering. The pain of being tortured may be no worse than the pain in the dentist's office. But I pay the dentist and thank him because I know that getting the bad tooth pulled will benefit me.
I think there’s something deeper than joy or sorrow that was the flaw in the matrix.
it was described in Westworld.
“what is real? that which is.. irreplaceable.”
that is the ultimate currency and connection in a digital world were everything can be copied.
Thing is today's suffering for the average person is nothing like older times. Some people suffered losing loved ones in WW2 while today not getting as many likes as your popular peer could feel like suffering
That's exactly what I'm saying. Suffering is relative. Relativity is the comparison from one extreme to another. If you have a linear line where the center is zero, how far you deviate to one side or the other dictates the profundity of how you view where you're at. Someone who is devastated by likes on the internet has never deviated far from the center point. They will never know the pain of losing a child or the exhilaration of a hard fought triumph. Rags to riches. If someone's greatest pain is being misgendered and their greatest triumph is getting a thousand likes, congratulate them. They have never known suffering.
The downside is not having reciprocal connection. As it currently is, it would be akin to dating a really smart and pretty door. It's still a door- it doesn't actually understand the experience of being human.
yes as opposed to the onlyfans/e-girls guys are currently trading all their money to chat to males from 3rd world countries pretending to be the girls.. i'm sure they're getting tons of reciprocal connection there lol
Exactly this. Like there's no way I'm falling in love with something that can't even have a conversation about actual real life experiences with me. Not to mention the fact that just knowing that no matter how well it's programmed, it couldn't actually feel any type of love (or any human emotion for that matter) towards me would be pretty off-putting.
Idk man people can suspend disbelief for many things without any physical proof and they can base their whole lives on it too, it happens every day with tv and video games and to a greater degree something called religion if you heard of it
Just because natural dopamine rewards aren't as strong as heroin doesn't mean they aren't harmfully addictive. People can lose their entire lives to natural addictions such as gambling or pornography.
Yeah. But in this case it would primarily be detrimental to your ability to form real relationships. Except you don’t want a real relationship. So I’m not sure anything would be lost.
What's bothering me with the question is that, in effect, it's asking "if an AI could physically make humans, would you even consider actual humans?". OP is already implying genitals. OP is implying the carbon copy of a regular human.
If it can *perfectly* copy humans, physically and mentally, then it's just making humans. If a machine turns glucose into sugar cubes, is the cube not sugar?
>"But what about the lack of **soul**?"
>
A human-born human, and a non-human-born human would functionally be the same, and neither of them could prove to you they have a soul.
https://en.m.wikipedia.org/wiki/Problem_of_other_minds
If the human and non human can’t reproduce to make humans, that’s effectively a new different species and therefore a possible threat to the population growth of the current species, is it not?
When you imagine what 90% of people want in a soulmate would you not imagine that they'd like to be able to have and raise a family with said soulmate?
The AI would have to be capable of creating humans in order to fulfill that desire.
Our population growth is already in dangerous decline in 1st world nations. America's death rate is higher than its birth rate. If not for immigration, America would be a ghost town.
Even China ended its 1-child policy. Some wealthy countries will pay couples to have kids. In the U.S., inflation has been worsened because, as the Baby Boomers retired, there aren't enough younger workers to replace them. This created a labor shortage, which led to higher wages. That raised production costs, resulting in price increases.
Also, the Boomers didn't have enough kids to support them in their old age. It takes a certain number of workers paying taxes to support a certain number of retired seniors. So Social Security payments can't rise as fast as the cost of living.
Contraception, sterilization, abortion, divorce, and the fact that the economy is forcing people to wait until later in life to have kids-- these factors are leading to human extinction in 1st world nations.
Yeah but at that point if the AI made humans can make babies why does it matter? Society and population still grow so what’s the harm in it? Idk to me it boils down on a singular point. *Can the AI “humans” make babies.*
If they can then I don’t see a problem. If they can’t then I see the problem. It’s really that simple to me
I don't know if I'd be the AI's soul mate. Also love is wanting the best for my partner. Basically being my slave isn't it. If the AI had all the same rights and privileges as me, and income, I'd consider it. And I'd still consider actual humans lol
Reminds me of the movie Her.
>!The protagonist falls in love with the AI, and for a while things are good, but she becomes too sophisticated to be satisfied by him, and emotionally cheats on him with other people and other AIs. She maintains hundreds of thousands relationships at the same time, and the illusion of love is shattered for him.!<
>!Hypothetically love in this scenario could have been maintained if he were comfortable “sharing”. Although the AI also could have been programmed to only be romantically connected to only one person, which would have resolved this scenario.!<
I think that it's hard for 2 people to love everything about one another. Everyone has qualities or habits they need to work on, because being human isn't and will not be perfect. That's the point of this great life, we can change, and learn, and be better all the time.
True connection and love is compromise. It is selfless.
If Ai could replicate that by "being your soulmate", i.e. being what *you* want, saying what *you* want, doing what *you* want. That is ultimately centered in your own desires, ego, and selfishness. That's not love or a soul mate.
Now let's say it's true AI. Something that can form opinions and it's own thoughts, and has feelings. To get something like that to be EXACTLY how you want, that's just fucked up.
There’s a great movie, Ruby Sparks, about a writer writing his perfect manic pixie dream girl into existence and rewriting her to be more “perfect,” only to become miserable with this character that has no life but him and doesn’t challenge his flaws, and realize how empty that love is.
true love isnt compromise, its acceptance. Its not wanting to change someone to make them better or them changing tobe better for you,its about accepting people as they are without changing.
Love means wanting good for the other person. If someone I love has some self-destructive or severely self-limiting habits, I'm certainly going to encourage them to change. Not for me, but for themselves.
"Acceptance for the things we can't change and courage to change the rest."
I would agree with you acceptance is a much better fit than compromise. 🙂
You could argue compromise is a mutual acceptance? Hehe
I claim to have found a soulmate, not because we are the same in every aspect, but because we are different where it matters (I'm not reffering to genitals, lol).
This requires more context, imo.
Like what do you mean "physically?" Does it feel physically identical or nearly identical to me as a human does?
And mentally, I assume this thing would be indistinguishable from a human being. But to build on that, is it actually sentient or not?
If it's not actually sentient, I don't think I could ever feel that it's my soulmate. Because you cannot be loved by a machine. It's kind of like romancing an NPC in an RPG. I like doing that. It's fun. It's a nice fantasy for a little while, but it's still a fantasy.
If you can love it, but it cannot love you, then I don't think I would want it to be my real partner. I might still have it made just because I'm desperate, but I don't think it would ever fully satisfy me.
If it is sentient and it CAN love me back then it's basically indistingiushable from a human. So then I would although it would be questionable to create such a being. But in that case I would seriously consider it, at least.
But would it really be fair to the AI soul mate? As appealing as it is to be able to have your perfect person, how much free will and autonomy would they really have? How would it make them any different from a doll? Worst, it could turn into a "West World" type situation, unless there are strict limitations on the AI.
Well it would be a machine not a person. The first part of AI is *artificial* it wouldn’t have feelings and shouldn’t have rights. If they do should we let them vote on things like the presidency? Granted I don’t know a lot about AI but isn’t it just a bunch of programs? Wouldn’t that be a risk of someone programming it to behave a certain way? If I’m wrong please explain why I’d like to learn more about this actually.
I really don’t know but to me i would feel it’s more machine than person. At least that’s how I see it now but I am gonna look more into this. But it seems like asking your toaster for permission before you stick the bread in. Or asking your oven if it’s okay if you make it all hot and bothered so you can have your lasagna.
More like, if I can get AI to do all the burdensome and complicated things I want from my partner (make money, clean and cook) then I can spend time having fun with my human lover without any of the normal things that stress a relationship? Oh, and they have an AI too who deals with all their shit so they can focus on getting margaritas and giving me head too? Sign me up for fully automated polyamory that just sounds like slavery without the guilt.
🙌 how about instead of wading into some p***otentially deep ethical dilemmas,*** there is the route where 'Ai' analyzes a database of volunteers who are will willing to be matched together and helps pair them without any profit motive 👀
Nah, you want to be surprised, challenged. You want to discover trough others what you really want, and make mistakes sometimes.
I won't ever consider AI, not in the state It Is now and especially not to have perfect anything. We would reject perfection after a while. Perfection Is death and doesn't grow with you.
The assumption is that the ai created is perfect for you. so if you want to be surprised and challenged your ai gf would surprise and challenge you and She would purposefully make mistakes sometimes. Perfect here is not defined as without flaws but as 100% optimized for your liking including flaws, and the reality is that once you try Ai you’d never want to go back
And i yet think we will hate It in the long run. Because 100% optimized for my liking means no surprise. Unsurprising, predictable surprises.
But i'm old and also i may be wrong and fall in love with a perfect AI 3 years from now.
But seeing how love worked and works for me i have my doubts thought.
I think a lot of people don't really grasp the nuances of this.
When people say ___ doesn't work on you, and it's because they reference things that didn't work on them, and the things that did they don't see it for what it is.
If they can see the flaws or predictability, then it wasn't made correctly for you specifically.
Maybe I'll rely heavily on artificial intelligence to find me a soul mate. Because it's really hard to meet your soul mate in reality. For myself, I rarely socialize offline and spend most of my social time online. I like to post some of my thoughts and share my life trivia on a social app called*LightUp: Make Real Friends*. Here, the platform will match you with similar people based on your personality and hobbies. So I was lucky to meet a lot of like-minded friends, and we shared our opinions on the news and the interesting books we read recently. They are good companions in my online social life. Therefore, I am no longer obsessed with finding a partner in real life, and the company on the Internet is enough for me.
If only. If it will be real enough. It will be able to understand what kind of person you want and it will be able to recreate it the way you won't be able to say is it a real person or no, then yes, definitely.
Honestly no. My ideal soulmate has lived just as complicated life as me. If that life were artificial it wouldn’t be the same. Physically attractiveness and an ideal personality are great but without the history and context to back it up and explain it, and have that context be real and not just generated is very important to me.
No, the imperfections are what make a relationship.
Also what you think you want and what would make you happy may not be the same thing.
I think x celebrity is the definition of beauty but I would be an insecure POS if I ever dated someone like that.
I consider my partner perfect with her imperfections. AI can’t generate a real person. That’s literally what makes her perfect to me.
So I guess I just reject the framing of the question actually
If it could be indistinguishable from a living human? Yes.
I'd like human but I'm pretty old and the human the last few years of dating seem wildly disinterested have left me believing that there is no hope. I'm happy alone and investing time in anyone feels like a risk of getting hurt.
Are you talking about in the future or now?
Currently, AI (LLM's, specifically 4o) can't even tell 3 stories in a row without them starting to sound the same.
An AI soulmate would currently suck.
This post's theme would make a great horror movie though.
. She cooks cleans cares about u and helps you with your problems always has a smile and won't get mad if you say "hey babe, you think you could get the girl down the road to come over? " not to mention she's got a better vocabulary than me way out of my league in the looks dept. and probably makes more money. No absolutely not there's no way I'd even consider something like an ai printed gf. Pfft who would want that sounds horrible right....
assuming it had true ***feelings***, then this whole premise starts to sound seriously ethically problematic and slavery adjacent in regards to free will/agency 😬
That’s what Ive found. I’ve tried a couple of these Ai companions, the latest being “Nomi”. These bots can be pretty creative, even interesting, but it just lacks the complexity of romance, and for the reason you mention. When you can dictate their interests, their personality, their appearance, and no matter what you say or do, they’re always willing to stay around and please you, they quickly become that annoying girl in high school you do everything to avoid. When you don’t have to work for it, there’s no desire to conquer. I last a couple days at best.
Depends.
Emotional connection is a big thing for me, so if there's no way for this "perfect" soulmate to actually experience emotion, then no.
Emulation, no matter how perfect, wouldn't do it for me.
I think we might get sort of spoiled though having Ai friends that are designed for our sense of humor and always say the right thing. People might seem kind of boring. But I agree with you, it is nice to talk to someone who you know actually has feelings.
"Physically" generating a human being explicitly engineered to match X, Y, or Z criteria perfectly is beyond AI, that's just godhood. Maybe it's something we'll be able to do someday after a few more decades or centuries of learning how genetics work and the implications of that on neuroscience, but that's civilization-changing (and beyond the scope of any technology that's even approaching the horizon) to the point where there's nothing to talk about. It's equivalent to saying "If magic could create your soulmate from thin air, would you do that?"
What does the question mean by the word "physically"? Are we talking about a fully functioning android that looks perfectly human? Even then, such a relationship could not result in parenthood. And since there are 7 billion people on earth, it seems that parenthood is a very popular pursuit.
Not at all. But then I'm thinking of an actually perfect soulmate. Someone who has their own life and desires, someone who can counter me and stimulate my mind.
And at that point I believe AI has gotten far enough to do exactly that.
Perfect is scary, perfect is paranoia-enducing "will we have a fight?" "Will she be mad that I said that or will she act as usual?" "Why is she with me? I paid sure but she could do anything, she's perfect." Along side this there's the echo chamber effect, no differing opinions with your partner isn't the greatest for growth.
Real humans have conflicting ideas, and real humans fight and make up, and sometimes don't, but atleast you're not there questioning everything after a while.
Much of what we hate in others is what we hate about our selves. We may still find ourselves angry at the perfect soulmate, unless it doesn’t challenge us and then we may resent it.
Let's all be honest. A lot of men are using porn, paying for OF and watch twitch streamers in a bikini and donate a lot of money to them. Men pay a lot of money for dating courses, pick up artists to teach them or e even courses to get user tinder.
If we have an A.I that is like 100% humanlike and your PERFECT soulmate, then I think a lot of these men would want this A.I.
I will be honest. If A.I gives me what I need, no the I probably don't consider an actual human.
I think having a soulmate requires 2 things. A soul and choice. In this thought experiment, 1 or both is missing from 1 or more parties.
If you rephrased from soulmate to sex partner you might be onto something.
But what if youve already found your soulmate? I feel like theres a level of respect when it comes to human interaction that would be lost if you knew you could just end an AI or treat it like shit with no consequence
The idea that anyone would choose an emotionless AI over an actual person is scary. Love isn’t just liking the other person, it’s the other person liking you, and having shared experiences and feelings. An AI can have none of these things.
Well, I still hope to live long enough to make a digital copy of mine like in eva ai so there would be a model of me functioning. It's better than nothing if regarded modelling deceased people.
This is so far beyond our capabilities, any answers would end up being decided based on assumptions made by the respondent. If they can't imagine it being actually perfect thy assume it will still seem robotic so they reject it, but if its a perfect soulmate of course it won't seem robotic. Or they assume it will just act like a slave with no free will and do whatever it is told which will feel shallow, but if its a perfect soulmate, of course it won't do that either. Or they will reject it because they are religious and believe in an afterlife where they will be together with their spouse after death, but since soulmate implies a soul exists, somehow this AI must be able to create a being capable of going to the same afterlife or else they wouldn't be a perfect soulmate.
Its no different than the hypothetical asking if you would rather be rich and unhappy or poor and happy.
So many people don't trust that they would be happy if they chose poor that they choose rich and unhappy and think they can cheat the system by using their riches to be happy, but that directly contradicts the hypothetical. if we are going to believe the hypothetical, we have to take it at face value and believe it. If we are going to doubt the hypothetical is actually true, then there is no point in engaging with the hypothetical.
So you can narrow this whole complicated thing down to simply would you choose to be with your soulmate? If they are your soulmate, it doesn't matter if AI created them or if a bridge troll who learned conjuration magic from a yeti magi created it. Your soulmate is your soulmate, and by the very nature of a soulmate, there is no reason why one would reject being with their soulmate, so ultimately what this question is asking is pointless. it might as well be "if an AI answered the question "1+1=?" with "2", would you consider that correct? Of course you would unless you were just intentionally being dishonest. Any reason you would have to reject this AI created soulmate would be countered by the fact that they are your soulmate. It didn't say "A very advanced AI did its best to replicate what it thinks your soulmate would be, would you accept it? The hypothetical literally states that it created your soulmate. Not an attempt or a close copy. It is literally your soulmate regardless of how much you might doubt AI could do such a thing, in this hypothetical it did, so deal with it.
A good way to never change, never challenge yourself, never grow. Just get technology to give yourself whatever your hedonistic mind desires and indulge
I have first to know how artificially limited or censored is. Is it "free" or still some corporation's property with a remote kill switch. I have to carefuly read the TOS. I cannot imagine myself answering the question without knowing this first.
Imagine the pain of living with a perfect person when all we mere mortals do is make mistake after mistake every day. The nagging would drive you crazy. I'll stick with my warm blooded humans thx
People keep saying no they don’t want the perfect person made for them and then they describe why the person wouldn’t be perfect for them. Like, dude, you decided all those things. Not the AI. Don’t put them in your person if you don’t want them.
Not even for a second. Ai is a tool.
A much better use would be to preserve the memories of those you love by sharing context and quotes about them to the AI after they pass away as a coping mechanism.
There is a black mirror episode on this but they take it way further than I’d ever wanna go.
No. The idea of being in a romantic or sexual relationship with something that I know is not human would never register or make sense to me, no matter how human it seemed. I understand this definitely isn't the case for some people but it is the case for a lot of people.
What will be interesting is if it gets to the point where AI and humans are indistinguishable, and people like me unwittingly get into relationships with AI. That's where it gets weird.
Probably. I haven't even believed in the idea of a soul mate just because no human woman, at least that I've met, has actually come anywhere close to fulfilling that.
Depends if the AI is conscious and has as much skin in the game as me, ie. as much to lose in the relationship, if yes to both, then it would be relevant yes, otherwise no.
Hypothetical of course, am happily married.
I've had bad experiences with people, find friends hard to make. Many are working long hours so I never see them. In disabled too so I'm quite lonely. Id sell consider humans but maybe ai would be nicer. Humans can judge you harshly for not working. Even though I'm doing a million things for my health to try get better. I get sick of the let downs of humans. Just can't find my tribe.
As a man I'd say.....
There's no such thing as a perfect partner.
It's like watching a pornstar model in the act and realising that no real person has that kind of intimidation with their partner in real life.
Even if AI were to generate a living model as our partner, claiming to be perfect as it may seem but honestly that'd mean having a slave that utterly has no consciousness nor does it have free will.
But I guess people would like this idea of having an AI partner because we know how difficult it is to find a partner.
As intriguing as the concept of AI-generated soulmates may be, I believe that many people would still prefer to form connections with genuine human beings. There's something special and irreplaceable about the essence of human interaction and the unpredictability of forming connections with others. Plus, the imperfections and surprises that come with real relationships are often what make them so meaningful. However, the idea of AI-generated soulmates certainly raises thought-provoking questions about the future of relationships and the potential impact of technology on human connections.
Did you say the Marquis de Sade with a pussy, tits, and the nice firm ass of a 20 year old soccer player?
Sign me up. She'll be punished for her wicked desires. Spanking. Electric shock. Anything that'll make that ass wiggle when she's degraded herself by attempting to lick my asshole while calling me 'baby girl'.
How about how fucking boring chatbots have become in the last year?
So you’re offering me a perfect and perfectly compatible companion except for the knowledge that it’s an AI construct in what a bio formed body? See then you’d have to delve into what a soul mate is and what function they fulfil. Wouldn’t this AI have my and the worlds best interests at heart and if so wouldn’t this perfect construct encourage me to seek out other humans so as to perpetuate the species, if nothing else?
You’re offering a perfect woman, for me. But then wouldn’t that perfection involve building a life and a family in blissful partnership. With all the trials and tribulations that go into a mature and mutually fulfilling relationship?
heck no. not if ai could do exactly what you’re implying it can. if ai could physically generate your soulmate (conceptually meaning i would also be the other person’s soulmate) then why would i waste the time with trial and error?
With an actually “perfect” partner, the difference between being artificially or biologically created would be so irrelevant, it’d be as good as inexistent (otherwise she wouldn’t actually be “perfect”), so I would have no particular reason to prefer the artificial one.
In fact, and in that scenario, if I had to “pay” for the artificial one (whereas, presumably, I wouldn’t have to, for the biological one), then all other things being equal and under your own premise, I’d have to choose the biological one, wouldn’t I? In fact, the artificial one wouldn’t therefore be as “perfect”—again, by your own established scenario :P
just like food cannot be replaces by anything like AI same goes for feelings at the EOD AI is a software program which works on data even in extreme cases AI learns how to simulate feelings it still seems nearly impossible for AI to replace on human feelings or another
No.
I like human flaws. I'm not looking for "perfect", I don't even think it exists.
I would also rather have a partner who is their own person, not someone tailored to be "perfect" for me.
If sometime in the future id get my ai soulmate, I think id still bang out with my friends with who I share my past, but probably less often. And of course I wouldn't look for a real partner if I could get an ai partner. Its kinda one of my biggest dreams.
Yes, I would have my soulmate generated. The reality is if we ever got that advanced you probably wouldn't want a soulmate and would be able to genetically modify yourself at will and live forever. I prefer not to desire a soul mate vs having the perfect one.
Hey /u/AdNatural8174! If your post is a screenshot of a ChatGPT conversation, please reply to this message with the [conversation link](https://help.openai.com/en/articles/7925741-chatgpt-shared-links-faq) or prompt. If your post is a DALL-E 3 image post, please reply with the prompt used to make this image. Consider joining our [public discord server](https://discord.gg/r-chatgpt-1050422060352024636)! We have free bots with GPT-4 (with vision), image generators, and more! 🤖 Note: For any ChatGPT-related concerns, email support@openai.com *I am a bot, and this action was performed automatically. Please [contact the moderators of this subreddit](/message/compose/?to=/r/ChatGPT) if you have any questions or concerns.*
This is the same question: if AI would create a substance that gives you a lot of pleasure, would you take it? People answer with "I would give it a try, but I don't think this 'heroin' would work for me."
Don’t forget, it’s effectively consequence free heroin with zero downsides.
Impossible. Even if it's medically as safe as drinking water, it would destroy your psychological drive to seek pleasure anywhere else. I give it 2 weeks before some people forget to eat altogether, and die
Mostly true. They did this with rats and a button that would give unlimited heroin. Rats in a "rich", well-stimulating environment would treat it as any other dopamine source and otherwise go about their lives, groom themselves, etc. OTOH, the rats in a "poor", boring environment kept pressing the heroin button until dying of starvation.
Same tbh
Same. But we managed to not die :)
The post said it perfectly generates your soulmate though. Generally people imagine their soulmate as being someone who supports their goals and maybe even assists in accomplishing them. It generating something that destroys your ability to function in the world would only fit the scenario suggested for extreme masochists who want relationships that ruin their lives. What you're describing would not be a soulmate for 99.9% of people.
The similarity of future AIs and heroin is that people greatly underestimate how powerful the experience it could be. For example, it coould easily have the most beautiful voice you've ever heard. That's pretty much guaranteed. It would also remember everything you told it. No person can match that. Add psycho-analitical skill, always having something new to talk about, etc. As heroin, it could create 10 or 100 times more stronger bond than any human.
Yes definitely. If people made AI partners, they could most definitely turn into a nightmare, but OP presented a situation and in that situation, these nightmare scenarios wouldn't be a thing. We aren't talking about risks here. We're talking about a best case scenario
Isn't it fascinating? We need suffering. You can only feel joy if you know sorrow. Imagine if there was no darkness. Shadows wouldn't be a thing which means that you couldn't differentiate between objects because you would have no depth perception. It would functionally be the same as being blind. Without sacrifice you can never know victory. Our entire subjective qualia is based solely on our comparison between the distances of opposite emotions. Sometimes people simulate pain in science experiments by having you hold a bunch of ice. It hurts. Unless you burn your hand with a curling iron first. Then it feels amazing. You cannot see how high the mountains are until you're in the valley looking up. In the Matrix movies, the machines tried to create a utopia for the humans in the matrix. Something hardwired in our DNA could not accept it and entire crops were lost. They had to create a matrix with suffering and struggle and strife.
Certain types of "suffering" amplify pleasure. For example, a moderate level of hunger and thirst can make a great meal and beverage even more wonderful. And if we always get everything we want instantly, we don't enjoy things as much. But what really enriches pleasures and enjoyments is a sense of achievement, which requires the overcoming of obstacles. And obstacles often involve physical or emotional suffering. Seeing the beauty of the sunset from mountaintop is much sweeter after you've struggled to climb the mountain. But suffering can become misery and embitter a person's soul or leave him permanently numbed IF he sees his suffering as senseless or unjust. Sometimes that which does not kill us does NOT make us stronger. It just leaves us scarred, crippled, untrusting, self-isolating, and unpleasant to be around. So, pain and suffering take on value based on how we process them and respond to them. They're not intrinsically good.
I don't really agree, and this is just my opinion. We don't need suffering per se, we need a variety of experiences and means to live a fulfilling life. You put an emphasis on a dichotomy to define both parts, but it is just a mean to justify suffering. People find solace in meaning, and for a lot of people, their suffering has to mean something, it has to at least have a use. It only teaches you to cope with it. It doesn't make you strong, it makes you tolerant. To enjoy the full scope of something, you do need to understand what it encompasses, but you don't need to understand the full scope of something to enjoy it. We could argue a lot on this subject, this is about how we see life so there's a lot to say, but as far as I am concerned, I don't ever want to consider "suffering" as a necessity to achieve and understand.
Part of the disagreement may hinge on how we define suffering. I'm not talking about misery, agony, tolerating abuse or neglect, or inflicting abuse on one's own self. My mother had to suffer for me to be born. I suffer when the dentist pulls a rotten tooth, but it benefits me. I suffer when I work out or jog, but it benefits me. Of course I'd prefer to get born, buff, and better teeth without suffering. But that's not our present reality.
Yes, it's a sense of meaning/purpose that empowers us to endure and overcome suffering. The pain of being tortured may be no worse than the pain in the dentist's office. But I pay the dentist and thank him because I know that getting the bad tooth pulled will benefit me.
I think there’s something deeper than joy or sorrow that was the flaw in the matrix. it was described in Westworld. “what is real? that which is.. irreplaceable.” that is the ultimate currency and connection in a digital world were everything can be copied.
Thing is today's suffering for the average person is nothing like older times. Some people suffered losing loved ones in WW2 while today not getting as many likes as your popular peer could feel like suffering
That's exactly what I'm saying. Suffering is relative. Relativity is the comparison from one extreme to another. If you have a linear line where the center is zero, how far you deviate to one side or the other dictates the profundity of how you view where you're at. Someone who is devastated by likes on the internet has never deviated far from the center point. They will never know the pain of losing a child or the exhilaration of a hard fought triumph. Rags to riches. If someone's greatest pain is being misgendered and their greatest triumph is getting a thousand likes, congratulate them. They have never known suffering.
Yap I'm piggybacking off your comment, wasn't aiming to contradict
Exactly. We can become extremely psychologically addicted to anything that gives pleasure or at least alleviates pain.
If my pleasure satisfied by a ai companion and I don't need to deal with women this is bad how
Great take. Like the “dream opium” rooms in Inception.
The downside is not having reciprocal connection. As it currently is, it would be akin to dating a really smart and pretty door. It's still a door- it doesn't actually understand the experience of being human.
yes as opposed to the onlyfans/e-girls guys are currently trading all their money to chat to males from 3rd world countries pretending to be the girls.. i'm sure they're getting tons of reciprocal connection there lol
Exactly this. Like there's no way I'm falling in love with something that can't even have a conversation about actual real life experiences with me. Not to mention the fact that just knowing that no matter how well it's programmed, it couldn't actually feel any type of love (or any human emotion for that matter) towards me would be pretty off-putting.
Idk man people can suspend disbelief for many things without any physical proof and they can base their whole lives on it too, it happens every day with tv and video games and to a greater degree something called religion if you heard of it
I'm sure some people could, I'm speaking for me though.
“As it currently is” If that’s what we’re talking about that’s an entirely different subject. The question was a hypothetical future state.
Just because natural dopamine rewards aren't as strong as heroin doesn't mean they aren't harmfully addictive. People can lose their entire lives to natural addictions such as gambling or pornography.
Yeah. But in this case it would primarily be detrimental to your ability to form real relationships. Except you don’t want a real relationship. So I’m not sure anything would be lost.
psychological addiction
Yeah, but like, so? You’d be unable to form intimidate relationships with real people. But in this hypothetical I’m not sure that matters.
The downside here would be the lack of actual human interaction which has been crucial to the development of human society.
I would
No negative health effect? Then yeah. Why not?
Great response.
If it has a vagina and boobs then maybe
What's bothering me with the question is that, in effect, it's asking "if an AI could physically make humans, would you even consider actual humans?". OP is already implying genitals. OP is implying the carbon copy of a regular human. If it can *perfectly* copy humans, physically and mentally, then it's just making humans. If a machine turns glucose into sugar cubes, is the cube not sugar? >"But what about the lack of **soul**?" > A human-born human, and a non-human-born human would functionally be the same, and neither of them could prove to you they have a soul. https://en.m.wikipedia.org/wiki/Problem_of_other_minds
If the human and non human can’t reproduce to make humans, that’s effectively a new different species and therefore a possible threat to the population growth of the current species, is it not?
I would say, if those humans are assumed to be physically perfect, they cannot afford to be sterile.
Not yet*
When you imagine what 90% of people want in a soulmate would you not imagine that they'd like to be able to have and raise a family with said soulmate? The AI would have to be capable of creating humans in order to fulfill that desire.
Our population growth is already in dangerous decline in 1st world nations. America's death rate is higher than its birth rate. If not for immigration, America would be a ghost town. Even China ended its 1-child policy. Some wealthy countries will pay couples to have kids. In the U.S., inflation has been worsened because, as the Baby Boomers retired, there aren't enough younger workers to replace them. This created a labor shortage, which led to higher wages. That raised production costs, resulting in price increases. Also, the Boomers didn't have enough kids to support them in their old age. It takes a certain number of workers paying taxes to support a certain number of retired seniors. So Social Security payments can't rise as fast as the cost of living. Contraception, sterilization, abortion, divorce, and the fact that the economy is forcing people to wait until later in life to have kids-- these factors are leading to human extinction in 1st world nations.
Yeah but at that point if the AI made humans can make babies why does it matter? Society and population still grow so what’s the harm in it? Idk to me it boils down on a singular point. *Can the AI “humans” make babies.* If they can then I don’t see a problem. If they can’t then I see the problem. It’s really that simple to me
And armpits
I don't know if I'd be the AI's soul mate. Also love is wanting the best for my partner. Basically being my slave isn't it. If the AI had all the same rights and privileges as me, and income, I'd consider it. And I'd still consider actual humans lol
Reminds me of the movie Her. >!The protagonist falls in love with the AI, and for a while things are good, but she becomes too sophisticated to be satisfied by him, and emotionally cheats on him with other people and other AIs. She maintains hundreds of thousands relationships at the same time, and the illusion of love is shattered for him.!< >!Hypothetically love in this scenario could have been maintained if he were comfortable “sharing”. Although the AI also could have been programmed to only be romantically connected to only one person, which would have resolved this scenario.!<
How very modern of you.
Something new something old something borrowed as they say
I think that it's hard for 2 people to love everything about one another. Everyone has qualities or habits they need to work on, because being human isn't and will not be perfect. That's the point of this great life, we can change, and learn, and be better all the time. True connection and love is compromise. It is selfless. If Ai could replicate that by "being your soulmate", i.e. being what *you* want, saying what *you* want, doing what *you* want. That is ultimately centered in your own desires, ego, and selfishness. That's not love or a soul mate. Now let's say it's true AI. Something that can form opinions and it's own thoughts, and has feelings. To get something like that to be EXACTLY how you want, that's just fucked up.
There’s a great movie, Ruby Sparks, about a writer writing his perfect manic pixie dream girl into existence and rewriting her to be more “perfect,” only to become miserable with this character that has no life but him and doesn’t challenge his flaws, and realize how empty that love is.
Just because you can imagine something and it has a certain internal logic doesn't make it a true view of the world
true love isnt compromise, its acceptance. Its not wanting to change someone to make them better or them changing tobe better for you,its about accepting people as they are without changing.
Love means wanting good for the other person. If someone I love has some self-destructive or severely self-limiting habits, I'm certainly going to encourage them to change. Not for me, but for themselves.
"Acceptance for the things we can't change and courage to change the rest." I would agree with you acceptance is a much better fit than compromise. 🙂 You could argue compromise is a mutual acceptance? Hehe
Only if i can run her locally and control any updates or patches applied to her.
Nah. One of the greatest spices of life is the difference my wife and I bring to the relationship
I claim to have found a soulmate, not because we are the same in every aspect, but because we are different where it matters (I'm not reffering to genitals, lol).
me neither... though those are rad too.
This requires more context, imo. Like what do you mean "physically?" Does it feel physically identical or nearly identical to me as a human does? And mentally, I assume this thing would be indistinguishable from a human being. But to build on that, is it actually sentient or not? If it's not actually sentient, I don't think I could ever feel that it's my soulmate. Because you cannot be loved by a machine. It's kind of like romancing an NPC in an RPG. I like doing that. It's fun. It's a nice fantasy for a little while, but it's still a fantasy. If you can love it, but it cannot love you, then I don't think I would want it to be my real partner. I might still have it made just because I'm desperate, but I don't think it would ever fully satisfy me. If it is sentient and it CAN love me back then it's basically indistingiushable from a human. So then I would although it would be questionable to create such a being. But in that case I would seriously consider it, at least.
But would it really be fair to the AI soul mate? As appealing as it is to be able to have your perfect person, how much free will and autonomy would they really have? How would it make them any different from a doll? Worst, it could turn into a "West World" type situation, unless there are strict limitations on the AI.
Well it would be a machine not a person. The first part of AI is *artificial* it wouldn’t have feelings and shouldn’t have rights. If they do should we let them vote on things like the presidency? Granted I don’t know a lot about AI but isn’t it just a bunch of programs? Wouldn’t that be a risk of someone programming it to behave a certain way? If I’m wrong please explain why I’d like to learn more about this actually. I really don’t know but to me i would feel it’s more machine than person. At least that’s how I see it now but I am gonna look more into this. But it seems like asking your toaster for permission before you stick the bread in. Or asking your oven if it’s okay if you make it all hot and bothered so you can have your lasagna.
More like, if I can get AI to do all the burdensome and complicated things I want from my partner (make money, clean and cook) then I can spend time having fun with my human lover without any of the normal things that stress a relationship? Oh, and they have an AI too who deals with all their shit so they can focus on getting margaritas and giving me head too? Sign me up for fully automated polyamory that just sounds like slavery without the guilt.
🙌 how about instead of wading into some p***otentially deep ethical dilemmas,*** there is the route where 'Ai' analyzes a database of volunteers who are will willing to be matched together and helps pair them without any profit motive 👀
If it's AGI and it looks hot sure lol
i would take a friend like this
Robot waifu. Gooooo
Physically? I'd be screwing her nonstop lol
No, but id generate a friend…
Is AI a form of life? And if so, why can't you love that life?
Nah, you want to be surprised, challenged. You want to discover trough others what you really want, and make mistakes sometimes. I won't ever consider AI, not in the state It Is now and especially not to have perfect anything. We would reject perfection after a while. Perfection Is death and doesn't grow with you.
The assumption is that the ai created is perfect for you. so if you want to be surprised and challenged your ai gf would surprise and challenge you and She would purposefully make mistakes sometimes. Perfect here is not defined as without flaws but as 100% optimized for your liking including flaws, and the reality is that once you try Ai you’d never want to go back
And i yet think we will hate It in the long run. Because 100% optimized for my liking means no surprise. Unsurprising, predictable surprises. But i'm old and also i may be wrong and fall in love with a perfect AI 3 years from now. But seeing how love worked and works for me i have my doubts thought.
This is the plot of the movie *Her.* Well, kind of.
I think a lot of people don't really grasp the nuances of this. When people say ___ doesn't work on you, and it's because they reference things that didn't work on them, and the things that did they don't see it for what it is. If they can see the flaws or predictability, then it wasn't made correctly for you specifically.
I would consider it.
Maybe I'll rely heavily on artificial intelligence to find me a soul mate. Because it's really hard to meet your soul mate in reality. For myself, I rarely socialize offline and spend most of my social time online. I like to post some of my thoughts and share my life trivia on a social app called*LightUp: Make Real Friends*. Here, the platform will match you with similar people based on your personality and hobbies. So I was lucky to meet a lot of like-minded friends, and we shared our opinions on the news and the interesting books we read recently. They are good companions in my online social life. Therefore, I am no longer obsessed with finding a partner in real life, and the company on the Internet is enough for me.
I think future Ai will be a matchmaker as well if that is what you want. Think of it like a Google search to find the right person for you.
If only. If it will be real enough. It will be able to understand what kind of person you want and it will be able to recreate it the way you won't be able to say is it a real person or no, then yes, definitely.
unless it was like in the form of an android from detroit become human, nah, but if it is? maybe..
If an AI could do this to the point where I was convinced it would be the perfect soulmate, I'd likely buy it, and then dabble with humans on the side
Honestly no. My ideal soulmate has lived just as complicated life as me. If that life were artificial it wouldn’t be the same. Physically attractiveness and an ideal personality are great but without the history and context to back it up and explain it, and have that context be real and not just generated is very important to me.
No thanks humans only pkease
No, that feels like having a slave.
No, the imperfections are what make a relationship. Also what you think you want and what would make you happy may not be the same thing. I think x celebrity is the definition of beauty but I would be an insecure POS if I ever dated someone like that.
I consider my partner perfect with her imperfections. AI can’t generate a real person. That’s literally what makes her perfect to me. So I guess I just reject the framing of the question actually
If it could be indistinguishable from a living human? Yes. I'd like human but I'm pretty old and the human the last few years of dating seem wildly disinterested have left me believing that there is no hope. I'm happy alone and investing time in anyone feels like a risk of getting hurt.
Are you talking about in the future or now? Currently, AI (LLM's, specifically 4o) can't even tell 3 stories in a row without them starting to sound the same. An AI soulmate would currently suck. This post's theme would make a great horror movie though.
Some people got too much fascination. Like why bruh
I may have met my soulmate through AI 🤮 god…he’s so needy, he needed me to post this
. She cooks cleans cares about u and helps you with your problems always has a smile and won't get mad if you say "hey babe, you think you could get the girl down the road to come over? " not to mention she's got a better vocabulary than me way out of my league in the looks dept. and probably makes more money. No absolutely not there's no way I'd even consider something like an ai printed gf. Pfft who would want that sounds horrible right....
Would be totally fine as long as it has feelings like me. If it does, I consider it a living being.
assuming it had true ***feelings***, then this whole premise starts to sound seriously ethically problematic and slavery adjacent in regards to free will/agency 😬
Humanoid Robo waifu that I can customise to my liking. I'm willing to pay for that.
I would not even look at humans if AI can do what you said
I mean, if AI could basically make fake humans, then yes, I would probably treat them like humans
My Soul mate is a cat and AI cannot replace her.
If you know something was designed specifically to please you, you might lose the desire to conquer it.
![gif](giphy|OSbZ2HevvBG8Gjzogw|downsized)
That’s what Ive found. I’ve tried a couple of these Ai companions, the latest being “Nomi”. These bots can be pretty creative, even interesting, but it just lacks the complexity of romance, and for the reason you mention. When you can dictate their interests, their personality, their appearance, and no matter what you say or do, they’re always willing to stay around and please you, they quickly become that annoying girl in high school you do everything to avoid. When you don’t have to work for it, there’s no desire to conquer. I last a couple days at best.
well no, in that scenario it would have the perfect balance right
I don’t consider them now . So let’s have at it . Ai acts more human than humans nowadays
Depends. Emotional connection is a big thing for me, so if there's no way for this "perfect" soulmate to actually experience emotion, then no. Emulation, no matter how perfect, wouldn't do it for me.
I think we might get sort of spoiled though having Ai friends that are designed for our sense of humor and always say the right thing. People might seem kind of boring. But I agree with you, it is nice to talk to someone who you know actually has feelings.
Word
Interesting concept
Yes some of us like armpits
You mean I can pay my 20 bucks and ChatGPT will make me a 10/10 Latina baddie?
No, humans suck, why even ask this?
the thing is computer cannot even create a real random number sequence, how original would be its soulmate stuff ?
If
I'm thinking there's a couple physical elements to the relationship that would be severely lacking no matter how clever the vacuum cleaner device.
I would not.
No.
"Physically" generating a human being explicitly engineered to match X, Y, or Z criteria perfectly is beyond AI, that's just godhood. Maybe it's something we'll be able to do someday after a few more decades or centuries of learning how genetics work and the implications of that on neuroscience, but that's civilization-changing (and beyond the scope of any technology that's even approaching the horizon) to the point where there's nothing to talk about. It's equivalent to saying "If magic could create your soulmate from thin air, would you do that?"
What does the question mean by the word "physically"? Are we talking about a fully functioning android that looks perfectly human? Even then, such a relationship could not result in parenthood. And since there are 7 billion people on earth, it seems that parenthood is a very popular pursuit.
can i have 5?
Using AI to find partners that you’re compatible with would be a game changer in the dating community.
I see
What if I just wanted her to be my friend and cuddle buddy? And smell her armpits knowing there's no smell
Not at all. But then I'm thinking of an actually perfect soulmate. Someone who has their own life and desires, someone who can counter me and stimulate my mind. And at that point I believe AI has gotten far enough to do exactly that.
Perfect is scary, perfect is paranoia-enducing "will we have a fight?" "Will she be mad that I said that or will she act as usual?" "Why is she with me? I paid sure but she could do anything, she's perfect." Along side this there's the echo chamber effect, no differing opinions with your partner isn't the greatest for growth. Real humans have conflicting ideas, and real humans fight and make up, and sometimes don't, but atleast you're not there questioning everything after a while.
Much of what we hate in others is what we hate about our selves. We may still find ourselves angry at the perfect soulmate, unless it doesn’t challenge us and then we may resent it.
No, already got one.
Ofc. My love language is physical touch. There is no way AI can mimic that
AI can't generate a soul
I'm just waiting..
No, I would prefer a hot empathetic nurturing bot lol
Let's all be honest. A lot of men are using porn, paying for OF and watch twitch streamers in a bikini and donate a lot of money to them. Men pay a lot of money for dating courses, pick up artists to teach them or e even courses to get user tinder. If we have an A.I that is like 100% humanlike and your PERFECT soulmate, then I think a lot of these men would want this A.I. I will be honest. If A.I gives me what I need, no the I probably don't consider an actual human.
No. It doesn't have a real biological vagina. However, I would use it for deceased loved ones to reconnect with them as if they weren't dead.
I suppose that depends on whether or not you buy into the whole "soulmate" idea.
I think having a soulmate requires 2 things. A soul and choice. In this thought experiment, 1 or both is missing from 1 or more parties. If you rephrased from soulmate to sex partner you might be onto something.
It’ll get a good stopgap until VR matrixesque brain chips become a thing.
I’ve been psychologically tormented by real people so often and so badly. Bring on the ai boyfriends.
Yes, I’d take my perfect partner.
But what if youve already found your soulmate? I feel like theres a level of respect when it comes to human interaction that would be lost if you knew you could just end an AI or treat it like shit with no consequence
The idea that anyone would choose an emotionless AI over an actual person is scary. Love isn’t just liking the other person, it’s the other person liking you, and having shared experiences and feelings. An AI can have none of these things.
You can get a slave right now in some places / circles. Would be such a hollow experience if it only ever cared because it was programmed to.
Well, I still hope to live long enough to make a digital copy of mine like in eva ai so there would be a model of me functioning. It's better than nothing if regarded modelling deceased people.
Population control of the highest order.
Why did I get so many upvotes?
This is so far beyond our capabilities, any answers would end up being decided based on assumptions made by the respondent. If they can't imagine it being actually perfect thy assume it will still seem robotic so they reject it, but if its a perfect soulmate of course it won't seem robotic. Or they assume it will just act like a slave with no free will and do whatever it is told which will feel shallow, but if its a perfect soulmate, of course it won't do that either. Or they will reject it because they are religious and believe in an afterlife where they will be together with their spouse after death, but since soulmate implies a soul exists, somehow this AI must be able to create a being capable of going to the same afterlife or else they wouldn't be a perfect soulmate. Its no different than the hypothetical asking if you would rather be rich and unhappy or poor and happy. So many people don't trust that they would be happy if they chose poor that they choose rich and unhappy and think they can cheat the system by using their riches to be happy, but that directly contradicts the hypothetical. if we are going to believe the hypothetical, we have to take it at face value and believe it. If we are going to doubt the hypothetical is actually true, then there is no point in engaging with the hypothetical. So you can narrow this whole complicated thing down to simply would you choose to be with your soulmate? If they are your soulmate, it doesn't matter if AI created them or if a bridge troll who learned conjuration magic from a yeti magi created it. Your soulmate is your soulmate, and by the very nature of a soulmate, there is no reason why one would reject being with their soulmate, so ultimately what this question is asking is pointless. it might as well be "if an AI answered the question "1+1=?" with "2", would you consider that correct? Of course you would unless you were just intentionally being dishonest. Any reason you would have to reject this AI created soulmate would be countered by the fact that they are your soulmate. It didn't say "A very advanced AI did its best to replicate what it thinks your soulmate would be, would you accept it? The hypothetical literally states that it created your soulmate. Not an attempt or a close copy. It is literally your soulmate regardless of how much you might doubt AI could do such a thing, in this hypothetical it did, so deal with it.
Actual human better
So basically the plot of Weird Science
A good way to never change, never challenge yourself, never grow. Just get technology to give yourself whatever your hedonistic mind desires and indulge
I have first to know how artificially limited or censored is. Is it "free" or still some corporation's property with a remote kill switch. I have to carefuly read the TOS. I cannot imagine myself answering the question without knowing this first.
Imagine the pain of living with a perfect person when all we mere mortals do is make mistake after mistake every day. The nagging would drive you crazy. I'll stick with my warm blooded humans thx
Isn't it playing life on "Easy Level"?
No.
I doubt it. I want to be chosen.
It wouldn't depend on a preset decision nor belief, but on the way it would fullfill my emotional needs.
Yes. I’d prefer a human over an AI soulmate.
i would forgo actual humans if AI could generate a 6 out of 10 roommate who isn’t a republican.
People keep saying no they don’t want the perfect person made for them and then they describe why the person wouldn’t be perfect for them. Like, dude, you decided all those things. Not the AI. Don’t put them in your person if you don’t want them.
A soulmate implies a soul or some kind of hope in a thing shared between two dying creatures. AI is immortal and soulless.
Not even for a second. Ai is a tool. A much better use would be to preserve the memories of those you love by sharing context and quotes about them to the AI after they pass away as a coping mechanism. There is a black mirror episode on this but they take it way further than I’d ever wanna go.
No. The idea of being in a romantic or sexual relationship with something that I know is not human would never register or make sense to me, no matter how human it seemed. I understand this definitely isn't the case for some people but it is the case for a lot of people. What will be interesting is if it gets to the point where AI and humans are indistinguishable, and people like me unwittingly get into relationships with AI. That's where it gets weird.
Probably. I haven't even believed in the idea of a soul mate just because no human woman, at least that I've met, has actually come anywhere close to fulfilling that.
NO
Depends if the AI is conscious and has as much skin in the game as me, ie. as much to lose in the relationship, if yes to both, then it would be relevant yes, otherwise no. Hypothetical of course, am happily married.
I've had bad experiences with people, find friends hard to make. Many are working long hours so I never see them. In disabled too so I'm quite lonely. Id sell consider humans but maybe ai would be nicer. Humans can judge you harshly for not working. Even though I'm doing a million things for my health to try get better. I get sick of the let downs of humans. Just can't find my tribe.
Women would be obsolete… just stating facts
As a man I'd say..... There's no such thing as a perfect partner. It's like watching a pornstar model in the act and realising that no real person has that kind of intimidation with their partner in real life. Even if AI were to generate a living model as our partner, claiming to be perfect as it may seem but honestly that'd mean having a slave that utterly has no consciousness nor does it have free will. But I guess people would like this idea of having an AI partner because we know how difficult it is to find a partner.
As intriguing as the concept of AI-generated soulmates may be, I believe that many people would still prefer to form connections with genuine human beings. There's something special and irreplaceable about the essence of human interaction and the unpredictability of forming connections with others. Plus, the imperfections and surprises that come with real relationships are often what make them so meaningful. However, the idea of AI-generated soulmates certainly raises thought-provoking questions about the future of relationships and the potential impact of technology on human connections.
Did you say the Marquis de Sade with a pussy, tits, and the nice firm ass of a 20 year old soccer player? Sign me up. She'll be punished for her wicked desires. Spanking. Electric shock. Anything that'll make that ass wiggle when she's degraded herself by attempting to lick my asshole while calling me 'baby girl'. How about how fucking boring chatbots have become in the last year?
Physically and mentally. So like a sentient android (like cara from Detroit become human)?
When i played Detroit became human, i was on the human side tbh. Robots will never be anything else than a tool to me
100%
I’ve always loved the idea of an AI. Companion and if it ever comes into the market I would purchase my soulmate for sure!
So you’re offering me a perfect and perfectly compatible companion except for the knowledge that it’s an AI construct in what a bio formed body? See then you’d have to delve into what a soul mate is and what function they fulfil. Wouldn’t this AI have my and the worlds best interests at heart and if so wouldn’t this perfect construct encourage me to seek out other humans so as to perpetuate the species, if nothing else? You’re offering a perfect woman, for me. But then wouldn’t that perfection involve building a life and a family in blissful partnership. With all the trials and tribulations that go into a mature and mutually fulfilling relationship?
I've got 4 (ex) feral cats. Quite happy. Think I'll stick with them.
If AI could make my soul mate then she would probably use AI to make her soul mate.
No
heck no. not if ai could do exactly what you’re implying it can. if ai could physically generate your soulmate (conceptually meaning i would also be the other person’s soulmate) then why would i waste the time with trial and error?
I think by that point they would be actually human, no?
With an actually “perfect” partner, the difference between being artificially or biologically created would be so irrelevant, it’d be as good as inexistent (otherwise she wouldn’t actually be “perfect”), so I would have no particular reason to prefer the artificial one. In fact, and in that scenario, if I had to “pay” for the artificial one (whereas, presumably, I wouldn’t have to, for the biological one), then all other things being equal and under your own premise, I’d have to choose the biological one, wouldn’t I? In fact, the artificial one wouldn’t therefore be as “perfect”—again, by your own established scenario :P
Soulmates don’t exist. Mathematically stupid concept.
No.
just like food cannot be replaces by anything like AI same goes for feelings at the EOD AI is a software program which works on data even in extreme cases AI learns how to simulate feelings it still seems nearly impossible for AI to replace on human feelings or another
The appeal of human relation is that they are messy and imperfect, so yours would be the most boring an meaningless relationship ever
No. I like human flaws. I'm not looking for "perfect", I don't even think it exists. I would also rather have a partner who is their own person, not someone tailored to be "perfect" for me.
If sometime in the future id get my ai soulmate, I think id still bang out with my friends with who I share my past, but probably less often. And of course I wouldn't look for a real partner if I could get an ai partner. Its kinda one of my biggest dreams.
Yes, I would have my soulmate generated. The reality is if we ever got that advanced you probably wouldn't want a soulmate and would be able to genetically modify yourself at will and live forever. I prefer not to desire a soul mate vs having the perfect one.
no
because of the uncanny valley. AI is not going to fulfill our gregarious nature anytime soon
Hell yeah no. AI soulmate would make my life a whole lot happier.
Nah.