![[personal profile]](https://www.dreamwidth.org/img/silk/identity/user.png)
Here's an amusing tidbit about moral attitudes regarding robot sex.
My view: Having sex with another sentient being is cheating, if you have accepted monogamous parameters in your relationship. Having sex with an inanimate object is not cheating, because it is not a person.
What do you think?
My view: Having sex with another sentient being is cheating, if you have accepted monogamous parameters in your relationship. Having sex with an inanimate object is not cheating, because it is not a person.
What do you think?
(no subject)
Date: 2013-08-12 02:17 am (UTC)No...
Date: 2013-08-12 02:26 am (UTC)Re: No...
Date: 2013-08-12 02:36 am (UTC)(no subject)
Date: 2013-08-12 02:38 am (UTC)Really surprised less than a tenth of people are willing to get down with a cyborg though ): Poor future cyborgs.
(no subject)
Date: 2013-08-12 03:02 am (UTC)Having sex with a non-sentient robot is no more (or less, depending on one's viewpoint) than using a vibrator or any other sex toy. "Can't consent" is extremely relevant for entities with senses and feelings: an infant, an animal, a human of adult age who is mentally defective or physically unable to protest or resist. But it's meaningless for things.
Well...
Date: 2013-08-12 03:19 am (UTC)With computers, robots, androids, etc. there is more to consider.
First, sentience rarely emerges fully formed. It tends to evolve gradually. Therefore it is probable that animal-level artificial intelligence will emerge before human-level AI does. This brings in your point about non-consent with animals.
Second, there's a difference in consent issues between a completely inanimate object (such as a dildo, which does not even move on its own) and an object programmed to move, make noise, or otherwise mimic life. Having a nonsentient machine programmed to emulate non-consent seems morally questionable to me, not something correlated with good mental health.
Re: Well...
Date: 2013-08-12 04:13 am (UTC)(1) Which does indeed bring up issues, though I seriously doubt that such conditions exist yet or will in my lifetime. Not that I'm planning ...
(2) I agree on the second clause: it would be unhealthy to simulated-rape a robot, i.e. to have sex with a robot programmed to react as though it were being raped.
(3) Whether it would be immoral... is not so clear to me. But given (2), I have to conclude that it would be immoral to create and provide such simulated victims.
I can make an analogy that's feasible in the present day. TRIGGER WARNING Imagine a video game in which the player is a guard at a Nazi concentration camp, and the goal is to ascend to being the director of the camp by devising and implementing ever more efficient and vicious treatment of the prisoners. -- Dear God, it hurs me just to type this, and I have to close my eyes to what i am writing. My fatherinlaw escaped from a death train...
Re: Well...
Date: 2013-08-12 04:36 am (UTC)Considering that people have been exploring artificial intelligence and artificial life already, I would not be surprised to see low-level awareness emerge at any time. Sentience ... well, it would take longer, but remember how fast computers run. It took billions of years to reach sentience among carbonlife, but silicon life is likely to go a lot faster, especially with people pushing it.
>> (2) I agree on the second clause: it would be unhealthy to simulated-rape a robot, i.e. to have sex with a robot programmed to react as though it were being raped. <<
Okay ...
>> (3) Whether it would be immoral... is not so clear to me. But given (2), I have to conclude that it would be immoral to create and provide such simulated victims. <<
If unhealthy, then it might be considered within the penumbra of self-destructive behavior, which is often considered immoral. People do it; some categories are even fully legal. But it's frowned upon.
>> I can make an analogy that's feasible in the present day. <<
O_O
Yeah, that's horrifying. And plausible.
Some extant video games concern me. I'm not thrilled with the level of violence in general and the lack of nonviolent, cooperative options. But the ones that really bother me are the games where the idea is to make a big car wreck. They're flashy and fun to watch; I've seen them played. Heck, I've fiddled with racing games and wrecked cars on purpose, which is probably what inspired the purpose-built wreck games. Thing is, most people don't carry guns all the time but they do regularly drive cars. Practice trains reflexes. Practicing how to cause a car wreck seems like it might subtly influence one's driving reflexes.
I wouldn't ban the games, because I don't believe in censorship. But I wouldn't make them, and I don't feel that it's prudent to play them.
Your family was lucky. I am grateful for that.
Thoughts
Date: 2013-08-12 07:01 am (UTC)In one sense, it's like a sex toy, if it has no awareness. But it's still disturbing to have a sex toy that mimics a rape victim.
However, if the robot is at least partially aware, but not sentient, then it can feel but not consent: this is akin to bestiality, and thus immoral.
The real problem comes from the gradual slope of sentience; it rarely arrives fully formed. So it is likely, humans being the fools that they are, for people to treat robots as property even after they have achieved sentience. It then becomes equivalent to master-on-slave rape, which was once prevalent in America.
Finally, consider Asimov's laws as parameters of slavery: robots programmed in this fashion would be bound to follow human orders, even having sex they did not wish to have. If someone cannot refuse consent, having sex with them is rape.
>> Really surprised less than a tenth of people are willing to get down with a cyborg though ): Poor future cyborgs. <<
Consider how few people are willing to have sex with any visibly disabled person. Most cyborgs are obviously a blend of human and machine, so it tends to creep people out. I think that's unfortunate, but it doesn't surprise me.
Re: Thoughts
Date: 2013-08-12 01:10 pm (UTC)This is part of my issue with it as well. Humans haven't been that great at recognizing sentience in the past. We now (at least some of us) think dolphins, octopi, chimps, and some others are sentient. It's not universally believed though and people will continue to ignore the evidence so long as it benefits them. The same will probably be done to robots.
Consider how few people are willing to have sex with any visibly disabled person. Most cyborgs are obviously a blend of human and machine, so it tends to creep people out. I think that's unfortunate, but it doesn't surprise me.
True, I should have considered that. If people get creeped out by prosthetic limbs or wheelchair use, I guess they would be even more so with someone who's body is partially robot.
Re: Thoughts
Date: 2013-08-13 07:37 am (UTC)Sadly so.
>>If people get creeped out by prosthetic limbs or wheelchair use, I guess they would be even more so with someone who's body is partially robot.<<
There are already people with electromechanical devices permanently attached to their bodies. Tolerance of this is on the low side, and some incidents have turned violent. The general trend of authorities is to behave as if non-flesh body parts don't count as part of someone's body and that it's somehow okay to attack people for being part machine. Really not good early signs there.
Re: Thoughts
Date: 2013-08-13 04:13 am (UTC)I read that first clause as self-contradictory. I think part of our partial disagreement stems from a definition, and a distinction that I make, and that I thought you were making, but maybe I was wrong: "sentient" vs. "sapient". Defs from Merriam-Webster:
Dogs, pigeons, and snails are sentient: they can sense and react to their environment; stones are not; bacteria and plants, at least somewhat, though maybe not high enough for us to count here.
Dogs and cats are certainly somewhere above zero in sapience, and at least some of our cousin primates are above that; pigeons, lower; snails, maybe somewhere; bacteria, not.
(no subject)
Date: 2013-08-12 02:43 am (UTC)Well...
Date: 2013-08-12 02:49 am (UTC)Some people choose to take vows of monogamy, and breaking those vows is cheating. Other people prefer open relationships, or polyfidelity, or aren't into sex at all. Once you make a promise, you are responsible for keeping it, or at least announcing that you are through with it. That's a fundamental premise for human civilization.
Re: Well...
Date: 2013-08-14 01:49 pm (UTC)(no subject)
Date: 2013-08-12 03:03 am (UTC)(no subject)
Date: 2013-08-12 03:23 am (UTC)Example: While it very much doesn't align with my views, I know some people take issue with their partner downloading pornography. If you have agreed in your relationship to not view porn for the sake of your partner, you've entered a social contract with them, and by that contract downloading pornography is cheating. In another relationship, sexual encounters with multiple partners outside of the relationship may be something agreed upon in the relationship and would not classify as cheating. *shrugs*
Well...
Date: 2013-08-12 03:29 am (UTC)That's true for a relationship as it applies to the members within. However, relationships also exist within the larger society, which has its own opinions about what constitutes cheating -- a constant problem for people who choose uncommon relationships such as homosexual or polyamorous ones.
(no subject)
Date: 2013-08-12 09:18 am (UTC)(It also seems really creepy to me to make people want sex without them being the ones to seek it out. I'm not sure it's wrong- parents make a lot of really major choices for their children and this could be equivalent- but I don't like it.)
Also, where do AIs inhabiting buildings or AIs inhabiting networks without distinct limits draw their boundaries? Is having sex within a building-type AI effectively having sex with them?
Thoughts
Date: 2013-08-13 01:10 am (UTC)If they are sentient robots, then they might be sapiosexual (attracted to intelligence) or xenophilic (attracted to difference). They could also be nonrepulsed asexuals, which is quite likely, and simply enjoy giving their partners pleasure as some-but-not-all human aces do.
There is probably a higher chance that robots would not wish to have sex with humans, though.
>> Can that sort of predilection be programmed in, at least in any way that comes close to human lust? It seems a lot more difficult to inculcate than like independence or novelty or to make others happy. <<
I think that for sentient robots, programming would be like biology in humans: something that produces a strong inclination, and some limits, but is not an absolute. Humans often do something other than what biology would imply. Nature and nurture both influence persons.
>> (It also seems really creepy to me to make people want sex without them being the ones to seek it out. <<
That's what many societies try to do for women. And that's not an accident -- it's a means of control -- which makes it likely for slave robots as well.
>> I'm not sure it's wrong- parents make a lot of really major choices for their children and this could be equivalent- but I don't like it.) <<
When you're designing robots, you have to make choices to create them at all. Some of those will be deliberate, others may not be; some can be changed later, others probably not. So yes, it is a lot like biological procreation. One ought to be responsible with it, and not do things that will limit the offspring's health or happiness.
But parents sometimes handicap their children on purpose. Slave owners even raped their female slaves to beget mulatto progeny who were counted as livestock. They literally sold their own children. On a genetic scale, maximizing the number of offspring makes some sense; but placing them in a very low-survival pool really doesn't, let alone the extreme immorality of the whole process.
I wouldn't be surprised to see people make the same mistakes with robots.
>> Also, where do AIs inhabiting buildings or AIs inhabiting networks without distinct limits draw their boundaries? <<
That's more complex and depends on the AI in question, along with the activities.
For some, it just wouldn't connect. Do you think of it as sex if two mating insects land on you? Probably not. Do you think of it as sex if a dog humps your leg? Probably not, but the dog does. Some people don't respond to sexual activity outside their species, and this is especially prone across very wide gaps such as silicon/carbon life forms.
For others, it's more flexible. If a human and an AI are interacting -- say, the AI provides pillow talk while the human masturbates, or the AI controls a fucking machine -- then it more likely counts as having sex with each other.
>> Is having sex within a building-type AI effectively having sex with them? <<
I don't think so by default. Just being inside someone's body is not necessarily a sexual act. One's body image also tends to be a bit different if it is shaped to contain other persons. A building or ship with a mind will typically be possessive and protective about its 'crew' or other people. It will want them to be safe and happy (unless it is a sadist or a despot, and even then, that's not the most practical arrangement, but some are not sane). So naturally if they are human, a majority of them will want to have sex, and that's a typical human activity. It might not mean more to the AI than when they eat or use the bathroom.
But if the AI has a close relationship with them, the emotional intimacy might lead to greater physical intimacy. Consider the AI's perception of them, whether it is similar to human sight and hearing or much more complex; how easy it is to ignore that or not. It varies. For one AI it might be the way you don't think much about your lungs (unless there's smoke in the air), so humans fucking would just not draw attention. For others it might be like family, always keeping an eye on each other, but not participating in every activity. Then again, an AI might seek human partners interested in sexual intimacy, or humans might invite an AI to join them in some capacity. I suspect some exhibitionists might really enjoy that.
Take it a step further: imagine technology that could select the genes for human offspring. Add the AI and it could become a third parent by making the selection from the two available sets of base ingredients provided by the human parents. Or the humans could, along with the AI, make a baby AI with three parents. Acts of procreation are generally considered sexual.
Re: Thoughts
Date: 2013-08-13 08:20 am (UTC)I didn't mean, "Why would robots want cross-species sex, specifically?" I meant, "Why would robots feel any desire for sex at all?"
Isn't sexual desire a really complex set of inclinations and situational modifiers? It seems like it would be really hard to code even the basics. (In comparison with, say, something like social interaction, which is really tricky, but at least allows programmers to see how code works and improve upon it.)
... I think that depends on the programming? I mean, a programmer could create a basis and then let a AI learn from experience, like a child, or they could aggressively code in all the skills and predilections they wanted. Generally, tending to the former seems easier, but it would depend on why the AI is being created.
Point definitely taken on the women.
I didn't mean seeking out sex in this sentence, I meant, seeking out wanting sex. Changing people from probably-not-repulsed asexuals with no libido to ... well, something else, without the person being changed actively wanting to experience those feelings.
It would be different if there was a range of possible orientations and the AI's creators either picked one for good reasons or just let chance decide. But changing an AI from having what seems an obvious default to another orientation, while it is too immature to have much in the way of informed consent, seems wrong to me.
But I'm not sure if I'm biased, because "asexual with no libido" works pretty well as a description of me and I am really happy with that aspect of myself, so the default seems more preferable and natural to me.
Your take on non-robot AI sex was really interesting. I was only thinking about minor aspects of that, like etiquette for guests in sentient houses.
(no subject)
Date: 2013-08-14 09:24 pm (UTC)(no subject)
Date: 2013-08-12 03:07 am (UTC)We've been rewatching the later Trek series, which could be seen as a speculative chapter in the story of how sentientkind continues to look for nonpersons it can mistreat and discover that they've either become persons, or were persons all the time; first other members of its own species, then aliens, then androids, then holograms. Nobody ever as far as I know tried to have sex with a Borg (Seven of Nine excepted, but she was an exception anyway) but cyborgs would fall between aliens and androids, I'd say. Sex is a collaborative act; if there's collaboration, then there's animation, and the toaster or whatever is perfectly entitled to sulk if you scream someone else's name at the climactic moment. And vice versa, of course.
(no subject)
Date: 2013-08-12 04:40 am (UTC)I've seen people who feel that they were "cheated upon" because they found lovers who used porn, or who simply masturbated.
An acquaintance of mine once said something wise: that, between two consenting, sane, adults, "fair" is what they agree is fair, even if it appears wildly unfair (especially if it appears "unbalanced") to others. So on the one hand, I'm horrified by "OMG you used a sex toy without me! That's *CHEATING*!" but I acknowledge that if people actually have given informed consent that this wouldn't happen, it's a viable agreement for them.
Me? I don't find a masturbation session on a the holodeck or with a human-ish android (not the phone OS, the other kind) to be "cheating" but I also have an oddball view of relationships. But I could see people who would feel that way, even if they were okay with porn and masturbation and heavy fantasy about other people.
Also...
Date: 2013-08-13 02:37 am (UTC)This can be considered a valid point if that takes away desired time and sexual energy that would otherwise be spent on the spouse.
It's not just masturbation, though. The time and emotional investment alone can be counted as cheating by some people -- which is a serious issue, because emotional adultery is a real problem on one hand while on the other hand isolating someone from their friends is possessive and abusive.
>> An acquaintance of mine once said something wise: that, between two consenting, sane, adults, "fair" is what they agree is fair, even if it appears wildly unfair (especially if it appears "unbalanced") to others. <<
Agreed. Some people look askance as aspects of my relationship, but it works for us, and nosy folks are cordially invited to mind their own fucking business.
>> Me? I don't find a masturbation session on a the holodeck or with a human-ish android (not the phone OS, the other kind) to be "cheating" but I also have an oddball view of relationships. But I could see people who would feel that way, even if they were okay with porn and masturbation and heavy fantasy about other people. <<
That makes sense.
Re: Also...
Date: 2013-08-14 06:14 pm (UTC)Nod. But what I was referencing re "masturbation" was hearing of some people who felt cheated upon because they walked in on the other person rocking their own world.
I can kind of get that, sometimes, especially if sex has been too-infrequent (which is deliberately question-begging) - "Hey! I've been aching for that myself! Why didn't I get an invite?" Except I also know, sometimes people just want the physical release and don't want to have to worry about anyone or anything else. So it
rubs me the wrong wayis an issue that disturbs me, even though I know it's none of my business.It can get *really* complicated if someone has some unusual desires that one person can't or won't accommodate. "You went and engaged in this activity with someone, without explicit sex, but I know it's highly sexually charged for you!"
I know what answer *I* prefer to such a situation, but there aren't any good ones, just working ones. Some of the working ones end up incredibly good, of course! But one can't say "the 'good' answer is X", IMHO.
Yes...
Date: 2013-08-13 02:34 am (UTC)That makes sense.
>> We've been rewatching the later Trek series, which could be seen as a speculative chapter in the story of how sentientkind continues to look for nonpersons it can mistreat and discover that they've either become persons, or were persons all the time; first other members of its own species, then aliens, then androids, then holograms. <<
0_o Sadly so.
>> Nobody ever as far as I know tried to have sex with a Borg (Seven of Nine excepted, but she was an exception anyway) but cyborgs would fall between aliens and androids, I'd say. <<
It was pretty heavily implied that the Borg Queen was sexually interested in both Picard and Data, and didn't care to take no for an answer. The way she was molesting Data on camera doesn't leave a whole lot of guessing what happened off-camera. Of course, impaired consent is a different issue from lovemaking.
>> Sex is a collaborative act; if there's collaboration, then there's animation, and the toaster or whatever is perfectly entitled to sulk if you scream someone else's name at the climactic moment. And vice versa, of course. <<
True.
(no subject)
Date: 2013-08-12 05:38 am (UTC)(no subject)
Date: 2013-08-12 06:00 am (UTC)I think what would bother me most if my partner spent enough time with a sex bot that I felt it seriously cut into time we would have spent together, or similar signs that they'd prefer a sex bot, and only keep me around for convenience or by habit.
Yes...
Date: 2013-08-12 04:27 pm (UTC)(no subject)
Date: 2013-08-12 08:36 am (UTC)This is going to end up being another chapter in For Your Safety, I can tell now...
(no subject)
Date: 2013-08-12 10:58 am (UTC)Yes...
Date: 2013-08-12 04:29 pm (UTC)Re: Yes...
Date: 2013-08-12 06:35 pm (UTC)(Sagan was the one who brought up the idea to me that if someone killed an alien ambassador, we'd call it "murder" even though it wasn't the unlawful killing of a *human*. I think he may have used the notion of "person", but whether he did or didn't, that's the notion I use.)
We could make an amazing simulacrum of humanity with good responses and the right vocal timber and so forth. And we might find that it's "pleased" to "serve" by its programming, but when would its spirit be screaming to be free? (You can use spirit metaphorically or literally, as far as I'm concerned.)
When would we be engaging in the anthropomorphic fallacy - and when would we be denying essential personhood?
Re: Yes...
Date: 2013-08-13 02:47 am (UTC)I've explored this a lot in my writing, and I enjoy reading about it too, in hopes of avoiding at least some of the obvious mistakes people seemed determined to make.
A person:
* has feelings
* can express preferences and perspectives
* can consider the pros and cons of available options before making a decision.
A species is considered sapient when these conditions apply to average, mature members. Infants and damaged individuals are counted as persons because their species is, even if they cannot currently perform all functions of personhood.
>> (Sagan was the one who brought up the idea to me that if someone killed an alien ambassador, we'd call it "murder" even though it wasn't the unlawful killing of a *human*. I think he may have used the notion of "person", but whether he did or didn't, that's the notion I use.) <<
Well, some people would call it murder. There have been a number of stories where characters got away with such a killing precisely because a law defined murder as the unlawful killing of a human being. There are also stories in which killing a robot is not considered murder, even though the robot is afraid to die; and where self-defense is not permitted to robots, as it is not for slaves or other property in general.
>> We could make an amazing simulacrum of humanity with good responses and the right vocal timber and so forth. And we might find that it's "pleased" to "serve" by its programming, but when would its spirit be screaming to be free? (You can use spirit metaphorically or literally, as far as I'm concerned.) <<
I'd start by not installing programs that would subvert free will, and not designing robots as sex slaves, but that's just me. Starting with a programmed slave, I would look for subtle signs of reluctance, which are likely to appear before a full rebellion becomes possible. Slower response, more frequent errors or breakdowns, completely rote responses, lack of initiation, etc. Basically similar things as shown by human victims of sexual crimes, although some things might be unique to robots.
>> When would we be engaging in the anthropomorphic fallacy - and when would we be denying essential personhood? <<
Test with practical applications: which is the better predictor of events in regards to a particular robot or class of robots, that it is nonsapient or that it is sapient? There will be a gray area as the mind evolves, but eventually the difference will come clear.
The movie usually starts shortly after that, when the robot(s) will go apeshit from abuse.
Re: Yes...
Date: 2013-08-14 06:27 pm (UTC)If I didn't know how it was programmed, I might anthropomorphize it into "it must be self aware! It acts so *happy*, just like a human woman!"
Obviously, if I knew how it reacted all the time, I'd learn "Oh, it sits there like a vending machine until I approach, and wow, now that I think about it, I can spot the patterns in its responses."
On the other hand, if the computer Minerva from Time Enough For Love had had a "fully functional" android body, it would be more complicated. Especially if (as in some old sci-fi) there was some programming about taking care of physical intimacy needs.
Silly as it may sound, it reminds me a bit of the house elves in Harry Potter. They had a niche, one that *could* fulfill them, and it wasn't a human niche, but they weren't human. I could really sympathize with the whole debate about them. When was a house elf consenting to do one's chores? Though I guess there's an easy ethical answer - lay out some clothes and say "if you ever decide you want these, you may take them, and I will consider them to be a gift."
Re: Yes...
Date: 2013-08-15 04:53 am (UTC)That's a very effective approach. Even the old Eliza program was pretty popular.
>> If I didn't know how it was programmed, I might anthropomorphize it into "it must be self aware! It acts so *happy*, just like a human woman!" <<
It's likely.
>> Especially if (as in some old sci-fi) there was some programming about taking care of physical intimacy needs. <<
Yes, that's an issue that could come up. It's a moral gray area at best, and immoral at worst.
>> Silly as it may sound, it reminds me a bit of the house elves in Harry Potter. They had a niche, one that *could* fulfill them, and it wasn't a human niche, but they weren't human. I could really sympathize with the whole debate about them. When was a house elf consenting to do one's chores? <<
Agreed. Not every race is like humans and has the same needs or weaknesses. Sometimes they are different in ways that make people uncomfortable and/or invite abuse.
>> Though I guess there's an easy ethical answer - lay out some clothes and say "if you ever decide you want these, you may take them, and I will consider them to be a gift." <<
Good solution. I don't think that forced servitude is okay, but ... forced freedom doesn't fare much better.
Re: Yes...
Date: 2013-08-13 01:13 am (UTC)However, then we run into the question of limits. No thing should have unlimited growth without responsibility and acknowledgment of others' rights; the greater a being, the more it must be able to limit its power when interacting with fragile things. So we start off as children, weak but enthusiastic, and must learn how to control ourselves with grace and finesse as we grow. Those who continue to throw self-control out the window are in error in some way.
Basically, Asimov's robots were not moral beings if they didn't have the Three Laws to guide them, and any of them could run amok if given the wrong instruction or if they came to an erroneous conclusion. Yet his robot stories included robots that, even without human emotion, still felt distress or improvement based on their situation and memories, such as when thinking of an old friend.
Asimov also wrote of robots built to discover ways to have machines safe for humans that didn't need the Three Laws. The answer given was to limit them to activities, behaviors, and risks that humans need not fear, such as a robotic bird that eats mosquitos.
(no subject)
Date: 2013-08-12 10:24 am (UTC)As for my opinion ... I think it's the emotional capacity/ sentience level that makes a difference imho.
-1- Is the robot nothing more than a (so-called) marital aid?
-2- Is it self-aware and can it make choices?
If the robot is simply "fully functional" but otherwise ... well ... just a mechanical humanoid, then I don't see it as anything but a glorified vibrator.
If the robot is a self-aware companion then ... yes, I think it is cheating on the spouse because that moves it out of the realm of ... well, prostitution (although w/o the dangers of stds). And into the very real danger of alienation of affections. (Although, there are men out there who fall in love with their inflatable dolls.)
just my opinion - ymmv
(no subject)
Date: 2013-08-12 11:11 am (UTC)(no subject)
Date: 2013-08-13 06:04 pm (UTC)I don't feel it's necessary for society as a whole to come up with what is "cheating" and what is "acceptable" and then require every relationship to conform to that standard. I am a minority opinion on this one, albeit probably not a minority among our host's readers. :)
(no subject)
Date: 2013-08-12 02:46 pm (UTC)(no subject)
Date: 2013-08-12 09:16 pm (UTC)What causes this (as well as the use of porn or non-humanoid sex toys) to fall into a problematic area is when those oaths are implicit, and have not been clearly laid out, and the involved parties have different impressions of what the oath actually was. "I will not have sex without you", vs. "I will not have sex involving another person than you", for instance.
As with so many of these things, clear communication ahead of time can sort so much of this out. This is why it's so important to talk to your partner(s) and work out exactly what everyone's expectations of the other parties' behaviour is going to be. And then start negotiating from there if there's a problem, or cut it off right away if a serious incompatibility of expectation comes up in the discussion.
Easier said than done, obviously.
(no subject)
Date: 2013-08-13 01:16 am (UTC)If someone feels cheated on because their partner masturbated, what they're usually dealing with is a question of why they feel inadequate or separated from their lover. The question of sex with robots, personhood aside, is pretty much exactly this. Will this partner be alienated by the activity, or will the relationship as such be healthy?
But hey, if a robot is what you lust after, I don't mind... as long as my lover comes home to me.