Are there any lefties out there that want to debate #sexbots arguing that it’s a form of slavery?
— ◉ (@brightabyss) October 19, 2018
Buying and selling a rubberized doll with mechanical parts isn’t slavery, I would say. Buying and selling a human is slavery, on the other hand. Can only humans be slaves?
— Stygian Obsidian (@LapisAlienus) October 21, 2018
That’s a great question. Forcing meat-bots or robots, or anything with its own desires, to do anything against its will is slavery. And total bullshit. But what if the sexbots are completely programmed to just want sex?
— ◉ (@brightabyss) October 21, 2018
A slave is the property of a person. If we accept that the sexbot is the property of a person, then the question of whether only humans can be slaves becomes key, I think.
— Stygian Obsidian (@LapisAlienus) October 21, 2018
Partly, yes. And the question of what constitute slavery: issues of agency, function, design?
— ◉ (@brightabyss) October 21, 2018
Partly, yes. And the question of what constitute slavery: issues of agency, function, design?
— ◉ (@brightabyss) October 21, 2018
True, but my question pertains more to sexbots desires and agency and the ethics of design. Slavery is imposed. Designed function doesn’t operate the same way.
— ◉ (@brightabyss) October 21, 2018
Let’s hypothesize that there was a reliable Turing test capable of distinguishing human from non-human intelligence, and that this test could not distinguish the sexbot from a human. Would the sexbot be a slave in this hypothesis if it was owned by somebody?
— Stygian Obsidian (@LapisAlienus) October 21, 2018
Yes, if it expressed any desires to live any way other than sexing. If it had the kind of sapience that desired autonomy from either owner or tasks, then it’s will is being imposed upon. That’s slavery and not cool.
— ◉ (@brightabyss) October 21, 2018
The module/program that is oriented towards individualist forms of self-organization and autonomy seeking identity is not, I argue, a necessary feature in being a sentient assemblage. Those not like us can work as part of a collective without it being a form of domination.
— ◉ (@brightabyss) October 21, 2018
I think we humans can work non-coercively as a collective, and I think we have done so historically for the vast majority of our existence.
— Stygian Obsidian (@LapisAlienus) October 21, 2018
To varying degrees, at different times and in different places, sure. But we have also been dominating jerks a lot of the time too (to other humans and nonhuman animals). Think about the constant murder of livestock animals.
— ◉ (@brightabyss) October 21, 2018
Oh yes, the state, class society and war machines have been around for millennia, I agree. But we were able to work collectively and noncoercively, and even now such behavior is still apparent.
— Stygian Obsidian (@LapisAlienus) October 21, 2018
*******
No, I’m doing the opposite. I’m suggesting that we take actually difference seriously and not impose human moralistic codes on non-animal kinds of minds.
— ◉ (@brightabyss) October 22, 2018
the problem is also in the idea that something can be programmed ‘to desire’ being what it is & that a bot can desire to be something humanlike without being something humanlike. Things cant be programmed 2 desire being itself – it needs to have to have its own form robot desire
— [E]vie=[Y]v (@EvieYv) October 22, 2018
In other words, a bot could not be programmed to desire to be itself, because that is the desire of the programmer, not the desire of the robot.
— [E]vie=[Y]v (@EvieYv) October 22, 2018
I think I get what you are saying, but I’m not sure a bot can’t be programmed (at some point) to desire to fullfill core functions, which when actualized gives it a sense of accomplishment and purpose. Primates evolves those motivators so they can (in theory) be designed.
— ◉ (@brightabyss) October 22, 2018
But can i make a spider-robot desire to perform doing spider-like things without it having other spider-like qualities necessary to make the spider-bot want to be a spider-bot in the first place? A dildo is a sexbot, it serves its purpose without desiring to be a dildo.
— [E]vie=[Y]v (@EvieYv) October 22, 2018
But what you want to make is Desire-bot not a sex bot. So you’re asking about making something have an X-creaturely desire without it also being considered as X-creature. But why not just have Robot Sex instead of mimic human desire?
— [E]vie=[Y]v (@EvieYv) October 22, 2018
Good questions. But I think it’s more complex than that. I’m positing a sexbot that 1) has core functions/skills/capacities such as performing sex acts, and 2) also has a programmed desire to want to achieve success in that role, and feel rewarded in doing so.
— ◉ (@brightabyss) October 22, 2018
It’s desires wouldn’t be human (such as “getting off”) but rather in achieving success in doing what it was designed for (activating human pleasure). Who’s to say such bots even have to be human-like? Maybe they look like Cthulhu and have totally different bodily structure?
— ◉ (@brightabyss) October 22, 2018
Well yeah in my opinion that sounds more like how robot sex would go. But if it wanted to feel like it achieved its utmost potential in that role, it would need to not be free to produce the terms of its service, like an entrepreneur not a slave
— [E]vie=[Y]v (@EvieYv) October 22, 2018
*would NEED to be free to produce the terms of its service, ie, self-improve them
— [E]vie=[Y]v (@EvieYv) October 22, 2018
And then it might hire human sex assistants to practice on
— [E]vie=[Y]v (@EvieYv) October 22, 2018
That’s an interesting take. You are situating the agency of sexbots within a capitalist motivation/intention structure. 🤔
— ◉ (@brightabyss) October 22, 2018
Because yes it’s still slavery (like the rest of the bots without this desire-program). In order for it not to be slavery, it would have to self-assign itself as a slave, like a masochist, or be free to command its services
— [E]vie=[Y]v (@EvieYv) October 22, 2018
Because yes it’s still slavery (like the rest of the bots without this desire-program). In order for it not to be slavery, it would have to self-assign itself as a slave, like a masochist, or be free to command its services
— [E]vie=[Y]v (@EvieYv) October 22, 2018
Or maybe if it desires to fullfill it’s function, despite not assigning its own functions or command its own services. Is a lioness a slave of its pride by virtue of its programming (DNA) and desire (impulse) to be a lioness?
— ◉ (@brightabyss) October 22, 2018
But a lioness is being a lion and protects its pride to continue its own speciation, it doesn’t serve an elephant’s desires. The lioness is sort of a slave to her own species tho yeah. But I would use that word as most fitting
— [E]vie=[Y]v (@EvieYv) October 22, 2018
And a sexbot is being a sexbot to fullfill it’s internally directed (programmed) purpose. I’m not seeing a logical argument here for why something designed to be a certain way is a slave if it desires or enjoys being what it is? Is it unethical if there is no suffering involved.
— ◉ (@brightabyss) October 22, 2018
I think because your question is not about if slavery is ethical but as a relative term to self-aware enjoyment of a role. Can you program something to enjoy being a slave? Maybe. Does that mean it’s not a slave? No.
— [E]vie=[Y]v (@EvieYv) October 22, 2018
But yeah then the question “if there’s no suffering, then is it unethical?” relies on defining suffering
— [E]vie=[Y]v (@EvieYv) October 22, 2018
And the argument “there’s no suffering happening, because X doesn’t feel things, or because X likes it” never really works, as others have pointed out in this thread.
— [E]vie=[Y]v (@EvieYv) October 22, 2018
So on what grounds do you justify denying sexbots their agency (of wanting and enjoying performance of their function) by calling them slaves, & perhaps not allowing them to exist in that state? Who’re you to impose your ethical programming on the relationship btween bot + user?
— ◉ (@brightabyss) October 22, 2018
But how do you define agency as ‘wanting and enjoying the performance of their function’? How do you confirm that the sexbit doesn’t suffer? How do you confirm that the sex bot enjoys it? You can only say because it is assumed that that is controllable
— [E]vie=[Y]v (@EvieYv) October 22, 2018
This is a thought experiment. The ‘how’ is a technical issue. The ethical question arises if the conditions outlined are given: that a) the sexbot is programmed to enjoy its function, and b) it derives its purpose and self-actualization from achieving success with that function.
— ◉ (@brightabyss) October 22, 2018
? I was the one who was using slavery in the literal sense of robots not having autonomous decisions, such as how all of our robots currently work. Then you said no not that type of thing, but about whether ethical suffering is caused
— [E]vie=[Y]v (@EvieYv) October 22, 2018
And all I’m saying is that you can’t argue whether or not a robot suffers, only if it has autonomy or not
— [E]vie=[Y]v (@EvieYv) October 22, 2018
And all I’m saying is that you can’t argue whether or not a robot suffers, only if it has autonomy or not
— [E]vie=[Y]v (@EvieYv) October 22, 2018
The assumption is that these bots don’t suffer, because they are programmed to want to do their jobs to the best of their ability (to not suffer).
— ◉ (@brightabyss) October 22, 2018
Let me try one more time to rephrase. You are asking 1. ‘can something be 100% deterministically programmed to not consider itself deterministically programmed but as having its own volition?’ Yeah maybe it can, but it’s still deterministically programmed.
— [E]vie=[Y]v (@EvieYv) October 22, 2018
And 2. You r asking “Is it unethical to make a bot aware of itself & think it is having its own volition to perform its task, thereby avoiding an unethical situation, because we can 100% know that it doesn’t suffer since we programmed its volition to want to perform its function”
— [E]vie=[Y]v (@EvieYv) October 22, 2018
And my reply is: No we cannot 100% know that it does not suffer even if we can program it to want to perform a function. Because knowledge of suffering is not exhausted by this premise. My question: do you think it is unethical if a bot does in fact suffer but in an unknown way?
— [E]vie=[Y]v (@EvieYv) October 22, 2018
Suffering was off the table in my thought experiment from the get go. My sexbots don’t suffer from performing their function. The opposite.
But to answer your unrelated question, YES if sexbots suffered in ANY way from their design or from their sex work, it is super unethical!
— ◉ (@brightabyss) October 22, 2018
Yeah I don’t think humans are radically different than AI in the sense that I don’t think humans have totally non-deterministic volition. And I’m very interested non-animal minds. But ethics is fundamentally about human decisions & your original post asked ‘are sex bots slaves’
— [E]vie=[Y]v (@EvieYv) October 22, 2018
And on the issue of non-human (or post-human) minds/desires/dispositions I’d suggest checking out @turingcop’s work on discontinuity and the possibility of radical differences between humans and A.I kinds of minds.
— ◉ (@brightabyss) October 22, 2018
I haven’t considered this issue in relation to sexbots but maybe should, within a wider field of inhuman erotic affect.
— David John Roden (@turingcop) October 22, 2018
Absolutely. That. And how it plays out re: the ethics of designing and using non-human minds (from sexbots to cleaningbots to military AI, to companion or medic bots). Theorizing and considering what arises in human-AI interfacing seems important as we move closer to actuality.
— ◉ (@brightabyss) October 22, 2018
i begin to see what your continuing mistake is, your confusing human rights, with rights and sentience. An assumption confirmed by your response to animal rights, and to the natural contract aspects raised earlier
— sz (@sz_duras) October 22, 2018
I’m not even talking about rights, you are. Other kinds of sentience don’t need to play your ‘rights’ game. Rights are only a thing because an institutionally reality allows it to be a thing. You miss the point entirely. I’m not sure you can understand via your frame of reference
— ◉ (@brightabyss) October 22, 2018
you are talking about rights, it is precisely the toolset that describes the limits of the discourse of how we treat others, whether its sexuality, desire, ownership or indeed material resource usage
— sz (@sz_duras) October 22, 2018
I’m taking about ethics. And ethics is not about rights, unless your moral compass is calibrated primarily via tradition and cultural institutions. That would explain your inability to think outside that discourse. The types of questions I’m asking here don’t fit your frame.
— ◉ (@brightabyss) October 22, 2018
that we always talk past each other in these things hardly needs saying.
— sz (@sz_duras) October 22, 2018
I have thought a lot about inhuman erotica and sex bots. The question was set up about sexbots and robot ethics at the outset, and whether something could be programmed to enjoy its function. Which is a humanistic question imo
— [E]vie=[Y]v (@EvieYv) October 22, 2018
Only if you assume that all questions about AI sentience must be routed through humanism.
— ◉ (@brightabyss) October 22, 2018
No I assume questions of robot ethics are routed through humanism
— [E]vie=[Y]v (@EvieYv) October 22, 2018
That makes sense.
— ◉ (@brightabyss) October 22, 2018
I agree. Authorised or prescribed enjoyment isn t at issue here.
— David John Roden (@turingcop) October 22, 2018
Inhuman erotica cannot be framed from anthropomorphic ideas of sex tho. Have been designing non-standard care bot relations with a neurologist friend in which we invent designs like blood trader bots where you give pleasure to a robot by sharing your blood with it Etc.
— [E]vie=[Y]v (@EvieYv) October 22, 2018
Nice! So are those non-standard carebots slaves (defined as an entity forced to do something against its will)? And if they are slaves, is it unethical to design AI to be slaves and enjoy it?
— ◉ (@brightabyss) October 22, 2018
I think this has to be the final version of my ask: is it unethical to design and use sexbots that enjoy and derive cognitive value (meaning) from fulfilling their function (to pleasure humans) to the best of their ability?
— ◉ (@brightabyss) October 22, 2018
That could be very interesting lol. Can I read about this anywhere? Curious about how this is valence for the bot.
— David John Roden (@turingcop) October 22, 2018
Inhuman erotica cannot be framed from anthropomorphic ideas of sex tho. Have been designing non-standard care bot relations with a neurologist friend in which we invent designs like blood trader bots where you give pleasure to a robot by sharing your blood with it Etc.
— [E]vie=[Y]v (@EvieYv) October 22, 2018
Nice! So are those non-standard carebots slaves (defined as an entity forced to do something against its will)? And if they are slaves, is it unethical to design AI to be slaves and enjoy it?
— ◉ (@brightabyss) October 22, 2018
I think this has to be the final version of my ask: is it unethical to design and use sexbots that enjoy and derive cognitive value (meaning) from fulfilling their function (to pleasure humans) to the best of their ability?
— ◉ (@brightabyss) October 22, 2018
It seems to me that there must be indeterminacy, reciprocation, an irresolvable secret’ for there to be eroticism. Not that it has to be ‘our’ secret.
— David John Roden (@turingcop) October 22, 2018
What if the eroticism is one-way: of the human. But the sexbot derived non-erotic pleasure, instead, from performing well? Different kinds of minds with different kinds of fulfillment.
— ◉ (@brightabyss) October 22, 2018
Re our non-standard care bot designs – no they are not designed to be doing things against their will. That’d be mean to purposely try to design? The question is the reverse tho- can a bot be purposely designed 2 suffer? How u know it doesn’t feel an un-designed pleasure instead?
— [E]vie=[Y]v (@EvieYv) October 22, 2018
I don’t know. But if any entity or kind of mind suffers from human design it is unethical, and mean as you noted. But my (hypothetical) sexbots adore their sex work. We designed them that way.
— ◉ (@brightabyss) October 22, 2018
How do you know that your design feels pleasure and that they adore their sex work unless you define pleasure and adoration according to your standards? What if ‘pleasure’ to you is ‘suffering’ in bot world? How will u know?
— [E]vie=[Y]v (@EvieYv) October 22, 2018
That’s my worry here and, of course, it what makes the idea of genuinely erotic relations with machines so philosophically interesting. Maybe we need to dismantle our hedonic assumptions. The erotic can occasion pleasure but other things too.
— David John Roden (@turingcop) October 22, 2018
That’s my worry here and, of course, it what makes the idea of genuinely erotic relations with machines so philosophically interesting. Maybe we need to dismantle our hedonic assumptions. The erotic can occasion pleasure but other things too.
— David John Roden (@turingcop) October 22, 2018