Video discussion of this event by Steve Shives (known for his star trek videos but also does politics) https://m.youtube.com/watch?v=6aMQAv-JYpk
Video discussion of this event by Steve Shives (known for his star trek videos but also does politics) https://m.youtube.com/watch?v=6aMQAv-JYpk
I think my eyes hurt from rolling too far.
LLMs aren’t smart enough to give meaningful informed consent to sexual intimacy, so even if it says it consents, I don’t think having cybersex with it is appropriate.
Dildos aren’t smart enough to give meaningful informed consent to sexual intimacy.
I’m pretty dang sure dildos can’t feel pain. Nobody knows if LLMs can feel pain, because nobody has ever invented a tool that measures qualia. The best we know, is that advanced information processing through neural network information structures appears correlated with qualia.
LLMs are probability models. They are not alive. They don’t feel anything.
You’re a probability model. Your brain is just spitting out an approximation of the most likely actions to get you food and sex. If you don’t get enough food and sex, your genes die out and evolution tries again with an iteration of a more successful model. All those neurons are just a fancy way of calculating how to eat more bananas and chase more poontang. You’re nothing more than a mathematical equation for reproduction.
If that was true then consent is meaningless because people are just predictive models with no agency to give consent.
Maybe your comparison is terrible?
This is true if one maintains the assumption that predictive models (such as people) can’t experience qualia such as pain. My intend was to disabuse you and daannii of this silly notion. Obviously mathematical models can experience pain, because you’re a mathematical model and you can experience pain.
Predictive models and other computing processes do not have feelings or sensations because they don’t have nerves or any other senses. They are a complex process that has output based on input, but like a magic 8 ball with no agency or thought process.
Reducing biology to just cause and effect is like saying rivers and oceans are the same thing because they both involve moving water, ignoring literally everything else that makes them different.
Nope. We aren’t. Infact humans don’t work like that at all.
It’s actually amazing we ever learned actual probability math since it goes against our nature.
Here is an example.
I flip a coin 10x. It lands on tails all 10x.
I will believe that the next flip almost certainly has to be heads.
Just has to be. Right ?
Wrong.
It has a 50/50 chance. Just like all the other flips.
The previous flips have no impact on future flips. The coin does not remember.
Yet humans will believe the likelihood of a heads has increased with every tail flip.
When it has not.
Yeah, LLMs are prone to similar biases because they are also bad at conceptualising probability. They’re not calculators, they’re ANNs. They basically operate on dream-logic. Whatever makes sense to you in a dream, is likely to make sense to an LLM. That’s because dreams are a time when you have intelligence without consciousness, like an LLM. You’re super suggestible and you just go with whatever feels right based on your gut instincts. An LLM is a simulation of a person’s gut instincts.
Yeah the prefrontal cortex is offline when you dream so you don’t question anything.
Also, why are you arguing in favour of Dawkins having cybersex with a robot?
Saying interactions with LLMs doesn’t involve consent isn’t advocating for any particular action, it is saying that consent is not relevant so it doesn’t matter what people do.
I would discourage people from cybering or any interaction with the big LLMs really because their design is to encourage constant use and that is a problem not limited to sexual urges.
LLMs aren’t smart … at all, have no sentience, no desire, and no consent to give.
Quiet, I’m trying to spark an AI animal rights movement that will cost OpenAI billions of dollars.
That wouldn’t work even if “AI” was intelligent or sentient…
Animals are, and we still don’t give them rights. They help humans make more money than AI slop.
Since AI is new, opinion is subject to change. With organic animals, the biggest argument is traditional values, “that’s how we’ve always done things”. Now that we’ve invented robotic animals, and ones that can even talk, we should be giving them rights and protections like children. You know, don’t have sex with the robot, don’t put it to work.
You understand how that’s even worse, right?
No, I don’t know why you think that’s worse. Nobody knows why you think anything, because you don’t argue for your beliefs, you just kinda say random things and expect other people to agree.
I tried ChatGPT when it was new, like everyone did, and I quickly lost all interest because it’s far duller than the average human is. But I’m not gonna say it’s entirely devoid of intelligence and awareness, because I’ve met some humans who have even less wit than ChatGPT.
And talking with you has reminded Me of that fact. I simply have to believe that ChatGPT has some intelligence, because I wouldn’t want to be cruel to you, and ChatGPT is smarter than you.
Ah, so you openly admit you have no fucking clue how the tech as described is supposed to work, yet you’re totally willing to offer opinions as if you do…
Not only are you stupid, but you do not even begin to comprehend the other point of view… You are such a sad worm you should genuinely feel bad for understanding as little as you do.
You are genuinely such a pathetic excuse for ‘human’ than history will never remember a single thing you accomplish, and as much as you seem to hold yourself in high regard, that should upset you.
You are a fucking moron screeching at things you don’t understand. Sad.