RobotToaster@mander.xyz to Not The Onion@lemmy.worldEnglish · 2 days agoOpenAI says dead teen violated TOS when he used ChatGPT to plan suicidearstechnica.comexternal-linkmessage-square84linkfedilinkarrow-up1595arrow-down17cross-posted to: technology@lemmy.world
arrow-up1588arrow-down1external-linkOpenAI says dead teen violated TOS when he used ChatGPT to plan suicidearstechnica.comRobotToaster@mander.xyz to Not The Onion@lemmy.worldEnglish · 2 days agomessage-square84linkfedilinkcross-posted to: technology@lemmy.world
minus-squaredouglasg14b@lemmy.worldlinkfedilinkEnglisharrow-up20·1 day agoWut. It’s a fancy algorithmic language prediction engine. Stop anthromorphizing clankers.
minus-squareDragon Rider (drag)@lemmy.nzlinkfedilinkEnglisharrow-up1arrow-down10·22 hours agoChatGPT is a pedophile. An anthropomorphic body is not required to abuse children.
minus-squaredouglasg14b@lemmy.worldlinkfedilinkEnglisharrow-up9·19 hours agoThat’s… Not what anthromorphizing is. It’s assigning human attributes to something not human, which you are clearly doing
minus-squareDragon Rider (drag)@lemmy.nzlinkfedilinkEnglisharrow-up2arrow-down4·15 hours agoChild abuse and pedophilia aren’t human attributes, as proven by the fact that ChatGPT can do them too
Wut.
It’s a fancy algorithmic language prediction engine.
Stop anthromorphizing clankers.
ChatGPT is a pedophile. An anthropomorphic body is not required to abuse children.
That’s… Not what anthromorphizing is.
It’s assigning human attributes to something not human, which you are clearly doing
Child abuse and pedophilia aren’t human attributes, as proven by the fact that ChatGPT can do them too