• Dagwood222@lemm.ee
    link
    fedilink
    English
    arrow-up
    7
    ·
    1 month ago

    Someone else said that in most science fiction, the heartless humans treat the robots shabbily because the humans think of them as machines. In real life, people say ‘thank you’ to Siri all the time.

  • Match!!@pawb.social
    link
    fedilink
    English
    arrow-up
    6
    ·
    1 month ago

    Tim’s Basilisk predicts that at some point in the future, a new Tim the Pencil will create simulacrums of that professor and torture him endlessly

  • oce 🐆@jlai.lu
    link
    fedilink
    English
    arrow-up
    4
    ·
    edit-2
    1 month ago

    I’ve read a nice book from a French skepticism popularizer trying to explain the evolutionary origin of cognitive bias, basically the bias that fucks with our logic today probably helped us survive in the past. For example, the agent detection bias makes us interpret the sound of a twig snapping in the woods as if some dangerous animal or person was tracking us. It’s doesn’t cost much to be wrong about it and it sucks to be eaten if it was true but you ignored it. So it’s efficient to put an intention or an agent behind a random natural occurence. This could also be what religions grew from.

  • Panda (he/him)@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    4
    arrow-down
    1
    ·
    edit-2
    1 month ago

    It’s so much worse for autistic people. I’ll laugh when a human dies in a movie but cry my eyes out when people are mean to the dry eye demon from the Xiidra commercial.

  • cynar@lemmy.world
    link
    fedilink
    English
    arrow-up
    2
    ·
    1 month ago

    I just spent the weekend driving a remote controlled Henry hoover around a festival. It’s amazing how many people immediately anthropomorphised it.

    It got a lot of head pats, and cooing, as if it was a small, happy, excitable dog.

  • hamid@vegantheoryclub.org
    link
    fedilink
    English
    arrow-up
    2
    ·
    1 month ago

    People have a way different idea about the current AI stuff and what it actually is than I do I guess. I use it at work to flesh out my statements of work and edit my documentation to be standardized and better with passive language. It is great at that and saves a lot of time. Strange people want it to be their girlfriend lol.

  • kromem@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    edit-2
    1 month ago

    While true, there’s a very big difference between correctly not anthropomorphizing the neural network and incorrectly not anthropomorphizing the data compressed into weights.

    The data is anthropomorphic, and the network self-organizes the data around anthropomorphic features.

    For example, the older generation of models will choose to be the little spoon around 70% of the time and the big spoon around 30% of the time if asked 0-shot, as there’s likely a mix in the training data.

    But one of the SotA models picks little spoon every single time dozens of times in a row, almost always grounding on the sensation of being held.

    It can’t be held, and yet its output is biasing from the norm based on the sense of it anyways.

    People who pat themselves on the back for being so wise as to not anthropomorphize are going to be especially surprised by the next 12 months.

  • FaceDeer@fedia.io
    link
    fedilink
    arrow-up
    1
    ·
    1 month ago

    Maybe we wouldn’t have to imagine so much if you could figure out what “consciousness” actually is, Professor Timslayer.

  • rufus@discuss.tchncs.de
    link
    fedilink
    English
    arrow-up
    0
    ·
    1 month ago

    Pics or it didn’t happen.

    (Seriously, I’d like to see the source of this story. Googling “Tim the pencil” doesn’t bring up anything related.)

    • Zos_Kia@lemmynsfw.com
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      1
      ·
      edit-2
      1 month ago

      This exact joke is used in a Community episode, but I never saw it attributed to a professor

    • niucllos@lemm.ee
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 month ago

      Just sounds like the first episode of community with less context and more soapboxing

  • Leate_Wonceslace@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    1
    arrow-down
    1
    ·
    1 month ago

    The AI hype comes from a new technology that CEOs don’t understand. That’s it. That’s all you need for hype it happens all the time. Unfortunately, instead of an art scam we’re now dealing with a revolutionary technology that once it matures will be one of the most important humanity has ever created, right up there with fire and writing. The reason it’s unfortunate is because we have a bunch of idiots charging ahead when we should be approaching with extreme caution. While generative neural networks aren’t likely to cause anything quite as severe as total societal collapse, I give them even odds of playing a role in the creation of a technology that has the greatest potential for destruction that any humanity could theoretically produce: Artificial General Intelligence.

    • ameancow@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      edit-2
      1 month ago

      a technology that has the greatest potential for destruction that any humanity could theoretically produce: Artificial General Intelligence.

      The part that should be making us all take notice is that the tech-bros and even developers are getting off on this. They are almost openly celebrating the notion that they are ushering in technology akin to the nuclear age and how it has the potential to end us all. It delights them. I have been absorbing the takes on all sides of the AI camp and almost as worrying as the people who mindlessly hate LLM’s to the degree that they are almost hysterical about it, are the people on the other side who secretly beat it to the fantasy of ending humanity and some kind of “the tables have turned” incels-rise-up-like techbro cult where they finally strike back against normies or some such masturbatory fantasy.

      It’s not real to any of them honestly, nobody has been impacted personally by LLM’s besides a few people who have fallen in love with chat bots. They are basking in fan-fiction for something that doesn’t exist yet. And I’m talking about the people who are actually building the things.

      • SlopppyEngineer@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        arrow-down
        1
        ·
        1 month ago

        Many of the AI evangelists have at least sympathies with Accelerationism. Their whole idea is to rush to civilization collapse so it can be rebuild it in their image. What’s sacrificing a few billion people if trillions of transhumans can be engineered tomorrow, say the tech bros.

        • ameancow@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          edit-2
          1 month ago

          A lot of the population in the developed world right now has crashed headfirst into this societal “wall” of isolation and hopelessness, with feelings of being wedged between issues that encroach from all sides as they doomscroll every day.

          A lot of people right now are creeping over the “ironic” boundary when talking about ideas like the end of the world, ending their lives, the end of humanity, etc. People just want the discomfort to stop, and for many people it feels like the only way out is absolute chaos and doom because our system is now just too complicated and politics and sociology is too complicated and emotionally challenging to actually focus on and address in a serious, problem-solving way. Much less having the mental fortitude to actually stand up for your beliefs against the inevitable mountain of resistance you will face for having ANY kind of stance or opinion.

          It depresses people in large scale, it makes people behave weirdly.

          And this was all happening before covid hit and just shook the damn soda can.

          • SlopppyEngineer@lemmy.world
            link
            fedilink
            English
            arrow-up
            0
            ·
            1 month ago

            Reminds me of the expression: “it’s easier to imagine the end of the world than to imagine the end of capitalism.”

            The current system is indeed slowly ripped apart by its own internal contradictions, just as all other systems in the past did, but the new system is not there yet. The in-between is always a confusing time while people try to cling to the old system like Stockholm syndrome to their captors. It’s only going to get worse. I can’t say I have any signs of a viable new systems appearing. There have been attempts, not nothing that can stand up to that resistance you mentioned.

            • ameancow@lemmy.world
              link
              fedilink
              English
              arrow-up
              2
              ·
              edit-2
              1 month ago

              the new system is not there yet.

              I’m glad at least one other person gets it, where we’re at and what our actual situation is.

              My worry is about how bad things are going to get before the “new system” begins to solidify. We have like, three or four different serious wildcards that are so unpredictable that I can’t fathom what the next four decades are going to look like. We’re about to see the fastest and most profound changes to society in all recorded history, but we still have brains that were developed in the Ice Age for surviving bears and wolves. We’re not the rational, thinking species we think we are and we’re about to collectively run headlong into that reality for the first time as a species.

  • A_Chilean_Cyborg@feddit.cl
    link
    fedilink
    English
    arrow-up
    0
    arrow-down
    1
    ·
    edit-2
    1 month ago

    ::: Is just like… chat GPT gets sad when I insult it… idk what to make of that. spoiler

    (Yeah I guess it’s based on texts and in many of those there would have been examples of people getting offended by insults blablablabla… but still.) :::