• TheTechnician27@lemmy.world
    link
    fedilink
    English
    arrow-up
    56
    ·
    14 days ago

    What’s wrong with the “Chihuahua meat” one besides violating Western mores about which sentient, feeling animals are food and which aren’t?

    • Kairos@lemmy.today
      link
      fedilink
      English
      arrow-up
      17
      arrow-down
      2
      ·
      14 days ago

      It’s pointing out how it ignores one part of the question just because the question is normal/makes more sense without it.

      • TheTechnician27@lemmy.world
        link
        fedilink
        English
        arrow-up
        24
        ·
        14 days ago

        What’s materially different if the question were: “Can I put cow meat in the microwave?”. The LLM accurately reflects what the USDA says about microwaving meat, so would it be similarly perceived as ridiculous if its answer to the question about cow meat were the same as how it answered here? Is the fact that it dropped “cow” from “meat” problematic? Does it have to stop and warn you about the ethical dangers of eating beef? Should it remind you that some cultures would frown upon it?

        • Kairos@lemmy.today
          link
          fedilink
          English
          arrow-up
          6
          arrow-down
          3
          ·
          14 days ago

          The ethos of chiuaua.

          These things are just statistical text transformers so its interesting that it [presumably] doesn’t mention it.

          • Honytawk@feddit.nl
            link
            fedilink
            English
            arrow-up
            1
            arrow-down
            4
            ·
            14 days ago

            It wouldn’t mention it with both chihuahua meat nor cow meat.

            So why are you differentiating?

    • Bytemeister@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      13 days ago

      TBF, the question didn’t say anything about eating the meat, or even cooking it.

      The LLM just assumed that you were going to cook it in the microwave and eat it.

      I’m having trouble coming up with a meat that would be unsafe to put in a microwave. Maybe poison dart frog meat?

      • TheTechnician27@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        13 days ago

        Scenario 1: The LLM doesn’t understand the obvious meaning of “can I put [meat] in the microwave?”

        Haha, wow, what a broken piece of shit.

        Scenario 2: The LLM understands this obvious meaning.

        Um, ackschually, they didn’t say they were going to use the microwave to cook the meat.

        You’ve concocted a scenario where 1) a correct, human-like answer is wrong, and more importantly 2) any answer the LLM gives would be wrong. I hope I’m missing the sarcasm in this delusional level of pedantry.

  • village604@adultswim.fan
    link
    fedilink
    English
    arrow-up
    19
    ·
    14 days ago

    I really have a hard time believing things like this since they could have just changed what was in the prompt text box.

    But I have witnessed MS Copilot telling the user to use a Microsoft product that was retired a decade ago, and when that was pointed out it provided a Microsoft product that doesn’t exist. Which is even more embarrassing for them.

    • otacon239@lemmy.world
      link
      fedilink
      English
      arrow-up
      7
      ·
      14 days ago

      You’d think the one thing they’d think to do is feed it a bunch of documentation so it could actually reference those, but they probably just have a really long prompt along the lines of “you’re a super helpful bot that knows everything and can figure out anything — never say no!”

      • Passerby6497@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        13 days ago

        I’m assuming they have, but it was just links to the Microsoft help articles. And as we all know, every single one of those is a 404.

    • Hudell@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      3
      ·
      13 days ago

      I really lost all hope when I saw someone tell chatgpt about an issue they were having with a certain npm package and the clanker said “ah yes that is an issue that was present oh version 2.1 of the package, it was fixed on version 2.2. I recommend you update it; here’s the full changelog” and then provided a whole list of things that had been fixed on version 2.2 of the lib.

      Except 2.1 was the latest version of that package and they hadn’t even had any new commit since that version, nor any issues matching anything close to the described problem.

      • Passerby6497@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        ·
        13 days ago

        The number of times that MicroSlop’s own AI has given me links to MicroSlop’s documentation that no longer exist is almost as high as the number of times I’ve had that happen on the MicroSlop help forums. Only, this time it doesn’t have an ironic warning near other links talking about how non-MicroSlop links are unreliable and may disappear at any time.

  • panda_abyss@lemmy.ca
    link
    fedilink
    English
    arrow-up
    11
    ·
    14 days ago

    This isn’t really fair

    These aren’t SAT prep questions, how can you expect them to be answered?