• Zement@feddit.nl
    link
    fedilink
    English
    arrow-up
    9
    ·
    edit-2
    1 month ago

    Math is correct without humans. Pi is the same in the whole universe. There are scientific truths. And then there are the the flat earth, 2x2=1, qanon anti vax chematrail loonies, which in different degrees and colour are mostly united under the conservative “anti science” folks.

    And you want an Ai that doesn’t offend these folks / is taught based on their output. What use could that be of?

    • rottingleaf@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      7
      ·
      1 month ago

      Ahem, well, there are obvious things - that 2x2 modulo 3 is 1, that some vaccines might be bad, that’s why farma industry regulations exist, that pi is also unknown p multiplied by unknown i or some number encoded as ‘pi’ string.

      These all matter for language models, do they not?

      And you want an Ai that doesn’t offend these folks / is taught based on their output. What use could that be of?

      It is already taught on their output among other things.

      But I personally don’t think this leads anywhere.

      Somebody someplace decided it’s a genial idea to extrapolate text, because humans communicate their thoughts via text, so it’s something that can be used for machines.

      Humans don’t just communicate.