• eatCasserole@lemmy.world
    link
    fedilink
    arrow-up
    50
    ·
    6 months ago

    Factual accuracy in LLMs is “an area of active research”, i.e. they haven’t the foggiest how to make them stop spouting nonsense.

    • Swedneck@discuss.tchncs.de
      link
      fedilink
      arrow-up
      27
      ·
      6 months ago

      duckduckgo figured this out quite a while ago: just fucking summarize wikipedia articles and link to the precise section it lifted text from

    • Excrubulent@slrpnk.net
      link
      fedilink
      English
      arrow-up
      12
      ·
      edit-2
      6 months ago

      Because accuracy requires that you make a reasonable distinction between truth and fiction, and that requires context, meaning, understanding. Hell, full humans aren’t that great at this task. This isn’t a small problem, I don’t think you solve it without creating AGI.