• 73 Posts
  • 699 Comments
Joined 1 year ago
cake
Cake day: June 6th, 2023

help-circle

  • Because these programs cannot themselves be concerned with truth, and because they are designed to produce text that looks truth-apt without any actual concern for truth, it seems appropriate to call their outputs bullshit.

    This is actually a really nice insight on the quality of the output of current LLMs. And it teaches about how they work and what the goals given by their creators are.

    They are but trained to produce factual information, but to talk about topics while sounding like a competent expert.

    For LLM researchers this means that they need to figure out how to train LLMs for factuality as opposed to just sounding competent. But that is probably a lot easier said than done.