Even blenders have safeguards though, if the pitcher isn’t installed most won’t work. I don’t think it’s insane to require some sort of safety with LLMs.
I think the metaphor is finetuning a LLM for ‘safety’ is like trying to engineer the blades to be “finger safe”, when the better approach would be to guard against fingers getting inside an active blender.
Finetuning LLMs to be safe is just not going to work, but building stricter usage structures around them will. Like tools.
This kinda goes against Altman’s assertion that they’re magic crystal balls (in progress), which would pop his bubble he’s holding up. But in the weeds of LLM land, you see a lot more people calling for less censoring, and more sensible and narrow usage.
Even blenders have safeguards though, if the pitcher isn’t installed most won’t work. I don’t think it’s insane to require some sort of safety with LLMs.
I think the metaphor is finetuning a LLM for ‘safety’ is like trying to engineer the blades to be “finger safe”, when the better approach would be to guard against fingers getting inside an active blender.
Finetuning LLMs to be safe is just not going to work, but building stricter usage structures around them will. Like tools.
This kinda goes against Altman’s assertion that they’re magic crystal balls (in progress), which would pop his bubble he’s holding up. But in the weeds of LLM land, you see a lot more people calling for less censoring, and more sensible and narrow usage.
The pitcher doesn’t stop you from sticking your fingers into it if you try, it just makes accidents less likely. Same thing here.