Martineski@lemmy.fmhy.mlM to Singularity | Artificial Intelligence (ai), Technology & Futurology@lemmy.fmhy.mlEnglish · edit-21 year agoMoasicML open sources new 8k context length MPT-30B language model under Apache 2.0 license (22.06.2023 blogpost)www.mosaicml.comexternal-linkmessage-square17fedilinkarrow-up17arrow-down10cross-posted to: models@lemmy.intai.tech
arrow-up17arrow-down1external-linkMoasicML open sources new 8k context length MPT-30B language model under Apache 2.0 license (22.06.2023 blogpost)www.mosaicml.comMartineski@lemmy.fmhy.mlM to Singularity | Artificial Intelligence (ai), Technology & Futurology@lemmy.fmhy.mlEnglish · edit-21 year agomessage-square17fedilinkcross-posted to: models@lemmy.intai.tech
minus-squareBehohippy@lemmy.worldlinkfedilinkEnglisharrow-up3·1 year agoStill had some reasoning issues, but looking forward to the fine tunes!
minus-squareMartineski@lemmy.fmhy.mlOPMlinkfedilinkEnglisharrow-up2·edit-21 year agoDoes prompting in a way to improve it’s reasoning work? I’m talking about CoT or ToT or other prompting strategies.
Still had some reasoning issues, but looking forward to the fine tunes!
Does prompting in a way to improve it’s reasoning work? I’m talking about CoT or ToT or other prompting strategies.