SuspciousCarrot78@lemmy.world to Privacy@lemmy.ml · edit-22 hours agoI'm tired of LLM bullshitting. So I fixed it.codeberg.orgexternal-linkmessage-square87linkfedilinkarrow-up1503arrow-down132file-textcross-posted to: privacy@lemmy.ml
arrow-up1471arrow-down1external-linkI'm tired of LLM bullshitting. So I fixed it.codeberg.orgSuspciousCarrot78@lemmy.world to Privacy@lemmy.ml · edit-22 hours agomessage-square87linkfedilinkfile-textcross-posted to: privacy@lemmy.ml
minus-squareSuspciousCarrot78@lemmy.worldOPlinkfedilinkarrow-up11arrow-down2·edit-260 minutes agodeleted by creator
minus-squareitkovian@lemmy.worldlinkfedilinkarrow-up19arrow-down1·3 months agoAs I understand it, it corrects the output of LLMs. If so, how does it actually work?
minus-squareSuspciousCarrot78@lemmy.worldOPlinkfedilinkarrow-up27arrow-down3·edit-21 hour agodeleted by creator
minus-squareitkovian@lemmy.worldlinkfedilinkarrow-up10arrow-down1·3 months agoThat is much clearer. Thank you for making this. It actually makes LLMs useful with much lesser downsides.
minus-squareSuspciousCarrot78@lemmy.worldOPlinkfedilinkarrow-up17arrow-down2·edit-258 minutes agodeleted by creator
deleted by creator
As I understand it, it corrects the output of LLMs. If so, how does it actually work?
deleted by creator
That is much clearer. Thank you for making this. It actually makes LLMs useful with much lesser downsides.
deleted by creator
Will do.