@misk@sopuli.xyz to Technology@lemmy.worldEnglish • 1 year agoWe have to stop ignoring AI’s hallucination problemwww.theverge.comexternal-linkmessage-square144fedilinkarrow-up1484arrow-down126
arrow-up1458arrow-down1external-linkWe have to stop ignoring AI’s hallucination problemwww.theverge.com@misk@sopuli.xyz to Technology@lemmy.worldEnglish • 1 year agomessage-square144fedilink
minus-square@UnsavoryMollusk@lemmy.worldlinkfedilinkEnglish2•edit-21 year agoThey are right though. LLM at their core are just about determining what is statistically the most probable to spit out.
minus-square@Cyberflunk@lemmy.worldlinkfedilinkEnglish0•1 year agoYour 1 sentence makes more sense than the slop above.
They are right though. LLM at their core are just about determining what is statistically the most probable to spit out.
Your 1 sentence makes more sense than the slop above.