Why do LLMs make stuff up? New research peers under the hood. (arstechnica.com)
from alyaza@beehaw.org to technology@beehaw.org on 30 Mar 2025 17:30
https://beehaw.org/post/19174694

#technology

threaded - newest

technocrit@lemmy.dbzer0.com on 31 Mar 2025 16:41 collapse

JFC. Computers don’t “make stuff up”. They generate sentences based on statistics and programming. If it’s “wrong”, that’s just the program. There’s no inherent reason why these programs would be magic truth machines. Maybe these people need to figure out their own damn code but that’s not some breakthrough. It’s not some mystery that needs “research”.

TL;DR: GIGO.