• 0 Posts
  • 33 Comments
Joined 1 year ago
cake
Cake day: July 9th, 2023

help-circle












  • Ppl misjudge the implications of those priorities.
    Low priority: your shit is broken, well deal with it during office hours
    Medium priority: a whole department’s shit is broken. We’ll stay longer to fix it if needed.
    High priority: EVERYONE’S SHIT IS BROKEN HOLY FUCK, SEVENTEEN TECHS ARE IGNORING THE SPEED LIMIT ON THEIR WAY TO FIX THIS RIGHT NOW




  • “op” you are referring to is… well… myself, Since you didn’t comprehend that from the posts above, my reading comprehension might not be the issue here. \

    But in all seriousness: I think this is an issue with concepts. No one is saying that LLMs can’t “learn” that would be stupid. But the discussion is not “is everything programmed into the LLM or does it recombine stuff”. You seem to reason that when someone says the LLM can’t “understand”, that person means “the LLM can’t learn”, but “learning” and “understanding” are not the same at all. The question is not if LLMs can learn, It’s wether it can grasp concepts from the content of the words it absorbs as it it’s learning data. If it would grasp concepts (like rules in algebra), it could reproduce them everytime it gets confronted with a similar problem. The fact that it can’t do that shows that the only thing it does is chain words together by stochastic calculation. Really sophisticated stachastic calculation with lots of possible outcomes, but still.