• BluesF@lemmy.world
    link
    fedilink
    arrow-up
    3
    ·
    6 days ago

    It doesn’t understand anything. It predicts a word based on previous words - this is why I called it syntax. If you imagine a huge and vastly complicated series of rules about how likely one word is to follow up to, say, 1000 others… That’s an LLM.

    • Archpawn@lemmy.world
      link
      fedilink
      arrow-up
      1
      arrow-down
      1
      ·
      6 days ago

      It can predict that the word “scales” is unlikely to appear near “books”. Do you understand what I mean now? Sorry, neural networks can’t understand things. Can you make predictions based on what senses you received now?

      • BluesF@lemmy.world
        link
        fedilink
        arrow-up
        1
        ·
        5 days ago

        Well given that an LLM produced the nonsense riddle above, obviously it cannot predict that. It can predict the structure of a riddle perfectly well, it can even get the rhyming right! But the extra layer of meaning involved in a riddle is beyond what LLMs are able to do at the moment. At least, all of them that I’ve seen - they all seem to fall flat with this level of abstraction.