@zaph@Stamets It should be “a bank.” People make bank inquiries. Traditionally they have scales to measure precious materials. They of course guard treasure and precious things.
@zaph@Stamets ai just doesn’t do those relations that well. It knows how riddles are supposed to look like, but I doubt it can do the mental leap between questions and answers
AI doesn’t have a mind to do mental leaps, it only knows syntax. Just a form of syntax so, so advanced that it sometimes accidentally gets things factually correct. Sometimes.
It’s more advanced than just syntax. It should be able to understand the double meanings behind riddles. Or at the very least, that books don’t have scales, even if it doesn’t understand that the scales that a piano has aren’t the same as the ones a fish has.
It doesn’t understand anything. It predicts a word based on previous words - this is why I called it syntax. If you imagine a huge and vastly complicated series of rules about how likely one word is to follow up to, say, 1000 others… That’s an LLM.
It can predict that the word “scales” is unlikely to appear near “books”. Do you understand what I mean now? Sorry, neural networks can’t understand things. Can you make predictions based on what senses you received now?
Well given that an LLM produced the nonsense riddle above, obviously it cannot predict that. It can predict the structure of a riddle perfectly well, it can even get the rhyming right! But the extra layer of meaning involved in a riddle is beyond what LLMs are able to do at the moment. At least, all of them that I’ve seen - they all seem to fall flat with this level of abstraction.
Had a puzzle thrown at me by my DM this weekend.
He admitted it was written by ai. I did not guess correctly.
deleted by creator
What kind of scaley books does your DM have?
I wouldn’t have either. At least partially because I have no idea what scales have to do with books.
Right? Throw in a “I have spine, but no arms or legs. Sometimes, I have dog ears.”
@zaph @Stamets It should be “a bank.” People make bank inquiries. Traditionally they have scales to measure precious materials. They of course guard treasure and precious things.
That’s what I was thinking at first, but since when do banks have riddles? Though maybe in-universe riddles are considered top of the line security.
Well ever set a password or pin or security question for a bank? You might be able to construe these as riddles…
@Archpawn A passcode is a riddle. A lock is a riddle. The unknown contents of a vault are a riddle.
I’m pretty sure a riddle is a bunch of glowing question mark-shaped trophies scattered randomly across a city.
Books have… Scales? What?
AI sucks at creativity
The ghost of a scholarly dragon. No flame because dead but still being inscrutable with people
Why does it have scales but no wings? Also, lots of living dragons don’t have flame and breathe lightning, acid, poison, or cold instead.
That’s a good point. Maybe it’s a living green drake.
But they have wings. Maybe it’s a kobald that likes riddles.
Scales?
@zaph @Stamets ai just doesn’t do those relations that well. It knows how riddles are supposed to look like, but I doubt it can do the mental leap between questions and answers
AI doesn’t have a mind to do mental leaps, it only knows syntax. Just a form of syntax so, so advanced that it sometimes accidentally gets things factually correct. Sometimes.
It’s more advanced than just syntax. It should be able to understand the double meanings behind riddles. Or at the very least, that books don’t have scales, even if it doesn’t understand that the scales that a piano has aren’t the same as the ones a fish has.
It doesn’t understand anything. It predicts a word based on previous words - this is why I called it syntax. If you imagine a huge and vastly complicated series of rules about how likely one word is to follow up to, say, 1000 others… That’s an LLM.
It can predict that the word “scales” is unlikely to appear near “books”. Do you understand what I mean now? Sorry, neural networks can’t understand things. Can you make predictions based on what senses you received now?
Well given that an LLM produced the nonsense riddle above, obviously it cannot predict that. It can predict the structure of a riddle perfectly well, it can even get the rhyming right! But the extra layer of meaning involved in a riddle is beyond what LLMs are able to do at the moment. At least, all of them that I’ve seen - they all seem to fall flat with this level of abstraction.