• 0 Posts
  • 35 Comments
Joined 6 months ago
cake
Cake day: May 11th, 2024

help-circle




  • Hackworth@lemmy.worldtoScience Memes@mander.xyzSquare!
    link
    fedilink
    English
    arrow-up
    6
    ·
    edit-2
    1 month ago

    I don’t really know, but I think it’s mostly to do with pentagons being under-represented in the world in general. That and the specific way that a pentagon breaks symmetry. But it’s not completely impossible to get em to make one. After a lot of futzing around, o1 wrote this prompt, which seems to work 50% of the time with FLUX [pro]:

    An illustration of a regular pentagon shape: a flat, two-dimensional geometric figure with five equal straight sides and five equal angles, drawn with black lines on a white background, centered in the image.















  • Hackworth@lemmy.worldtoScience Memes@mander.xyzI wish I was as bold as these authors.
    link
    fedilink
    English
    arrow-up
    2
    arrow-down
    1
    ·
    edit-2
    4 months ago

    What makes the “spicy autocomplete” perspective incomplete is also what makes LLMs work. The “Attention is All You Need” paper that introduced attention transformers describes a type of self-awareness necessary to predict the next word. In the process of writing the next word of an essay, it navigates a 22,000-dimensional semantic space, And the similarity to the way humans experience language is more than philosophical - the advancements in LLMs have sparked a bunch of new research in neurology.