• Lvxferre@mander.xyz
    link
    fedilink
    arrow-up
    7
    ·
    edit-2
    7 months ago

    Habsburg-AI? Do you have an idea on how much you made me laugh in real life with this expression??? It’s just… perfect! Model degeneration is a lot like what happened with the Habsburg family’s genetic pool.

    When it comes to hallucinations in general, I got another analogy: someone trying to use a screwdriver with nails, failing, and calling it a hallucination. In other words I don’t think that the models are misbehaving, they’re simply behaving as expected, and that any “improvement” in this regard is basically a band-aid being added to humans to a procedure that doesn’t yield a lot of useful outputs to begin with.

    And that reinforces the point from your last paragraph - those people genuinely believe that, if you feed enough data into a L"L"M, it’ll “magically” become smart. It won’t, just like 70kg of bees won’t “magically” think as well as a human being would. The underlying process is “dumb”.

    • Retiring@lemmy.ml
      link
      fedilink
      arrow-up
      2
      ·
      7 months ago

      I am glad you liked it. Can’t take the credit for this one though, I first heard it from Ed Zitron in his podcast „Better Offline“. Highly recommend.