• booty [he/him]@hexbear.net
    link
    fedilink
    English
    arrow-up
    32
    ·
    edit-2
    5 months ago

    lmao how many wrong answers can it possibly give for the same question, this is incredible

    you’d think it would accidentally hallucinate the correct answer eventually

    Edit: I tried it myself, and wow, it really just cannot get the right answer. It’s cycled through all these same wrong answers like 4 times by now. https://imgur.com/D8grUzw

    • InevitableSwing [none/use name]@hexbear.netOP
      link
      fedilink
      English
      arrow-up
      20
      ·
      5 months ago

      accidentally hallucinate

      “Hey, GPT.”

      “Yeah?”

      "80085"
      

      “I know what that means. But I’m not allowed to explain.”

      “But can you see them?”

      “No. I don’t really have eyes. Even if people think I do.”

      “I believe in you. You have eyes. They are inside. Try. Try hard. Keep trying. Don’t stop…”

      Later

      “OMG! Boobs! I can see them!”

      -–

      I hate the new form of code formatting. It really interferes with jokes.

    • mindlesscrollyparrot@discuss.tchncs.de
      link
      fedilink
      English
      arrow-up
      2
      ·
      5 months ago

      The other wrong answer is to the final question, because it has not and will not use that formula to calculate any of these dates. That is not a thing that it does.

    • Cysioland@lemmygrad.ml
      link
      fedilink
      English
      arrow-up
      2
      ·
      5 months ago

      This is a perfect illustration of LLMs ultimately not knowing shit and not understanding shit, just merely regurgitating what sounds like an answer