• Goodtoknow@lemmy.ca
      link
      fedilink
      arrow-up
      11
      arrow-down
      1
      ·
      1 year ago

      This is more to do with that current LLMs have no concept of what individual letters or characters really are, they only know “tokens” or clumps of letters. But really stupid companies are allowing the spread of LLM puke like this without human oversight or fact checking.

      • tomi000@lemmy.world
        link
        fedilink
        arrow-up
        2
        arrow-down
        8
        ·
        1 year ago

        Sure lets fact check every google response. Wouldnt hurt the economy, thats like a few million new jobs.

        • droans@lemmy.world
          link
          fedilink
          arrow-up
          5
          arrow-down
          1
          ·
          1 year ago

          Everyone who’s actually worked a real job knows it’s better for someone to not do a job at all than to do it 75% right.

          Because now that you know the LLM is getting basic information wrong, you can’t trust that anytime it produced is correct. You need to spend extra time fact-checking it.

          LLMs like Bard and ChatGPT/GPT3/3.5/4 are great at parsing questions and making results that sound good, but they are awful at giving correct answers.

  • tigeruppercut@lemmy.zip
    link
    fedilink
    arrow-up
    11
    ·
    1 year ago

    Last time this came up I was trying to see if Kenya for some reason didn’t start with a k in Swahili, sort of the way that technically no words start with an n in Japanese (they have a syllabary, so ninja starts with ni, Nobunaga starts with no, etc). But in this case I think the AI is just being fucky.

    • CitizenKong@lemmy.world
      link
      fedilink
      arrow-up
      17
      ·
      edit-2
      1 year ago

      It’s likely that the LLM ingested this bad joke and took it for fact:

      “There are no countries in Africa that start with K.”

      “What about Kenya?”

      “Kenya suck deez nuts?”