• huginn@feddit.it
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    Here’s a recent story about hallucinations: https://www.cnn.com/2023/08/29/tech/ai-chatbot-hallucinations/index.html

    The tldr is nobody has solved it and it might not be solvable.

    Which when you think of the structure being LLMs… that makes sense. They’re statistical models. They don’t have a grounding in any sort of truth. If the input hits the right channels it will output something undefined.

    The Microsoft guy tries to spin this as “creativity!” but creativity requires intent. This is more like a random number generator outputting your tarot and you really buying into it.