• @steph@lemmy.clueware.org
    link
    fedilink
    1210 months ago

    Given this trend, GPT 5 or 6 will be trained on a majority of content from its previous versions, modeling them instead of the expect full range of language. Researchers have already tested the outcome of a model-in-loop with pictures, it was not pretty.

    • @theluddite@lemmy.mlOP
      link
      fedilink
      English
      210 months ago

      Yeah absolutely. The Luddite had a guest write in and suggest that if anxiety is the self turned inwards,nthe internet is going to be full of increasingly anxious LLMs in a few years. I really liked that way of putting it.