• TommySoda@lemmy.world
    link
    fedilink
    English
    arrow-up
    18
    ·
    6 months ago

    Funny how the average person figured this out almost immediately while Google needed half a year to figure it out with their researchers. Almost like they were ignoring it as long as they could for the sake of profit. Fuck around and find out, I guess.

  • requiem@lemmy.world
    link
    fedilink
    English
    arrow-up
    3
    ·
    6 months ago

    Google Researchers Now Also Say We All Should Use Their Shit AI Search That Tells Us To Eat Glue

  • AwkwardLookMonkeyPuppet@lemmy.world
    link
    fedilink
    English
    arrow-up
    4
    arrow-down
    1
    ·
    6 months ago

    They’re admitting that they are the source of a massive problem. But are they going to do anything about it, or keep pushing their shitty, half-baked AI? It’s crazy to me how much worse their AI is than ChatGPT, considering all of the financial and engineering resources available to Google.

  • Please_Do_Not@lemm.ee
    link
    fedilink
    English
    arrow-up
    3
    ·
    6 months ago

    It’s alright guys–I just looked up a solution and Google suggests eating glue and a few small pebbles will solve the issue.

  • mozz@mbin.grits.dev
    link
    fedilink
    arrow-up
    1
    ·
    6 months ago

    I think almost certainly that disinformation based on fake accounts simply posting memes or targeted viewpoints, hoping to send the message through sheer repetition, it still a lot more common than doctored factual information. (Not that that means that faked up disinformation isn’t a problem - just saying I think it’s still relatively rare as a vehicle for disinformation.)

    Why would you even open yourself up to “see, the underlying citation for this thing they’re saying is not true” when you might as well not even enter into the sphere of backing up what you’re saying with facts, and just state your assertions as if they were facts, instead.