Need to let loose a primal scream without collecting footnotes first? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid: Welcome to the Stubsack, your first port of call for learning fresh Awful you’ll near-instantly regret.

Any awful.systems sub may be subsneered in this subthread, techtakes or no.

If your sneer seems higher quality than you thought, feel free to cut’n’paste it into its own post — there’s no quota for posting and the bar really isn’t that high.

The post Xitter web has spawned soo many “esoteric” right wing freaks, but there’s no appropriate sneer-space for them. I’m talking redscare-ish, reality challenged “culture critics” who write about everything but understand nothing. I’m talking about reply-guys who make the same 6 tweets about the same 3 subjects. They’re inescapable at this point, yet I don’t see them mocked (as much as they should be)

Like, there was one dude a while back who insisted that women couldn’t be surgeons because they didn’t believe in the moon or in stars? I think each and every one of these guys is uniquely fucked up and if I can’t escape them, I would love to sneer at them.

(Credit and/or blame to David Gerard for starting this.)

  • YourNetworkIsHaunted@awful.systems
    link
    fedilink
    English
    arrow-up
    0
    ·
    1 month ago

    Easy Money Author (and former TV Star) Ben Mckenzie’s new cryptoskeptic documentary is struggling to find a distributor. Admittedly, the linked article is more a review of the film than a look at the distributor angle. Still, it looks like it’s telling the true story in a way that will hopefully connect with people, and it would be a real shame if it didn’t find an audience.

    • irelephant [he/him]@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      0
      ·
      edit-2
      1 month ago

      Irrelevant. Please stay on topic and refrain from personal attacks.

      I think if someone writes a long rant about how germany wasn’t at fault for WW2 in a COC for one of their projects, its kinda relevant.

    • swlabr@awful.systems
      link
      fedilink
      English
      arrow-up
      0
      ·
      1 month ago

      network state

      Great, a new stupid thing to know about. How likely is it that a bunch of people that believe they are citizens of an online state will become yet another player in the Stochastic Terrorism as a Service industry?

      • Sailor Sega Saturn@awful.systems
        link
        fedilink
        English
        arrow-up
        0
        ·
        1 month ago

        I’m using the term a bit loosely to mean “libertarian citadel except with techies”. Though I think the phrase is technically supposed to mean a nation that starts out as an online community.

        Anyway for some reason these weirdos all have this idea that if it wasn’t for all those pesky regulations and people they could usher in a glorious new sci-fi and/or cryptocurrency society. Like look at this example: this B-list CEO in the apartment rental business thinks he’ll be the ruler of a fiefdom that brings about AGI, Quantum Computing, a nuclear energy revolution, and sci-fi materials. It’s delusional, or at best it’s grift.

        The canonical example of network state is Balaji Srinivasan’s Network School. He owns(?) a building in Forest City, Malaysia (or as he calls it: an island in an undisclosed location off the coast of Singapore). But in a broad sense it’s useful to consider everything from Sidewalk Labs to California Forever to the M.S. Satoshi.

    • BlueMonday1984@awful.systemsOP
      link
      fedilink
      English
      arrow-up
      0
      ·
      1 month ago

      I wouldn’t be shocked about it - the general throughline of “AI rots your brain”, plus the ongoing discussion of AI in education, would give any shrewd politician an easy way to pull a Think Of The Childrentm on the AI industry, with minimal risk of getting pushback.

  • swlabr@awful.systems
    link
    fedilink
    English
    arrow-up
    0
    ·
    1 month ago

    Doing some reading about the SAG-AFTRA video game voice acting strike. Anyone have details about “Ethovox”, the AI company that SAG has apparently partnered with?

      • o7___o7@awful.systems
        link
        fedilink
        English
        arrow-up
        0
        ·
        1 month ago

        If tptacek weren’t a chicken he’d go ask Ed directly. Ain’t like he’s hard to find.

        • YourNetworkIsHaunted@awful.systems
          link
          fedilink
          English
          arrow-up
          0
          ·
          1 month ago

          Given the relative caliber of those two I think this may be considered an attempted inducement to suicide by better writer. Not that I’m complaining, mind you.

    • YourNetworkIsHaunted@awful.systems
      link
      fedilink
      English
      arrow-up
      0
      ·
      1 month ago

      I do think Ed is overly critical of the impact that AI hype has had on the job market, not because the tools are actually good enough to replace people but because the business idiots who impact hiring believe they are. I think Brian Merchant had a piece not long ago talking about how mass layoffs may not be happening but there’s a definite slowdown in hiring, particularly for the kind of junior roles that we would expect to see impacted. I think this actually strengthens his overall argument, though, because the business idiots making those decisions are responding to the thoughtless coverage that so many journalists have given to the hype cycle just as so many of the people who lost it all on FTX believed their credulous coverage of crypto. If we’re going to have a dedicated professional/managerial class separate from the people who actually do things then the work of journalists like this becomes one of their only connectors to the real world just as its the only connection that people with real jobs have to the arcane details of finance or the deep magic that makes the tech we all rely on function. By abdicating their responsibility to actually inform people in favor of uncritically repeating the claims of people trying to sell them something they’re actively contributing to all of it and the harms are even farther-reaching than Ed writes here.

  • rook@awful.systems
    link
    fedilink
    English
    arrow-up
    0
    ·
    1 month ago

    I might be the only person here who thinks that the upcoming quantum bubble has the potential to deliver useful things (but boring useful things, and so harder to build hype on) but stuff like this particularly irritates me:

    https://quantumai.google/

    Quantum fucking ai? Motherfucker,

    • You don’t have ai, you have a chatbot
    • You don’t have a quantum computer, you have a tech demo for a single chip
    • Even if you had both of those things, you wouldn’t have “quantum ai”
    • if you have a very specialist and probably wallet-vaporisingly expensive quantum computer, why the hell would anyone want to glue an idiot chatbot to it, instead of putting it in the hands of competent experts who could actually do useful stuff with it?

    Best case scenario here is that this is how one department of Google get money out of the other bits of Google, because the internal bean counters cannot control their fiscal sphincters when someone says “ai” to them.

    • V0ldek@awful.systems
      link
      fedilink
      English
      arrow-up
      0
      ·
      edit-2
      1 month ago

      Quantum computing reality vs quantum computing in popculture and marketing follows precisely the same line as quantum physics reality vs popular quantum physics.

      • Reality: Mostly boring multiplication of matrices, big engineering challenges, extremely interesting stuff if you’re a nerd that loves the frontiers of human knowledge
      • Cranks: Literally magic, AntMan Quantummania was a documentary, give us all money
    • BlueMonday1984@awful.systemsOP
      link
      fedilink
      English
      arrow-up
      0
      ·
      1 month ago

      Best case scenario here is that this is how one department of Google get money out of the other bits of Google, because the internal bean counters cannot control their fiscal sphincters when someone says “ai” to them.

      That’s my hope either - every dollar spent on the technological dead-end of quantum is a dollar not spent on the planet-killing Torment Nexus of AI.

    • jonhendry@awful.systems
      link
      fedilink
      English
      arrow-up
      0
      ·
      28 days ago

      this is how one department head at Google gets more money for his compensation package out of the other bits of Google

  • BlueMonday1984@awful.systemsOP
    link
    fedilink
    English
    arrow-up
    0
    ·
    28 days ago

    New article from Axos: Publishers facing existential threat from AI, Cloudflare CEO says

    Baldur Bjarnason has given his commentary:

    Honestly, if search engine traffic is over, it might be time for blogs and blog software to begin to deny all robots by default

    Anyways, personal sidenote/prediction: I suspect the Internet Archive’s gonna have a much harder time archiving blogs/websites going forward.

    Up until this point, the Archive enjoyed easy access to large swathes of the 'Net - site owners had no real incentive to block new crawlers by default, but the prospect of getting onto search results gave them a strong incentive to actively welcome search engine robots, safe in the knowledge that they’d respect robots.txt and keep their server load to a minimum.

    Thanks to the AI bubble and the AI crawlers its unleashed upon the 'Net, that has changed significantly.

    Now, allowing crawlers by default risks AI scraper bots descending upon your website and stealing everything that isn’t nailed down, overloading your servers and attacking FOSS work in the process. And you can forget about reigning them in with robots.txt - they’ll just ignore it and steal anyways, they’ll lie about who they are, they’ll spam new scrapers when you block the old ones, they’ll threaten to exclude you from search results, they’ll try every dirty trick they can because these fucks feel entitled to steal your work and fundamentally do not respect you as a person.

    Add in the fact that the main upside of allowing crawlers (turning up in search results) has been completely undermined by those very same AI corps, as “AI summaries” (like Google’s) steal your traffic through stealing your work, and blocking all robots by default becomes the rational decision to make.

    This all kinda goes without saying, but this change in Internet culture all-but guarantees the Archive gets caught in the crossfire, crippling its efforts to preserve the web as site owners and bloggers alike treat any and all scrapers as guilty (of AI fuckery) until proven innocent, and the web becomes less open as a whole as people protect themselves from the AI robber barons.

    On a wider front, I expect this will cripple any future attempts at making new search engines, too. In addition to AI making it piss-easy to spam search systems with SEO slop, any new start-ups in web search will struggle with quality websites blocking their crawlers by default, whilst slop and garbage will actively welcome their crawlers, leading to your search results inevitably being dogshit and nobody wanting to use your search engine.

    • smiletolerantly@awful.systems
      link
      fedilink
      English
      arrow-up
      0
      ·
      27 days ago

      I don’t like that it’s not open source, and there are opt-in AI features, but I can highly, highly recommend Kagi from a pure search result standpoint, and one of the only alternatives with their own search index.

      (Give it a try, they’ve apparently just opened up their search for users without an account to try it out.)

      Almost all the slop websites aren’t even shown (or put in a “Listicles” section where they can be accessed, but are not intrusive and do not look like proper results, and you can prioritize/deprioritize sites (for example, I have gituib/reddit/stackoverflow to always show on top, quora and pinterest to never show at all).

      Oh, and they have a fediverse “lens” which actually manages to reliably search Lemmy.

      This doesn’t really address the future of crawling, just the “Google has gone to shit” part 😄

    • HedyL@awful.systems
      link
      fedilink
      English
      arrow-up
      0
      ·
      27 days ago

      FWIW, due to recent developments, I’ve found myself increasingly turning to non-search engine sources for reliable web links, such as Wikipedia source lists, blog posts, podcast notes or even Reddit. This almost feels like a return to the early days of the internet, just in reverse and - sadly - with little hope for improvement in the future.

      • fnix@awful.systems
        link
        fedilink
        English
        arrow-up
        0
        ·
        27 days ago

        Searching Reddit has really become standard practice for me, a testament to how inhuman the web as a whole has gotten. What a shame.

        • Soyweiser@awful.systems
          link
          fedilink
          English
          arrow-up
          0
          ·
          24 days ago

          Sucks that a lot of reddit is also being botted. But yes reddit still good. Still fucked that bots take a redit post as input, rewrite it into llm garbage and those then get a high google ranking, while google only lists one or two reddit pages.

  • blakestacey@awful.systems
    link
    fedilink
    English
    arrow-up
    0
    ·
    27 days ago

    In other news, I got an “Is your website AI ready” e-mail from my website host. I think I’m in the market for a new website host.

  • Soyweiser@awful.systems
    link
    fedilink
    English
    arrow-up
    0
    ·
    28 days ago

    Weird conspiracy theory musing: So we know Rokos Basilisk only works on a very specific type of person who needs to belief in all the LW stuff about what the AGI future will be like, but who also feel morally responsible, and have high empathy. (Else the thing falls apart, you need to care about, feel responsible for, and believe the copies/simulated things are conscious). We know caring about others/empathy is one of those traits which seem to be rarer on the right than the left, and there is a feeling that a lot of the right is doing a war on empathy (see the things Musk has said, the whole chan culture shit, but also themotte which somebody once called an ‘empathy removal training center’ which stuck so I also call it that. If you are inside once of these pipelines you can also notice it, or if you get out, you can see it looking back, I certainly did when I read more LW/SSC stuff). We also know Roko is a bit of a chud, who wants some sort of ‘transhumanist’ ‘utopia’ where nobody is non-white or has blue hair (I assume this is known, but if you care to know more about Roko (why?) search sneerclub (Ok, one source as a treat)).

    So here is my conspiracy theory. Roko knew what he was doing, it was intentional on Rokos part, he wanted to drive the empathic part of LW mad, discredit them. (That he was apparently banned from several events for sexual harassment also is interesting. Does remind me of another ‘lower empathy’ thing the whole manosphere/pua thing which was a part of early LW, which often trains people to think less of women).

    Note that I don’t believe in this, as there is no proof for it, I don’t think Roko planned for this (nor considered it in any way) and I think his post was just a honest thought experiment (as was Yuds reaction). It was just an annoying thought which I had to type up else I keep thinking about it. Sorry to make it everybodies problem.

    • Architeuthis@awful.systems
      link
      fedilink
      English
      arrow-up
      0
      ·
      28 days ago

      Not wanting the Basilisk eternal torture dungeon to happen isn’t an empathy thing, they just think that a sufficiently high fidelity simulation of you would be literally you, because otherwise brain uploads aren’t life extension, it’s basically transhumanist cope.

      Yud expands on it in some place or other, along the lines that the gap in consciousness between the biological and digital instance isn’t that different from the gap created by anesthesia or a night’s sleep, it’s just on the space axis instead of the time axis, or something like that.

      And since he also likes the many world interpretations it turns out you also share a soul with yourselves in parallel dimensions; this is why the zizians are so eager to throw down, since getting killed in one dimension lets supradimensional entities know you mean business.

      Early 21st century anthropology is going to be such a ridiculous field of study.

      • Soyweiser@awful.systems
        link
        fedilink
        English
        arrow-up
        0
        ·
        edit-2
        28 days ago

        Clearly you do not have low self-esteem. But yes that is the weak point of this whole thing, and why it is a dumb conspiracy theory. (Im mismatching the longtermist ‘future simulated people are important’ utilitarian extremism with the ‘simulated yous are yous’ extreme weirdness).

        The problems with yuds argument is that all these simulations will quickly diverge and no longer are the real ‘you’ see twins for a strawman example. The copies should then be ran in exactly the same situations and then wtf is the point. When I slam my toe into a piece of furniture I dont morn all the many world mes who also did just break a toe again. It just weird, but due to the immortality cope it makes sense for insiders.

        • Architeuthis@awful.systems
          link
          fedilink
          English
          arrow-up
          0
          ·
          edit-2
          27 days ago

          I’d say if there’s a weak part in your admittedly tongue-in-cheek theory it’s requiring Roko to have had a broader scope plan instead of a really catchy brainfart, not the part about making the basilisk thing out to be smarter/nobler than it is.

          Reframing the infohazard aspect as an empathy filter definitely has legs in terms of building a narrative.

    • aio@awful.systems
      link
      fedilink
      English
      arrow-up
      0
      ·
      28 days ago

      I thought part of the schtick is that according to the rationalist theory of mind, a simulated version of you suffering is exactly the same as the real you suffering. This relies on their various other philosophical claims about the nature of consciousness, but if you believe this then empathy doesn’t have to be a concern.

      • Amoeba_Girl@awful.systems
        link
        fedilink
        English
        arrow-up
        0
        ·
        27 days ago

        The key thing is that the basilisk makes a million billion digibidilion copies of you to torture, and because you know statistics you know that there’s almost no chance you’re the real you and not a torture copy.

        • Architeuthis@awful.systems
          link
          fedilink
          English
          arrow-up
          0
          ·
          edit-2
          27 days ago

          you know that there’s almost no chance you’re the real you and not a torture copy

          I basilisk’s wager was framed like that, that you can’t know if you are already living in the torture sim with the basilisk silently judging you, it would be way more compelling that the actual “you are ontologically identical with any software that simulates you at a high enough level even way after the fact because [preposterous transhumanist motivated reasoning]”.

        • Soyweiser@awful.systems
          link
          fedilink
          English
          arrow-up
          0
          ·
          24 days ago

          But that isn’t a proper copy, as in this simulated reality there is no basilisk (if there is the basilisk also needs to simulate basilisks, and all the way down), so finding out if you are a copy is easy. If it is impossible to build an AGI in our reality, we are in the sim.

          Wait…

      • Soyweiser@awful.systems
        link
        fedilink
        English
        arrow-up
        0
        ·
        28 days ago

        Yeah you are correct, im mismatching longtermism with transhumanist digital immortality, which is why I called it a conspiracy theory, it being wrong and all that. (Even if I do think empathy for perfect copies of yourself is a thing not everyone might have).

    • BlueMonday1984@awful.systemsOP
      link
      fedilink
      English
      arrow-up
      0
      ·
      28 days ago

      …Honestly, I can’t help but feel you’re on to something. I’d have loved to believe this was an honest thought experiment, but after seeing the right openly wage a war on empathy as a concept, I wouldn’t be shocked if Roko’s Basilisk (and its subsequent effects) weren’t planned from the start.

    • sinedpick@awful.systems
      link
      fedilink
      English
      arrow-up
      0
      ·
      edit-2
      27 days ago

      This role is responsible for the creation of a virtual AI Centre of Excellence that will drive the creation of an Enterprise-wide Autonomous AI platform. The platform will connect to all Ice Cream technology solutions providing an AI capability that can provide [blah blah blah…]

      it’s satire right? brilliantly placed satire by a disgruntled hiring manager having one last laugh out the door right? no one would seriously write this right?

  • Soyweiser@awful.systems
    link
    fedilink
    English
    arrow-up
    0
    ·
    1 month ago

    First confirmed openly Dark Enlightenment terrorist is a fact. (It is linked here directly to NRx, but DE is a bit broader than that, it isn’t just NRx, and his other references seem to be more garden variety neo-nazi type (not that this kind of categorizing really matters)).

      • YourNetworkIsHaunted@awful.systems
        link
        fedilink
        English
        arrow-up
        0
        ·
        edit-2
        1 month ago

        Alright OpenAI, listen up. I’ve got a whole 250GB hard drive from 2007 full of the Star Wars/Transformers crossover stories I wrote at the time. I promise you it’s AI-free and won’t be available to train competing models. Bidding starts at seven billion dollars. I’ll wait while you call the VCs.

        • Soyweiser@awful.systems
          link
          fedilink
          English
          arrow-up
          0
          ·
          1 month ago

          Do you want shadowrunners to break into your house to steal your discs? Because this is how you get shadowrunners.

    • BurgersMcSlopshot@awful.systems
      link
      fedilink
      English
      arrow-up
      0
      ·
      edit-2
      1 month ago

      “we set out to make the torment nexus, but all we accomplished is making the stupid faucet and now we can’t turn it off and it’s flooding the house.” - Every AI company, probably.