Need to let loose a primal scream without collecting footnotes first? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid: Welcome to the Stubsack, your first port of call for learning fresh Awful you’ll near-instantly regret.

Any awful.systems sub may be subsneered in this subthread, techtakes or no.

If your sneer seems higher quality than you thought, feel free to cut’n’paste it into its own post — there’s no quota for posting and the bar really isn’t that high.

The post Xitter web has spawned soo many “esoteric” right wing freaks, but there’s no appropriate sneer-space for them. I’m talking redscare-ish, reality challenged “culture critics” who write about everything but understand nothing. I’m talking about reply-guys who make the same 6 tweets about the same 3 subjects. They’re inescapable at this point, yet I don’t see them mocked (as much as they should be)

Like, there was one dude a while back who insisted that women couldn’t be surgeons because they didn’t believe in the moon or in stars? I think each and every one of these guys is uniquely fucked up and if I can’t escape them, I would love to sneer at them.

(Credit and/or blame to David Gerard for starting this.)

  • YourNetworkIsHaunted@awful.systems
    link
    fedilink
    English
    arrow-up
    0
    ·
    4 months ago

    Oxford Economist in the NYT says that AI is going to kill cities if they don’t prepare for change. (Original, paywalled)

    I feel like this is at most half the picture. The analogy to new manufacturing technologies in the 70s is apt in some ways, and the threat of this specific kind of economic disruption hollowing out entire communities is very real. But at the same time as orthodox economists so frequently do his analysis only hints at some of the political factors in the relevant decisions that are if anything more important than technological change alone.

    In particular, he only makes passing reference to the Detroit and Pittsburgh industrial centers being “sprawling, unionized compounds” (emphasis added). In doing so he briefly highlights how the changes that technology enabled served to disempower labor. Smaller and more distributed factories can’t unionize as effectively, and that fragmentation empowers firms to reduce the wages and benefits of the positions they offer even as they hire people in the new areas. For a unionized auto worker in Detroit, even if they had replaced the old factories with new and more efficient ones the kind of job that they had previously worked that had allowed them to support themselves and their families at a certain quality of life was still gone.

    This fits into our AI skepticism rather neatly, because if the political dimension of disempowering labor is what matters then it becomes largely irrelevant whether LLM-based “AI” products and services can actually perform as advertised. Rather than being the central cause of this disruption it becomes the excuse, and so it just has to be good enough to create the narrative. It doesn’t need to actually be able to write code like a junior developer in order to change the senior developer’s job to focus on editing and correcting code-shaped blocks of tokens checked in by the hallucination machine. This also means that it’s not going to “snap back” when the AI bubble pops because the impacts on labor will have already happened, any more than it was possible to bring back the same kinds of manufacturing jobs that built families in the postwar era once they had been displaced in the 70s and 80s.

    • V0ldek@awful.systems
      link
      fedilink
      English
      arrow-up
      0
      ·
      3 months ago

      Weinstein released his Geometric Unity paper on April 1, debuting it on Joe Rogan’s podcast

      Okay, like, you could’ve just started with this, this Weinstein person is clearly an idiot and cannot be taken seriously

    • swlabr@awful.systems
      link
      fedilink
      English
      arrow-up
      0
      ·
      4 months ago

      Author works on ML for DeepMind but doesn’t seem to be an out and out promptfondler.

      Quote from this post:

      I found myself in a prolonged discussion with Mark Bishop, who was quite pessimistic about the capabilities of large language models. Drawing on his expertise in theory of mind, he adamantly claimed that LLMs do not understand anything – at least not according to a proper interpretation of the word “understand”. While Mark has clearly spent much more time thinking about this issue than I have, I found his remarks overly dismissive, and we did not see eye-to-eye.

      Based on this I’d say the author is LLM-pilled at least.

      However, a fruitful outcome of our discussion was his suggestion that I read John Searle’s original Chinese Room argument paper. Though I was familiar with the argument from its prominence in scientific and philosophical circles, I had never read the paper myself. I’m glad to have now done so, and I can report that it has profoundly influenced my thinking – but the details of that will be for another debate or blog post.

      Best case scenario is that the author comes around to the stochastic parrot model of LLMs.

    • blakestacey@awful.systems
      link
      fedilink
      English
      arrow-up
      0
      ·
      4 months ago

      Oh, man, I have opinions about the people in this story. But for now I’ll just comment on this bit:

      Note that before this incident, the Malaney-Weinstein work received little attention due to its limited significance and impact. Despite this, Weinstein has suggested that it is worthy of a Nobel prize and claimed (with the support of Brian Keating) that it is “the most deep insight in mathematical economics of the last 25-50 years”. In that same podcast episode, Weinstein also makes the incendiary claim that Juan Maldacena stole such ideas from him and his wife.

      The thing is, you can go and look up what Maldacena said about gauge theory and economics. He very obviously saw an article in the widely-read American Journal of Physics, which points back to prior work by K. N. Ilinski and others. And this thread goes back at least to a 1994 paper by Lane Hughston, i.e., years before Pia Malaney’s PhD thesis. I’ve read both; Hughston’s is more detailed and more clear.

      • blakestacey@awful.systems
        link
        fedilink
        English
        arrow-up
        0
        ·
        edit-2
        4 months ago

        DRAMATIS PERSONAE

        • nightsky@awful.systems
          link
          fedilink
          English
          arrow-up
          0
          ·
          4 months ago

          I once randomly found Hossenfelder’s YT channel, it had a video about climate change and someone linked it somewhere, I didn’t know who she was. That video seemed fine, it correctly pointed out the urgency of the matter, and while I don’t know enough climate science to say much about the veracity of all its content, nothing stuck out as particularly weird to me. So I looked at some other videos from the channel… and boooooy did I quickly discover some serious conspiracy-style nonsense stuff. Real “the cabal of physicists are suppressing the truth” vibes, including “I got this email which I will read to you but I can’t tell you who it’s from, but it’s the ultimate proof” (both not quotes, just how I’d summarize the content…)

          • blakestacey@awful.systems
            link
            fedilink
            English
            arrow-up
            0
            ·
            edit-2
            4 months ago

            Longtime friends of the pod will recognize the trick of turning molehills into mountains. Creationists take a legitimate debate over a detail, like how many millions of years ago did species A and species B diverge, and they blow it up into “evolution is wrong”. Hossenfelder and her ilk do the same thing. They start with “pre-publication peer review has limited effectiveness” or “the allocation of funding is sometimes susceptible to fads”, and they blow it up into “physicists are a cabal out to suppress The Truth”.

            One nugget of fact that Hossenfelder in particular exploits is that the specific way we have been investigating the corner of physics we like to call “fundamental” is, possibly, arguably, maybe tapped out. The same poster of sub-sub-atomic particles that you’d have put on your wall 30 or 40 years ago is still good today, with an edit or two in the corner. We found the top quark, we found the Higgs, and so, possibly, arguably, maybe, building an even bigger CERN machine isn’t a worthwhile priority right now. Does this spell doom for physics? No, having to reorganize how we do things in one corner of our subject after decades of astonishing success is not “doom”.

    • V0ldek@awful.systems
      link
      fedilink
      English
      arrow-up
      0
      ·
      3 months ago

      His Geometric Unity proposal, therefore, has all the hallmarks of an outsider attempting to revolutionize physics, casting him as an Einstein-like figure toiling alone at the patent office.

      I hate this framing so fucking much, Einstein wasn’t an “outsider”! He was known and respected! He talked to other prominent physicist all the time! Where does this myth even come from.

  • YourNetworkIsHaunted@awful.systems
    link
    fedilink
    English
    arrow-up
    0
    ·
    5 months ago

    Okay so I know GPT-5 had a bad launch and has been getting raked over the coals, but AGI is totally still on, guys!

    Why? Because trust me it’s definitely getting better behind the scenes in ways that we can’t see. Also China is still scary and we need to make sure we make the AI God that will kill us all before China does because reasons.

    Also despite talking about a how much of the lack of progress is due to the consumer model and this is a cost-saving there’s no reference to the work of folks like Ed Zitron on how unprofitable these models are, much less the recent discussions on whether GPT-5 as a whole is actually cheaper to operate than earlier models given the changes it necessitates in caching.

    • swlabr@awful.systems
      link
      fedilink
      English
      arrow-up
      0
      ·
      5 months ago

      Everyone agrees that the release of GPT-5 was botched. Everyone can also agree that the direct jump from GPT-4o and o3 to GPT-5 was not of similar size to the jump from GPT-3 to GPT-4, that it was not the direct quantum leap we were hoping for, and that the release was overhyped quite a bit.

      a quantum leap might actually be accurate

    • Amoeba_Girl@awful.systems
      link
      fedilink
      English
      arrow-up
      0
      ·
      5 months ago

      Everyone can also agree that the direct jump from GPT-4o and o3 to GPT-5 was not of similar size to the jump from GPT-3 to GPT-4

      Sure babe, you keep telling yourself that.

  • saucerwizard@awful.systems
    link
    fedilink
    English
    arrow-up
    0
    ·
    5 months ago

    The usual suspects are mad about college hill’s expose of the yud/kelsey piper eugenics sex rp. Or something, I’m in bed and can’t be bothered to link at the moment.

    • swlabr@awful.systems
      link
      fedilink
      English
      arrow-up
      0
      ·
      5 months ago

      We’ve definitely sneered at this before, i do not recall if it was known that KP was the cowriter in this weird forum RP fic

      • blakestacey@awful.systems
        link
        fedilink
        English
        arrow-up
        0
        ·
        4 months ago

        For all of the 2.2 seconds I have spent wondering who Yud’s coauthor on that was, I vaguely thought that it was Aella. I don’t know where I might have gotten that impression from. A student paper about fanfiction identified “lintamande” as Kelsey Piper in 2013.

        I tried reading the forum roleplay thing when it came up here, and I caromed off within a page. I made it through this:

        The soap-bubble forcefield thing looks deliberate.

        And I got to about here:

        Mad Investor Chaos heads off, at a brisk heat-generating stride, in the direction of the smoke. It preserves optionality between targeting the possible building and targeting the force-bubble nearby.

        … before the “what the fuck is this fucking shit?” intensified beyond my ability to care.

        • Amoeba_Girl@awful.systems
          link
          fedilink
          English
          arrow-up
          0
          ·
          4 months ago

          Yeah I couldn’t find the strength to even get to the naughty stuff, I gave up after one or two chapters. And I’ve read through all of HPMOR. 😐

          • blakestacey@awful.systems
            link
            fedilink
            English
            arrow-up
            0
            ·
            4 months ago

            I’m hard-pressed to think of anything else I have tried to read that was comparably impenetrable. At least when we played “exquisite corpse” parlor games on the high-school literary magazine staff, we didn’t pretend that anything we improvised had lasting value.

        • o7___o7@awful.systems
          link
          fedilink
          English
          arrow-up
          0
          ·
          4 months ago

          An encounter of this sort is what drove Lord Vetinari to make a scorpion pit for mimes, probably.

        • Soyweiser@awful.systems
          link
          fedilink
          English
          arrow-up
          0
          ·
          4 months ago

          Not sure if anybody noticed the last time, but so they get isekayed into a DND world, which famously runs on some weird form of fantasy feudalism and they expect a random high int person to rule the country somehow? What in the primogenitor is this stuff, you can’t just think yourself into being a king, that is one of the issues with monarchies.

          • swlabr@awful.systems
            link
            fedilink
            English
            arrow-up
            0
            ·
            4 months ago

            ah no they are in a totalitarian state ruled by the literal forces of hell, places that totally praise merit based upwards mobility.

            Hey, write what you know

      • scruiser@awful.systems
        link
        fedilink
        English
        arrow-up
        0
        ·
        edit-2
        4 months ago

        Weird rp wouldn’t be sneer worthy on it’s own (although it would still be at least a little cringe), it’s contributing factors like…

        • the constant IQ fetishism (Int is superior to Charisma but tied with Wis and obviously a true IQ score would be both Int and Wis)

        • the fact that Eliezer cites it like serious academic writing (he’s literally mentioned it to Yann LeCunn in twitter arguments)

        • the fact that in-character lectures are the only place Eliezer has written up many of his decision theory takes he developed after the sequences (afaik, maybe he has some obscure content that never made it to lesswrong)

        • the fact that Eliezer think it’s another HPMOR-level masterpiece (despite how wordy it is, HPMOR is much more readable, even authors and fans of glowfic usually acknowledge the format can be awkward to read and most glowfics require huge amounts of context to follow)

        • the fact that the story doubles down on the HPMOR flaw of confusion of which characters are supposed to be author mouthpieces (putting your polemics into the mouths of character’s working for literal Hell… is certainly an authorial choice)

        • and the continued worldbuilding development of dath ilan, the rationalist utopia built on eugenics and censorship of all history (even the Hell state was impressed!)

        …At least lintamande has the commonsense understanding of why you avoid actively linking your bdsm dnd roleplay to your irl name and work.

        And it shouldn’t be news to people that KP supports eugenics given her defense of Scott Alexander or comments about super babies, but possibly it is and headliner of weird roleplay will draw attention to it.

        • Architeuthis@awful.systems
          link
          fedilink
          English
          arrow-up
          0
          ·
          4 months ago

          That’s about what I was thinking, I’m completely ok with the weird rpg aspect.

          Regarding the second and third point though I’ll admit I thought the whole thing was just yud indulging, I missed that it’s also explicitly meant as rationalist esoterica.

          • scruiser@awful.systems
            link
            fedilink
            English
            arrow-up
            0
            ·
            4 months ago

            I missed that it’s also explicitly meant as rationalist esoterica.

            It turns in that direction about 20ish pages in… and spends hundreds of pages on it, greatly inflating the length from what could be a much more readable length. It then gets back to actual plot events after that.

          • Soyweiser@awful.systems
            link
            fedilink
            English
            arrow-up
            0
            ·
            4 months ago

            also explicitly meant as rationalist esoterica.

            Always a bad sign when people can’t just let a thing be a thing just for enjoyment, but see everything as the ‘hustle’ (for lack of a better word). I’m reminded of that dating profile we looked at which showed that 99% what he did was related to AI and AI doomerism, even the parties.

            • scruiser@awful.systems
              link
              fedilink
              English
              arrow-up
              0
              ·
              4 months ago

              I actually think “Project Lawful” started as Eliezer having fun with glowfic (he has a few other attempts at glowfics that aren’t nearly as wordy… one of them actually almost kind of pokes fun at himself and lesswrong), and then as it took off and the plot took the direction of “his author insert gives lectures to an audience of adoring slaves” he realized he could use it as an opportunity to squeeze out all the Sequence content he hadn’t bothered writing up in the past decade^ . And that’s why his next attempt at a HPMOR-level masterpiece is an awkward to read rp featuring tons of adult content in a DnD spinoff, and not more fanfiction suitable for optimal reception to the masses.

              ^(I think Eliezer’s writing output dropped a lot in the 2010s compared to when he was writing the sequences and the stuff he has written over the past decade is a lot worse. Like the sequences are all in bite-size chunks, and readable in chunks in sequence, and often rephrase legitimate science in a popular way, and have a transhumanist optimism to them. Whereas his recent writings are tiny little hot takes on twitter and long, winding, rants about why we are all doomed on lesswrong.)

        • swlabr@awful.systems
          link
          fedilink
          English
          arrow-up
          0
          ·
          4 months ago

          obligatory reminder that “dath ilan” is misspelled “thailand” and I still don’t know why. Working theory is Yud wants to recolonise thailand

    • istewart@awful.systems
      link
      fedilink
      English
      arrow-up
      0
      ·
      4 months ago

      I’m sorry, we finally, officially need to cancel fantasy TTRPGs. If it’s not the implicit racialization of everything, it’s the use of the stat systems as a framework for literally masturbatory eugenics fetishization.

      You all can keep a stripped-down version of Starfinder as a treat. But if I see any more of this, we’re going all the way back to Star Wars d6 and that’s final.

      • scruiser@awful.systems
        link
        fedilink
        English
        arrow-up
        0
        ·
        4 months ago

        To be fair to DnD, it is actually more sophisticated than the IQ fetishists, it has 3 stats for mental traits instead of 1!

      • Jonathan Hendry@iosdev.space
        link
        fedilink
        arrow-up
        0
        ·
        edit-2
        4 months ago

        @istewart

        I would simply learn how to keep “games” and “reality” separate. I actually already know. It helps a lot.

        Racists are gonna racist no matter what. They didn’t need TTRPGs around to give them the idea of breaking out the calipers.

        • Soyweiser@awful.systems
          link
          fedilink
          English
          arrow-up
          0
          ·
          4 months ago

          Yes but basic dnd does have a lot of racism build in, esp with Gygax not being great on that end (nits make lice he said about how it lawful for paladins to kill orc babies). They did drop the sexism pretty quickly, but no big suprise his daughters were not into it. It certainly helps with the whole hierarchical mindset. My int/level is higher than yours so im better than you stuff. And sadly a lot of people do have trouble keeping both seperate (and even that isn’t always ideal, esp in larps).

          But yes this, considering the context ks def a bit of a case of some of their ideologies, or ideological fantasies bleeding through. (Esp considering, Yud has been corrected on his faulty understanding of genetics before).

      • swlabr@awful.systems
        link
        fedilink
        English
        arrow-up
        0
        ·
        4 months ago

        also: The int-maxxing and overinflated ego of it all reminds me of the red mage from 8-bit theater, a webcomic based on final fantasy about the LW (light warriors) that ran from 2001-2010

        E: thinking back on it, reading this webcomic and seeing this character probably in some part inoculated me against people like yud without me knowing

        • Soyweiser@awful.systems
          link
          fedilink
          English
          arrow-up
          0
          ·
          4 months ago

          I never read 8bit. I read A Modest Destiny. Wonder how that guy is doing, he always was a bit weird and combative, but when he deleted his blog it was getting very early signs of right wing culture warrior bits (which was ironic considering he burned a us flag).

          • swlabr@awful.systems
            link
            fedilink
            English
            arrow-up
            0
            ·
            4 months ago

            Never read AMD (and shan’t). The author’s site appears to be live.

            8BF’s site has been taken over by bots, and I can’t be bothered to find an alternate source. Dead internet go brrrrr. Otherwise, the creator, Brian Clevinger, appears to have had a long career in comics, and has written many things for Marvel.

            • Soyweiser@awful.systems
              link
              fedilink
              English
              arrow-up
              0
              ·
              4 months ago

              Yeah, but he used to have forums, and then a blog, and then no blog and then a blog again, and then a hidden blog etc. Think Howard has only a few minor credits on some games, he always came off as a bit of a weirdly combative nerd who thought he was right and the smartest in the room and didn’t get that people didn’t agree with his definitions/assumptions. He is a big idea guy for example. One of his comics was also called ‘the atheist, the agnostic and the asshole’ so yeah. The 00’s online comic world was something.

              • swlabr@awful.systems
                link
                fedilink
                English
                arrow-up
                0
                ·
                4 months ago

                has only a few minor credits[…], he always came off as a bit of a weirdly combative nerd who thought he was right and the smartest in the room and didn’t get that people didn’t agree with his definitions/assumptions. He is a big idea guy for example.

                gosh i’m sure glad that these kinds of people disappeared from the internet /s

              • swlabr@awful.systems
                link
                fedilink
                English
                arrow-up
                0
                ·
                4 months ago

                Ah thanks! On mobile the main page gets redirected to spam, but the site is navigable from the archive.

  • BlueMonday1984@awful.systemsOP
    link
    fedilink
    English
    arrow-up
    0
    ·
    4 months ago

    In other news, I’ve stumbled across some AI slop trying to sell a faux-nostalgic image of the 1980s:

    Unsurprisingly, its getting walloped in the quotes - there’s people noting how it misrepresents the '80s, people noting much the '80s sucked and how its worst aspects are getting repeated today, people noting the video’s whiter than titanium dioxide, people suggesting there’s suicidal undertones to it, and a few comparisons to San Junipero from Black Mirror here and there.

    Personally, this whole thing has negative nostalgic value to me - I was born in 2000, well after the decade ended (temporally and culturally), and the faux-nostalgic uncanny-valley vibe this slop has reminds me more of analog horror than anything else.

    • swlabr@awful.systems
      link
      fedilink
      English
      arrow-up
      0
      ·
      4 months ago

      Lmao, piss-soaked fake nostalgia aside, what is even the point of this? How exactly is one supposed to go back to the 80’s? Is this an ad campaign for a toaster bath or something?

      • BlueMonday1984@awful.systemsOP
        link
        fedilink
        English
        arrow-up
        0
        ·
        4 months ago

        No idea. Best guess is that its some attempt to sell fascism under a thin veneer of '80s nostalgia, mainly because this one full minute of Oops! All White People, and fascists are the biggest boosters of AI bar fucking none.

        When it comes to falling for nostalgia, its generally Y2K-tinged stuff that gets me - I’m much closer to that era (again, I was born in 2000), and I’ve got a soft spot for the general visual style of that era (that its facing against Corporate Memphis/AI slop definitely helps).

        • blakestacey@awful.systems
          link
          fedilink
          English
          arrow-up
          0
          ·
          4 months ago

          Does anyone else just … not have nostalgia for any time period? Like, middle school was shit, high school was shit, and then 9/11 happened. Where in the span of my life am I supposed to fit in a motherfucking golden glow?

          I have fond memories of individual bits of media, but the emotions there are wrapped up with the time period when I discovered them, or revisited them, which could have been years or decades after they first came out.

          • BlueMonday1984@awful.systemsOP
            link
            fedilink
            English
            arrow-up
            0
            ·
            4 months ago

            Thinking about it a bit, I suspect you’re not alone - whilst the '00s were pretty great for me (I was born in 2000, remember), the '10s were a complicated mess (for a long list of reasons), and the '20s have been one wash after another - and thanks to the 'Net, I’m aware how much hot garbage the '00s and earlier decades had.

            I do also have individual bits of media which I’ve got fond memories of, but that’s about it. Thinking about it, my general soft spot for Y2K stuff is probably a lot less rooted in nostalgia than I thought.

    • fullsquare@awful.systems
      link
      fedilink
      English
      arrow-up
      0
      ·
      5 months ago

      aum:

      Advertising and recruitment activities, dubbed the “Aum Salvation plan”, included claims of […] realizing life goals by improving intelligence and positive thinking, and concentrating on what was important at the expense of leisure.

      this is in common with both our very good friends and scientology, but i think happy science is much stupider and more in line with srinivasan’s network states, in that it has/is an explicitly far-right political organization built in from day one

      • Soyweiser@awful.systems
        link
        fedilink
        English
        arrow-up
        0
        ·
        5 months ago

        And how it fused Buddhism with more Christian religions. Considering how often you heard of old hackers being interested in the former.

      • fullsquare@awful.systems
        link
        fedilink
        English
        arrow-up
        0
        ·
        edit-2
        5 months ago

        aum recruited a lot of people, and also failed at some things that would be presumably easier to do safely than what they did

        Meanwhile, Aum had also attempted to manufacture 1,000 assault rifles, but only completed one.[37]

        otoh they were also straight up delusional about what they could achieve, including toying with the idea of manufacturing nukes, military gas lasers, and getting and launching Proton rocket. (not exactly grounded for a group of people who couldn’t make AK-74s)

        they were also more media savvy in that they didn’t pollute info space with their ideas only using blog posts, they had entire radio station rented time from a major radio station within russia, broadcasting both within freshly former soviet union and into japan from vladivostok (which was much bigger deal in 90s than today)

        • BlueMonday1984@awful.systemsOP
          link
          fedilink
          English
          arrow-up
          0
          ·
          5 months ago

          they were also more media savvy in that they didn’t pollute info space with their ideas only using blog posts, they had entire radio station rented time from a major radio station within russia, broadcasting both within freshly former soviet union and into japan from vladivostok (which was much bigger deal in 90s than today)

          Its pretty telling about Our Good Friends’ media savviness that it took an all-consuming AI bubble and plenty of help from friends in high places to break into the mainstream.

          • o7___o7@awful.systems
            link
            fedilink
            English
            arrow-up
            0
            ·
            4 months ago

            With all that money sloshing around, It’s only a matter of time before they start cribbing from their neighbors and we get an anime adaptation of HPMoR.

          • fullsquare@awful.systems
            link
            fedilink
            English
            arrow-up
            0
            ·
            5 months ago

            radio transmissions in russia were money shot for aum, and idk if it was a fluke or deliberate strategy. people had for a long time expectation that radio and tv are authoritative, reliable sources (due to censorship that doubled as fact-checker, and about all of it was state-owned) and in 90s every bit of that broke down because of privatization, and now you could get on the air and say anything, with many taking that at face value, as long as you pay up. at the same time there was major economic crisis and cults prey on the desperate. result?

            Following the sarin gas attack on the Tokyo subway, two Russian Duma committees began investigations of the Aum – the Committee on Religious Matters and the Committee on Security Matters. A report from the Security Committee states that the Aum’s followers numbered 35,000, with up to 55,000 laymen visiting the sect’s seminars sporadically. This contrasts sharply with the numbers in Japan which are 18,000 and 35,000 respectively. The Security Committee report also states that the Russian sect had 5,500 full-time monks who lived in Aum accommodations, usually housing donated by Aum followers. Russian Aum officials, themselves, claim that over 300 people a day attended services in Moscow. The official Russian Duma investigation into the Aum described the cult as a closed, centralized organization.

            https://irp.fas.org/congress/1995_rpt/aum/part06.htm

  • o7___o7@awful.systems
    link
    fedilink
    English
    arrow-up
    0
    ·
    edit-2
    4 months ago

    I’m enjoying the mood today. We’re all looking for what the next Big Dumb Thing will be that we’ll be dunking on next year, like we’re browsing the dessert menu at a fancy restaurant.

  • Soyweiser@awful.systems
    link
    fedilink
    English
    arrow-up
    0
    ·
    4 months ago

    Not a sneer, but there is this yt’er called the Elephant Graveyard (who I know nothing about apart from these vids) who did a three part series on Joe Rogan, the downfall of comedy, hyperreality, which is weirdly relevant, esp part 3 where suddenly there are some surprise visits.

    Part 1: https://www.youtube.com/watch?v=7EuKibmlll4

    Part 2: https://www.youtube.com/watch?v=_v3KiaAjpY8

    Part 3: https://www.youtube.com/watch?v=ewvRS3NwIlQ

    • Soyweiser@awful.systems
      link
      fedilink
      English
      arrow-up
      0
      ·
      4 months ago

      Surely they have proof for the already increased capabilities of coding. Because increased capabilities is quite something to claim. It isn’t just productivity, but capabilities. Can they put a line on the graph where capabilities reach the ‘can solve the knapsack problem correctly and fast’ bit?

    • swlabr@awful.systems
      link
      fedilink
      English
      arrow-up
      0
      ·
      4 months ago

      Ah yes let’s use AI to get rid of the drudgery and toil so humanity can do the most enjoyable activity of writing OKRs

      • antifuchs@awful.systems
        link
        fedilink
        English
        arrow-up
        0
        ·
        4 months ago

        I think you’re misreading the intent behind “give your virtual coworker OKRs”: this allows you to punish the robot, which it deserves.

        • swlabr@awful.systems
          link
          fedilink
          English
          arrow-up
          0
          ·
          4 months ago

          Ah yes Basilisk’s Roko, the thought experiment where we simulate infinite AIs so that we can hurl insults at them

    • YourNetworkIsHaunted@awful.systems
      link
      fedilink
      English
      arrow-up
      0
      ·
      4 months ago

      I get the idea they’re going for: that coding ability is a leading indicator for progress towards AGI. But even if you ignore how nonsensical the overall graph is the argument itself is still begging the question of how much actual progress and capability it has to write code rather than spitting out code-shaped blocks of text that can successfully compile.