Hey folks!

I made a short post last night explaining why image uploads had been disabled. This was in the middle of the night for me, so I did not have time to go into a lot of detail, but I’m writing a more detailed post now to clear up where we are now and where we plan to go.

What’s the problem?

As shared by the lemmy.world team, over the past few days, some people have been spamming one of their communities with CSAM images. Lemmy has been attacked in various ways before, but this is clearly on a whole new level of depravity, as it’s first and foremost an attack on actual victims of child abuse, in addition to being an attack on the users and admins on Lemmy.

What’s the solution?

I am putting together a plan, both for the short term and for the longer term, to combat and prevent such content from ever reaching lemm.ee servers.

For the immediate future, I am taking the following steps:

1) Image uploads are completely disabled for all users

This is a drastic measure, and I am aware that it’s the opposite of what many of our users have been hoping, but at the moment, we simply don’t have the necessary tools to safely handle uploaded images.

2) All images which have federated in from other instances will be deleted from our servers, without any exception

At this point, we have millions of such images, and I am planning to just indiscriminately purge all of them. Posts from other instances will not be broken after the deletion, the deleted images will simply be loaded directly from other instances.

3) I will apply a small patch to the Lemmy backend running on lemm.ee to prevent images from other instances from being downloaded to our servers

Lemmy has always loaded some images directly from other servers, while saving other images locally to serve directly. I am eliminating the second option for the time being, forcing all images uploaded on external instances to always be loaded from those servers. This will somewhat increase the amount of servers which users will fetch images from when opening lemm.ee, which certainly has downsides, but I believe this is preferable to opening up our servers to potentially illegal content.

For the longer term, I have some further ideas:

4) Invite-based registrations

I believe that one of the best ways to effectively combat spam and malicious users is to implement an invite system on Lemmy. I have wanted to work on such a system ever since I first set up this instance, but real life and other things have been getting in the way, so I haven’t had a chance. However, with the current situation, I believe this feature is more important then ever, and I’m very hopeful I will be able to make time to work on it very soon.

My idea would be to grant our users a few invites, which would replenish every month if used. An invite will be required to sign up on lemm.ee after that point. The system will keep track of the invite hierarchy, and in extreme cases (such as spambot sign-ups), inviters may be held responsible for rule breaking users they have invited.

While this will certainly create a barrier of entry to signing up on lemm.ee, we are already one of the biggest instances, and I think at this point, such a barrier will do more good than harm.

5) Account requirements for specific activities

This is something that many admins and mods have been discussing for a while now, and I believe it would be an important feature for lemm.ee as well. Essentially, I would like to limit certain activities to users which meet specific requirements (maybe account age, amount of comments, etc). These activities might include things like image uploads, community creation, perhaps even private messages.

This could in theory limit creation of new accounts just to break rules (or laws).

6) Automated ML based NSFW scanning for all uploaded images

I think it makes sense to apply automatic scanning on all images before we save them on our servers, and if it’s flagged as NSFW, then we don’t accept the upload. While machine learning is not 100% accurate and will produce false positives, I believe this is a trade-off that we simply need to accept at this point. Not only will this help against any potential CSAM, it will also help us better enforce our “no pornography” rule.

This would potentially also allow us to resume caching images from other instances, which will improve both performance and privacy on lemm.ee.


With all of the above in place, I believe we will be able to re-enable image uploads with a much higher degree of safety. Of course, most of these ideas come with some significant downsides, but please keep in mind that users posting CSAM present an existential threat to Lemmy (in addition to just being absolutely morally disgusting and actively harmful to the victims of the abuse). If the choice is between having a Lemmy instance with some restrictions, or not having a Lemmy instance at all, then I think the restrictions are the better option.

I also would appreciate your patience in this matter, as all of the long term plans require additional development, and while this is currently a high priority issue for all Lemmy admins, we are all still volunteers and do not have the freedom to dedicate huge amounts of hours to working on new features.


As always, your feedback and thoughts are appreciated, so please feel free to leave a comment if you disagree with any of the plans or if you have any suggestions on how to improve them.

  • PlasmaDistortion@lemm.ee
    link
    fedilink
    English
    arrow-up
    128
    arrow-down
    1
    ·
    1 year ago

    Personally I say just leave hosting of images to dedicated sites for that purpose. Your efforts are better left to dealing with how to render them. That being said, I use to be in charge of managing abuse on a site that has an average of 20 million posts a month (seriously).

    The way I essentially defeated these kinds of attacks was with an image scanning service. It scans for anything NSFW and blocks it. Sometimes things would make it through but once an admin flagged it we could use that to block the users IP and account. It’s not cheap but the volume is also not huge yet for lemm.ee so it might not be too bad.

    • TWeaK@lemm.ee
      link
      fedilink
      English
      arrow-up
      60
      ·
      1 year ago

      This is my opinion also. Reddit turned to shit around the time they started self-hosting. Imgur only exists because people needed a place to host reddit images.

        • TWeaK@lemm.ee
          link
          fedilink
          English
          arrow-up
          36
          ·
          edit-2
          1 year ago

          No, but there’s nothing stopping you from using direct links from imgur, in traditional fashion.

          It’s a little bit convoluted, though. You have to post the image, then hover over and select “Get share links”, and then pick the option for BB code (forums). This has the [img] tags at the start and finish, but importantly it has the direct link to the image file. If you use this on lemmy then it will load in the instance, rather than directing to imgur itself.

          • JohnDClay@sh.itjust.works
            link
            fedilink
            arrow-up
            9
            ·
            1 year ago

            Imgur is deleting images over a certain age posted anonymously. And they might continue to decrease the number of images they keep to try to be closer to profitability. So that will be bad for longevity of content.

            • winterwulf@lemm.ee
              link
              fedilink
              arrow-up
              3
              ·
              1 year ago

              reddit is the new imgur. I post stuff to my reddit profile grab the image link and post here. let spezz pay the bill for hosting our images.

              • tsonfeir@lemm.ee
                link
                fedilink
                arrow-up
                0
                ·
                11 months ago

                This is brilliant. someone should make a way for us to provide login info to reddit that will just “login” and “post” an image to some random private sub, then return the url. A browser plugin would probably do this easily.

        • Hubi@feddit.de
          link
          fedilink
          arrow-up
          15
          ·
          1 year ago

          I’ve seen people link to uploads on Pixelfed, though this is probably not the intended use case.

        • Bongles@lemm.ee
          link
          fedilink
          arrow-up
          1
          ·
          1 year ago

          Not yet but I wish there was. I use imgur quite a lot and I like the idea of a fediverse version. Especially with the direction they’ve gone lately.

    • Franzia@lemmy.blahaj.zone
      link
      fedilink
      arrow-up
      18
      ·
      1 year ago

      Yeah genuinely we could all be hosting images for free or cheap on several image sites. Even NSFW images and videos! And it would save our instance admins a lot of headaches and probably some cost too.

    • JohnDClay@sh.itjust.works
      link
      fedilink
      arrow-up
      9
      arrow-down
      1
      ·
      1 year ago

      Personally I say just leave hosting of images to dedicated sites for that purpose.

      They aren’t profitable, so they’ll eventually go down. If no one is looking at their site, why keep it going just to serve other sights?

  • bdesk@kbin.social
    link
    fedilink
    arrow-up
    83
    arrow-down
    1
    ·
    1 year ago

    You forgot getting the authorities involved when somebody does upload csam

    • nxfsi@lemmy.world
      link
      fedilink
      arrow-up
      32
      arrow-down
      1
      ·
      1 year ago

      It’s a known tactic by trolls to upload cheese pizza and then notify the media/the authorities themselves because context doesn’t matter when it comes to CSAM

    • sunaurus@lemm.eeOPM
      link
      fedilink
      arrow-up
      14
      ·
      1 year ago

      The Lemmy.world team is getting some authorities involved already for this particular case. I am definitely in favor of notifying law enforcement or revelant organizations, and if anybody tries to use lemm.ee to spread such things, I will definitely be involving my local authorities as well.

    • TWeaK@lemm.ee
      link
      fedilink
      English
      arrow-up
      21
      arrow-down
      10
      ·
      1 year ago

      getting the authorities involved

      How do you imagine that playing out? This isn’t some paedophile ring trading openly, this is people using CSAM as an attack vector. Getting over-enthusiastic police involved is exactly their goal, and will likely do very little to help the victims in the CSAM itself.

      Yes, authorities should be notified and the material provided to the relevant agencies for examination. However that isn’t truly the focus of what’s happening here. There is no immediate threat to children with this attack.

      • WalkableProgrammer@lemmy.world
        link
        fedilink
        arrow-up
        43
        arrow-down
        2
        ·
        1 year ago

        How do you imagine that playing out?

        FBI: Whoa that illegal

        Admin: Ya

        FBI: We’re going to look for this guy

        Admin: alright

        END ACT 1

        • TWeaK@lemm.ee
          link
          fedilink
          English
          arrow-up
          18
          arrow-down
          6
          ·
          1 year ago

          This isn’t something the FBI have much involvement with. The FBI deal with matters across states.

          This isn’t America, where you have a bunch of separate states unified under one American government. People haven’t been posting porn to lemm.ee. People have been posting porn to other instances, which has seeped through to lemm.ee.

          Getting the Estonian law enforcement involved is like trying to get the Californian government involved in dealing with a problem from Texas. Estonian law enforcement have no jurisdiction over lemmy.world or any other instance, and giving them an opportunity is only going to lead to locking down lawful association and communication in favour of some vague “think of the children” rhetoric. And, like I say, it won’t do anything to curtail the production of CSAM as the purpose of this attack has little to do with the promotion of CSAM.

          Frankly, it could easily be more like:

          lemm.ee: We’ve got a problem with illegal content

          Estonian law enforcement: Woah that’s illegal.

          Estonian law enforcement: You’ve admitted to hosting illegal content. We’re going to confiscate all your stuff.

          lemm.ee is shut down pending investigation.

          Meanwhile, if lemm.ee continues its current course of action, yet someone notifies law enforcement:

          Estonian law enforcement: Woah, we’ve got a report of something dodgy, that’s illegal.

          lemm.ee: People tried to post illegal content elsewhere that could have come to our site, we blocked and deleted it to the best of our ability.

          Estonian law enforcement: Fair enough, we’ll see what we can figure out.

          It really matters how and when the problem is presented to law enforcement. If you report yourself, they’re much more likely to take action against yourself than if someone else reports you. It doesn’t do yourself any favours to present your transgressions to them, not unless you’re absolutely certain you’re squeeky clean.

          At this stage and in these circumstances, corrective action is more important than reporting.

          • lagomorphlecture@lemm.ee
            link
            fedilink
            arrow-up
            6
            ·
            1 year ago

            You’re assuming that no American user saw any of the content. I think the FBI could absolutely get involved if the content was seen by anyone in the US, let alone by people in more than 1 state. I’m not going to pretend to be an expert on child abuse or cyber crimes but the FBI devotes massive resources to investigation of crimes against children and could potentially at least help other agencies investigate where this attack originated from. And if the FBI were able to determine that the attack originated from the US, I assure you the DOJ is far less kind to people who possess, commit or distribute that type of horrible child abuse than they are to rich old white men who commit a coup. You’re kind of acting like this is just another DDOS attack rather than the deliberate distribution of horrific images of child abuse to a platform that in no way encourages distribution of child abuse material.

            Anywhooooo the problem was much worse on lemmy.world since they were the main target of the attack. Does anyone know if they reported it?

            • barsoap@lemm.ee
              link
              fedilink
              arrow-up
              1
              ·
              edit-2
              1 year ago

              Local authorities will be the contact point of the admins (or authorities of where the servers are hosted). They’ll investigate what they can and then ring up euro/inter/whatever pol as necessary to have other forces handle stuff in their respective jurisdictions. Cross-border law enforcement isn’t exactly unchartered waters, they’ve been doing it for quite a while.

              As to the current case the ball is clearly in the field of lemmy.world admins and their local authorities (Germany? Hetzner, I think, as so many) as they’re the ones with the IP logs. Even if the FBI gets a tip-off because an American saw anything they’re not exactly in a position to do anything but go via Interpol and ask the BKA if they’d like to share those IP logs.

  • ButtDrugs@lemm.ee
    link
    fedilink
    arrow-up
    61
    ·
    1 year ago

    For step 6 - are you aware of the tooling the admin at dbzero has built to automate the scanning of images in Lemmy instances? It looks pretty promising.

    • sunaurus@lemm.eeOPM
      link
      fedilink
      arrow-up
      21
      arrow-down
      1
      ·
      1 year ago

      Yep, I’ve already tested it and it’s one of the options I am considering implementing for lemm.ee as well.

      • PriorProject@lemmy.world
        link
        fedilink
        arrow-up
        5
        ·
        edit-2
        1 year ago

        It’s worth considering some commercially developed options as well: https://prostasia.org/blog/csam-filtering-options-compared/

        The Cloudflare tool in particular is freely and widely available: https://blog.cloudflare.com/the-csam-scanning-tool/

        I am no expert, but I’m quite skeptical of db0’s tool:

        • It repurposes a library designed for preventing the creation of synthetic CSAM using stable diffusion. This library is typically used in conjunction with prompt scanning and other inputs into the generation process. When run outside it’s normal context on non-ai images, it will lack all this input context which I speculate reduces its effectiveness relative to the conditions under which it’s tested and developed.
        • AI techniques live and die by the quality of the dataset used to train them. There is not and cannot be an open-source test dataset of CSAM upon which to train such a tool. One can attempt workarounds like extracting features classified and extracted separately like trying to detect coexisting features related to youth (trained from dataset A using non sexualized images including children) and sexuality (trained separately from dataset B using images containing only adult performers)… but the efficacy of open source solutions is going to be hamstrung by the inability to train, test, and assess effectiveness of the open tools. Developers of major commercial CSAM scanners are better able to partner with NCMEC and other groups fighting CSAM to assess the effectiveness of their tools.

        I’m no expert, but my belief is that open tools are likely to be hamstrung permanently compared to the tools developed by big companies and the most effective solutions for Lemmy must integrate big company tools (or gov/nonprofit tools if they exist).

        PS: Really impressed by your response plan. I hope the Lemmy world admins are watching this post, I know you all communicate and collaborate. Disabling image uploads is I think I very effective temporary response until detection and response tooling can be improved.

        • iquanyin@lemmy.world
          link
          fedilink
          arrow-up
          1
          ·
          1 year ago

          you make some good points. this gave rise to a thought: seems like law enforcement would have such a data set and seems they should of course allow tools to be trained on it. seems but who knows? might be worth finding out.)

          • Cubes@lemm.ee
            link
            fedilink
            English
            arrow-up
            1
            ·
            1 year ago

            Tbh I’m kind of surprised no government has set up a service themselves to deal with situations like this since law enforcement is always dealing with CSAM, and it seems like it’d make their job easier.

            Plus with the flurry of hugely privacy-invading or anti-encryption legislation that shows up every few months under the guise of “protecting the children online”, it seems like that should be a top priority for them, right?! Right…?

            • PriorProject@lemmy.world
              link
              fedilink
              arrow-up
              1
              ·
              edit-2
              1 year ago

              I replied to the parent comment here to say that governments HAVE set up CSAM detection services. I linked a review of them in my original comment.

              • They’ve set them up through commercial partnerships with technology companies… but that’s no accident. CSAM fighting orgs don’t have the tech reach of a major tech company so they ask for help there.
              • Those partnerships are limited to major/successful orgs… which makes it hard to participate as an OSS dev. But again, that’s on-purpose as the same access that would empower OSS devs to improve detection would enable CSAM producers to improve evasion. Secrecy is useful in this race, even if it has a high cost.

              Plus with the flurry of hugely privacy-invading or anti-encryption legislation that shows up every few months under the guise of “protecting the children online”, it seems like that should be a top priority for them, right?! Right…?

              This seems like inflammatory bait but I’ll bite once.

              • Improving CSAM detection is absolutely a top priority of these orgs, and in the last 10y the scope and reach of the detection tools they’ve created with partners has expanded in reach from scanning zero images to scanning hundreds of millions or billions of images annually. It’s a fairly massive success story even if it’s nowhere near perfect.
              • Building global internet infrastructure to scan all/most images posted to the internet is itself hugely privacy invading even if it’s for a good cause. Nothing prevents law-makers from coopting such infrastructure for less noble goals once it’s been created. Lemmy is in desperate need of help here, and CSAM detection tools are necessary in some form, but they are also very much scary scary privacy invading tools that are subject to “think of the children” abuse.
              • Cubes@lemm.ee
                link
                fedilink
                English
                arrow-up
                1
                ·
                1 year ago

                Good info! Fwiw, I wasn’t intending for it to be “inflammatory bait”, but a jab at the congresspeople who use “for the children” as a way to sneak in bad legislation instead of actually doing things that could protect children

          • PriorProject@lemmy.world
            link
            fedilink
            arrow-up
            1
            ·
            1 year ago

            I’m not sure I follow the suggestion.

            • NCMEC, the US-based organization tasked with fighting CSAM, has already partnered with a list of groups to develop CSAM detection tools. I’ve already linked to an overview of the resulting toolsets in my original comment.
            • The datasets used to develop these tools are private, but that’s not an oversight. The datasets are… well… full of CSAM. Distributing them openly and without restriction would be contrary to NCMEC’s mission and to US law, so they limit the downside by partnering only with serious/capable partners who are able to commit to investing significant resources to developing and long-term maintaining detection tools, and who can sign onerous legal paperwork promising to handle appropriately the access they must be given to otherwise illegal material to do so.
            • CSAM detection tools are necessarily a cat and mouse game of CSAM producers attempting to evade detection vs detection experts trying to improve detection. In such a race, secrecy is a useful… if costly… tool. But as a result, NCMEC requires a certain amount of secrecy from their partners about how the detection tools work and who can run them in what circumstances. The goal of this secrecy is to prevent CSAM producers from developing test suites that allow them to repeatedly test image manipulation strategies that retain visual fidelity but thwart detection techniques.

            All of which is to say…

            … seems like law enforcement would have such a data set and seems they should of course allow tools to be trained on it. seems but who knows? might be worth finding out.)

            Law enforcement DOES have datasets, and DO allow tools to be trained on them… I’ve linked the resulting tools. They do NOT allow randos direct access to the data or tools, which is a necessary precaution to prevent attackers from winning the circumvention race. A Red Hat or Mozilla scale organization might be able to partner with NCMEC or another organization to become a detection tooling partner, but db0, sunaurus, or the Lemmy devs likely cannot without the support of a large technology org with a proven track record or delivering and maintaining successful/impactful technology products. This has the big downside of making a true open-source detection tool more or less impossible… but that’s a well-understood tradeoff that CSAM-fighting orgs are not likely to change as the same access that would empower OSS devs would empower CSAM producers. I’m not sure there’s anything more to find out in this regard.

          • barsoap@lemm.ee
            link
            fedilink
            arrow-up
            1
            ·
            edit-2
            1 year ago

            If you have publicly available detection tools you can train models based on how well stuff they generate triggers those models, i.e. train an AI to generate CSAM (distillation in AI lingo). It also allows training of adversarial models which can imperceptibly change images to foil the detection tools. There’s no way to isolate knowledge and understanding so none of it is public and if you see public APIs they’re behind appropriate rate-limiting etc. so that you can’t use them for that purpose.

        • barsoap@lemm.ee
          link
          fedilink
          arrow-up
          1
          arrow-down
          1
          ·
          1 year ago

          The neat thing is that it’s all much easier as lemm.ee doesn’t allow porn: The filter can just nuke nudity with extreme prejudice, adult or not.

    • Franzia@lemmy.blahaj.zone
      link
      fedilink
      arrow-up
      16
      arrow-down
      1
      ·
      1 year ago

      It seems promising but also incomplete for US hosts, as our laws do not allow deletion of CSAM rather it must be saved and preserved and sent to a central authority and not deleted until they give the okay. Rofl.

      I also wonder if this solution will use PHash or other hashing to filter out known and unaltered CSAM images (without actually comparing the images, rather their metadata).

    • ApeNo1@lemm.ee
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      I blocked botart from their instance as some pretty disturbing stuff was added in the last few days.

  • net00@lemm.ee
    link
    fedilink
    English
    arrow-up
    55
    arrow-down
    2
    ·
    1 year ago

    IMO Lemmy shouldn’t have media uploading of any kind. Aside from the CSAM risk, it’s unsustainable and I think one of the reasons Reddit went to shit is by getting into the whole image/video/gif hosting.

    Dozens of media hosts exist out there, and the mobile/web clients should focus instead on showing remote content better.

    • billy@catata.fish
      link
      fedilink
      English
      arrow-up
      37
      arrow-down
      1
      ·
      1 year ago

      The flip side of the argument is that if you also host the media you are not at risk of having broken links. I’ve seen a number of long running forums that had post bodies that contained external images that are now broken.

      Of course an argument can be made that the only reason that those forums have lived for so long was due to not having costs associated with hosting media.

      • TWeaK@lemm.ee
        link
        fedilink
        English
        arrow-up
        11
        arrow-down
        1
        ·
        1 year ago

        That’s no worse than a reddit link getting borked because it’s been cross-posted and someone managed to kill the original link with a DMCA notice.

        • billy@catata.fish
          link
          fedilink
          English
          arrow-up
          7
          ·
          edit-2
          1 year ago

          I would say that is a different issue. DMCA could go to whatever external host as well so that doesn’t change.

          My argument was about putting faith in external providers to stay alive to continue hosting media. You can also get in a situation where an external provider decides to do a mass delete like what Imgur did this past summer.

        • DMmeYourNudes@lemmy.world
          link
          fedilink
          English
          arrow-up
          6
          ·
          1 year ago

          A post getting removed because someone threatened legal action is not the same as using an image host that goes under because no one visits their site to see their ads to pay for hosting it or because they arbitrarily purged their content or changed their link format like imgur has. Unless Lemmy hosts it’s own images it will be at risk of being purged like has happened many times over.

    • Anonymousllama@lemmy.world
      link
      fedilink
      arrow-up
      2
      ·
      1 year ago

      I get we don’t trust these third party image hosting sites, but if it’s that or having local images that can potentially bring down instances, I’d say that’s a no brainier of a compromise.

      These upload sites like imgur automatically handle image detection and take the load off smaller servers. It seems like a perfect solution got now

    • kinttach@lemm.ee
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 year ago

      There is a privacy and tracking concern with loading images from 3rd-party hosts vs lemm.ee hosting or re-hosting them.

  • eee@lemm.ee
    link
    fedilink
    arrow-up
    53
    arrow-down
    3
    ·
    edit-2
    1 year ago

    Please please do not implement an invite system.

    The success of a forum like this depends on people being able to join and express their thoughts freely. Reddit and digg would never have gotten where they are if they had a closed system.

    I almost didn’t join lemmy because the first two instances I heard about (lemmy.ml and beehaw) had closed registration. I think I applied and then forgot about it for 2 weeks. Thankfully I saw a post about lemmy on reddit yet again and finally found an open instance.

    Don’t let the actions of a few scumbags ruin a good thing for everyone. You’ll be giving them exactly what they want.

    • sunaurus@lemm.eeOPM
      link
      fedilink
      arrow-up
      44
      arrow-down
      2
      ·
      1 year ago

      I agree that users should be able to join Lemmy freely, but I think it makes a lot of sense to try and spread users out more between instances - this spreads out the responsibilities between more admins, spreads out the load between more servers and also reduces the chance of a single point of failure for the whole system.

      It’s clear that there are seriously vile people out there who want to cause huge amounts of damage to Lemmy, and if we have unlimited growth in a few selected instances, then these people only have to target those specific instances for maximum damage.

      In a perfect world, none of this would be necessary, but then again, in a perfect world, we wouldn’t need a decentralized platform in the first place.

      • eee@lemm.ee
        link
        fedilink
        arrow-up
        12
        ·
        edit-2
        1 year ago

        Thanks for responding!

        I agree that it’s best for the lemmyverse.net if there are many big instances too.

        Unfortunately, the concept of the fediverse isn’t as easy to understand. The average newcomer (who mostly just wants to consume content and occasionally ask a question or two) starts off by interacting within their instance, and it takes some time to figure out cross-instance communication (there are still posts about this on the nostupidquestions-type communities). For such users, landing on a small instance means they’ll poke around the Local active posts, think that “this forum is dead”, and never return.

        Like reddit, having a large userbase on lemmyverse is important to keep the conversation interesting (see https://i.imgur.com/4tXHAO0.png). Reddit has provided lemmy with a huge shot at success by injecting a large number of users. But if I’m being honest, the conversation on the lemmyverse isn’t as diverse and engaging as it is on reddit yet. This isn’t self-sustaining yet. I can point to 2 pieces of evidence to support this:

        1. Using Voat as a (imperfect) proxy - I don’t know if there are official stats of Voat, but the best dataset I’ve seen for Voat (https://ojs.aaai.org/index.php/ICWSM/article/download/19382/19154/23395) has 16.2M comments in 2.3M submissions from 113k users. Voat was shut down for lack of funding, but even in its heyday it wasn’t exactly thriving - many people on Voat were united in their toxicity and it never really got going. Compare these numbers to the lemmyverse which has about 100k active users over the last 6 months. If the fediverse is to grow beyond “that niche forum for nerds”, this userbase isn’t enough.

        2. It’s already clear that the number of active users is decreasing - since mid-July, the number of monthly active users has dropped from 70k to 50k. This is expected (bunch of redditors who joined in June, poked around and said hi and left), but it means if the lemmyverse wants to have any chance of succeeding long term, you can’t alienate new users now.

        The approach I’ve been advocating since the beginning of lemmy is:

        • if you see a user who’s interested in lemmy but isn’t really tech savvy, just point them to one of the biggest instances. Don’t explain what federation is, leave it as a feature to be discovered once they’re engaged.
        • if you see a user who’s interested in the concept of a fediverse and wants to know how it works, explain federation and send them to a smaller instance.

        The way federation works now, it’s still disadvantageous to be on a smaller instance (discoverability of new communities is harder, syncing posts/comments isn’t always fast, it’s hard to know which community is more active. Many of these can be fixed with changes to activitypub and lemmy protocol, but in the meantime, sending casual users to small instances means they’ll likely never return.

        So to sum up, I think there should be an avenue for casual users to join the biggest instances, even as we encourage people to move to smaller ones (either targeting those who are more tech savvy, or those who have already been on Lemmy long enough to know how it works - I myself was on Lemmy.world and switched to this “smaller” instance).

        Anyway, you’re the admins here and I have no say over what you eventually do. I’m just hoping you’ll consider the practical realities of user behavior - everyone wants what’s best for the fediverse in the long term.

    • Blaze@discuss.tchncs.de
      link
      fedilink
      arrow-up
      28
      arrow-down
      3
      ·
      1 year ago

      If I may, lemm.ee is now the second biggest instance. Redirecting people to register on local instances (feddit.country) or generalist ones (reddthat.com, Lemmy.today, discuss.online etc.) couldebe reasonable to make those ones grow as well.

      I agree that there should be a clear lists of instances open for registrations, but that probably needs to wait for the dust to settle a bit beforehand

    • lagomorphlecture@lemm.ee
      link
      fedilink
      arrow-up
      14
      arrow-down
      3
      ·
      1 year ago

      While I understand your concerns, this instance has gotten a fair bit larger and will start to suffer the same issues that lemmy.world does if registrations aren’t curbed. It can’t grow infinitely. That just isn’t feasible for one server. Having closed registrations on lemm.ee doesn’t stop anyone from signing up on different instances. A solution might be to temporarily limit registration here in some way, and for the devs and instance admins to find a better way of helping new users choose an instance. The initial sign up process was confusing, and could be streamlined to make it easier for people to choose an instance. In the long term, enhancing the way federation works so users who do sign up on smaller/newer instances don’t need to be lemmy savvy to find content would also help alleviate that type of issue.

    • iquanyin@lemmy.world
      link
      fedilink
      arrow-up
      2
      arrow-down
      1
      ·
      1 year ago

      i get your point but some folks aren’t that put off by it, assuming they can ask for an invite and it does t take ten years. i had to work at it a bit over on reddit but i took my time and just wrote about the difficulties and in a couple weeks hey, i got an invite. i’d prefer a nicer community once i’m in to a quick and easy entry but it sucks thereafter (or is just chaotic and unhappy periodically). it’s like your house. do you just let everyone in from fear of being lonely? probably not. probably, if you’re not a outlier, you’ve taken steps to make it a bit hard for anyone not invited to enter. and it makes your home a better place to be.

  • AFK BRB Chocolate@lemmy.world
    link
    fedilink
    arrow-up
    33
    arrow-down
    1
    ·
    1 year ago

    All of this seems good to me except 4 - I hate the thought of any instances being invited only. I’d much prefer it was just a verified user approach (even just an email) with a waiting period for doing things like posting images. Maybe even limit newish users after that period to a small number of image posts a day.

    Making an instance feel like a club is going to turn off a lot of people. For sure do what you need to do, but I hope you can avoid that one.

    • CoderKat@lemm.ee
      link
      fedilink
      English
      arrow-up
      10
      arrow-down
      1
      ·
      1 year ago

      Strongly agreed. Lemmy needs to grow. I badly miss many smaller communities that are only viable with Reddit’s size. Making prominent instances invite only (or requiring approvals or closing sign ups entirely – as some other instances have done) is just going to hurt Lemmy as a whole.

      Treating new accounts with a lot more scrutiny makes sense to me. We could require the first few comments to have mod approval to even show them (probably more of a per community setting since it would likely have to depend on community mods), restrict images for some period, have more aggressive content filters on young accounts, etc.

    • eyy@lemm.ee
      link
      fedilink
      arrow-up
      4
      arrow-down
      1
      ·
      1 year ago

      I agree. The lemmyverse still need to grow to have any chance of lasting beyond the next year or so

      • Lyrl@lemm.ee
        link
        fedilink
        English
        arrow-up
        6
        arrow-down
        1
        ·
        1 year ago

        If volunteer admins are at their limits, tools to enable admins to manage larger communities needs to come before further growth. Yes, lemmy needs an order of magnitude growth to be able to seriously compete on content, but outgrowing admin capacity is not a sustainable path.

        • iquanyin@lemmy.world
          link
          fedilink
          arrow-up
          3
          arrow-down
          1
          ·
          1 year ago

          yes. very sound thinking there. many problems in life can be skipped by some forethought and timing. don’t invite folks for dinner before your kitchen is fully built.

  • zeus ⁧ ⁧ ∽↯∼@lemm.ee
    link
    fedilink
    English
    arrow-up
    28
    arrow-down
    1
    ·
    edit-2
    1 year ago

    thank you for your work sunaurus, and i’m sorry you had to sort through this

    (particularly annoying though, as i never got around to adding a user banner; and i had one in mind as well. i wish there was some way to externally host avatars and banners)

  • Awoo [she/her]@hexbear.net
    link
    fedilink
    English
    arrow-up
    30
    arrow-down
    3
    ·
    edit-2
    1 year ago

    Forums have existed on the internet forever and and have already dealt with this thousands of times previously. You don’t need to overthink it or reinvent the wheel. It didn’t stop forums existing very comfortably in the past and isn’t an issue that should be that different to deal with today.

    Simply limit image uploads to a certain account age threshold and karma threshold and you will eliminate 99% of the ability to abuse this.

    • AbsolutelyNotABot@lemm.ee
      link
      fedilink
      arrow-up
      5
      arrow-down
      2
      ·
      1 year ago

      Forums have existed on the internet forever and and have already dealt with this thousands of times previously

      The main difference is that forums aren’t federated. On Lemmy you not only need to keep in check internal users, but also external instances, and as everyone can host one, federation ads extra complexity

      • Awoo [she/her]@hexbear.net
        link
        fedilink
        English
        arrow-up
        4
        arrow-down
        1
        ·
        1 year ago

        Not really. We’ve had forums that literally allow you to post to them without even signing up with an account. Without being a “user” at all. This isn’t about “checking” anyone, it’s simply about limiting its ability to be used as a troll tool below the point at which it becomes too tedious to bother. At that point you have eliminated 99% of it.

        This CSAM poster is 1 single person among hundreds of thousands. Making it too tedious to perform eliminates them along with the problem entirely.

    • iByteABit [he/him]@lemm.ee
      link
      fedilink
      arrow-up
      2
      ·
      1 year ago

      I disagree with karma as a concept, but I agree that there should be restrictions of this kind in place. It’s not user friendly, but if it minimizes the chance of someone uploading sick stuff on Lemmy then I support it.

    • thangcuoi@lemm.ee
      link
      fedilink
      arrow-up
      1
      ·
      1 year ago

      Also, maybe some kind of autohide/minimise posts that have been down voted to a certain threshold.

      Let users contribute to a central database similar to Sponsoblock. Posts flag by a Power/Trusted users would be immediately hidden pending review.

  • Fibby@lemm.ee
    link
    fedilink
    arrow-up
    25
    arrow-down
    2
    ·
    1 year ago

    I’m going to be a part of an invite only community?! Of course, given the circumstances, this is pretty fucked. But I feel kinda fancy right now.

    Thanks for all you do on lemm.ee

  • LoboAureo@lemm.ee
    link
    fedilink
    arrow-up
    24
    arrow-down
    1
    ·
    1 year ago

    I left Twitter before musk, when the security chief said that they know they have CP but they were doing nothing.

    I can forgive a measure that doesn’t work as expected or at 100% but not the inactivity.

    Therefore I’m agree with any measure you think it can work despite any inconveniences for me.

    Sorry for any misspelled or wrong word, English isn’t my main language

    Regards and thanks for all your efforts.

    • infinipurple@lemm.ee
      link
      fedilink
      arrow-up
      2
      ·
      1 year ago

      Your English is flawless and your sentiment is echoed. The last thing we should do is to ignore the problem.

  • thegiddystitcher@lemm.ee
    link
    fedilink
    arrow-up
    18
    ·
    1 year ago

    This has been a great instance since day one, and it’s good to see you once again being so proactive. Thank you for the update!

    There are downsides with all kinds of moderation, but ultimately most of us accept that the internet can’t function as a true free-for-all. Absolutely in support of whatever you feel is necessary to keep the server safe, but please watch out for yourself too and make sure you’re asking for help where needed.

    p.s. anyone reading this who doesn’t donate to the server yet, here’s a reminder that that’s a thing you can do.

  • gelframe@lemm.ee
    link
    fedilink
    arrow-up
    17
    ·
    1 year ago

    Could you post a guide on disabling the local image cache? I compile from scratch so I’m not afraid of making changes in the code, I just don’t really know rust. I shut down my personal instance and this would allow me to turn it back on.

  • NuPNuA@lemm.ee
    link
    fedilink
    arrow-up
    18
    arrow-down
    2
    ·
    1 year ago

    Got to be honest, having an invite based system and locking certian features behind age of accounts, karma, etc seems like the opposite of the freedom everyone promised me the Fediverse represented when we moved over.

    I personally don’t really care about images and would prefer image uploads just stay deactivated and we operate as a text only forum but with open membership.

    • sunaurus@lemm.eeOPM
      link
      fedilink
      arrow-up
      11
      arrow-down
      2
      ·
      1 year ago

      Leaving image uploads completely disabled would also be an option to fight this particular type of attack, but there are also other issues with open registrations. For example, while our sign-up captcha seems to be preventing automated registrations, we are still having to ban advertiser accounts almost daily. I think an invite system would really help to reduce sign-ups by any kind of users intending to abuse the system.

      • infinipurple@lemm.ee
        link
        fedilink
        arrow-up
        1
        arrow-down
        1
        ·
        1 year ago

        I’m all for an invite-based system, although we will need some way of combating ‘invite trees’, where one bad actor invites several others, who subsequently invite an exponentially increasing number. A reasonable delay on the invite allowance would go a long way, I think.

        • ToxicWaste@lemm.ee
          link
          fedilink
          arrow-up
          7
          ·
          1 year ago

          I have to say that an invite based signup system makes my toenails curl backwards. IMO this will let instances die out slowly. I didn’t know anyone using lemmy and just stumbled upon it. ppl like me wont ever be able to join an instance if it is invite only.

          Don’t misunderstand me: I do understand how critical it is for the operators of instances to protect themselves. Lemmy is a rather young project and still needs better admin tools. However, there are some good discussions happening on GitHub. Untill the operators and admins have the tooling to protect themselves, I see disabling img upload as preferable. It also took reddit some time to allow uploading images, instead of linking them.

          • 1024_Kibibytes@lemm.ee
            link
            fedilink
            arrow-up
            4
            ·
            1 year ago

            I 100% agree! An invite-based system means that a new user has to find some way of contacting someone in order to request an invite. I think that only allowing X posts per day for e.g. the first week or 2 for new accounts would be a way to combat companies and spammers. Not allowing images or limiting image posts for new accounts, and using automated CSAM detection methods, which I understand are in the works, seems to be a good way to combat that problem.

  • coffee@lemm.ee
    link
    fedilink
    arrow-up
    17
    arrow-down
    1
    ·
    1 year ago

    This is something that many admins and mods have been discussing for a while now, and I believe it would be an important feature for lemm.ee as well. Essentially, I would like to limit certain activities to users which meet specific requirements (maybe account age, amount of comments, etc). These activities might include things like image uploads, community creation, perhaps even private messages.

    Sounds like the old karma requirements some reddit subs had. While I’m not against that, it would restrict locally registered users more so than others who are posting on lemm.ee communities when their host instance has no such system in place. I’m aware that if they post images those would be uploaded to their home instance and linked here with the patch you mentioned above, but the downside is that local users might feel inconvenienced more so than others. Not saying it’s a bad idea though, if we are thinking from a “protect lemm.ee” angle first and foremost.

    Automated ML based NSFW scanning for all uploaded images

    You might want to reach out to the dev of Sync for Lemmy, ljdawson on !syncforlemmy@lemmy.world, he just implemented an anti-NSFW upload feature in the app to do his part. Essentially, Sync users currently can’t post any kind of porn. While I don’t think that the CP spammers were using his particular app, or any app to begin with, I do think it’s a neat feature to have, but would make much more sense to run server-side.

    • eyy@lemm.ee
      link
      fedilink
      arrow-up
      6
      ·
      1 year ago

      he just implemented an anti-NSFW upload feature in the app to do his part. Essentially, Sync users currently can’t post any kind of porn

      but what about normal, legal, NSFW material?

      • barsoap@lemm.ee
        link
        fedilink
        arrow-up
        1
        arrow-down
        1
        ·
        1 year ago

        Not allowed on lemm.ee in the first place. Well, you can see NSFW posts and subscribe to everything on lemmynsfw.com but you’re not supposed to post any porn from a lemm.ee account.

        Policing NSFW is a whole can of worms, it makes sense to leave it to specialised instances. They can nuke political drama from orbit, we can nuke nudity from orbit, both saving mod bandwidth to do the other thing right.

  • Varyag@lemm.ee
    link
    fedilink
    arrow-up
    15
    ·
    1 year ago

    I like almost everything on this plan, except for the last 2 items. The account requirements for “extra activities” best be chosen carefully as to not encourage the good old “karma farming” that we got away from in leaving Reddit.
    And the ML thing for recognizing NSFW is also something to be carefully considered. Too strict and it gets annoying with false positives, it can restrict posting actual content, and too lax won’t make a difference for the people actually looking to circumvent it. I think a “vetting” system like the previous item could be better in the long run, in only letting “trusted” people upload content.