• rtxn@lemmy.world
    link
    fedilink
    arrow-up
    2
    ·
    edit-2
    2 days ago

    Anubis is a simple anti-scraper defense that weighs a web client’s soul by giving it a tiny proof-of-work workload (some calculation that doesn’t have an efficient solution, like cryptography) before letting it pass through to the actual website. The workload is insignificant for human users, but very taxing for high-volume scrapers. The calculations are done on the client’s side using Javascript code.

    (edit) For clarification: this works because the computation workload takes a relatively long time, not because it bogs down the CPU. Halting each request at the gate for only a few seconds adds up very quickly.

    Recently, the FSF published an article that likened Anubis to malware because it’s basically arbitrary code that the user has no choice but to execute:

    […] The problem is that Anubis makes the website send out a free JavaScript program that acts like malware. A website using Anubis will respond to a request for a webpage with a free JavaScript program and not the page that was requested. If you run the JavaScript program sent through Anubis, it will do some useless computations on random numbers and keep one CPU entirely busy. It could take less than a second or over a minute. When it is done, it sends the computation results back to the website. The website will verify that the useless computation was done by looking at the results and only then give access to the originally requested page.

    Here’s the article, and here’s aussie linux man talking about it.

    • The Quuuuuill@slrpnk.net
      link
      fedilink
      English
      arrow-up
      2
      ·
      2 days ago

      fwiw Anubis is working on a more respectful update, this was their first pass solution for what was basically a break glass emergency. i understand FSF’s concern, but Anubis is the only thing that’s making a free and open internet remotely possible right now, and far better it that nightmare fuel like cloudflare

        • rtxn@lemmy.world
          link
          fedilink
          arrow-up
          0
          ·
          edit-2
          2 days ago
          • A web server that can’t discriminate between a request made by a human and one made by a machine has to handle all requests. It may not be an issue for large companies like Amazon or Microsoft, but small websites will suffer timeouts and outages.
          • Without a locally hosted solution like Anubis, small websites would have to move behind a large centralized service like Cloudflare.
          • Otherwise they might not be able to continue operating and only large corporate-backed services like Twitter and Reddit would survive.

          The alternative is having to choose between Reddit and Cloudflare. Does that look “free” and “open” to you?

          • daniskarma@lemmy.dbzer0.com
            link
            fedilink
            arrow-up
            0
            ·
            edit-2
            2 days ago

            That whole thing is under two wrong suppositions.

            It assumes that we sites are under constant ddos and that cannot exist if there is not ddos protection.

            This is false.

            It assumes that anubis is effective against ddos attacks. Which is not. Is a mitigation, but any ddos attack worth is name would not have any issue bringing down a site with anubis. As the sever still have to handle request even if they are smaller requests.

            Anubis only use case is to make AI scrappers to consume more energy while scrapping, while also making many legitimate users also use more energy. It’s just being promoted in the anti-AI wave, but I don’t really see much usefulness into it.

            • rtxn@lemmy.world
              link
              fedilink
              arrow-up
              0
              ·
              edit-2
              2 days ago

              It assumes that we sites are under constant ddos

              It is literally happening. https://www.youtube.com/watch?v=cQk2mPcAAWo https://thelibre.news/foss-infrastructure-is-under-attack-by-ai-companies/

              It assumes that anubis is effective against ddos attacks

              It’s being used by some little-known entities like the LKML, FreeBSD, SourceHut, UNESCO, and the fucking UN, so I’m assuming it probably works well enough. https://policytoolbox.iiep.unesco.org/ https://xeiaso.net/notes/2025/anubis-works/

              anti-AI wave

              Oh, you’re one of those people. Enough said. (edit) By the way, Anubis’ author seems to be a big fan of machine learning and AI.

              (edit 2 just because I’m extra cross that you don’t seem to understand this part)

              Do you know what a web crawler does when a process finishes grabbing the response from the web server? Do you think it takes a little break to conserve energy and let all the other remaining processes do their thing? No, it spawns another bloody process to scrape the next hyperlink.

              • daniskarma@lemmy.dbzer0.com
                link
                fedilink
                arrow-up
                0
                ·
                edit-2
                2 days ago

                Some websites being under ddos attack =/= all sites are under constant ddos attack, nor it cannot exist without it.

                First there’s a logic fallacy in there. Being used by does not mean it’s useful. Many companies use AI for some task, does that make AI useful? Not.

                The logic it’s still there all anubis can do against ddos is raising a little the barrier before the site goes down. That’s call mitigation not protection. If you are targeted for a ddos that mitigation is not going to do much, and your site is going down regardless.

                • CanadaPlus@lemmy.sdf.org
                  link
                  fedilink
                  arrow-up
                  0
                  ·
                  edit-2
                  2 days ago

                  If a request is taking a full minute of user CPU time, it’s one hell of a mitigation, and anybody who’s not a major corporation or government isn’t going to shrug it off.

                  • daniskarma@lemmy.dbzer0.com
                    link
                    fedilink
                    arrow-up
                    0
                    ·
                    edit-2
                    2 days ago

                    Precisely that’s my point. It fits a very small risk profile. People who is going to be ddosed but not by a big agent.

                    It’s not the most common risk profile. Usually ddos attacks are very heavy or doesn’t happen at all. These “half gas” ddos attacks are not really common.

                    I think that’s why when I read about Anubis is never in a context of ddos protection. It’s always on a context of “let’s fuck AI”, like this precise line of comments.

    • CanadaPlus@lemmy.sdf.org
      link
      fedilink
      arrow-up
      1
      ·
      2 days ago

      Well, that’s a typically abstract, to-the-letter take on the definition of software freedom from them. I think the practical necessity of doing something like this, especially for services like Invidious that are at risk, and the fact it’s a harmless nonsense calculation really deserves an exception.

      • rtxn@lemmy.world
        link
        fedilink
        arrow-up
        1
        ·
        edit-2
        2 days ago

        Correct. Anubis’ goal is to decrease the web traffic that hits the server, not to prevent scraping altogether. I should also clarify that this works because it costs the scrapers time with each request, not because it bogs down the CPU.