With the proof of work approach, at least it’s demanding the client consume some resources, though the ‘right’ amount is a tricky question, either it’s so trivial as to hardly matter to the scrapers, or it’s hard enough to put a dent in the scrapers’ build, but human operated low end devices are royally screwed…
Here the crawler simply schedules a resumption and moves on to other work. The crawler doesn’t need it right now and it’s free for it to wait.
https://anubis.techaro.lol/docs/admin/configuration/challenges/metarefresh/
Apparently it just tells your browser to refresh after x seconds.
That’s just protection by obscurity then. Any targeted attack could do that challenge at zero cost.
Seems utterly pointless though…
With the proof of work approach, at least it’s demanding the client consume some resources, though the ‘right’ amount is a tricky question, either it’s so trivial as to hardly matter to the scrapers, or it’s hard enough to put a dent in the scrapers’ build, but human operated low end devices are royally screwed…
Here the crawler simply schedules a resumption and moves on to other work. The crawler doesn’t need it right now and it’s free for it to wait.
Which doesn’t work with some browsers, but it’s acceptable, I guess