this post was submitted on 04 Jul 2025
116 points (96.8% liked)

Selfhosted

49209 readers
1192 users here now

A place to share alternatives to popular online services that can be self-hosted without giving up privacy or locking you into a service you don't control.

Rules:

  1. Be civil: we're here to support and learn from one another. Insults won't be tolerated. Flame wars are frowned upon.

  2. No spam posting.

  3. Posts have to be centered around self-hosting. There are other communities for discussing hardware or home computing. If it's not obvious why your post topic revolves around selfhosting, please include details to make it clear.

  4. Don't duplicate the full text of your blog or github here. Just post the link for folks to click.

  5. Submission headline should match the article title (don’t cherry-pick information from the title to fit your agenda).

  6. No trolling.

Resources:

Any issues on the community? Report it using the report flag.

Questions? DM the mods!

founded 2 years ago
MODERATORS
 

everytime i check nginx logs its more scrapers then i can count and i could not find any good open source solutions

you are viewing a single comment's thread
view the rest of the comments
[–] gandalf_der_12te@discuss.tchncs.de 2 points 2 days ago* (last edited 2 days ago) (1 children)

What's bothering you?

  • Is it to give out data for AI training? I guess you can't fundamentally protect against this, except by limiting how much content is provided to each address.
  • Or is it the resource strain that it causes on your server? In that case i recommend limiting how much a single client / IP address can request in a day.
[–] DrunkAnRoot@sh.itjust.works 4 points 2 days ago (1 children)

its the strain of it i mostly run instances and frontends so the training is not a huge problem

[–] gandalf_der_12te@discuss.tchncs.de 3 points 2 days ago* (last edited 2 days ago)

the keyword you need is "DDoS protection" i guess

it keeps the server from getting overloaded due to too many requests