Thank you for this! Awesome work!
By the way, this looks easy to put in a container. Have you considered doing that?
A place to share alternatives to popular online services that can be self-hosted without giving up privacy or locking you into a service you don't control.
Rules:
Be civil: we're here to support and learn from one another. Insults won't be tolerated. Flame wars are frowned upon.
No spam posting.
Posts have to be centered around self-hosting. There are other communities for discussing hardware or home computing. If it's not obvious why your post topic revolves around selfhosting, please include details to make it clear.
Don't duplicate the full text of your blog or github here. Just post the link for folks to click.
Submission headline should match the article title (don’t cherry-pick information from the title to fit your agenda).
No trolling.
Resources:
Any issues on the community? Report it using the report flag.
Questions? DM the mods!
Thank you for this! Awesome work!
By the way, this looks easy to put in a container. Have you considered doing that?
Amazing work.
Just going to argue on behalf of the other users who know apparently way more than you and I do about this stuff:
WhY nOt juSt UsE thE FBi daTaBaSe of CSam?!
(because one doesn’t exist)
(because if one existed it would either be hosting CSAM itself or showing just the hashes of files - hashes which won’t match if even one bit is changed due to transmission data loss / corruption, automated resizing from image hosting sites, etc)
(because this shit is hard to detect)
Some sites have tried automated detection of CSAM images. Youtube, in an effort to try to protect children, continues to falsely flag 30 year old women as children.
OP, I’m not saying you should give up, and maybe what you’re working on could be the beginning of something that truly helps in the field of CSAM detection. I’ve got only one question for you (which hopefully won’t be discouraging to you or others): what’s your false-positive (or false-negative) detection rate? Or, maybe a question you may not want to answer: how are you training this?
I'm not training it's. Im using publicly available clipboard models.
The false positive rate is acceptable. But my method is open source so feel free to validate on your end
Acceptable isn’t a percentage, but I see in your opinion that it’s acceptable. Thanks for making your content open source. I do wish your project the best of luck. I don’t think I have what it takes to validate this myself but if I end up hosting an instance I’ll probably start using this tool more often myself. It’s better than nothing at at present I have zero instances but also zero mods lined up.
I think deleting images from the pictrs storage can corrupt the pictrs sled db so I would not advise it, you should go via the purge endpoint on the pictrs API.
Nah. It will just not find those images to serve
Interesting, when I tried a while back it broke all images (not visible on the website due to service worker caching but visible if you put any pictrs url into postman or something)
Well you can clearly see images still here ;)
True, you're correct. I'm just not sure how you did it without corrupting the sled db. Maybe I'm just unlucky
the sled db is not touched. It's just that when pict-rs is trying to download the file pointed by the sleddb, it's get a 404
Now if you can make this work with mastodon, i'd be eternally grateful.😁
It's software agnostic. So long as you're storing you're images in object storage, it should work
Can this be used with the Lemmy-easy-deploy method?
This shouldn't run on your lemmy server (unless your lemmy server has a gpu)
I can put one in..
I don't know your setup, but unless it's a very cheap GPU, it would be a bit of a waste to use it only for this purpose. But up to you