I can't speak specifically to the infosec bots, but I suspect it has something to do with all of the Lemmy instances mirroring every post. It could add a lot of weight to SEO for a various websites. So if they can get a post that doesn't get deleted, that's SEO fodder
cybersecurity
An umbrella community for all things cybersecurity / infosec. News, research, questions, are all welcome!
Community Rules
- Be kind
- Limit promotional activities
- Non-cybersecurity posts should be redirected to other communities within infosec.pub.
Enjoy!
Seems like Lemmy should add a rel=canonical
link when browsing federated communities - this would “solve“ this issue (and would be the correct thing to do anyway).
I believe Lemmy instances disallow crawling by default, so SEO is probably not why. Would be nice to find Lemmy results in Google if they can sort out the canonical URL problem. Reddit was a great resource for random questions, and if people move here it should still be easy to find.
Nope, it's allowed.
The default robots.txt disallows access to a few paths but not /post or /comment.
There are lots of crawler bots hitting my instance (ByteSpider being the most aggressive). I just have a list of User Agent regexes I use to block them via Nginx. Some, like Semrush, have IP ranges I can block completely at the firewall (in addition to the UA filters)
What makes you say that? robot.txt just disallows things like /create_community and there's no robots, googlebot, etc meta tags in the source that I can see, and no nofollow apart from on a few things like feeds.
Also, I'm sure I've seen Lemmy appearing in search results already.
Do you mean rel="nofollow"
?
No, I was referring to the bit about having lots of copies of the same content on each different instance. If example.com/c/comm@* had a meta tag giving the origin community as the rel=canonical link target then only the origin would be in a search engine as the only linker.
rel=nofollow is a good idea too, but less interesting to this semantic html nerd.
Also, one can create a personal instance of lemmy without users, create a bot to subscribe to many communities and they'd end up with a whole database to simply create personalized recommenders targeted to every single user.
Don't know if they are doing it now, but it should be pretty easy. One has everything, subscriptions, upvotes, all comments, all nicely served in a convenient relational db format
The SEO-angle is interesting, thank you for the insight!