this post was submitted on 06 Aug 2023
29 points (83.7% liked)
Lemmy.world Support
3228 readers
67 users here now
Lemmy.world Support
Welcome to the official Lemmy.world Support community! Post your issues or questions about Lemmy.world here.
This community is for issues related to the Lemmy World instance only. For Lemmy software requests or bug reports, please go to the Lemmy github page.
This community is subject to the rules defined here for lemmy.world.
You can also DM https://lemmy.world/u/lwreport or email report@lemmy.world (PGP Supported) if you need to reach our directly to the admin team.
Follow us for server news ๐
Outages ๐ฅ
https://status.lemmy.world
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Either the problems with its API responses are breaking lemmy.world, or a broken lemmy.world is causing problematic API responses.
Currently, you can ask lemmy.world for page number billion of its communities and it'll return a response (for the communities it thinks it has on that page, rather than an empty one, as it should). For something like lemmyverse.net, this means its crawler can never get to end of a scan, and some apps are maybe trying to endlessly load the list.
References:
https://github.com/tgxn/lemmy-explorer/issues/139
https://lemmy.world/post/2651283
This is the same reason I had to turn off my search engines crawler.
There were changes made to the API to ignore any page > 99. So if you ask for page 100 or page 1_000_000_000 you get the first page again. This would cause my crawler to never end in fetching "new" posts.
lemm.ee on the other hand made a similar change but anything over 99 returns an empty response. lemm.ee also flat out ignores
sort=Old
, always returning an empty array.Both of these servers did it for I assume the same reason. Using a high page number significantly increases the response time. It used to be (before they blocked pages over 99) that responses could take over 8-10 seconds! But asking for a low page number would return in 300ms or less. So because it's a lot harder to optimize the existing queries, and maybe not possible, for now the problematic APIs were just disabled.