this post was submitted on 28 Mar 2025
31 points (100.0% liked)
Programming
19270 readers
88 users here now
Welcome to the main community in programming.dev! Feel free to post anything relating to programming here!
Cross posting is strongly encouraged in the instance. If you feel your post or another person's post makes sense in another community cross post into it.
Hope you enjoy the instance!
Rules
Rules
- Follow the programming.dev instance rules
- Keep content related to programming in some way
- If you're posting long videos try to add in some form of tldr for those who don't want to watch videos
Wormhole
Follow the wormhole through a path of communities !webdev@programming.dev
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
I would scrape them into individual json files with more info than you think you need, just for the sake of simplicity. Once you have them all, then you can work out an ideal storage solution, probably some kind of SQL DB. Once that is done, you could turn the json files into a .tar.zst and archive it, or just delete them if you are confident in processed representation.
Source: I completed a similar but much larger story site archive and found this to be the easiest way.
That's a good idea! Would yaml be alright for this too? I like the readability and Python styled syntax compared to json.
I would stay away from YAML (almost at all costs).
Curious to hear your reasoning as to why yaml is less desirable? Would think the opposite.
Surprised me with your strong opinion.
Maybe if you would allow, and have a few shot glasses handy, could take a stab at changing your mind.
But first list all your reservations concerning yaml
Relevent packages I wrote that rely on yaml
pytest-logging-strict
sphinx-external-toc-strict
What's your reasoning for that?
At this point, I think I'll only use yaml as the scraper output and then create a database tool to convert that into whatever data format I end up using.
https://ruudvanasseldonk.com/2023/01/11/the-yaml-document-from-hell
JSON is a much simpler (and consequently safer) format. It's also more universally supported.
YAML (or TOML) is decent for a manually read and written configuration. But for a scraper output for storage and follow-up workflows being through code parsing anyway, I would go for JSON.
That's an interesting read. I'll definitely give json a try too.
Very wise idea. And if you want to up your game, can validate the yaml against a schema.
Check out strictyaml
The author is ahead of his time. Uses validated yaml to build stories and weave those into web sites.
Unfortunately the author also does the same with strictyaml tests. Can get frustrating cause the tests are too simple.
Gonna be honest, I'll need to research a bit more what validating against a schema is, but I get the general idea, and I like it.
For initial testing and prototypes, I probably won't worry about validation, but once I get to the point of refining the system, validation like that would be a good idea.
I see no reason you can't use yaml.
Yaml and json are essentially identical for basic purposes.
Once the scraper has been confirmed working, are you going to be doing a lot of reading/editing of the raw data? It might as well be a binary blob (which is a bad idea as it couples the raw data to your specific implementation)
I'm not entirely sure yet, but probably yes to both. The story text will likely stay unchanged, but I'll likely experiment with various ways to analyze the stories.
The main idea I want to try is assigning stories "likely tags" based on the frequency of keywords. So castle and sword could indicate fantasy while robot and ship could indicate sci-fi. There are a lot of stories missing tags, so something like this would be helpful.
Yup, I think it'd work fine, especially if you want the ability to easily inspect individual items.
Any of the popular python yaml libraries will be more than sufficient. With a bit of work, you can marshal the input (when reading files back) into python (data)classes, making it easy to work with.