I’m dead 💀
dartos
Generally, training an llm is a bad way to provide it with information. “In-context learning” is probably what you’re looking for. Basically just pasting relevant info and documents into your prompt.
You might try fine tuning an existing model on a large dataset of legalese, but then it’ll be more likely to generate responses that sound like legalese, which defeats the purpose
TL;DR Use in context learning to provide information to an LLM Use training and fine tuning to change how the language the llm generates sounds.
It def adds some flavor to the social media political scene
Probably money. Given enough money, I’m sure tiktok will ban any search term
Lemmy has some very aggressive communists.
I’ve been lucky enough to dodge the crazy right wingers though.
That’s the open source life though :/
Almost nobody gets rich from open source. You’re explicitly granting rights that people usually pay for.
It’s noble, but it sucks.
That took time though.
Ssh only started getting major industry support after heart bleed and it’s been the go to secure shell for at least over a decade before that.
What “things in html, css, and js” does Firefox not support that prevents you from using it?
WebGPU has been the biggest one for me, but most sites don’t even use it.
I think all religions are just fake copycats of the one true god.
Praise be Flying Spaghetti Monster
A programming language itself isn’t a marketable skill!
Learn the underlying concepts of programming and how computers work and you’ll be able to move from language/framework to pretty much any language/framework easily.
Gotta capture every vote you can. Even from right leaning gay people.
Exactly