abruptly8951
It's definitely not indexed, we use RAG architectures to add indexing to data stores that we want the model to have direct access to, the relevant information is injected directly in the context (prompt). This can somewhat be equated to short term memory
The rest of the information is approximated in the weights of the neural network which gives the model general knowledge and intuition..akin to long term memory
You seem a bit troll-y too but I'll bite.
The only one suggesting to add women because they are women is the edgy beb. The change I want to see is edgies being made to feel small. I have achieved based on their anemic come back :)
The person starting this thread wants to see the world you're suggesting I create. And in doing so has opened a whole new class of YouTuber for our dear OP to explore..the analogy about fish, fishing and teachers come to mind
Ooh edgy, say more
This is an extremely odd outlook to have. Good luck with it.
Unfortunately the answer to your question is to not post at all, though if your contributions are worthwhile then that is not an excellent solution
This sounds like a fun project "samples of chemistry"
There's is a huge difference though.
That being one is making hardware and the other is copying books into your training pipeline though
The copy occurs in the dataset preparation.
Privacy preserving federated learning is a thing - essentially you train a local model and send the weight updates back to Google rather than the data itself....but also it's early days so who knows what vulnerabilities may exist
You need rebase instead. Merge just creates useless commits and makes the diffs harder to comprehend (all changes are shown at once, but with rebase you fix the conflicts in the commit where they happened)
Then instead of your branch of branch strat you just rebase daily into main and you're golden when it comes time to PR
For me the infinity subscription bypass stopped working so I finally made the switch