all of the above + more???
AK1174
joined 1 year ago
you could use LocalAI or ollama. but neither is going to work with 300mb of ram, and it needs a bunch compute resources for response speed to be usable. these models are also not very capable, in comparison to openAI’s gpt’s, but that depends on what your goal is with the models.
well a web server is a pen-testable thing, and is also a very common pen-tested thing so the background knowledge is useful .