this post was submitted on 31 Dec 2024
73 points (70.6% liked)

Firefox

20257 readers
7 users here now

A place to discuss the news and latest developments on the open-source browser Firefox

founded 5 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] adarza@lemmy.ca 139 points 6 months ago (10 children)
  • no account or login required.
  • it's an addon (and one you have to go get), not baked-in.
  • limited to queries about content you're currently looking at.
    (it's not a general 'search' or queries engine)
  • llm is hosted by mozilla, not a third party.
  • session histories are not retained or shared, not even with mistral (it's their model).
  • user interactions are not used to train.
[–] jeena@piefed.jeena.net 26 points 6 months ago (7 children)

Thanks for the summary. So it still sends the data to a server, even if it's Mozillas. Then I still can't use it for work, because the data is private and they wouldn't appreciate me sending their data toozilla.

[–] KarnaSubarna@lemmy.ml 21 points 6 months ago (1 children)

In such scenario you need to host your choice of LLM locally.

[–] ReversalHatchery@beehaw.org 5 points 6 months ago (1 children)

does the addon support usage like that?

[–] KarnaSubarna@lemmy.ml 7 points 6 months ago (1 children)

No, but the “AI” option available on Mozilla Lab tab in settings allows you to integrate with self-hosted LLM.

I have this setup running for a while now.

[–] cmgvd3lw@discuss.tchncs.de 4 points 6 months ago (1 children)

Which model you are running? Who much ram?

[–] KarnaSubarna@lemmy.ml 3 points 6 months ago* (last edited 6 months ago)

My (docker based) configuration:

Software stack: Linux > Docker Container > Nvidia Runtime > Open WebUI > Ollama > Llama 3.1

Hardware: i5-13600K, Nvidia 3070 ti (8GB), 32 GB RAM

Docker: https://docs.docker.com/engine/install/

Nvidia Runtime for docker: https://docs.nvidia.com/datacenter/cloud-native/container-toolkit/latest/install-guide.html

Open WebUI: https://docs.openwebui.com/

Ollama: https://hub.docker.com/r/ollama/ollama

load more comments (5 replies)
load more comments (7 replies)