this post was submitted on 31 Jan 2025
343 points (94.3% liked)

Open Source

32388 readers
524 users here now

All about open source! Feel free to ask questions, and share news, and interesting stuff!

Useful Links

Rules

Related Communities

Community icon from opensource.org, but we are not affiliated with them.

founded 5 years ago
MODERATORS
 

Article: https://proton.me/blog/deepseek

Calls it "Deepsneak", failing to make it clear that the reason people love Deepseek is that you can download and it run it securely on any of your own private devices or servers - unlike most of the competing SOTA AIs.

I can't speak for Proton, but the last couple weeks are showing some very clear biases coming out.

(page 2) 50 comments
sorted by: hot top controversial new old
[–] thingsiplay@beehaw.org 19 points 2 days ago (2 children)

How is this Open Source? The official repository https://github.com/deepseek-ai/DeepSeek-R1 contains images only, a PDF file, and links to download the model. I don't see any code. What exactly is Open Source here? And if so, where to get the source code?

[–] v_krishna@lemmy.ml 22 points 2 days ago (4 children)

In deep learning generally open source doesn't include actual training or inference code. Rather it means they publish the model weights and parameters (necessary to run it locally/on your own hardware) and publish academic papers explaining how the model was trained. I'm sure Stallman disagrees but from the standpoint of deep learning research DeepSeek definitely qualifies as an "open source model"

load more comments (4 replies)
load more comments (1 replies)
[–] AustralianSimon@lemmy.world 23 points 2 days ago

To be fair its correct but it's poor writing to skip the self hosted component. These articles target the company not the model.

[–] davel@lemmy.ml 21 points 2 days ago (3 children)
[–] JOMusic@lemmy.ml 7 points 2 days ago

Well you just made me choke on my laughter. Well done, well done.

load more comments (1 replies)
[–] wuphysics87@lemmy.ml 12 points 2 days ago (1 children)

There are many llms you can use offline

[–] davel@lemmy.ml 14 points 2 days ago (1 children)
[–] spooky2092@lemmy.blahaj.zone 7 points 2 days ago

Deepseek works reasonably well, even at cpu only in ollama. I ran the 7b and 1.5b models and it wasn't awful. 7b slowed down as the convo went on, but the 1.5b model felt pretty passable while I was playing with it

load more comments
view more: ‹ prev next ›