this post was submitted on 19 Dec 2024
165 points (93.2% liked)

Technology

60013 readers
2627 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 2 years ago
MODERATORS
 

Microsoft-owned GitHub announced on Wednesday a free version of its popular Copilot code completion/AI pair programming tool, which will also now ship by default with Microsoft’s popular VS Code editor. Until now, most developers had to pay a monthly fee, starting at $10 per month, with only verified students, teachers, and open source maintainers getting free access.

GitHub also announced that it now has 150 million developers on its platform, up from 100 million in early 2023.

“My first project [at GitHub] in 2018 was free private repositories, which we launched very early in 2019,” GitHub CEO Thomas Dohmke told me in an exclusive interview ahead of Wednesday’s announcement. “Then we had kind of a v2 with free private organizations in 2020. We have free [GitHub] Actions entitlements. I think at my first Universe [conference] as CEO, we announced free Codespaces. And so it felt natural, at some point, to get to the point where we also have a completely free Copilot, not just one that is for students and open source maintainers.”

you are viewing a single comment's thread
view the rest of the comments
[–] theherk@lemmy.world 13 points 1 day ago (2 children)

Run copilot’s proprietary model locally? You’re dreaming. But you can do this with ollama, and they aren’t forcing you. There are many local models that works pretty well.

[–] residentmarchant@lemmy.world 5 points 1 day ago

I used Ollama locally and it worked decently well. Code suggestions were fast and relatively accurate (as far as an LLM goes). The real issue was the battery hit. Oh man, it HALVED my battery life, which is already short enough when running a server locally

[–] muntedcrocodile@lemm.ee 7 points 1 day ago (1 children)

No i mean i assume they are shipping a vscode extension as default. I was wondering if said extension allows me to point at said locally run model.

[–] eager_eagle@lemmy.world 3 points 1 day ago

They aren't. Copilot is not a built-in extension. Can't say much about future plans though.