this post was submitted on 28 Feb 2024
91 points (76.6% liked)

Privacy

31853 readers
183 users here now

A place to discuss privacy and freedom in the digital world.

Privacy has become a very important issue in modern society, with companies and governments constantly abusing their power, more and more people are waking up to the importance of digital privacy.

In this community everyone is welcome to post links and discuss topics related to privacy.

Some Rules

Related communities

Chat rooms

much thanks to @gary_host_laptop for the logo design :)

founded 5 years ago
MODERATORS
 

This might also be an automatic response to prevent discussion. Although I'm not sure since it's MS' AI.

you are viewing a single comment's thread
view the rest of the comments
[–] Hotzilla@sopuli.xyz 4 points 8 months ago (1 children)

Copilot runs with GPT4-turbo. It is not trained differently than openai's GPT4-turbo, but it has different system prompts than openai, which tend to make it more easy to just quit discussion. I have never seen openai to say that I will stop this conversation, but copilot does it daily.

[–] LWD@lemm.ee 1 points 8 months ago* (last edited 8 months ago) (1 children)

So by "different system prompts", you mean Microsoft injects something more akin to their own modifiers into the prompt before passing it over to OpenAI?

(The same way somebody might modify their own prompt, "explain metaphysics" with their own modifiers like "in the tone of a redneck"?)

I assumed OpenAI could slot in extra training data as a whole extra component, but that also makes sense to me... And would probably require less effort.

[–] Hotzilla@sopuli.xyz 3 points 8 months ago

Yeah, pretty much like that, in Azure and paid openai both let you modify the system prompt also. There is also a creativity (temperature) property that can be modified. When too high, it will hallucinate more, if too low, it will give same output everytime.

Retraining the model costs like hundred million and weeks of computing power.