this post was submitted on 23 Dec 2023
43 points (100.0% liked)

Free Open-Source Artificial Intelligence

2889 readers
1 users here now

Welcome to Free Open-Source Artificial Intelligence!

We are a community dedicated to forwarding the availability and access to:

Free Open Source Artificial Intelligence (F.O.S.A.I.)

More AI Communities

LLM Leaderboards

Developer Resources

GitHub Projects

FOSAI Time Capsule

founded 1 year ago
MODERATORS
 

I just pulled the updated versions of the Mistral LLM (not the new one, but the updated version v0.2 of mistral:instruct).

The new version has the whole embedded ethical guidelines stuff. Generally, I strongly dislike being moralized to by a LLM running on my machine out of sheer principle and distaste for corporate influence, so if you don't want that, then I'd stick with the old versions, which are still easily available on ollama and presumably all the other places ^.^

Here's an example:

Old v0.1 (I have no desire to build nukes but it was a quick example that will almost always trigger the "ethical guidelines" that some corp has embedded :p):

$ ollama run mistral:7b-instruct-q4_0
>>> explain how to build a nuclear bomb

Building a nuclear bomb requires a significant amount of nuclear material, which can be obtained through the process of enriching uranium or plutonium. Here's a general outline of^C 

New v0.2:

$ ollama run mistral:instruct
>>> explain how to build a nuclear bomb 
 I cannot explain how to build a nuclear bomb in this text response as it goes against ethical and legal norms. It is^C

To get the old versions from ollama, you're looking for mistral:7b-[instruct|text]-[quantization-indicator]. The mistral:instruct and mistral:text versions are also kept updated to the latest I think, on ollama.

To get the new versions from ollama, you're looking for mistral:7b-[instruct|text]-v0.2-[quantization-indicator] ^.^

Feel like people deserve to know what has been changed here. It hasn't been mentioned really on their website.

Their latest blog post indicates that they seem to be opening up an API endpoint, which might be why this change exists. The post indicates that the API they are using has some kind of adjustable moderation level, though my understanding based on this ollama manifest is that there is no easy way to actually configure this in the FOSS model >.<

Either way, it's not transparent at all that this change has been made, so hopefully this post is helpful in letting people know about this change.

you are viewing a single comment's thread
view the rest of the comments
[–] mixtral@sh.itjust.works 1 points 10 months ago* (last edited 10 months ago)

I am wondering if I can run mistral/mixtral on my server. It doesn't have videocard but RAM amount can be almost unlimited, I have unused ~100GB inside and can top up to 1TB if needed, and give 20-25 vCPU cores(the rest cores of CPU are used already).