| # Pixtral-12B-GGUF Modelfile (IQ2_M) | |
| # --------------------------------- | |
| # | |
| # Tested with: Ollama v0.11.X-->v0.12.9(latest) | |
| # Quantization: IQ2_M (Quant created by = bartowski) | |
| # Quality: Surprisingly Usable | |
| # ---------------------------------------------------- | |
| # | |
| # Vision Notes: | |
| # Some users may need to set the context value -or- "num_ctx" | |
| # value to at least ~12K-->19K. | |
| # ----------------------------------------------------------- | |
| # | |
| # Created by: | |
| # EnlistedGhost (aka Jon Zaretsky) | |
| # -------------------------------- | |
| # | |
| # Goal: | |
| # To provide the FIRST actually functional and usable | |
| # GGUF model version of the Mistral Pixtral-12B for | |
| # direct-usage with Ollama! | |
| # Currently, there are NO USABLE OR WORKING versions | |
| # of this model... | |
| # --------------------------------------------------- | |
| # | |
| # Big/Giant/Huge Thank You: | |
| # (ggml-org, bartowski, and the Ollama team) | |
| # ggml-org: Working mmproj-pixtral vision projector! | |
| # Bartowki: Working I-Matrix Quants that can be paired with ggml-org vision projector! | |
| # Ollama team: Because without them, this wouldn't be possible in the first place! | |
| # ------------------------------------------------------------------------------------ | |
| # | |
| # Import our GGUF quant files: | |
| # (Assuming: Linux Operating System) | |
| # (Assuming: downloaded files are stored in "Downloads" directory/folder) | |
| FROM ~/Downloads/mmproj-Pixtral-12b-f16.gguf | |
| FROM ~/Downloads/Pixtral-12B-IQ2_M.gguf | |
| # ------------------------------------------------------------------------ | |
| # | |
| # Set Default System-Message/Prompt: | |
| SYSTEM """ | |
| # | |
| # !!!-WARNING-!!! | |
| # (Do not modify for: "recommended" configuration and behavior) | |
| # | |
| # !!!-OPTIONAL-!!! | |
| # Pixtral-12B by default does NOT include a system-prompt, however, you can choose to input one within this section of the Ollama-Modelfile. Please be aware that you can possibly damage the linking between the Pixtral-->VisionProjector within the system-prompt field; BE CAREFUL! | |
| """ | |
| # ------------------------------------------------------------------- | |
| # | |
| # Define model-chat template (Thank you to: @rick-github for this mic-drop) | |
| # Link to @rick-github post: https://github.com/ollama/ollama/issues/6748#issuecomment-3368146231 | |
| TEMPLATE [INST] {{ if .System }}{{ .System }} {{ end }}{{ .Prompt }} [/INST] | |
| # | |
| # Below are stop params (required for proper "assistant-->user" multi-turn) | |
| PARAMETER stop [INST] | |
| PARAMETER stop [/INST] | |
| # | |
| # Enjoy Pixtral-12B-GGUF for the ppl! | |
| # Erm, or at least for Ollama users... | |
| # <3 (^.^) <3 | |
| # | |
| # Notice: Please, read the "Instructions.md" at HuggingFace or Ollama-Website | |
| # for a how-to usage and guide on using this modelfile! | |