How to use from
Docker Model Runner
docker model run hf.co/maxcurrent/Peagle-9b
Quick Links

Peagle-9b

Hey there! ๐Ÿ‘‹ Welcome to the Peagle-14b! This is a merge of multiple models brought together using the awesome VortexMerge kit.

Let's see what we've got in this merge:

๐Ÿงฉ Configuration

slices:
  - sources:
      - model: mlabonne/NeuralBeagle14-7B
        layer_range: [0, 20]
  - sources:
      - model: eldogbbhed/NeuralPearlBeagle
        layer_range: [12, 32]
merge_method: passthrough 
Downloads last month
13,413
Safetensors
Model size
9B params
Tensor type
F16
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for maxcurrent/Peagle-9b

Quantizations
1 model

Space using maxcurrent/Peagle-9b 1