EEG-DINO (safetensors, Burn-compatible)

Converted weights for the EEG-DINO foundation model, packaged as safetensors for use with the eegdino Rust inference crate.

Derived from: eegdino/EEG-DINO

Files

File Model Params d_model Heads Layers Size
eeg_dino_small.safetensors Small 4.6 M 200 8 12 17 MB
eeg_dino_medium.safetensors Medium 33 M 512 16 16 129 MB
eeg_dino_large.safetensors Large 201 M 1 024 16 24 770 MB

What changed from the original weights

The original PyTorch .pt checkpoints from eegdino/EEG-DINO were converted with the following transformations:

  1. Filtered --- only student encoder weights are kept; teacher, projector, and loss tensors are discarded
  2. Renamed --- module.student. prefix stripped; Sequential indices mapped to descriptive names (e.g. proj_in.0 → proj_in.conv1)
  3. Transposed --- linear weight matrices converted from PyTorch [out, in] to Burn [in, out] layout
  4. Format --- saved as float32 safetensors (no bf16)

The conversion script is at scripts/convert_weights.py.

Usage with Rust

# Cargo.toml
[dependencies]
eegdino = "0.1"
burn = { version = "0.20", features = ["ndarray"] }
use eegdino_rs::prelude::*;
use burn::backend::NdArray;

type B = NdArray;

let encoder = EegDinoEncoder::<B>::builder()
    .weights("eeg_dino_small.safetensors")
    .device(Default::default())
    .build()
    .unwrap();

// 19 channels, 10 seconds @ 200 Hz
let signal = vec![0.0f32; 19 * 2000];
let result = encoder.encode_raw(&signal, 1, 19, 2000).unwrap();
// result.shape == [1, 191, 200]

Download

# All weights
hf download eugenehp/eegdino --local-dir weights

# Single model
hf download eugenehp/eegdino eeg_dino_small.safetensors --local-dir weights

Numerical parity

These converted weights produce outputs with NRMSE < 1e-6 compared to the original PyTorch model on identical inputs:

Model Max abs error NRMSE
Small 8.5e-7 5.5e-7
Medium 2.1e-6 8.8e-7
Large 4.8e-6 5.9e-7

Citation

If you use these weights, please cite the original EEG-DINO paper:

@article{jiang2024eegdino,
    title={Learning EEG Foundation Models via Hierarchical Self-Distillation},
    author={Jiang, Yuqi and others},
    year={2024}
}

Links

License

MIT (conversion and crate). Original model weights are subject to the EEG-DINO license.

Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for eugenehp/eegdino

Base model

eegdino/EEG-DINO
Finetuned
(1)
this model