EEG-DINO (safetensors, Burn-compatible)
Converted weights for the EEG-DINO foundation model, packaged as safetensors for use with the eegdino Rust inference crate.
Derived from: eegdino/EEG-DINO
Files
| File | Model | Params | d_model | Heads | Layers | Size |
|---|---|---|---|---|---|---|
eeg_dino_small.safetensors |
Small | 4.6 M | 200 | 8 | 12 | 17 MB |
eeg_dino_medium.safetensors |
Medium | 33 M | 512 | 16 | 16 | 129 MB |
eeg_dino_large.safetensors |
Large | 201 M | 1 024 | 16 | 24 | 770 MB |
What changed from the original weights
The original PyTorch .pt checkpoints from eegdino/EEG-DINO were converted with the following transformations:
- Filtered --- only student encoder weights are kept; teacher, projector, and loss tensors are discarded
- Renamed ---
module.student.prefix stripped; Sequential indices mapped to descriptive names (e.g.proj_in.0→proj_in.conv1) - Transposed --- linear weight matrices converted from PyTorch
[out, in]to Burn[in, out]layout - Format --- saved as float32 safetensors (no bf16)
The conversion script is at scripts/convert_weights.py.
Usage with Rust
# Cargo.toml
[dependencies]
eegdino = "0.1"
burn = { version = "0.20", features = ["ndarray"] }
use eegdino_rs::prelude::*;
use burn::backend::NdArray;
type B = NdArray;
let encoder = EegDinoEncoder::<B>::builder()
.weights("eeg_dino_small.safetensors")
.device(Default::default())
.build()
.unwrap();
// 19 channels, 10 seconds @ 200 Hz
let signal = vec![0.0f32; 19 * 2000];
let result = encoder.encode_raw(&signal, 1, 19, 2000).unwrap();
// result.shape == [1, 191, 200]
Download
# All weights
hf download eugenehp/eegdino --local-dir weights
# Single model
hf download eugenehp/eegdino eeg_dino_small.safetensors --local-dir weights
Numerical parity
These converted weights produce outputs with NRMSE < 1e-6 compared to the original PyTorch model on identical inputs:
| Model | Max abs error | NRMSE |
|---|---|---|
| Small | 8.5e-7 | 5.5e-7 |
| Medium | 2.1e-6 | 8.8e-7 |
| Large | 4.8e-6 | 5.9e-7 |
Citation
If you use these weights, please cite the original EEG-DINO paper:
@article{jiang2024eegdino,
title={Learning EEG Foundation Models via Hierarchical Self-Distillation},
author={Jiang, Yuqi and others},
year={2024}
}
Links
- Original model: eegdino/EEG-DINO
- Original code: github.com/miraclefish/EEG-DINO
- Rust crate: crates.io/crates/eegdino
- Rust repo: github.com/eugenehp/eegdino-rs
License
MIT (conversion and crate). Original model weights are subject to the EEG-DINO license.
Model tree for eugenehp/eegdino
Base model
eegdino/EEG-DINO