FEMBA: Efficient and Scalable EEG Analysis with a Bidirectional Mamba Foundation Model
Anna Tegon,
Thorir Mar Ingolfsson,
Xiaying Wang,
Luca Benini,
Yawei Li
July 2025
Abstract
FEMBA proposes a self-supervised EEG foundation model built on a bidirectional Mamba state-space architecture that scales linearly with sequence length and avoids the quadratic complexity of transformers. Trained on more than 21,000 hours of unlabeled EEG and fine-tuned on multiple downstream tasks, FEMBA reaches 81.82% balanced accuracy (AUROC 0.8921) on the TUAB dataset and 0.949 AUROC on TUAR, while a tiny 7.8M-parameter variant demonstrates suitability for resource-constrained devices.
Key Highlights
- Bidirectional Mamba state-space architecture scales linearly with sequence length and avoids transformer quadratic complexity.
- Pre-trained on more than 21,000 hours of EEG data; achieves 81.82% balanced accuracy on TUAB and 0.949 AUROC on TUAR.
- Compact 7.8M-parameter variant is suitable for embedded devices and edge deployment.
Resources
Postdoctoral Researcher
I develop efficient machine learning systems for biomedical wearables that operate under extreme resource constraints. My work bridges foundation models, neural architecture design, and edge deployment to enable real-time biosignal analysis on microwatt-scale devices.