Train-On-Request: An On-Device Continual Learning Workflow for Adaptive Real-World Brain-Machine Interfaces
Lan Mei,
Cristian Cioflan,
Thorir Mar Ingolfsson,
Victor Kartsch,
Andrea Cossettini,
Xiaying Wang,
Luca Benini
October 2024
Abstract
Train-On-Request (TOR) introduces an on-device continual learning workflow that allows users to update brain-machine interface models on demand, addressing inter-session variability while maintaining high accuracy in real-world settings. Evaluated on a motor-movement dataset collected with a wearable headband, TOR achieves up to 92% accuracy with calibration times as low as 1.6 minutes, reducing calibration effort by 46% compared to naive retraining. On a GAP9 RISC-V SoC, on-device training runs in 21.6 ms per step and consumes about 1 mJ, demonstrating feasibility for ultra-low-power edge devices.
Key Highlights
- Enables user-initiated on-device continual learning, cutting recalibration time by 46% while sustaining up to 92% accuracy.
- Runs on GAP9 with 21.6 ms training steps consuming about 1 mJ per update, suitable for battery-powered wearables.
- Demonstrates a practical workflow for adaptive BMIs deployed in real-world environments.
Postdoctoral Researcher
I develop efficient machine learning systems for biomedical wearables that operate under extreme resource constraints. My work bridges foundation models, neural architecture design, and edge deployment to enable real-time biosignal analysis on microwatt-scale devices.