Thorir Mar Ingolfsson

Thorir Mar Ingolfsson

Postdoctoral Researcher

ETH Zurich

Biography

I am a Postdoctoral Researcher at ETH Zurich working with Prof. Dr. Luca Benini, where I lead the machine learning research direction within the group. My work focuses on making AI accessible on the most resource-constrained devices - enabling foundation models and advanced ML systems to run on wearable biomedical devices consuming microwatts of power.

I develop efficient neural architectures that bridge the gap between large-scale foundation models and ultra-low-power edge deployment. My recent work on LUNA (NeurIPS 2025) achieves 300× reduction in computational costs while maintaining state-of-the-art performance for EEG analysis. I’m particularly interested in tiny recursion models, deep supervision techniques for time-series signals, and scalable foundation models for biosignals.

I’m looking for students and collaborators to work with me at the intersection of TinyML and biomedical AI. If you’re passionate about making AI work on edge devices, I’d love to hear from you.

Interests
  • Foundation Models for Biosignals
  • Tiny Recursion Models & Deep Supervision
  • TinyML & Edge AI Deployment
Education
  • Ph.D. in Electrical Engineering and Information Technology, 2025

    ETH Zurich

  • M.Sc. in Electrical Engineering and Information Technology, 2020

    ETH Zurich

  • B.Sc. in Electrical and Computer Engineering, 2018

    University of Iceland

717 Citations Google Scholar
12 h-index Core research impact

What I’m Currently Working On

As a researcher at the intersection of TinyML and biomedical AI, I'm currently exploring:

🧠 Foundation Models for Biosignals

I’m developing large-scale pre-trained models that can understand diverse biomedical signals with minimal fine-tuning. Our LUNA model (NeurIPS 2025) achieves topology-agnostic EEG analysis with 300× fewer FLOPs and 10× less memory than traditional approaches, enabling more robust and generalizable health monitoring systems.

Recent work: LUNA at NeurIPS 2025 | Code on GitHub

🔄 Tiny Recursion Models

I’m investigating how deep recursion and supervision techniques can be applied to time-series biosignals to improve model efficiency and accuracy. This approach enables more sophisticated temporal modeling while maintaining the ultra-low computational budgets required for edge deployment.

Focus areas: Deep supervision, recurrent architectures, temporal feature learning

🚀 Edge AI Deployment

I’m developing hardware-aware methods to deploy foundation models and advanced ML systems on resource-constrained wearable devices. This involves co-designing algorithms and implementations to achieve microwatt-level power consumption while maintaining clinical-grade performance for applications like seizure detection and physiological monitoring.

Technologies: GAP9, RISC-V processors, TinyML optimization

Flagship Foundation Models

Scaling EEG intelligence from topology-agnostic transformers to linear-time state-space architectures

LUNA · NeurIPS 2025

Topology-agnostic EEG foundation model trained on 21k+ hours that achieves 0.921 AUROC with 300× fewer FLOPs and 10× lower memory.

FEMBA · EMBC 2025

Bidirectional Mamba EEG foundation model that scales linearly with sequence length and reaches 0.949 AUROC on TUAR.

Available Projects for Students

🎓 Looking for Students & Collaborators

I'm looking for motivated students to work with me on TinyML and biomedical AI research. If you're passionate about making AI work on edge devices and have a strong background in machine learning, I'd love to hear from you.

Why work with me?

  • 🔬 Publish at top-tier venues (NeurIPS, ICML, IEEE journals)
  • 🛠️ Access to cutting-edge hardware (GAP9, embedded ML platforms)
  • 🌍 Collaborate with leading research groups and industry partners
  • 🎯 Work on real-world applications with practical impact
  • 📚 Regular mentorship and career guidance

📋 Open MSc Thesis Topics (Snippet View)

Tiny Recursive Models for Time-Series

Tiny Recursive Models for Time-Series

Adapt TRMs to non-visual domains (e.g., UCR, EEG) and analyse how deep supervision and adaptive halting impact accuracy and compute.

Quantized TRMs for Edge Deployment

Quantized TRMs for Edge Deployment

Quantise TRMs to INT8/INT4, deploy them on GAP9 or Cortex-M, and study accuracy–energy trade-offs and on-device adaptive halting.


💡 Not Seeing Your Dream Topic?

If you’re interested in working with me but don’t see a perfect fit above, feel free to reach out. I’m always open to discussing new ideas at the intersection of:

  • Foundation models for biosignals
  • Tiny recursive models and deep supervision
  • Hardware-aware neural architecture search
  • TinyML deployment and optimisation

Projects

*

Recent Publications

Tip: Explore the full archive and filter by venue, topic, or year on the publications page.
(2025). Finetuning and Quantization of EEG-Based Foundational BioSignal Models on ECG and PPG Data for Blood Pressure Estimation. In EMBC 2025.

PDF Project Preprint (arXiv) DOI

(2025). SzCORE: Seizure Community Open-Source Research Evaluation Framework for EEG-Based Seizure Detection. In Epilepsia 66 (Suppl. 3).

PDF Code Dataset Open Access PDF DOI (Epilepsia) SzCORE GitHub

(2025). A Wearable Ultra-Low-Power System for EEG-Based Speech-Imagery Interfaces. In IEEE TBioCAS 2025.

IEEE Xplore

(2025). CEReBrO: Compact Encoder for Representations of Brain Oscillations Using Efficient Alternating Attention. arXiv:2501.10885 (2025).

PDF Project Preprint (arXiv) DOI

(2024). Train-On-Request: An On-Device Continual Learning Workflow for Adaptive Real-World Brain-Machine Interfaces. In IEEE BioCAS 2024.

PDF Code Preprint (arXiv) GitHub Repository DOI

Contact