Nobuyuki Oishi

NobuyukiOishi

Engineer, researcher, and maker of things that work.

I believe technology can and should make people more aware, more active, more alive. This conviction is what drives my research in human sensing × AI.

Now Open to industry and research opportunities in sensing × AI.
Nobuyuki Oishi
01

About

Hi there, I'm Nobu!

I am originally from Shizuoka, Japan. I received my B.Eng and M.Eng from UEC Tokyo, and recently completed my PhD at the University of Sussex under the supervision of Prof. Daniel Roggen.

I am a hands-on researcher who loves translating ideas into reality. I enjoy the process of developing systems end-to-end—from getting my hands dirty with raw data collection and pre-processing, to training machine learning models and deploying them in the real world. I find joy in watching it all finally click into place.

Lately, I have been diving deep into how we can use LLMs and AI agents to accelerate research and development. If you are working on something similar, or just want to chat about the intersection of AI, wearables, and wellness, I'd love to connect!

座右の銘
自律して漸進
Evolve through self-mastery
Traveling around Japan by bicycle. Cape Sōya, the northernmost point in Japan.

Traveling around Japan by bicycle. Cape Sōya, the northernmost point in Japan.

Based in
Shizuoka,
Japan,
Earth
Languages
Japanese (native)
English (working proficiency)
Chinese (beginner)
Outside research
Guitar
Workout
Tea
02

Projects

PPDA project
PhD research at univ. of sussex · 2021–2026
Physics-informed Data Augmentation via Virtual IMU Simulation

Collecting and annotating IMU data for human activity recognition is costly and hard to scale. We address this through Physically Plausible Data Augmentation — physics-informed augmentation enabled by WIMUSim, a differentiable simulation framework that minimises the gap between virtual and real IMU signals via gradient-based optimisation. Evaluated across supervised and self-supervised settings, PPDA consistently outperforms conventional augmentation, enabling more efficient and ethically responsible use of human-subject data.

Wearable Computing Human Activity Recognition Data Augmentation Physics Simulation
Collaboration with Sheba Medical Centre · 2021–2022
Detecting Parkinson's Freezing of Gait with Earables

"Can earables detect Freezing of Gait (FoG)?" Earables are ideally positioned to sense FoG and deliver immediate audio feedback — making them a promising assistive tool for Parkinson's disease patients. To explore this without burdening vulnerable patients with new data collection, we simulated earable IMU signals from existing VR motion capture data — the first simulation-based proof-of-concept for FoG detection with earables.

Freezing of Gait Parkinson's Disease VR Motion Capture Virtual IMU Earables
Dataset project thumbnail
Collaboration with Concordia University · 2023
Benchmarking HAR Model Robustness to Real-World Variability

How robust are HAR models to real-world variabilities — different sensor placements, mixed hardware, different users? We collected a multimodal dataset capturing controlled walking and uncontrolled salad preparation across ages 20s–70s, to benchmark deep learning models' robustness against such variabilities. Using this dataset, we co-organised the "From Labels to Text" Open-ended Idea Challenge at ABC 2026.

Multimodal Open Dataset LLM+HAR Concordia
03

Selected Publications

[1]
Physically Plausible Data Augmentations for Wearable IMU-based Human Activity Recognition Using Physics Simulation
Oishi, N., Birch, P., Roggen, D., & Lago, P.
IEEE Sensors Journal · 2026 journal
[2]
WIMUSim: Simulating Realistic Variabilities in Wearable IMUs for Human Activity Recognition
Oishi, N., Birch, P., Roggen, D., & Lago, P.
Frontiers in Computer Science · 2025 journal
[3]
Detecting Freezing of Gait with Earables Trained from Motion Capture Data
Oishi, N., Heimler, B., Pellatt, L., Plotnik, M., & Roggen, D.
Proceedings of ISWC 2021 — International Symposium on Wearable Computers
04

Contact

I'm always happy to talk — about research, collaboration, or whatever's on your mind.