Research

Publications from the Accessible IoT project (2021-2025) under JST Presto Grant JPMJPR2132

30
Publications
5
Years
29
Research Themes

2025

Pro's Eyes: A Wearable System for Synchronous and Asynchronous Observational Pattern Learning

SIGGRAPH Asia 2025 Emerging Technologies

Qing Zhang, Jing Huang, Yuta Itoh, Thad Starner, Kai Kunze, Jun Rekimoto

Key Contribution: Wearable system enabling both synchronous and asynchronous observational pattern learning from experts

Pro’s Eyes is a wearable system that enables observational pattern learning from experts in both synchronous and asynchronous modes. The system captures and analyzes expert behavior patterns, facilitating skill transfer and learning through smart glasses technology.

smart-eyewear skill-learning educational-tools

A Multimodal Wearable Sensing System for Vocal Muscle Biofeedback in Singing Pitch Training

UbiComp 2025

Kanyu Chen, Akira Kato, Kai Kunze

Key Contribution: Multimodal wearable system providing real-time biofeedback for vocal muscle control during singing pitch training

This paper presents a multimodal wearable sensing system designed to provide vocal muscle biofeedback during singing pitch training. The system helps users develop better control over their vocal muscles through real-time physiological feedback.

wearable-iot physiological-sensing skill-learning

Biosensing, Enhanced Senses and Experience Design for Augmented Humans

UbiComp 2025

Jonna Häkkilä, Jani Mäntyjärvi, Zhengya Gong, Heiko Müller, Kati Pettersson, Ashley Colley, Roope Raisamo, Kai Kunze, Albrecht Schmidt

Key Contribution: Workshop exploring biosensing and enhanced senses for augmented human experiences

This workshop paper explores the intersection of biosensing, enhanced senses, and experience design for augmented humans. It brings together researchers and practitioners to discuss how physiological sensing can create novel human augmentation experiences.

physiological-sensing human-augmentation experience-design

Conscious or Unconscious Meditation? Haptic Interaction Design in Meditation Augmentation Using Physiological Sensing

UbiComp 2025

Danyang Peng, Kanyu Chen, Yun Suen Pai, Giulia Barbareschi, Kouta Minamizawa, Kai Kunze

Key Contribution: Exploration of haptic interaction design for meditation augmentation using physiological sensing to support both conscious and unconscious practices

This paper explores haptic interaction design for meditation augmentation, examining how physiological sensing can guide both conscious and unconscious meditation practices. The work investigates how different haptic feedback patterns can enhance meditative states.

haptic-feedback physiological-sensing wellbeing

Exploring Singing Breath: Physiological Insights and Directions for Breath-Aware Augmentation in Mixed Reality Design

UbiComp 2025

Kanyu Chen, Zhuang Chang, Qianyuan Zou, Kai Kunze

Key Contribution: Physiological study of singing breath patterns with design directions for breath-aware mixed reality augmentation

This paper explores the physiological aspects of singing breath control and proposes design directions for breath-aware augmentation in mixed reality. The work provides insights into respiratory patterns during singing and how they can be leveraged for immersive training experiences.

physiological-sensing mixed-reality skill-learning

Introduction of the VCSD Dataset: A Vocal Cords Dataset Using EMG and Ultrasonography for Singing Pitch Skill Recognition

UbiComp 2025

Kanyu Chen, Erwin Wu, Chen-Chieh Liao, Daichi Saito, Yichen Peng, Kato Akira, Hideki Koike, Kai Kunze

Key Contribution: First public dataset of vocal cord activity combining EMG and ultrasonography for singing pitch skill recognition

This paper introduces VCSD (Vocal Cords for Singing Dataset), a novel dataset combining EMG and ultrasonography data for singing pitch skill recognition. The dataset provides a valuable resource for researchers working on vocal training systems and physiological sensing.

physiological-sensing datasets skill-learning

2nd International Workshop on Mobile Cognitive-Augmenting and Cognition-Altering Technologies (CAT) in Human-Centered AI

MobileHCI 2025

Agnes Gruenerbl, Jan Spilski, Giulia Barbareschi, Kai Kunze, Passant ElAgroudy, Thomas Lachmann, Paul Lukowicz

Key Contribution: Second edition of workshop advancing research in mobile cognitive augmentation and human-centered AI

This workshop paper presents the second International Workshop on Mobile Cognitive-Augmenting and Cognition-Altering Technologies (mobiCHAI). Building on the first workshop, it continues to explore mobile technologies for cognitive augmentation within human-centered AI frameworks.

cognitive-augmentation mobile-computing human-centered-ai

Spread Your Wings: Demonstrating a Soft Floating Robotic Avatar with Flapping Wings for Novel Physical Interactions

SIGGRAPH 2025 Emerging Technologies

Mingyang Xu, Yulan Ju, Qing Zhang, Christopher Changmok Kim, Qingyuan Gao, Yun Suen Pai, Giulia Barbareschi, Matthias Hoppe, Kai Kunze, Kouta Minamizawa

Key Contribution: Soft floating robotic avatar with flapping wings enabling novel affective physical interactions

This demonstration presents a soft floating robotic avatar with flapping wings that enables novel forms of physical interaction. The robot builds on previous work (Cuddle robot) to explore how wing-based movements can create engaging and affective human-robot interactions.

assistive-robotics haptic-feedback human-robot-interaction

BodyPursuits

Exploring Smooth Pursuit Gaze Interaction Based on Body Motion Targets

ETRA 2025

Anja Hansen, Sarah Makarem, Kai Kunze, Yexu Zhou, Michael Thomas Knierim, Christopher Clarke, Hans Gellersen, Michael Beigl, Tobias Röddiger

Key Contribution: Novel gaze interaction technique based on smooth pursuit eye movements tracking body motion, enabling hands-free interaction

This research explores how the natural eye movement pattern of smooth pursuit can be used for interaction with IoT systems and wearable devices. By tracking how users’ eyes follow their own body motions, the system creates new opportunities for hands-free control.

The technique has particular relevance for accessibility, enabling people with limited hand mobility to interact with technology using natural body movements and gaze. It represents an innovative approach to inclusive interface design in the context of ubiquitous computing.

cognitive-augmentation

Eye-Tracking for Cognitive Well-Being

Balancing Detection and Ethical Feedback

ETRA 2025

Christopher Changmok Kim, Matthias Hoppe, Kai Kunze

Key Contribution: Framework for ethical eye-tracking-based cognitive well-being monitoring that balances detection accuracy with user privacy and autonomy

As eye-tracking becomes increasingly integrated into wearable devices and IoT systems, questions arise about how to ethically monitor and provide feedback about users’ cognitive states. This research addresses these concerns head-on, proposing design principles that prioritize user autonomy and well-being.

The work is particularly relevant for accessibility applications where cognitive monitoring could support people with attention difficulties, mental health challenges, or neurodivergent individuals, while respecting their privacy and agency.

cognitive-augmentation smart-eyewear

TIEboard: A Digital Educational Tool for Kids Geometric Learning

Proc. ACM IMWUT

Arooj Zaidi, Giulia Barbareschi, Kai Kunze, Yun Suen Pai, Junichi Yamaoka

Key Contribution: Interactive digital tool that makes geometric learning accessible and engaging for children

TIEboard represents an innovative approach to making mathematics education more accessible and inclusive for children. By combining tangible interaction with digital feedback, the tool creates an engaging learning environment that supports diverse learning styles.

The research demonstrates how IoT-enabled educational tools can bridge the gap between abstract mathematical concepts and concrete, hands-on learning experiences.

educational-tools inclusive-design

Cuddle-Fish: Exploring a Soft Floating Robot with Flapping Wings for Physical Interactions

AHs 2025

Mingyang Xu, Jiayi Shao, Yulan Ju, Ximing Shen, Qingyuan Gao, Weijen Chen, Qing Zhang, Yun Suen Pai, Giulia Barbareschi, Matthias Hoppe, Kouta Minamizawa, Kai Kunze

Key Contribution: Novel soft floating robot design for safe and engaging physical interaction

Cuddle-Fish presents an innovative soft floating robot with flapping wings designed for safe physical interactions. The soft materials and gentle motion make it suitable for therapeutic and assistive applications, particularly in contexts requiring calming, non-threatening robotic presence.

assistive-robotics

2024

mobiCHAI 2024: 1st International Workshop on Mobile Cognitive-Augmenting and Cognition-Altering Technologies (CAT) in Human-Centered AI

MobileHCI 2024

Passant ElAgroudy, Jan Spilski, Giulia Barbareschi, Kai Kunze, Thomas Lachmann, Paul Lukowicz

Key Contribution: First workshop establishing the research agenda for mobile cognitive augmentation technologies

This workshop paper introduces the first International Workshop on Mobile Cognitive-Augmenting and Cognition-Altering Technologies (mobiCHAI). The workshop explores how mobile technologies can augment human cognitive capabilities within human-centered AI frameworks.

cognitive-augmentation mobile-computing human-centered-ai

2023

Soma Express Kit

Understanding the Somaesthetic Experience of People with Visual Impairment

IoT 2023

Michi Kanda, Kai Kunze

Key Contribution: Novel IoT toolkit that leverages heightened somatosensory capacities of people with visual impairments for cross-ability interaction

This research introduces an innovative approach to understanding and sharing the embodied experiences of people with visual impairments. The Soma Express Kit uses IoT sensors and multi-sensory feedback to create new channels of communication between people with and without visual impairments.

By embracing the somaesthetic potential of people with visual impairments, this work fosters empathy and enables richer collaborative cross-ability interactions. The toolkit represents a significant contribution to inclusive IoT design, moving beyond traditional assistive technology paradigms.

visual-accessibility wearable-iot

Affective Umbrella--A Wearable System to Visualize Heart and Electrodermal Activity, towards Emotion Regulation through Somaesthetic Appreciation

Augmented Humans 2023

Kanyu Chen, Jiawen Han, Holger Baldauf, Ziyue Wang, Dunya Chen, Akira Kato, Jamie A Ward, Kai Kunze

Key Contribution: Wearable system that visualizes physiological signals to support emotion regulation through somaesthetic appreciation

The Affective Umbrella is a wearable system that visualizes heart and electrodermal activity in real-time. The work explores how somaesthetic appreciation of one’s own physiological signals can support emotion regulation and self-awareness.

wearable-iot physiological-sensing affective-computing

First Bite/Chew: distinguish typical allergic food by two IMUs

Augmented Humans 2023

Juling Li, Xiongqi Wang, Junyu Chen, Thad Starner, George Chernyshov, Jing Huang, Yifei Huang, Kai Kunze, Qing Zhang

Key Contribution: Novel IMU-based system for detecting allergic foods through bite and chew pattern recognition

This paper presents a wearable system using two IMU sensors to detect typical allergic foods by analyzing bite and chew patterns. The work offers a novel approach to food allergy detection that could help users avoid allergen exposure.

wearable-iot health-monitoring food-detection

2022

EyeMove-Towards Mobile Authentication using EOG Glasses

Augmented Humans 2022

Kirill Ragozin, Karola Marky, Jie Lu, Kai Kunze

Key Contribution: Novel mobile authentication approach using electrooculography captured by smart glasses

EyeMove explores using EOG (electrooculography) glasses for mobile authentication. The system leverages eye movement patterns captured through wearable glasses as a biometric authentication method, offering a novel approach to secure access control.

smart-eyewear authentication security

Ethereal Phenomena-Interactive Art, Meditation, and Breathing Biofeedback: From Mind and Body Wellness Towards Self-Transcendence

TEI 2022

Silvana Malaver Turbay, Igor Igorevich Segrovets, George Chernyshov, Jiawen Han, Christopher Changmok Kim, Kai Kunze

Key Contribution: Extended exploration of breathing biofeedback in interactive art, connecting mind-body wellness to self-transcendence

This extended work on Ethereal Phenomena presents an interactive art installation that uses breathing biofeedback to facilitate meditation and self-transcendence. The system bridges mind-body wellness practices with technological augmentation through real-time physiological sensing and visualization.

artistic-expression physiological-sensing cognitive-augmentation

2021

Ethereal Phenomena

SIGGRAPH Asia 2021 Art Gallery

S Malaver, N Nieto, I Segrovets, C Rizzi, G Chernyshov, C Kim, Kai Kunze

Key Contribution: Interactive art installation exploring meditation and breathing through physiological sensing and visualization

Ethereal Phenomena is an interactive art installation that explores the connection between meditation, breathing patterns, and visual representation. The work uses biofeedback to create immersive experiences that visualize physiological states.

artistic-expression physiological-sensing interactive-media

Affective Umbrella--Towards a Novel Sensor Integrated Multimedia Platform Using Electrodermal and Heart Activity in an Umbrella Handle

MUM 2021

Kanyu Chen, Jiawen Han, George Chernyshov, Christopher Kim, Ismael Rasa, Kai Kunze

Key Contribution: Novel sensor-integrated platform embedded in an umbrella handle for capturing physiological signals in everyday contexts

This paper introduces the Affective Umbrella, a novel platform that integrates electrodermal and heart activity sensors into an umbrella handle. The system explores unobtrusive physiological monitoring in everyday situations, turning a common object into a wearable sensing device.

wearable-iot physiological-sensing affective-computing