slide 1
Image Slide 1
The University of Aizu
Image Slide 2
Research Laboratory
Image Slide 3
Research Member
previous arrowprevious arrow
next arrownext arrow



A Lightweight Method to Detect the Insufficient Brushing Regions Using a Six-Axis Inertial Sensor

2017 IEEE 6th Global Conference on Consumer Electronics (GCCE 2017)

Magic Ring: A Self – contained Gesture Input Device on Finger

Lei Jing,Zixue Cheng,Yinghui Zhou,Junbo Wang,Tongjun Huang
Conference: Proceedings of the 12th International Conference on Mobile and Ubiquitous Multimedia

Welcome to UoA iSensingLab!

We envision a future where highly personalized services are seamlessly integrated in daily activities. To achieve this vision, we:

1. Develop sensing technologies to digitalize the human motions and perceptions
2. Design intelligent fusion methods for multi-sensory data
3. Create proof-of-concept systems as cornerstones for the further practical applications  

We are actively seeking students at all levels. If you are interested in joining us, please read the “Application” page first.




Lei Jing荊 雷教授Professor
Special Researcher
Daisuke MIYATA
Chenghong LU盧 成鸿
Michiko Hoshi星美智子Secretary
Wei GUO郭 葳D3
Zeping YU于 泽平D2
Kazuma SATO佐藤 一摩M2
Shunsuke SUZUKI鈴木 竣介M1
Geng Tiantian耿 甜甜M1
Ye Li李 燁M1
Zhang Yuren張 馭仁M1
Pu Zhongnan蒲 中南M1
Zhang Yunhao張 雲豪M1
Charles Chisom PrinceM1
Ryoma SAKASAI逆井 竜馬S4
Tsubasa IWAMOTO岩本 翼S4
Taise YAKUSHINJI薬真寺 泰生S4
Atsuya WATANABE渡邉 温也S4
Ryoma HASHIMOTO橋本 凌真S4
Hyakuda Issei百田 一星S3
Rossi Andy Takuyaロッシ アンディ拓也S3
Ichikawa Taiki市川 大輝S3
Outake Kenshin大竹 健心S3
Zhang Youchen張 友辰S1



We are looking for highly motivated students with interest in electronics engineering, control theory, machine learning, and cyber-physical systems. The ideal candidate will be enrolled at PhD level in the Computer Science and Engineering Department at The University of Aizu. The student will be supervised by Dr.Lei JING and will join JLab, based at The University of Aizu, Fukushima, Japan.


The envisaged research will focus on one of the following aspects.

E-textile sensor design and energy harvesting

How to make use the E-textile materials to design the new 3D tactile sensor and piezoelectric energy harvesting method.

Data gloves development and applications

How to find out the optimal combination and deployment of different kinds of sensors into the gloves for a given application scenario.

Intelligent data fusion with multimodal sensors

How to improve the measurement in a dynamic time-variant systems with the nonlinear states estimation methods (e.g. EKF) or the computational learning methods (e.g. CNN).

Multiple models fusion method

How To improve the prediction with less training, how about combine the multiple hand models like kinematic bio mechanical model, dynamic kinect model, and nonlinear learning model.

Robust hand motion sensing solutions are expected to be developed and applied into practical applications such as sign language recognition, robot control, and peach harvesting.


The candidate needs to have an undergraduate/master’s degree in engineering ideally with the concentration on mechatronics, or automation, or machine learning, or an (applied) mathematics degree ideally with the concentration on optimization, dynamic, and control. Previous research experience in electronics, control theory, or machine/deep learning is desired. We require excellent academic track record and extraordinary passion that demonstrates potential for successfully carrying on the project together.

The envisaged starting date is Fall 2023 or Spring 2024. Applications will be reviewed as received and the positions will remain open until filled.


Applications should include a CV and supporting documents. These should be sent electronically as a single PDF file to jing_lab at Submitted documents should include.

A curriculum vitae

An academic transcript

A statement of interest including a summary of past research activities, detailing why you are interested and how you fit in the project. Please keep it brief. 2 pages Maximum.

Names and contact information of 2 references.