From muscle signals to motor learning: multisensor-based gesture recognition and serious game design
Human motion recognition is a key component in achieving human-machine interaction. It is typically performed using wearable sensors that are attached to the human body. These sensors collect motion data, enabling accurate detection and interpretation of human movements.
Hand gesture recognition using electromyography (EMG) signals combined with machine learning can support the development of assistive device for the upper limb amputees. Similarly, lower limb motion recognition using inertial measurement units (IMU) and EMG can facilitate the development of home-based motor learning platforms and enable effective monitoring of training outcomes for both clinical populations and athletes.
This thesis explores the development of human-machine interfaces that leverage bio-signals and multisensory feedback to enable hand gesture recognition and support knee joint motor learning through interactive exergame environments.