The current goal of this project is to solidify skilled musicians’ hand motion in a robotic model to demonstrate efficient
biomechanical movements that can help musicians to prevent hand injuries. We have setup a motion capture system
to measure the principal hand biomechanical components of injured musicians with disabling symptoms, such as focal
dystonia (known as writer’s cramp) and healthy musicians. Kinematic and dynamic features of the motion models of
the two groups of musicians are characterized and compared.
The hand motion of the musicians is captured with a 14-sensor 5DT dataglove (Figure 1) and a Natural Point passive
motion tracking camera system (Figure 2). Musicians perform on a Yamaha hybrid AvantGrand N3 Piano (Figure 3)
that is equipped with MIDI (Music Instrument Digital Interface) output. The hand, arm, and shoulder motions in a
performance will be recorded along with the MIDI data. Computer software is developed to analyze the motion data
and the MIDI data to automatically identify and evaluate motion disorders. The motion model is then applied on a
simulated robot system to better understand the motion disorders.
This study, focused on the intensity of musician’s hand biomechanics, has a broader impact on school musician and
teacher, popular and amateur musician, and other occupational hand users. It has a profound impact on music
education since this will be the first time to introduce robot system in music learning. Non-musicians who use repeated
complex hand tasks (e.g., computer) are often unaware of the biomechanically incorrect use, and many experience
hand problems.
Department of Computer Science and Engineering • 4202 E. Fowler Ave • Tampa, FL 33620 •
(813)974-7508 Created by: Emmanuel Stinson - Send comments to
estinson@mail.usf.edu