Following Atau Tanaka’s Meta Gesture Music project, the BioMusic Project is developing musical instruments controlled by muscular activity — electromyography (EMG). Data from these sensors are rich but potentially difficult to interpret. Once captured, relating muscular activity to specific physical and musical gestures involves both technical and aesthetic sophistication to develop genuine expressivity. This talk presents an overview of the state-of-the-art in interactive machine learning algorithms for sensor interpretation, feature extraction, and gesture recognition and describes how BioMusic is developing these techniques for low-level data acquisition as well as higher-level musical control, creating unique and customisable musical instruments for a broad range of performers.