Despite significant advances in the engineering of biomechatronic devices, such as multiarticulated prosthetic hands, their functional utility continues to be limited because of difficulties in the ability to sense the volitional intent of the human user. Traditionally, electromyography using surface electrodes has been the dominant method for sensing volitional intent from myoelectric signals. A major challenge with surface electromyography has been the difficulty in achieving intuitive and robust proportional control of multiple degrees of freedom. To address this problem, our group has been one of the pioneers for a new control method (sonomyographic control) that overcomes several limitations of surface electromyography. In sonomyography, muscle mechanical deformations are sensed using ultrasound, as compared to electrical activation, and therefore the resulting control signals can directly control the position of the end effector. Compared to myoelectric signals, which control end-effector velocity, sonomyographic control may be more congruent with the remaining proprioception within the residual limb. In addition, ultrasound imaging can non-invasively resolve individual muscles, including those deep inside the tissue, and detect dynamic activity within different functional compartments in real-time. This alignment between activation and proprioception, as well as the ability to distinguish small, but significant differences in deep muscle tissue activations (e.g., different grasp patterns) may collectively provide more intuitive control signals for prosthetic devices. In this talk, I will describe results from our current studies with amputees and future directions. I will describe our work on developing miniaturized low-power ultrasound instrumentation for wearable systems. I will also describe extensions of this research to exoskeletons and other applications in rehabilitation.