A major goal for brain machine interfaces is to allow patients to control prosthetic devices with high degrees of independent movements. Such devices like robotic arms and hands require this high dimensionality of control to restore the full range of actions exhibited in natural movement. Current BMI strategies fall well short of this goal allowing the control of only a few degrees of freedom at a time. In this paper we present work towards the decoding of 27 joint angles from the shoulder, arm and hand as subjects perform reach and grasp movements. We also extend previous work in examining and optimizing the recording depth of electrodes to maximize the movement information that can be extracted from recorded neural signals.