Prosthetic Control Systems and Grace of Movement
There are a large group of malfunctions of the human body collectively known as neuromuscular disorders. That is to say a disorder that affects in some persistent manner, the reliable transmission of neural codes from the brain to the muscles of one part of the body or another. The problem can be anywhere in the chain, from a damaged cerebellum such as in cerebral palsy, to deterioration in the muscles themselves as in muscular dystrophy, with the peripheral neuropathies laying in between.
Whilst the underlying causes of the signal degradation are numerous with all manner of root causes, the result; a body which is not entirely under the direct mental control of its resident tenant, is a severe problem. It frequently results in a retardation in the quality of life possible, and in deed in the length of life available in many cases.
For a great many, the dream of being graceful and smooth of movement - or even not scattering every item on a desk they try to lean over - is something of a fondly held dream. For others, even being able to walk unaided holds a similar place of reverence.
The beginnings of one potential solution to this issue, whilst still distant, has begun to rear it's head. It comes from a relatively unlikely source: prosthetic arm research. Specifically work carried out by Motorlab at the University of Pittsburg in 2009.
The researchers were experimenting with increasing the sensitivity of neural reading on a robotic arm grafted to a research monkey. The codes for manipulating the wrist had only been completed this same year, and experiments were underway to get the monkey to manipulate the wrist joint of the robotic arm with it's mind. These experiments are still ongoing at time of writing.
They are however, not the focus of interest. Part of the robot arm is. One of the side effects of an increased level of control, is that the sensors have to be a lot more sensitive, and this means there is a danger of the robot arm mistakenly misinterpreting a signal and moving too suddenly, or beyond safe reach. Likewise there is a problem somewhat similar to a jiggling camera, as the processors attempt to understand data being sent to them rapid-fire by the subconscious brain. In short, natively connected, the arm would be a juddering mess, utterly useless for actual interaction.
So, it's not attached to the brain, not directly anyway. There is an expert system, a basic AI, between the arm motors and the nerve signal identification system. This AI looks at the incoming signals, and attempts to determine what the arm is trying to do. From this, it is able to filter out in near real-time, all the little judders and sub-movements that aren't part of the main aim - reaching up and around for example. This data is what is passed on to the motors, instead of the original mess of nervous signals.
For the monkey, it results in the smooth, fluid movement of the arm doing what he or she wanted it to do - reach up, grasp a knob, and turn it so water flows, or a piece of food is plucked. The same concept applied to a human with CP, could quite easily result in a smoothing of the movement aberrations, and a display of grace and poise that the native brain is literally not physically capable of commanding without help.
Obviously there is a world of difference between an AI running as a timeshare on a modern mainframe, and an AI running on a small device strapped to someone's upper arm, but the concept is there, and it is sound. Whether ultimately incorporated into an exoskeleton frame, or tied back into the peripheral nervous system to override the brain's own signals, or operating a human prosthetic for that matter, the concept of a learning AI as almost a correction centre for the brain's signalling pathways is an idea that works in practice.
It is important to note that the intent would never be to dictate movement, but only learn and adapt to each individual user, and extrapolate over time based on its own experience, and the data being sent to it, what the user is trying to do. In time and with practice, even fine and deft movements would be possible in this manner.
The hardware necessary to run such an AI at this current time is impractical to say the least. It is plausible that sufficiently advanced software could run on a dedicated computing platform the size of a laptop. However, strapping a laptop to your arm, along with necessary interface hardware, is not a feasible option. Not even for a modern exoskeletal system. What this concept needs most to see it to fruition, is the steady increase in computing power and reduction in size of high end processing components towards being able to fit the software in a form that is easily mountable on a human form for extended periods at a time.