Even the simplified two-link structure of the limbs during constrained reaching exhibits complex nonlinear dynamics (Hollerbach and MK-1775 molecular weight Flash, 1982). These nonlinearities within the motor system provide a major challenge for the sensorimotor control system, even if none of the other problems existed; however, the inclusion of these other problems makes the task even more challenging. Progress has been made into the computations that the sensorimotor system can perform to alleviate these problems. The remainder of the review will examine these computations
and how they relate to the five problems of sensorimotor control we have highlighted. Bayesian decision theory is a framework for understanding how the nervous system performs
optimal estimation and control in an uncertain world. It is composed of two components, Bayesian statistics and decision theory. Bayesian statistics involves the use of probabilistic http://www.selleckchem.com/products/umi-77.html reasoning to make inferences based on uncertain data, both combining uncertain sensory estimates with prior beliefs and combining information from multiple sensory modalities together, in order to produce optimal estimates. We use the term Bayesian inference in this review to refer to probabilistic reasoning and not simply to the application of Bayes’ rule (see below). Based on these inferences, decision theory is used to determine the optimal actions to take given task objectives. Different sensory modalities can often sample the same information about the state of our body (e.g., proprioceptive and visual location of the hand) or the state of the external world (e.g., auditory and Resminostat visual location of a bird). When these different modalities are experimentally
put in conflict, for example by mismatching the vision and sound of a person speaking, the percept corresponds to something intermediate between the percept of each modality alone (McGurk and MacDonald, 1976). Recent work has developed and tested the computational framework that underlies such multisensory integration. Even for normal sensory inputs, our sensory apparatus is variable and can have biases. Therefore, the estimates from different modalities are unlikely to be the same. Within the Bayesian framework, we can ask what is the most probable state of the world that gave rise to the multiple sensory inputs. Such a Bayesian model predicts that a scalar estimate from two different modalities, such as the visual and haptic width of a held object, should be weighted and averaged to produce an optimal estimate. Critically, the weighting of each modality should depend on its reliability (or the inverse of its variability due to noise), with the more reliable modality contributing more to the final estimate. Such a model of multisensory integration is supported by experimental studies of size estimation from visual and haptic cues (Ernst and Banks, 2002), location from visual and auditory cues (Körding et al.