Sliman Bensmaia, PhD.
Professor of Organismal Biology and Anatomy
University of Chicago
Biological and bionic hands: Natural neural coding and artificial perception
Wednesday, May 29, 2019
102 Benedum Hall
University of Pittsburgh
I will discuss three different lines of research:
Texture: The sense of touch affords a remarkable sensitivity to the microstructure of surfaces, affording us the ability to sense elements ranging in size from tens of nanometers to tens of millimeters. The hand sends signals about texture to the brain using three classes of nerve fibers through two neural codes: coarse features in spatial patterns of activation and fine features in precise temporal spiking patterns. These nerve signals culminate in a complex, high-dimensional representation of texture in somatosensory cortex, whose structure can account for the structure of texture perception. This complexity arises from the neurons that act as idiosyncratic detectors of spatial and/or temporal motifs in the afferent input.
Hand proprioception: the sense of hand movements and postures – plays a key role in manual dexterity and is an integral component of stereognosis – the haptic perception of the three dimensional structure of objects. To investigate, the neural basis of hand proprioception in somatosensory cortex, we recorded the time-varying kinematics of the monkey’s hand and the neural activity evoked in somatosensory cortex as monkeys performed a grasping task. We found that the activity of individual proprioceptive neurons in somatosensory cortex tracks the postures of multiple joints distributed over the entire hand. This configural representation of the hand is well suited to support stereognosis.
Bionic touch: Hands play a critical role in our ability to manipulate objects and to sense their physical properties. Losing a hand through amputation has adverse consequences on quality of life. One way to restore a measure of independence to these individuals is to equip them with prosthetic arms. To achieve a dexterous prosthesis, however, requires not only the acquisition of control signals to drive the movements of the robotic hand but also the transmission of sensory signals to convey information to the user about the consequences of these movements. One way to restore the sense of touch is to electrically stimulate tactile fibers through electrode chronically implanted in the nerve. To the extent that we can reproduce natural patterns of activation, the resulting tactile sensations will be natural and intuitive.
Sliman Bensmaia is a Professor in the Department of Organismal Biology and Anatomy and in the Committee on Computational Neuroscience. The main objective of his research is to discover how sensory information is encoded in the activity of neurons along the somatosensory neuraxis, spanning the senses of touch and proprioception, in primates. To this end, his team records neuronal responses, measures the elicited percepts, and develops mathematical models to link neuronal representations to behavior. Bensmaia’s team is also working towards restoring the sense of touch in bionic hands for amputees, through electrical interfaces with the nerves, or for people with tetraplegia, through electrical interfaces with the central nervous system. A widely published author, Bensmaia has spoken at dozens of invited talks and symposia and holds four patents. He is a member of the Society for Neuroscience, the American Physiological Society, and the Institute for Electrical and Electronics Engineers.
Hartmann SeNSE Lab
Interdepartmental Neuroscience Program
Peripheral representations of 3D tactile stimuli in the whisker system
Thursday, May 30, 2019
102 Benedum Hall
University of Pittsburgh
To reveal the full representational capabilities of sensory neurons, it is essential to observe their responses to complex, naturalistic stimuli. The rodent whisker system is one of the premier models for studying tactile processing and cortical function, and rodents are capable of a wide variety of behaviors that require complex sensory input from the array of facial whiskers. However, most descriptions of sensory neural representations in this system are based on experiments with reduced stimulus sets: the stimuli tend to be discrete, have few features, are small in dynamic range, and are limited to motion in a 2D plane. In particular, our understanding of how tactile information is first represented – in the primary sensory neurons of the trigeminal ganglion (Vg) – and thus shaped for more central processing, comes from studies that use these types of reduced stimuli.
In order to address this shortcoming, we developed a stereo-vision 3D whisker imaging technique and a model of 3D whisker mechanics that allow us to quantify the full mechanical consequences of complex whisker-object contacts. We show that individual Vg neurons represent many features of a stimulus simultaneously, and that the distinction between rapidly and slowly adapting neurons breaks down during complex stimulation. We then fit generalized linear models to show that neurons’ tactile feature representations vary continuously and tile the available mechanical information across neurons, contrasting with proposed codes in which Vg neurons segregate into functional classes.