Flexible, microfluidic tactile sensor skin for artificial fingertips
Student Lead (past): Dr. Ruben Ponce Wong
Funded by the National Science Foundation (Award #1264444)
Co-investigator: Jonathan D. Posner, University of Washington
Description: The exploration and manipulation of unstructured environments via tactile feedback still presents many challenges. We are developing a capacitance-based MEMS tactile sensor skin capable of detecting normal and shear forces and local vibrations. Using an array configuration and by tuning the material properties of the polymer PDMS at different sites, localized measurements with various magnitudes can be obtained simultaneously. PDMS has the desirable properties of being waterproof, chemically inert, and non-toxic. In addition, the sensor will be conformable to the curvilinear and deformable fingertip, which is crucial for performance. Computer simulations are being developed, and prototypes are under construction. Mapping of capacitance readings to external stimuli will be performed using nonlinear regression models, such as artificial neural networks.
Figure: Prototype of a capacitive microfluidic normal force sensor skin comprised of fluidic metal alloy embedded within an elastomeric skin.
Characteristics of a three-fingered grasp during a pouring task requiring dynamic stability
Student Lead: Ryan Manis
Funded by the National Science Foundation (Graduate Research Fellowship to R. Manis)
Description: We are characterizing digit control in a three-fingered grasp for a task that closely relates to an activity of daily living. The pouring task requires coordination of adduction/abduction and flexion extension degrees of freedom for successful completion. Dynamic stability is required for purposeful translation and rotation of the fluid-filled container against gravity.
Figure: Instrumented containers of various shapes.
Human grip responses to rotational perturbations of grasped objects
Student Lead: Michael De Gregorio
Funded in part by the National Science Foundation (CAREER Award #0954254)
Description: We are characterizing the reflex-like grip responses of the human hand to rotational disturbances of a grasped object. The goal is to better understand grip responses that require simultaneous coordination of adduction/abduction and flexion/extension across fingers. By better understanding patterned responses in humans, we can develop bio-inspired artificial reflexes to enhance the functionality of anthropomorphic robotic and prosthetic hands without adding to the user’s cognitive burden.
Figure: Motion and force data are collected using retro-reflective markers and 6DOF load cells, respectively. Surface EMG records the timing of the first dorsal interosseus response.
Sensory challenges for human-machine systems
Student Team: Kevin Bair, Stephanie Naufel, Justin Tanner, Ben Teplitzky
Co-investigator: Stephen I. Helms Tillery, Arizona State University
Funded by the National Science Foundation (Award #0932389)
Description: Human users of teleoperated manipulators lack a conscious perception of rich tactile feedback, even when using devices as intimately connected to the human body as a neuroprosthesis. In collaboration with the ASU Sensorimotor Research Group, we are working to close the sensory portion of the human-machine loop. Using nonhuman primates and a sensorized robot hand, we are mapping the relationships between biological somatosensory cortex recordings and artificial tactile sensor readings. The ultimate goal is to understand how and when to stimulate the brain to provide the user with a conscious perception of tactile events occurring at an artificial fingertip.
Figure: Brain-machine interface experiments.
Sensory-event driven artificial reflexes
Student Lead: Kevin Bair
Description: We are currently developing a multi-level control system that maintains voluntary control at the higher human level, but implements autonomous “survival behaviors ” at the lower machine level to address communication delays between human and machine. These survival behaviors can be implemented with sensory-event driven artificial reflexes that are inspired by human grip responses. The current research testbed uses two of three fingers on a robot hand (Barrett Technology) with flexion/extension and spread capabilities to mimic the thumb and index finger of a human hand. An anthropomorphic research testbed is forthcoming.
Figure: Robot hand (Barrett Technology) using a two-fingered precision grasp.