NRI: INT: Optoacoustic Material and Structure Pretouch Sensing at Robot Fingertip
Grant
Overview
Affiliation
Other
View All
Overview
abstract
When robots move from factory floors to a wider service market, it is imperative to enable robots to grasp objects with no prior knowledge. Contactless detection of object: material type; shape; and close-to-surface interior structure, can provide vital information such as friction coefficient and applicable grasping force for planning for successful grasps. Unfortunately, no existing sensors can achieve this. Imaging and ranging devices such as cameras or lidars can neither see through surface nor distinguish material type. Tactile sensing requires physical contacts between the robot finger and the object surface which may risk damaging the object or changing the position of the object. Either case may lead to a grasping failure. Microelectromechanical systems and robot perception experts will develop systems and algorithms to create a new type of miniature fingertip-mounted sensor that can detect and map object material type, shape, and close-to-surface interior structure without physical contact. The project will benefit a wide range of robotic applications that require grasping and manipulation such as manufacturing, service robots, search & rescue, etc. Building on the working principle of optoacoustic effect which refers to the formation of acoustic waves following light absorption in a solid material, investigators propose to send modulated laser pulse signals to probe material type and structure based on the acoustic spectrum, time-of-flight, and intensity analyses of the received ultrasound signals. The proposed sensor will be enabled by new and efficient material recognition and surface/interior structure mapping algorithms so that the recommended grasping points and force range will be available before robot fingers are closed. The integrated new research and educational effort is named as the Optoacoustic Material And Structure Sensor (OMASS) project which focuses three main tasks, 1) Development of OMASS devices: an iterative study on design, fabrication, packaging, calibration, testing, and device control, 2) Pretouch perception algorithms to enable the core functions of OMASS devices: material type recognition, surface shape & interior structure mapping, and grasping point planning, and 3) Building a material database with raw signals and signatures for common household items. The OMASS project will share development and educational efforts via journal and conference publications, seminars, research experience for undergraduates and teachers, open-house activities, and the Internet to scientists, students, underrepresented groups, and the public worldwide. The OMASS project will demonstrate the state-of-the-art robotics to the public. The research team will distribute hardware designs, source codes (e.g. ROS stacks), application programming interfaces, experimental data, and documentation via the project website so that other groups can learn from the project team''s experience. This award reflects NSF''s statutory mission and has been deemed worthy of support through evaluation using the Foundation''s intellectual merit and broader impacts review criteria.