CHS: Medium: Collaborative Research: Augmenting Human Cognition with Collaborative Robots Grant uri icon

abstract

  • Collaborative robotics is a growing application space in robot technology used in manufacturing, mining, construction, and energy industrial settings. A recent report by the International Federation of Robots indicates that global robotics spending will reach $13 billion in 2025. The largest consumers of industrial robotics have been in the Asian market, i.e. China, Japan, Republic of Korea, with the U.S. lagging behind both Europe (Germany, France, Spain) and Asia. It is in the national economic and stratgic interest to ensure that US industry and workers regain leadership in collaborative manufacturing robots. Towards that goal, this project will develop understanding of technical and socio-technical requirements to accelerate the use of collaborative robotics. The project will contribute new knowledge and theory of Human-Computer Interaction and Human-Robot Interaction, by augmenting human cognition for safer and more efficient collaborative robot interaction. The new design principles for collaborative robotic technologies will improve both the worker and the employer''s growth and progress. Fundamental knowledge gained here will be directly applicable in other high-risk domains that use collaborative robots, such as offshore oil rigs, military, and construction. The project seeks to empower new populations of workers (e.g., workers with disabilities), allow older workers to remain in the workforce, and potentially assist novice workers, thereby reducing skills gaps and improving work efficiency. The team will focus on broadening the participation of females in computing. In addition to traditional academic venues (e.g. conference, journals, etc.), research results will be further disseminated through workplace workshops and seminars through existing state, regional, and national networks of employers and industry partners.To meet these goals, the team of researchers plan to: (1) develop a novel HRI task/scenario classification scheme in collaborative robotics environments that are vulnerable to system failures; (2) establish fundamental neurophysiological, cognitive, and socio-behavioral models (workload, cognitive load, fatigue/stress, affect, and trust) to monitor and model the mind motor machine nexus; (3) use these models to determine when and how a human''s cognitive, social, behavioral and environmental states require adjustment via technology to enhance HRI for efficient and safe work performance; and finally (4) create an innovative and transformative Work 4.0 architecture (AMELIA: AugMEnted Learning InnovAtion) that includes a layer of augmented reality for human and robots to mutually learn and communicate current states. The team will characterize worker cognitive states inferred from their physiological data and eye tracking. They will then use embedded sensor readings, error codes, and surveillance cameras to characterize robot states. Through augmented reality, AMELIA will provide this data to both the worker and the robot for effective real-time adjustment in behaviors to mitigate failure sources and errors while ensuring minimal additional cognitive load. The team plans a novel communication scheme using artificial emotional intelligence in which robots and humans collaborate in potentially dangerous situations. The robot will detect the worker''s cognitive state using different machine learning techniques, and then take the appropriate action. Ultimately AMELIA seeks to empower the worker to focus on complex, cognitive problem-solving tasks, performed safely and efficiently, while ensuring that it adapts to both the worker''s attitudes and cognitive states.This award reflects NSF''s statutory mission and has been deemed worthy of support through evaluation using the Foundation''s intellectual merit and broader impacts review criteria.

date/time interval

  • 2019 - 2023