Autonomous learning of the semantics of internal sensory states based on motor exploration Academic Article uri icon

abstract

  • What is available to developmental programs in autonomous mental development, and what should be learned at the very early stages of mental development? Our observation is that sensory and motor primitives are the most basic components present at the beginning, and what developmental agents need to learn from these resources is what their internal sensory states stand for. In this paper, we investigate the question in the context of a simple biologically motivated visuomotor agent. We observe and acknowledge, as many other researchers do, that action plays a key role in providing content to the sensory state. We propose a simple, yet powerful learning criterion, that of invariance, where invariance simply means that the internal state does not change over time. We show that after reinforcement learning based on the invariance criterion, the property of action sequence based on an internal sensory state accurately reflects the property of the stimulus that triggered that internal state. That way, the meaning of the internal sensory state can be firmly grounded on the property of that particular action sequence. We expect the framing of the problem and the proposed solution presented in this paper to help shed new light on autonomous understanding in developmental agents such as humanoid robots.

published proceedings

  • INTERNATIONAL JOURNAL OF HUMANOID ROBOTICS

author list (cited authors)

  • Choe, Y., Yang, H., & Eng, D.

citation count

  • 19

complete list of authors

  • Choe, Yoonsuck||Yang, Huei-Fang||Eng, Daniel Chern-Yeow

publication date

  • June 2007