Motor-Based Autonomous Grounding in a Model of the Fly Optic Flow System Conference Paper uri icon

abstract

  • 2016 IEEE. The fly visual system, although tiny when compared to the mammalian visual system, can still perform highly sophisticated spatial tasks like collision avoidance, landing on objects, pursuit of prey, etc. Flies outperform human-made autonomous flying systems in solving such spatial tasks by a long way. This is partly due to their ability to perceive and respond to optical flow generated by motion in the environment. They are also known to take actions that alter the image flow on their eyes. Higher level neurons in the fly visual system respond to different types of complex optical flows due to rotation and translation, by pooling information from local motion detectors called the elementary motion detectors (EMDs) in the lower level. In this sense, neuronal responses (spikes) from these optical flow detectors in the fly carry highly encoded signals: a single spike can represent a complex dynamical pattern of movement in the visual field. In this paper, we investigate how such highly encoded signals can be interpreted and utilized in the fly's brain, while solely operating on the encoded internal spike patterns within their brain and no direct external (unencoded) sensory information, i.e. a form of grounding. With a computational model of the optical flow detectors based on those in the fly, we show that specific pattern of action (or coordinated motor output) is the only way that a fly can decode its internal spikes and generate meaningful, relevant behavior based on that.

name of conference

  • 2016 International Joint Conference on Neural Networks (IJCNN)

published proceedings

  • 2016 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN)

author list (cited authors)

  • Parulkar, A., & Choe, Y.

citation count

  • 0

complete list of authors

  • Parulkar, Amey||Choe, Yoonsuck

publication date

  • July 2016