Judging Distance By Motion-based Visually Mediated Odometry

Igor Katsman, Alfred M. Bruckstein, Robert J. Holt, and Ehud Rivlin.
Judging distance by motion-based visually mediated odometry.
In IEEE/RSJ International Conference on Intelligent Robots and Systems., 2:1357--1362, 2003

Online Version

A pdf version is available for download.

Abstract

Inspired by the abilities of both the praying mantis and the pigeon to judge distance by use of motion-based visually mediated odometry, we create miniature models for depth estimation that are similar to the head movements of these animals. We develop mathematical models of the praying mantis and pigeon visual behavior and describe our implementation and experimental environment. We investigate structure from motion problems when images are taken from a camera whose focal point is translating the first case is reminiscent of a praying mantis peering its head left and right, apparently to obtain depth perception, hence the moniker 'mantis head camera.' In the second case this motion is reminiscent of a pigeon bobbing its head back and forth, also apparently to obtain depth perception, hence the moniker 'pigeon head camera.' We present the performance of the mantis head camera and pigeon head camera models and provide experimental results of the algorithms. We provide the comparison of the definitiveness of the results obtained by both models. The precision of our mathematical model and its implementation is consistent with the experimental facts obtained from various biological experiments.

Co-authors

Bibtex Entry

@inproceedings{KatsmanBHR03i,
  title = {Judging distance by motion-based visually mediated odometry},
  author = {Igor Katsman and Alfred M. Bruckstein and Robert J. Holt and Ehud Rivlin},
  year = {2003},
  booktitle = {IEEE/RSJ International Conference on Intelligent Robots and Systems.},
  volume = {2},
  pages = {1357--1362},
  abstract = {Inspired by the abilities of both the praying mantis and the pigeon to judge distance by use of motion-based visually mediated odometry, we create miniature models for depth estimation that are similar to the head movements of these animals. We develop mathematical models of the praying mantis and pigeon visual behavior and describe our implementation and experimental environment. We investigate structure from motion problems when images are taken from a camera whose focal point is translating the first case is reminiscent of a praying mantis peering its head left and right, apparently to obtain depth perception, hence the moniker 'mantis head camera.' In the second case this motion is reminiscent of a pigeon bobbing its head back and forth, also apparently to obtain depth perception, hence the moniker 'pigeon head camera.' We present the performance of the mantis head camera and pigeon head camera models and provide experimental results of the algorithms. We provide the comparison of the definitiveness of the results obtained by both models. The precision of our mathematical model and its implementation is consistent with the experimental facts obtained from various biological experiments.}
}