Cited time in webofscience Cited time in scopus

Position estimation and multiple obstacles tracking method based on stereo vision system

Position estimation and multiple obstacles tracking method based on stereo vision system
Lim, Young-ChulLee, Chung-HeeKwon, SoonLee, Jong-Hun
DGIST Authors
Lim, Young-ChulLee, Chung-HeeKwon, SoonLee, Jong-Hun
Issued Date
Article Type
Conference Paper
In this paper, we present a method to estimate obstacles' position and track multiple obstacles on the road based on a stereo vision system. A stereo vision system can measure distance to an obstacle using disparity. However, this system has several problems such as sampling error, geometric problems due to the installation of a stereo camera, and image distortion in the calibration and rectification processes that cause deterioration in accuracy and reliability. We utilize a multi-layer perceptron (MLP) method to correct mean error, and also a strong tracking interacting multiple model (ST-IMM) Kaiman filter is proposed to minimize the error variance. The ST-IMM has robustness for maneuver and non-stationary error variance. ST-IMM has an advantage that one model can complement another model's shortcomings by using several sub-models. A simple data association method based on nearest neighborhood filtering is proposed to track multiple obstacles. The experiment results show that our algorithms can estimate the target's position and track multiple objects within about 4% distance error in range of 10 to 50 m, even when the target vehicle maneuvers rapidly. ©; 2009 IEEE.
Institute of Electrical and Electronics Engineers Inc.
Related Researcher
Files in This Item:

There are no files associated with this item.

Appears in Collections:
Division of Automotive Technology Advanced Radar Tech. Lab 2. Conference Papers
Convergence Research Center for Future Automotive Technology 2. Conference Papers
Intelligent Devices and Systems Research Group 2. Conference Papers


  • twitter
  • facebook
  • mendeley

Items in Repository are protected by copyright, with all rights reserved, unless otherwise indicated.