Registration of CAD-models to images by iterative inverse perspective matching

A new algorithm for registering a 3D model to a single or a sequence of 2D images without a-priori knowledge of correspondences is presented. The algorithm is composed of two iterated steps: the first step determines correspondences between image features and model data, while the second one computes the 3D rigid body transformation that minimizes the displacement of the matched points. As camera images do not yield full 3D information, the key idea proposed is to exploit the inverse perspective, i.e. the projection ray determined by the focal point of the camera and an image feature point. The inverse perspective yields the constraints for both correspondence search and pose estimation in 3D space rather than in the image plane. This not only simplifies the motion parameter estimation but also reduces matching ambiguities due to occlusions. In addition, a robust M-estimation technique reduces the impact of false correspondences. Experimental results show that complex, 3D CAD-models can be registered efficiently and accurately to images, even if image features are incomplete, fragmented and noisy.

[1]  W. Eric L. Grimson,et al.  The combinatorics of local constraints in model-based recognition and localization from sparse data , 1984, JACM.

[2]  Nicholas Ayache,et al.  A New Framework for Fusing Stereo Images with Volumetric Medical Images , 1995, CVRMed.

[3]  Paul J. Besl,et al.  A Method for Registration of 3-D Shapes , 1992, IEEE Trans. Pattern Anal. Mach. Intell..

[4]  Xinhua Zhuang,et al.  Pose estimation from corresponding point data , 1989, IEEE Trans. Syst. Man Cybern..

[5]  Gerd Hirzinger,et al.  World modeling for a sensor-in-hand robot arm , 1995, Proceedings 1995 IEEE/RSJ International Conference on Intelligent Robots and Systems. Human Robot Interaction and Cooperative Robots.

[6]  Takeo Kanade,et al.  Real-time 3-D pose estimation using a high-speed range sensor , 1993, Proceedings of the 1994 IEEE International Conference on Robotics and Automation.