%0 Conference Paper
%B Robotics and Automation (ICRA), 2010 IEEE International Conference on
%D 2010
%T Pose estimation in heavy clutter using a multi-flash camera
%A Ming-Yu Liu
%A Tuzel, O.
%A Veeraraghavan,A.
%A Chellapa, Rama
%A Agrawal,A.
%A Okuda, H.
%K 3D
%K algorithm;object
%K based
%K camera;multiview
%K depth
%K detection;object
%K detection;pose
%K distance
%K edge
%K edges;cameras;image
%K edges;integral
%K estimation;binary
%K estimation;multiflash
%K estimation;robot
%K function;depth
%K images;location
%K localization;pose
%K maps
%K matching;cost
%K matching;image
%K pose-refinement
%K texture;object
%K transforms;angular
%K vision;texture
%K vision;transforms;
%X We propose a novel solution to object detection, localization and pose estimation with applications in robot vision. The proposed method is especially applicable when the objects of interest may not be richly textured and are immersed in heavy clutter. We show that a multi-flash camera (MFC) provides accurate separation of depth edges and texture edges in such scenes. Then, we reformulate the problem, as one of finding matches between the depth edges obtained in one or more MFC images to the rendered depth edges that are computed offline using 3D CAD model of the objects. In order to facilitate accurate matching of these binary depth edge maps, we introduce a novel cost function that respects both the position and the local orientation of each edge pixel. This cost function is significantly superior to traditional Chamfer cost and leads to accurate matching even in heavily cluttered scenes where traditional methods are unreliable. We present a sub-linear time algorithm to compute the cost function using techniques from 3D distance transforms and integral images. Finally, we also propose a multi-view based pose-refinement algorithm to improve the estimated pose. We implemented the algorithm on an industrial robot arm and obtained location and angular estimation accuracy of the order of 1 mm and 2 #x00B0; respectively for a variety of parts with minimal texture.
%B Robotics and Automation (ICRA), 2010 IEEE International Conference on
%P 2028 - 2035
%8 2010/05//
%G eng
%R 10.1109/ROBOT.2010.5509897