Self-Localization in Urban Environment from One Image

H.-M. Chim and R. Chung (PRC)

Keywords

Camera pose estimation, junctions, correspondence establishment.

Abstract

We address the problem of how a camera-embedding device, and in turn the person or vehicle holding it, can have its position and orientation determined automatically from the image data it captures. The work is aimed for the application that a remote service provider can inform a person of his current location and orientation, once the person takes an image of his surroundings using a PDA or cellular phone and sends the image over. The problem is particularly meaningful in the urban environment in which the GPS signal could be blocked by crowded buildings. The problem is related to how the 2D scene image can be registered with the 3D database of the buildings that the service provider owns about the target environment. We propose a solution mechanism that makes use of certain corner features we refer to as junctions, which are generally amply available among buildings in the urban environment. It can be shown that three trihedral junctions, if matched between the 2D scene image and the 3D database, already represent an adequate set for solving the localization problem. An effective hypothesis-and-confirmation mechanism implemented in a hashing scheme is proposed to find such a minimal correspondence set. Experimental results on real image data of both laboratory scene and outdoor scenes are shown to illustrate the performance of the solution mechanism.

Important Links:



Go Back