Off-campus UMass Amherst users: To download dissertations, please use the following link to log into our proxy server with your UMass Amherst user name and password.

Non-UMass Amherst users, please click the view more button below to purchase a copy of this dissertation from Proquest.

(Some titles may also be available free of charge in our Open Access Dissertation Collection, so please check there first.)

Matching affine-distorted images

Raghavan Manmatha, University of Massachusetts Amherst


Many visual tasks involve the matching of image patches derived from imaging a scene from different viewpoints. Matching two image patches can, however, be a difficult task. This is because changes in the relative orientation of a surface with respect to a camera cause deformations in the image of the surface. Thus this deformation needs to be taken into account when matching or registering two image patches of an object under changes in viewpoint. Up to first order these deformations can be described using an affine transform. Here, a computational scheme to match two image patches under an affine transform is presented. The two image patches are filtered with Gaussian and derivative of Gaussian filters. The problem of matching the two image patches is then recast as one of finding amount by which these filters must be deformed so that the filtered outputs from the two images are equal. For robustness, it is necessary to use the filter outputs from many points in a small region to obtain an overconstrained system of equations. The resulting equations are linearized with respect to the affine transforms and then iteratively solved for the affine transforms. The method is local and can match image patches in situations where other algorithms fail. It is also shown that the same framework may be used to match points and lines.

Subject Area

Computer science|Electrical engineering

Recommended Citation

Manmatha, Raghavan, "Matching affine-distorted images" (1997). Doctoral Dissertations Available from Proquest. AAI9809365.