Scale- and orientation-invariant scene similarity metrics for image queries

Anthony Stefanidis, Peggy Agouris, Charalambos Georgiadis, Michela Bertolotto, James D. Carswell

Research output: Contribution to journalArticlepeer-review

Abstract

In this paper we extend our previous work on shape-based queries to support queries on configurations of image objects. Here we consider spatial reasoning, especially directional and metric object relationships. Existing models for spatial reasoning tend to rely on pre-identified cardinal directions and minimal scale variations, assumptions that cannot be considered as given in our image applications, where orientations and scale may vary substantially, and are often unknown. Accordingly, we have developed the method of varying baselines to identify similarities in direction and distance relations. Our method allows us to evaluate directional similarities without a priori knowledge of cardinal directions, and to compare distance relations even when query scene and database content differ in scale by unknown amounts. We use our method to evaluate similarity between a user-defined query scene and object configurations. Here we present this new method, and discuss its role within a broader image retrieval framework.

Original languageEnglish
Pages (from-to)749-772
Number of pages24
JournalInternational Journal of Geographical Information Science
Volume16
Issue number8
DOIs
Publication statusPublished - Dec 2002

Fingerprint

Dive into the research topics of 'Scale- and orientation-invariant scene similarity metrics for image queries'. Together they form a unique fingerprint.

Cite this