David W. Jacobs, Daphna Weinshall, et al.
IEEE Transactions on Pattern Analysis and Machine Intelligence
Image-based search has become an increasingly active area of research. Despite the fact that many resources have been spent on creating new ideas and improving existing algorithms and designs, existing image-based search engines do not provide scalability or accuracy that is even remotely close to today's textbased search engines. The challenges in image-based search are due to a number of factors, including the computation intensity of existing algorithms and the difficulty of detecting and recognizing objects. In this paper, we present a system called Proxima, which leverages contextual information provided by a mobile device, such as time, location, and user data, to search for people in a stored database who are "similar" to the person in the input image. Unlike other systems that automatically associate metadata with images, Proxima utilizes the image itself as part of the database query. We also describe a prototype that implements the Proxima algorithm to provide an image-based search service for social networking. © 2009 IEEE.
David W. Jacobs, Daphna Weinshall, et al.
IEEE Transactions on Pattern Analysis and Machine Intelligence
Minerva M. Yeung, Fred Mintzer
ICIP 1997
Graham Mann, Indulis Bernsteins
DIMEA 2007
Fearghal O'Donncha, Albert Akhriev, et al.
Big Data 2021