Nothing Special   »   [go: up one dir, main page]

skip to main content
Visual information retrieval environments
Publisher:
  • University of Pittsburgh
  • Computer Science 322 Alumni Hall Pittsburgh, PA
  • United States
ISBN:978-0-599-27812-7
Order Number:AAI9928029
Pages:
251
Reflects downloads up to 23 Feb 2025Bibliometrics
Skip Abstract Section
Abstract

In this dissertation, two visual information retrieval models, the distance-angle-based visual retrieval model and the angle-angle-based visual retrieval model are presented. Within the visual retrieval environment, semantic relationships among documents and reference points are demonstrated, five traditional information retrieval models are interpreted, five new information retrieval models are developed, documents are browsed, three different metrics are provided, three different similarity measures are offered, reference points are generated and adjusted based on feedback information, and ambiguity is alleviated. A distance-angle integrated similarity measure, which takes the strengths of both a distance-based similarity measure and an angle-based similarity measure, is developed and integrated into the distance-angle-based visual retrieval environment. A prototype of the distance-angle-based visual retrieval tool, the DARE application for Windows, is implemented using Visual C ++ ( MFC ) programming language.

The effects of the information retrieval models, the similarity measures, their interactions, the newly developed information retrieval models vs . the traditional information retrieval models, the two-reference-point-based information retrieval models vs . the one-reference-point-based information retrieval models, and the distance-based information retrieval models vs . the angle-based information retrieval models on the retrieval performance within DARE are investigated. The overall assessment for the DARE application for Windows is positive.

Future research directions on the topics are proposed, including implementing a new zoom function in the visual space, updating the current DARE version to a web-based version, comparing the retrieval performance between DARE and a traditional information retrieval system and between DARE and other visual retrieval tools, and so on.

Cited By

    Contributors
    • The University of British Columbia
    • University of Pittsburgh
    Please enable JavaScript to view thecomments powered by Disqus.

    Recommendations