Data-driven iterative image online annotation method

A data-driven, iterative technology, applied in the direction of metadata still image retrieval, electronic digital data processing, still image data retrieval, etc., can solve the problem of prior knowledge dependence of annotation data, cannot fully meet the application requirements of image annotation, and has no image optimization And other issues

Active Publication Date: 2016-10-12
NANJING UNIV
View PDF8 Cites 13 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] Based on the above image labeling methods, the model-driven labeling method has shortcomings such as relying on hypothetical models, relying on a large amount of manual labeling training data, and lacking flexibility in the labeling system, making it unsuitable for large-scale user image labeling work; the existing data The driving method also has the problem of dependence on the prior knowledge of the labeled data, and there is no optimization of the feature representation for the image, and the representation ability of the image is poor.
Therefore, none of the above image annotation methods can fully meet the application requirements of image annotation.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Data-driven iterative image online annotation method
  • Data-driven iterative image online annotation method
  • Data-driven iterative image online annotation method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment

[0082] In this example, if figure 2 is a sample image, where the black box is the image target area to be marked, Figure 3aSelect an example image for the prominent category members of the user interface. The layout on the left is to display the prominent categories pushed to the user in each cycle. The image selected by the user will have a black border around the image to show the confirmation effect. When the top of the image is selected, it will show a magnification effect, so that the user can observe the details of the image carefully. The layout on the right is the category label page. When the user clicks on the gray box, a new category label will be generated. Figure 3b An example of category labeling after selection of prominent category members in the user interface, that is, after the user selects the members of the prominent category, select the label category on the right, and then use positive or negative to submit the labeling results Complete the labeling ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a data-driven iterative image online annotation method. The method comprises the following steps: 1, data preparation and recovery: making a user feed samples to be annotated into a sample buffer pool successively, performing scale detection on the sample buffer pool, performing feature extraction, and feeding the samples to be annotated into a circular annotation module; and 2, circular annotation: at the start of each round of circulation process, calculating a similarity matrix of features through an online similarity measurement model, optimizing the similarity matrix through constraint propagation, performing data clustering on the similarity matrix by a spectral clustering algorithm, selecting a significant category, pushing the significant category to the user for annotation, cycling constantly till remaining samples are not enough to continue annotation, and exiting the circulation process to obtain a personalized label of an input image region given by the user. Through the technology of the method, rapid and accurate annotation of images can be realized; the annotation efficiency is increased; and the method can adapt to a personalized annotation system of the user.

Description

technical field [0001] The invention relates to an image labeling method, which belongs to the technical field of image processing, in particular to a data-driven iterative image online labeling method. Background technique [0002] In recent years, with the rapid development of Internet technology, multimedia images, and storage devices, the number of images that users come into contact with has grown explosively. How to quickly and effectively classify and organize a large amount of user image data to meet other application needs of users is an important research direction. Traditional image annotation aims at classification and retrieval, and is concerned with information such as scenes, objects, and relationships between objects in the image as a whole or in parts. Therefore, in the context of increasing user image data, image annotation should be more flexible, with a variety of annotation systems to adapt to different applications of different users, while maintaining...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06F17/30G06K9/62
CPCG06F16/58G06F16/5838G06F18/22
Inventor 孙正兴李博胡佳高杨崴
Owner NANJING UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products