Multi-scale automatic labeling method for images

An automatic labeling, multi-scale technology, applied in the field of machine learning, can solve the problem of labeling that cannot achieve global semantics, and achieve the effect of improving the efficiency of image labeling

Active Publication Date: 2020-12-29
北京邮电大学世纪学院
View PDF3 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] In the existing technology, semantic segmentation is an effective method for automatic annotation of local semantics of images. It realizes local semantics annotation by giving each pixel in the image a semantic label, and does not establish pixel and semantic correspondence. Semantic segmentation can more accurately realize local semantic annotation, but cannot achieve global semantic annotation.
At present, there is still a lack of research on methods for multi-scale semantic annotation based on semantic segmentation

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Multi-scale automatic labeling method for images

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0016] Such as figure 1 As shown, the multi-scale automatic labeling method for images in this embodiment is characterized in that it includes the following steps:

[0017] Step 1. Find K-nearest neighbor images of the image to be labeled in the training set. The training set includes N images, each image corresponds to several global labels, and each pixel of each image corresponds to a local label.

[0018] In this step, extract the GIST feature vectors of the image to be labeled and all the images in the training set, calculate the Euclidean distance between the GIST feature vector of the image to be labeled and the GIST feature vectors of all the images in the training set, and select K with the smallest Euclidean distance Image, as the K-nearest neighbor image of the image to be labeled. This embodiment extracts GIST feature vectors, in addition, it can also be HOG feature vectors or visual word bag feature vectors.

[0019] Step 2. The frequency of each global label ap...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention relates to a multi-scale automatic labeling method for images. The method includes the steps of searching a training set for K-nearest images of images to be labeled; adopting the occurrence frequency of global labels in the K-nearest images as the first weight; selecting M K-nearest images with the highest matching degree to be added into a candidate set; according to the occurrencefrequency of each global label in the candidate set images, updating the first weight to obtain second weight; utilizing the candidate set images to conduct local labeling on the images to be labeled; calculating an average relevancy coefficient between each local label of the images to be labeled in the training set images and all the global labels of the training set images, conducting weightedsummation on the first weight and the second weight to obtain third weight, and obtaining t global labels with the maximum third weight as the global labels of the images to be labeled. By using themulti-scale automatic labeling method for the images, multi-scale labeling of global and local semanteme of the images can be achieved. During global labeling, relevancy information between the locallabels and the global labels is utilized, and the accuracy of the global labels is improved.

Description

technical field [0001] The invention relates to the technical field of machine learning, in particular to an image automatic labeling technology. Background technique [0002] Automatic image annotation technology is one of the research hotspots in machine learning and computer vision. Automatic image annotation is to automatically assign semantic information to images in the form of words and vocabulary. The automatic labeling of images has great application value in many fields. The basic idea is to train a model using the already labeled images, and then apply the model to the unlabeled images to be labeled, and then infer the image to be labeled. semantics. [0003] The semantics of an image can be divided into two scales: local semantics and global semantics. Local semantics refers to the semantics presented by a certain part of the image, which can establish a mapping with pixels at certain positions in the image; global semantics refers to the semantics reflected by ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Patents(China)
IPC IPC(8): G06K9/62
Inventor 赵海英贾耕云
Owner 北京邮电大学世纪学院
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products