Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Automatic image annotation method integrating depth features and semantic neighborhood

An automatic image and depth feature technology, applied in character and pattern recognition, instruments, computer parts, etc., to achieve the effect of simple method, flexible implementation and strong practicability

Active Publication Date: 2016-12-21
FUZHOU UNIV
View PDF5 Cites 32 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] In view of this, the purpose of the present invention is to provide an automatic image labeling method that combines deep features and semantic neighborhoods to overcome the defects in the prior art and solve the problem of automatic image labeling for multiple objects and multiple labels

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Automatic image annotation method integrating depth features and semantic neighborhood

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0029] The present invention will be further described below in conjunction with the accompanying drawings and embodiments.

[0030] The present invention provides an automatic image labeling method that combines depth features and semantic neighborhoods, such as figure 1 As shown, in view of the time-consuming and labor-intensive manual feature selection and the traditional label propagation algorithm ignoring semantic similarity, which makes it difficult for the labeling model to be applied to the real image environment, an image labeling method that combines deep features and semantic neighborhoods is proposed. The method first utilizes a multi-layer CNN deep feature extraction network to achieve general and effective deep feature extraction. Then, the semantic groups are divided according to the keywords, and the visual neighbors are limited to the semantic groups to ensure that the images in the neighborhood image set are semantically adjacent and visually adjacent. Fina...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to an automatic image annotation method integrating depth features and semantic neighborhood. In view of the problem that manual selection of features takes time and energy in the traditional image annotation method, the problem that the traditional label propagation algorithm ignores semantic neighborhood, which results in visual similarity and semantic dissimilarity and affects the annotation result, and the like, the invention puts forward an automatic image annotation method integrating depth features and semantic neighbors. First, a unified and adaptive depth feature extraction framework based on a depth convolutional neural network (CNN) is built; then, a training set is grouped semantically, and a neighborhood image set of an image to be annotated is built; and finally, the contribution value of each label of the neighborhood images is calculated according to the visual distance, and the contribution values are sorted to get annotation keywords. The method is simple and flexible, and is of strong practicability.

Description

technical field [0001] The invention relates to an automatic image labeling method that combines depth features and semantic neighborhoods. Background technique [0002] With the rapid development of multimedia image technology, image information on the Internet is growing explosively. These digital images are widely used in business, news media, medicine, education and so on. Therefore, how to help users quickly and accurately find the desired image has become one of the hot topics in multimedia research in recent years. The most important technology to solve this problem is image retrieval and automatic image annotation technology. [0003] Automatic image annotation is a key step in image retrieval and image understanding. It is a technology to add keywords to unknown images that can describe the semantic content of the image. This technology mainly uses the image training set that has been marked with keywords to train the labeling model, and then uses the trained mod...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06K9/62
CPCG06F18/214
Inventor 柯逍周铭柯
Owner FUZHOU UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products