Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Deep model training method and device and image retrieval method and device

A deep model and training method technology, applied in neural learning methods, biological neural network models, special data processing applications, etc., can solve problems such as multi-system resources, weak constraints, false detection, etc., to reduce system resources and improve accuracy , the effect of reducing the false detection rate

Active Publication Date: 2017-10-20
BAIDU ONLINE NETWORK TECH (BEIJIBG) CO LTD
View PDF3 Cites 17 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0003] Although the above-mentioned first type of technical solution has a high accuracy rate, it has the defect of insufficient generalization ability, especially based on semantic similarity; the second type of technical solution has good generalization ability, but has the defect that the accuracy rate needs to be improved
[0004] Specifically, in the second category of technical solutions, the method of using the loss function design pattern of contrastive embedding (Contrastive embedding) or triplet embedding (Triplet embedding) can only use the constructed pair or triplet data for training, and the training There is a defect that the model is not easy to converge; and the method of using the loss function design pattern of Lifted structured feature embedding (Lifted structured feature embedding) has weak constraints on negative examples (that is, dissimilar samples), resulting in most cases of final prediction being Negative example, there are defects that can easily cause false detection
[0005] In addition, in practical applications, when the data magnitude of the entire database to be retrieved is large, the amount of feature data will be large, and its storage and operation will consume a lot of system resources

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Deep model training method and device and image retrieval method and device
  • Deep model training method and device and image retrieval method and device
  • Deep model training method and device and image retrieval method and device

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0037] The application will be further described in detail below in conjunction with the accompanying drawings and embodiments. It should be understood that the specific embodiments described here are only used to explain related inventions, rather than to limit the invention. It should also be noted that, for ease of description, only parts related to the invention are shown in the drawings.

[0038] It should be noted that, in the case of no conflict, the embodiments in the present application and the features in the embodiments can be combined with each other. The present application will be described in detail below with reference to the accompanying drawings and embodiments.

[0039] figure 1 It is a flowchart of a deep model training method provided by an embodiment of the present invention.

[0040] Such as figure 1 As shown, in this embodiment, the depth model training method provided by the present invention includes:

[0041] S13: According to the feature data e...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a deep model training method and device and an image retrieval method and device. The deep model training method includes the steps of according to feature data extracted by a deep model, calculating a first loss value of a structural-feature optimization embedding design mode and a second loss value of a comparison embedding design mode respectively; according to the first loss value and the second loss value, generating a combined loss value; according to the combined loss value, training the deep model. According to the deep model training method, on the basis of the loss layer of the structural-feature optimization embedding design mode, the loss layer of the comparison embedding design mode is combined, so that during the process of training, the punishment weight of negative examples is increased, the model is maintained easy to converge, and meanwhile, the accuracy of the feature data is improved, and the false detecting rate is reduced.

Description

technical field [0001] The present application relates to the technical field of image retrieval, in particular to a deep model training method and device, and an image retrieval method and device. Background technique [0002] At present, the existing image retrieval technology solutions for retrieving similar images usually include the following two types of methods: the first type uses traditional computer vision methods to extract image features, and then measures the distance of the features and sorts them to give the retrieval results; the second type The class uses the deep learning model to extract image features, and then measures the distance of the features and sorts them to give the retrieval results. [0003] Although the above-mentioned first type of technical solution has a high accuracy rate, it has the defect of insufficient generalization ability, especially based on semantic similarity; the second type of technical solution has a good generalization abilit...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06F17/30G06N3/04G06N3/08
CPCG06F16/583G06N3/08G06N3/048
Inventor 邓玥琳高光明丁飞胡先军
Owner BAIDU ONLINE NETWORK TECH (BEIJIBG) CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products