Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Model construction method based on transfer learning, image recognition method and device

A technology for learning models and building methods, applied in the field of image processing, which can solve problems such as limiting the application scope of learning models, poor object recognition experience, and object recognition that cannot distinguish features.

Active Publication Date: 2019-12-20
SHENZHEN ZTE NETVIEW TECH +1
View PDF4 Cites 3 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] Currently, an effective method to solve this problem is transfer learning, but in the actual neural network training process, often due to the large differences between different data sets, it will cause some strong fluctuations, and make the learning model It is difficult to converge, so that the same learning model can only be used to identify some objects with small feature differences, but not those objects with large differences, which limits the application range of the learning model and eventually leads to object recognition. poor experience

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Model construction method based on transfer learning, image recognition method and device
  • Model construction method based on transfer learning, image recognition method and device
  • Model construction method based on transfer learning, image recognition method and device

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0058] Please refer to figure 1 , the present application provides a method for building a model based on transfer learning, including steps S100-S300, which are described below.

[0059] Step S100, using the first training set to obtain a first learning model through deep neural network training.

[0060] It should be noted that the deep neural network here can be a long-term short-term memory network (Long / short termmemory, referred to as LSTM), a recurrent neural network (Recurrent neural networks, referred to as RNN), a generative adversarial network (Generative adversarial networks, referred to as GAN), Convolutional neural networks (DCNN for short) or deep convolutional inverse graphics networks (DCIGN for short), are not limited here.

[0061] It should be noted that the first training set here includes multiple images of objects with the first type of features, for example, the first training set includes multiple face images of Asians, where the first type of feature...

Embodiment 2

[0113] Please refer to Figure 5 , on the basis of the model construction method disclosed in Embodiment 1, the present application also discloses an image recognition method, which includes steps S410-S430, which will be described separately below.

[0114] Step S410, acquiring an image of an object to be detected, where the object to be detected is an object having a first type of feature and / or a second type of feature.

[0115] For example, if the object to be detected is an Asian, the first type of features corresponding to the Asian may include feature information on yellow skin, black eyes, black hair, and diamond-shaped facial contours. If the object to be detected is a European, then the second type of features corresponding to the European may include feature information of white skin color, blue eyes, blond hair, square face outline, and the like. If the object to be detected is a Eurasian mixed race, then his (her) face may include several feature information in t...

Embodiment 3

[0123] Please refer to Figure 6, on the basis of the image recognition method disclosed in Embodiment 2, correspondingly, the present application also discloses an image recognition device 1, which mainly includes an image acquisition unit 11, a feature extraction unit 12 and an object recognition unit 13, which will be described separately below .

[0124] The image acquiring unit 11 is configured to acquire an image of an object to be detected, and the object to be detected is an object having a first type of feature and / or a second type of feature. Specifically, the image acquisition unit 11 can acquire images of the object to be detected by means of imaging devices such as cameras and cameras, or even media videos. Regarding the specific functions of the image acquisition unit 11 , reference may be made to step S410 in the second embodiment, which will not be repeated here.

[0125] The feature extraction unit 12 is used to extract feature information in the image of th...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a model construction method based on transfer learning and an image recognition method and device. The method comprises the steps of obtaining a first learning model through the training of a deep neural network through a first training set; performing the parameter fine adjustment on each network layer of the first learning model one by one by using a second training set until the result of a loss function of the first learning model does not decrease or all network layers are traversed, and constructing a second learning model; combining the first training set and thesecond training set, and training through a second learning model to obtain a third learning model, wherein the third learning model is used for performing the feature information extraction on the image of an object with the first type of features and / or the second type of features. When the model is constructed, the first training set and the second training set are utilized, so that the constructed second learning model can have the feature learning capability of the two training sets, and a relatively better convergence effect can be achieved.

Description

technical field [0001] The invention relates to the technical field of image processing, in particular to a model building method based on transfer learning, an image recognition method and a device. Background technique [0002] In recent years, deep learning has attracted more and more attention and has been successfully applied in many fields. Deep learning algorithms can learn advanced features from massive data, which gives deep learning advantages over traditional machine learning. [0003] However, data dependence is one of the most serious problems in deep learning. Compared with traditional machine learning methods, deep learning is extremely dependent on large-scale training data, because it requires a large amount of data to understand the underlying data patterns. Since the production of a large number of training data sets generally has certain limitations, and the training network for a specific training set often performs differently after changing the data ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/62G06N3/04G06N3/08
CPCG06N3/08G06N3/044G06N3/045G06F18/214Y02T10/40
Inventor 尉桦邵新庆刘强
Owner SHENZHEN ZTE NETVIEW TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products