Objective positioning and classifying algorithm based on deep learning

A technology of target positioning and classification algorithm, applied in the field of deep learning, can solve the problems of increasing the difficulty of designing a real system, unable to achieve practical value, and unable to share different features, achieve good economic benefits, improve the efficiency of detection, improve The effect of information processing

Pending Publication Date: 2016-09-21
ENBOTAI TIANJIN TECH CO LTD
View PDF2 Cites 26 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0003] Traditional methods can only perform single-target detection. If multi-target detection is required, different features and different classifiers need to be used, which increases the difficulty of designing the entire system, and different features cannot be shared among different classifiers, resulting in It reduces the repeatability of calculation and cannot improve the efficiency of detection, and the generalization ability of traditional algorithms in complex scenarios is weak and cannot achieve practical value.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Objective positioning and classifying algorithm based on deep learning
  • Objective positioning and classifying algorithm based on deep learning
  • Objective positioning and classifying algorithm based on deep learning

Examples

Experimental program
Comparison scheme
Effect test

Embodiment

[0023] refer to Figure 1-4 , the present embodiment proposes a target location and classification algorithm based on deep learning, comprising the following steps:

[0024] S1: Input pictures to the first network, and output a series of target positioning boxes and scores;

[0025] S2: input a picture and a series of sub-windows to the second network;

[0026] S3: Propagate the network forward to the last convolutional layer to generate a feature map;

[0027] S4: Use the zoom factor to perform coordinate transformation on the sub-window, so that the coordinates are mapped to the feature map;

[0028] S5: Use the zoomed sub-window to obtain features on the feature map, and down-sample to a fixed size;

[0029] S6: classify the data after mining, and obtain the classification results and scores of the regions;

[0030] S7: Input the classification results of the target positioning frame and area into the classifier for classification, and the output is the category and coo...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses an objective positioning and classifying algorithm based on deep learning. The algorithm comprises the following steps: S1, inputting a picture into a first network and outputting a series of objective positioning frames and scores; S2, inputting a picture and a series of sub windows into a second network; S3, forwardly propagating the networks to a last convolution layer to generate a characteristic pattern; S4, performing coordinate transformation on the sub windows by using a zooming coefficient in order to map the coordinates to the characteristic pattern; S5, acquiring a characteristic from the characteristic pattern by using the zoomed sub windows and pooling the characteristic to a regular size; S6, classifying the pooled data to obtain a classification result and a score of an area; and S7, inputting the objective positioning frames and the classification result of the area into a classifier to carry out classification, and outputting the type and the coordinates of the objective. The algorithm may extract characteristics, classifies the objectives, performs positioning and identification by using a unified network, and greatly increases calculating speed.

Description

technical field [0001] The present invention relates to the technical field of deep learning, in particular to a target positioning and classification algorithm based on deep learning. Background technique [0002] The deep learning algorithm effectively organizes the processing of semantic segmentation, target detection and tracking, scene understanding and analysis, etc. in the vehicle vision under the framework of the convolutional neural network, forming an overall end-to-end processing solution, and optimizing the performance of vehicle vision. The neural network model enables it to quickly, accurately and effectively complete the task of visual processing under the vehicle embedded system. [0003] Traditional methods can only perform single-target detection. If multi-target detection is required, different features and different classifiers need to be used, which increases the difficulty of designing the entire system, and different features cannot be shared among dif...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/62
CPCG06F18/241
Inventor 王曦宋健明谢晓靓周冕李皓
Owner ENBOTAI TIANJIN TECH CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products