Weak and small object recognition method based on deep neural network

A deep neural network and weak target technology, applied in neural learning methods, biological neural network models, character and pattern recognition, etc., can solve problems such as low accuracy, complex structure, underfitting of weak target recognition, etc., and achieve salience Improved and accurate classification effect

Inactive Publication Date: 2018-06-05
NORTHWESTERN POLYTECHNICAL UNIV
View PDF6 Cites 15 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] The technical problem solved by the present invention is: the existing deep neural networks are all designed for large-scale targets, and their structures are too complex, and under-fitting problems are prone to occur in the recognition of weak and small targets, resulting in low accuracy

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Weak and small object recognition method based on deep neural network
  • Weak and small object recognition method based on deep neural network
  • Weak and small object recognition method based on deep neural network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0016] See figure 1 , the deep neural network structure proposed by the present invention has a total of 23 layers, of which 21 layers are hidden layers, including 5 layers of convolutional layers, 3 layers of fully connected layers, 7 layers of ReLU layers, 3 layers of normalization layers and 3 pooling layers layer. The input of the neural network is a 65×65 RGB color image, and the output is two probability values, which respectively represent the probability that the input image is the target and not the target.

[0017] The data accepted by the input layer of the deep neural network is an RGB color image of 65×65 pixels. The convolution kernel size of the first convolutional layer is 5×5×3, a total of 64, and the convolution operation with a step size of 2 is performed. , and then after a pooling layer with a step size of 2 and a size of 3×3 pixels and a regularization layer, a set of data with a size of 17×17×64 is output.

[0018] The second convolutional layer is com...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention relates to a weak and small object recognition method based on a deep neural network, and the method is an application of the deep neural network at the aspect of computer vision. Aimingat the problems that an existing neural network is not high in recognition precision of a weak and small object and is slow in recognition speed, the invention proposes a novel deep neural network structure, and the structure focuses on the extraction of the features of the weak and small object and the accurate description of the small-scale image features through the increase of the image datadepth and the repeated extraction of features at the same scale, thereby achieving the accurate recognition of the weak and small object.

Description

technical field [0001] The invention belongs to the technical field of image processing, and relates to the application of a deep neural network in computer vision, and relates to a weak target recognition method based on a deep neural network. Background technique [0002] Deep neural network is a very popular research direction in the field of artificial intelligence in recent years, and breakthroughs have been made in image processing, target recognition, and speech recognition. Deep convolutional neural network is the most representative network structure in the field of image processing. The deep convolutional neural network has two characteristics: (1) It has a deep and multi-hidden layer structure. The deep structure enables the neural network to learn complex functions that can represent high-level abstract features, and the multi-hidden layers greatly improve the learning ability, thereby extracting more abstract and essential features. (2) Adopt weight sharing ne...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/62G06N3/08
CPCG06N3/084G06V2201/07G06F18/24G06F18/214
Inventor 王靖宇王霰禹姜海旭张科王佩吕梅柏张彦华
Owner NORTHWESTERN POLYTECHNICAL UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products