Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Target detection method based on dense connection structure

A dense connection, target detection technology, applied in the field of deep convolutional neural network and computer vision, can solve problems such as low accuracy rate, low feature extraction ability, affecting detection accuracy rate, etc., to enhance feature extraction ability and improve feature extraction ability. , the effect of improving efficiency and accuracy

Pending Publication Date: 2021-03-23
CHANGSHA UNIVERSITY OF SCIENCE AND TECHNOLOGY
View PDF0 Cites 8 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Since it is a basic network proposed for classification tasks, when it is used for target detection, it has the disadvantages of low feature extraction ability and inability to make full use of multi-scale regional features, which will have an impact on the classification and positioning of the subsequent detection network, thereby affecting detection. Accuracy rate, resulting in low accuracy rate

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Target detection method based on dense connection structure
  • Target detection method based on dense connection structure
  • Target detection method based on dense connection structure

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0032] refer to Figures 1 to 5 , the present embodiment provides a method for object detection based on a densely connected structure, comprising the following steps:

[0033] S1: Define the target category to be detected, collect a large amount of image data, classify and label the collected image data according to the defined target category, and obtain a data set.

[0034] Define the target category to be detected according to the detection requirements. Collect the required image data by manual shooting and installation of shooting equipment, or crawl the data that needs to be detected on the webpage through crawler technology, and classify the collected data according to the defined target categories, and use the image labeling tool labelling Label the target object in the image data, get the actual border of the target object, and mark its target category to get the data set. According to the principle of random division, the marked data is divided according to the ra...

Embodiment 2

[0063] This embodiment provides a method for object detection based on a densely connected structure, comprising the following steps:

[0064] S1: It is exactly the same as Embodiment 1, that is, define the target category to be detected, collect a large amount of image data, classify and mark the collected image data according to the defined target category, and obtain a data set.

[0065] S2: Construct the target detection network model and determine the loss function. The target detection network model in this embodiment is composed of a basic network module, a feature fusion module, a dense connection module and a feature aggregation module. Each component module of the target detection network model is composed of some convolutional layers and pooling layers. Each convolutional layer performs convolution operations on the input image data, and each operation extracts different features in the image. The convolution of the lower layers The layer extracts simple image stru...

Embodiment 3

[0073] This embodiment provides a method for object detection based on a densely connected structure, comprising the following steps:

[0074] S1: Define the target category to be detected, collect a large amount of image data, classify and label the collected image data according to the defined target category, and obtain a data set.

[0075] S2: Construct the target detection network model and determine the loss function. The target detection network model in this embodiment is composed of a basic network module, a feature fusion module, a dense connection module and a feature aggregation module. Each component module of the target detection network model is composed of some convolutional layers and pooling layers. Each convolutional layer performs convolution operations on the input image data, and each operation extracts different features in the image. The convolution of the lower layers The layer extracts simple image structures such as edges and lines of the image, the...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a target detection method based on a dense connection structure, and the method comprises the steps: defining a to-be-detected target type, labeling a target object in collectedimage data, obtaining the actual frame of the target object in the image data, and marking the target type of the target object, and obtaining a data set; constructing a target detection network model composed of a basic network module, a feature fusion module, a dense connection module and a feature aggregation module, and determining a loss function; training the constructed target detection network model by using the data set until the loss function converges, finishing the training process, and storing the corresponding weight parameter at the moment to obtain a trained target detection network model; and inputting the image of the to-be-detected target category into the trained target detection model to realize target detection. According to the method, a dense connection mode and afeature fusion and aggregation mode are combined, so that the feature extraction capability is improved, the gradient descent problem is relieved, and the detection efficiency and accuracy are effectively improved.

Description

technical field [0001] The invention relates to the technical fields of deep convolutional neural network and computer vision based on deep learning, in particular to a target detection method based on a dense connection structure. Background technique [0002] With the rapid development of information technology, massive amounts of data are generated in every aspect of people's lives. Image is one of many data types. It is a basic research goal to understand the information content conveyed by image data and a basic task of computer vision. [0003] In recent years, with the advancement of high-performance computing technology and the emergence of high-efficiency computing equipment, the development of deep learning and artificial intelligence technology, object detection technology has been widely used in many aspects such as robot vision, consumer electronics, intelligent video surveillance, and content-based image retrieval. application. However, when using traditional...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/62G06N3/04G06N3/08G06T3/60G06T7/11G06T7/13
CPCG06T7/13G06T7/11G06T3/60G06N3/084G06T2207/20132G06V2201/07G06N3/045G06F18/214G06F18/253Y02T10/40
Inventor 蒋加伏蒋利佳颜丹
Owner CHANGSHA UNIVERSITY OF SCIENCE AND TECHNOLOGY
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products