Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Crowd counting method based on coding-decoding structure multi-scale convolutional neural network

A convolutional neural network and crowd counting technology, which is applied in the field of crowd counting based on multi-scale convolutional neural network with encoding-decoding structure, can solve the problems of low density map quality, loss of multi-scale feature information, poor fusion, etc. Achieve effective fusion and improve output quality

Active Publication Date: 2020-06-05
XI'AN UNIVERSITY OF ARCHITECTURE AND TECHNOLOGY
View PDF4 Cites 34 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] The purpose of the present invention is to provide a crowd counting method based on multi-scale convolutional neural network with encoding-decoding structure, to solve the problems of multi-scale feature information loss, poor fusion and density in the crowd counting method based on multi-column convolutional neural network. Image quality is not high

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Crowd counting method based on coding-decoding structure multi-scale convolutional neural network
  • Crowd counting method based on coding-decoding structure multi-scale convolutional neural network
  • Crowd counting method based on coding-decoding structure multi-scale convolutional neural network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0055] In order to make the purpose, technical effects and technical solutions of the embodiments of the present invention more clear, the technical solutions in the embodiments of the present invention are clearly and completely described below in conjunction with the accompanying drawings in the embodiments of the present invention; obviously, the described embodiments It is a part of the embodiment of the present invention. Based on the disclosed embodiments of the present invention, other embodiments obtained by persons of ordinary skill in the art without making creative efforts shall all fall within the protection scope of the present invention.

[0056] A crowd counting method based on an encoding-decoding structure multi-scale convolutional neural network in an embodiment of the present invention comprises the following steps:

[0057] Step 1: Collect the image information in the actual scene through the monitoring camera, consider the viewing angle distortion of the i...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a crowd counting method based on a coding-decoding structure multi-scale convolutional neural network, and the method is characterized in that the method comprises the following steps: considering the visual angle distortion of an image, and employing an adaptive Gaussian filter to calculate a true value density map of the image; adopting an encoding-decoding structure, building a multi-scale convolutional neural network model, wherein a loss function of the multi-scale convolutional neural network model comprises pixel space loss and counting error description; training and testing the built multi-scale convolutional neural network model to obtain a trained multi-scale convolutional neural network model; inputting a to-be-estimated image into the trained multi-scale convolutional neural network model, and predicting to obtain a crowd density map; and performing regression estimation on the crowd density map to obtain the number of people in the to-be-estimatedimage. According to the method, scale features and contextual information of the image can be reserved, and the output quality of the density map can be improved.

Description

technical field [0001] The invention belongs to the technical field of image processing, and in particular relates to a crowd counting method based on an encoding-decoding structure multi-scale convolutional neural network. Background technique [0002] With the rapid development of the national economy and the continuous acceleration of urbanization, the urban population has increased sharply, and the resulting social problems have also continued to increase; for example, unsafe accidents such as stampedes caused by crowded people; The pressure of scheduling, etc. To solve the above problems, it is necessary to accurately predict the number of people in the scene, and the image can clearly and intuitively reflect the change of the crowd in the actual scene, so the estimation and counting of crowd density based on image information has important research significance. [0003] The Convolutional Neural Network (CNN) model has developed rapidly in the fields of semantic segme...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06K9/00G06K9/62G06N3/04
CPCG06V20/53G06N3/045G06F18/214
Inventor 孟月波刘光辉徐胜军纪拓
Owner XI'AN UNIVERSITY OF ARCHITECTURE AND TECHNOLOGY
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products