Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Crowd counting method based on scene depth information

A technology for crowd counting and scene depth, which is applied in computing, image data processing, computer components, etc., can solve the problems of insufficient multi-scale adaptability, high degree of overfitting, and low counting accuracy of crowd counting, and achieve suppression of sample size Insufficient, low degree of overfitting, and accurate counting effect

Inactive Publication Date: 2019-07-26
CHANGSHU INSTITUTE OF TECHNOLOGY
View PDF2 Cites 30 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Benefits of technology

This patented technology improves upon existing techniques such as multilevel adaptation (MLA) or deep belief networks (DBN). It allows us to train multiple models simultanously instead of one at once, allowing for more flexibility when selecting appropriate regression functions depending on different scenes' sizes. Additionally, it introduces a new approach called Multiplex Coding Network (MCNN), where each region represents only part of the original dataset while other parts are combined together. By doing these modifications, we aim to reduce overfit phenomena during training without compromising accuracy. Overall, our technical effect lies within improved versatility across various applications including crowning count analysis, object recognition, and motion tracking.

Problems solved by technology

Technological Problem addressed in this patents relates to improving crowd counting systems' capabilities without sacrificial effort towards specific objects like humans. Current solutions involve either stator pattern recognition algorithms or fully trained models, but these approaches require extensive prior knowledge about the environment being observed.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Crowd counting method based on scene depth information
  • Crowd counting method based on scene depth information
  • Crowd counting method based on scene depth information

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0037] Such as figure 1 As shown, the present invention is based on the crowd counting method of scene depth information, comprises the following steps:

[0038] (10) Image segmentation: using the monocular image depth estimation algorithm to extract the depth information of the input image, and segment the input image into near-view and distant-view areas according to the depth information;

[0039] Such as figure 2 Shown, described (10) image segmentation step comprises:

[0040](11) Extract depth information: use a fully convolutional residual network to predict the depth map of a single RGB input image, map the input image into a corresponding depth map, and restore the size of the depth map to the size of the input image;

[0041] (12) Segmentation of image area: Using linear iterative clustering method, the depth map is divided into two parts, and then the segmentation result is mapped to the input image, so that the input image is divided into near-view area and dist...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a crowd counting method based on scene depth information, which comprises the following steps of: (10) image segmentation: extracting depth information of an input image by adopting a monocular image depth estimation algorithm, and segmenting the input image into a close-range region and a distant-range region according to the depth information; (20) close-range area crowdcounting: predicting a head boundary frame of the close-range area by adopting a crowd counting method based on target detection to obtain the number of people in the close-range area; (30) crowd counting in a distant view area: predicting a crowd density map in the distant view area by adopting a crowd counting method based on density estimation, and solving the product of the crowd density map in the distant view area to obtain the number of people in the distant view area; and (40) confirmation of the number of people in the input image: adding the number of people in the distant view areaand the number of people in the close view area to obtain the total number of people in the input image. The crowd counting method based on the scene depth information is good in multi-scale adaptability, low in overfitting degree and accurate in counting.

Description

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Owner CHANGSHU INSTITUTE OF TECHNOLOGY
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products