Overlapped particulate matter layered counting method based on a color image and a depth image

A depth image and color image technology, applied in image analysis, image data processing, image enhancement, etc., can solve the problems of low counting accuracy, multi-layer overlapping occlusion, counting, etc.

Active Publication Date: 2019-04-05
JIANGSU UNIV
View PDF3 Cites 8 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

In the process of particle counting, there will be multiple overlapping occlusions, which seriously affects the accuracy of counting. Therefore, the counting of overlapping particles has become a difficult problem in automatic counting systems, and has also attracted the attention of many scholars.
Aiming at the research status of the current automatic counting method of overlapping particles, the counting accuracy is not high and the counting cannot be counted when the particles are completely overlapped, this paper proposes a layered counting method for overlapping particles

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Overlapped particulate matter layered counting method based on a color image and a depth image
  • Overlapped particulate matter layered counting method based on a color image and a depth image
  • Overlapped particulate matter layered counting method based on a color image and a depth image

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0049] The present invention will be described in detail below in conjunction with the accompanying drawings and specific embodiments, but the protection scope of the present invention is not limited thereto.

[0050] (1) Collect the target image.

[0051] The specific method is as follows: using the Microsoft Kinect camera, the color image and the depth image of the same scene can be obtained at the same time. The pixel intensity of the depth image corresponds to the distance from the camera. By adjusting the appropriate shooting distance, the target image with the best effect can be obtained.

[0052] (2) Image preprocessing operation.

[0053] The specific method is as follows: In order to complete the enhancement of the original depth image, it is enhanced by grayscale transformation and a multi-frame improved median filter method is proposed to complete the denoising of the depth image. The algorithm steps: 1. Continuously when taking pictures Save multiple frames of ima...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention provides an overlapped particulate matter layered counting method based on a color image and a depth image. Color and depth images are acquired by the Kinect camera and registered, and improved K-is adopted. a color image target area is extracted by using a means algorithm, dividing the color image target area into single target particles, adhesion types and overlapping types by using contour characteristics of the color image target area, and reconstructing an extractable contour by using an interpolation algorithm to obtain a corresponding area target quantity; For overlapped target contours which cannot be extracted by the method, the average area of a single particulate matter can be estimated, and the corresponding number is obtained by dividing the total area of the target area by the average area; And for the upper layer target which cannot be extracted by using the color image, threshold segmentation is performed on the depth image of the upper layer target so asto complete extraction and classification counting of the upper layer target. According to the method, the paper-skin walnuts which are uniform in size and similar to a circle are adopted as researchobjects, the walnuts which are placed at will are counted, the average correct rate reaches 99.38%, it is indicated that the method is effective, and meanwhile a new idea is provided for counting overlapped particulate matter.

Description

technical field [0001] The invention belongs to the technical field of digital image processing, and relates to the collection and processing of depth images, in particular to classification identification and hierarchical counting of overlapping particles based on color images and depth images. Background technique [0002] In recent years, the automatic counting method using image processing technology has been applied in many fields, especially in the field of agricultural automation. The counting method based on machine vision has effectively overcome many shortcomings of manual counting. In the process of particle counting, there will be multiple overlapping occlusions, which seriously affects the accuracy of counting. Therefore, the counting of overlapping particles has become a difficult problem in automatic counting systems, and has also attracted the attention of many scholars. Aiming at the research status of the current automatic counting method of overlapping par...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06T7/33G06T7/11G06T7/12G06T5/00G06T7/60G06T7/80G06K9/62
CPCG06T5/002G06T7/11G06T7/12G06T7/337G06T7/60G06T7/80G06T2207/30242G06T2207/10024G06T2207/10028G06F18/23213Y02A90/10
Inventor 朱伟兴司艳丽李新城
Owner JIANGSU UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products