Disparity map DBSCAN clustering-based region of interest extraction method

A technology of region of interest and extraction method, which is applied in the field of region of interest extraction based on disparity map DBSCAN clustering, can solve the problems of sample acquisition and labeling costs that are difficult to apply, difficult, and have a large impact on accuracy

Active Publication Date: 2019-11-26
CHINESE AERONAUTICAL RADIO ELECTRONICS RES INST
View PDF4 Cites 4 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0003] Existing region of interest extraction is mostly based on monocular images, usually using manual interaction or image threshold segmentation method to find the corresponding target region and regard it as the region of interest, its accuracy is greatly affected by the image itself, such as uneven illumination, Small differences between the foreground and the background will cause difficulties in extraction
Target detection technology can also be used as a means of extracting regions of interest, and the target frame of the detection output is used as the region of interest for subsequent processing, but it requires a large number of samples as offline training data, which is difficult for occasions where sample acquisition and labeling costs are high. Be applicable

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Disparity map DBSCAN clustering-based region of interest extraction method
  • Disparity map DBSCAN clustering-based region of interest extraction method
  • Disparity map DBSCAN clustering-based region of interest extraction method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0027] In order to further illustrate the technical content of the present invention, the present invention will be described in detail below in conjunction with the accompanying drawings and examples of implementation.

[0028] see figure 1 As shown, a method for extracting regions of interest based on disparity map DBSCAN clustering includes the following steps:

[0029] Step 1: Perform matching calculation on the binocular view to obtain a parallax image. The disparity image can come from a sparse disparity map method based on SAD matching, or from a dense disparity map calculation method based on SGM, graph cut, and deep learning. The disparity map should be quantized to a pixel value range from 0 to 255.

[0030] Step 2: Preprocess the parallax image input in step 1, and transform the parallax image into an image matrix with 1 row and N column vectors; where N is the product of image width W and height H, and the background in the parallax image is in the image matrix ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a disparity map DBSCAN clustering-based region of interest extraction method, which comprises the following steps of 1, performing matching calculation on a binocular view to obtain a disparity image; 2, deforming the parallax image into an image matrix with one row and N column vectors, wherein N is the product of the width W and the height H of the image; 3, taking the absolute value distance of the parallax as a DBSCAN sample distance, and carrying out the clustering analysis of the image matrix through employing a DBSCAN clustering algorithm; and 4, mapping a clustering result back to a certain path in the original binocular view, and obtaining a region of interest according to the clustering result. The method does not need manual interaction and sample set training, can automatically extract the region of interest at each depth, is not affected by a complex background, and facilitates the further analysis and processing of the image.

Description

technical field [0001] The invention relates to the technical field of image processing, in particular to a method for extracting regions of interest based on disparity map DBSCAN clustering. Background technique [0002] Region of Interest (ROI) refers to an image region selected in some form from the processed image, and this region is the focus of subsequent image analysis. Generally speaking, the region of interest is the region where the target object is located in the image, rather than the background region. The region of interest extraction technology can reduce image processing time and improve processing accuracy. [0003] Existing region of interest extraction is mostly based on monocular images, usually using manual interaction or image threshold segmentation method to find the corresponding target region and regard it as the region of interest, its accuracy is greatly affected by the image itself, such as uneven illumination, Small differences between the fore...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/32G06K9/62
CPCG06V10/25G06F18/2321
Inventor 李威魏大洲邢波涛
Owner CHINESE AERONAUTICAL RADIO ELECTRONICS RES INST
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products