Unlock instant, AI-driven research and patent intelligence for your innovation.

Foreground reduction interference extraction method in three-dimensional reconstruction

An extraction method and three-dimensional reconstruction technology, which is applied in the field of foreground interference extraction in three-dimensional reconstruction, can solve the problems of not taking into account image, optimization filling, background interference, etc., and achieve the effect of perfect and reasonable steps, reducing error points, and excellent comprehensive performance

Pending Publication Date: 2020-06-30
青岛联合创智科技有限公司
View PDF9 Cites 3 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0006] The purpose of the present invention is to overcome the shortcomings of the existing foreground image processing, aiming at the existing modeling methods in the current motion reconstruction process, there is a large amount of background information, the background interference is obvious, and the existing foreground extraction cannot take into account the shortcomings of image optimization and filling, the design Provide a method for extracting foreground drop interference in 3D reconstruction

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Foreground reduction interference extraction method in three-dimensional reconstruction
  • Foreground reduction interference extraction method in three-dimensional reconstruction
  • Foreground reduction interference extraction method in three-dimensional reconstruction

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0031] A method for extracting foreground interference in 3D reconstruction involved in this embodiment, the specific process steps are as follows:

[0032] Step 1, calculate the image depth map: such as figure 1 As shown, S represents the modeling object, P i (i=1, 2, 3) represent the images taken from different angles. The motion reconstruction algorithm is used to perform feature extraction, feature matching and cluster adjustment steps on the captured images, and the position of each image in space is calculated. After dense matching, it is The depth map of each image can be obtained;

[0033] Step 2. Divide the depth map into blocks according to the depth value: Since the surface of the object is continuous and the foreground object occupies a large proportion in the image, the depth value of the background object is relatively large under normal circumstances, so deleting the point with a large depth value can make the Larger background blocks are deleted. When there a...

Embodiment 2

[0039] In this embodiment, the foreground interference extraction method in 3D reconstruction described in Implementation 1 is used to process specific images; taking large-scale scene modeling as an example, the specific processing steps are as follows:

[0040] (1) Calculate image depth map: Get the original image, two of which are shown in Figure 6(a) and Figure 6(b), calculate the depth map of the original image, and process Figure 6(a) to obtain the image 7(a), Fig. 7(b) is obtained after processing Fig. 6(b);

[0041] (2) Further process the depth maps of all original images according to steps 2-4 of Embodiment 1 to obtain a depth map after removing interference, and obtain Fig. 8(a) after processing Fig. 7(a), and Fig. 7( b) Figure 8(b) is obtained after processing;

[0042] (3) According to step 5 described in Embodiment 1, the depth maps of all original images are fused to obtain a three-dimensional point cloud model, such as Figure 9 as shown, Figure 9 The image...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention belongs to the technical field of foreground image processing, and relates to a foreground interference reduction extraction method in three-dimensional reconstruction. The method comprises the following process steps: 1, calculating an image depth map; 2, segmenting the depth map into blocks according to the depth value; 3, deleting the image blocks with small areas, and filling small gaps in the depth map; 4, counting depth values and deleting pixel points of which the depth values are greater than two times of the average depth value; 5, calculating a three-dimensional point cloud by using a depth map fusion algorithm. The method can effectively reduce the interference of the background object on the foreground during three-dimensional reconstruction, capture, delete and perfectly repair small image interference factors doped in the foreground, effectively reduce error points in the three-dimensional point cloud, reduce the modeling range, facilitate the modeling, observation and analysis of the foreground object in the image, and is wide in application scene.

Description

Technical field: [0001] The invention belongs to the technical field of foreground image processing, and relates to a foreground extraction method in the motion reconstruction process of photographed images by using a computer, especially a foreground interference extraction method in three-dimensional reconstruction, which can reduce background objects in three-dimensional reconstruction Foreground interference, foreground extraction effect is good. Background technique: [0002] At present, motion reconstruction is required when processing surveillance images. The goal of SFM (Structure from Motion, SFM) is to be able to automatically restore camera motion and scene structure using two or more scenes; it is a self-calibration technology that can Automatic camera tracking and motion matching. A commonly used motion reconstruction solution is to first calculate the camera pose of each image, then calculate the depth map of each image, and finally restore the 3D point cloud ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06T17/00G06T5/50G06T7/579G06T7/11G06T7/155G06T7/194
CPCG06T17/00G06T5/50G06T7/579G06T7/11G06T7/155G06T7/194G06T2207/20021G06T2207/20221
Inventor 纪刚杜靖安帅杨丰拓
Owner 青岛联合创智科技有限公司