Underwater scene reconstruction method based on motion recovery, and storage medium

A technology for scene reconstruction and motion recovery, applied in 3D modeling, 3D image processing, image analysis, etc., can solve the problems of reduced precision, increased operation difficulty and cost, lack of interconnection between different perspectives of reconstructed point clouds, etc., to achieve accurate The effects of improved robustness, good efficiency and precision

Active Publication Date: 2018-10-12
JILIN UNIV
View PDF4 Cites 10 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, the existing algorithms also have many disadvantages: the binocular stereo vision algorithm based on feature point matching has high accuracy and low time complexity, but can only obtain the sparse point cloud of the scene, and the reconstruction effect is not ideal; Although the matching binocular stereo vision algorithm can obtain dense point cloud data, the accuracy is reduced, and the time complexity is greatly improved compared with feature point matching. When performing 3D reconstruction of large scenes, it takes too long
At the same time, because the traditional binocular stereo vision algorithm only considers the relationship between the corresponding frames, the reconstruction point cloud lacks the interconnection of different perspectives, and the reconstruction point cloud connection is unnatural, which affects the reconstruction eff

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Underwater scene reconstruction method based on motion recovery, and storage medium
  • Underwater scene reconstruction method based on motion recovery, and storage medium
  • Underwater scene reconstruction method based on motion recovery, and storage medium

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0091] The underwater scene reconstruction method based on motion recovery of the present invention can be programmed in a computer language and run on a development platform to realize the above-mentioned functions.

[0092] In Embodiment 1, the underwater scene reconstruction method based on motion recovery is developed on the VisualStudio2010 platform using C++ language, and verified on a set of subsea videos taken by a strictly calibrated binocular GoPro Hero2 camera.

[0093] Two 7s avi format videos are used, which are taken by the left-eye camera and the right-eye camera. Each video contains a total of 210 frames of images, after removing redundant images, a set of images to be reconstructed containing 34 frames is obtained. Figure 6(a)-(d) shows four frames in the image after removing the redundancy in the image in the reconstructed underwater video, Figure 7 The 3D point cloud model of the seabed obtained by reconstructing the above image through the scene, Figure 8 It ...

Embodiment 2

[0095] In order to further evaluate the effect of the underwater scene reconstruction method of the present invention, the present invention uses two sets of multi-view model images with laser scanning data to reconstruct using the method of the present invention and perform quantitative comparative analysis.

[0096] Fig. 9 is a comparison between a dinosaur model reconstructed by the reconstruction method of the present invention and a laser scanning model, wherein Fig. 9(a) is a laser scanning model, and Fig. 9(b) is a model reconstructed by the algorithm of the present invention.

[0097] Fig. 10 is a comparison between the temple model reconstructed by the reconstruction method of the present invention and the laser scanning model, wherein Fig. 10(a) is the laser scanning model, and Fig. 10(b) is the model reconstructed by the algorithm of the present invention.

[0098] It can be seen from this that the present invention can still achieve better reconstruction results even when ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses an underwater scene reconstruction method based on motion recovery, and a storage medium. The reconstruction method comprises: introducing an improved motion recovery algorithm, extracting a motion matrix, establishing a mutual connection between video images, and after completing redundant image culling, performing feature point matching and point cloud generation in two steps: firstly performing feature point matching on the binocular image, generating patches based on the matched feature points in order to obtain more dense point cloud data; then spreading the patches to all angles of view, to complete the reconstruction of the scene model, and finally performing the color correction of on point cloud model according to the imaging characteristics of the underwater scene. The invention may still achieve better reconstruction results when only a few input images are obtained, has relatively good efficiency and precision, and at the same time certainly improvesthe accuracy and robustness of the reconstruction scene.

Description

Technical field [0001] The invention relates to a three-dimensional reconstruction method, in particular to an underwater scene reconstruction method based on motion recovery and a storage medium, which can take into account both efficiency and accuracy. Background technique [0002] The real world is three-dimensional. In order to facilitate the observation and analysis of the real world and even expand it, it is necessary to reconstruct a three-dimensional model in a computer environment. In recent years, with the rapid advancement of computer hardware technology and the rapid development of software, there are more and more methods for constructing 3D models. Related software is widely used in medical image processing, 3D printing, computer games, virtual reality, mapping, simulation military training, Film and television entertainment and other fields. According to the different ways of obtaining reconstruction data, 3D model construction technology is mainly divided into th...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G06T17/00G06T7/246G06T7/33G06T7/80G06T7/90G06T15/50
CPCG06T7/248G06T7/337G06T7/85G06T7/90G06T15/50G06T17/00G06T2207/10021G06T2207/10028G06T2207/20021
Inventor 王欣杨熙
Owner JILIN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products