Three-dimensional tree image fusion method based on environmental perception

A technology of environment perception and image fusion, applied in image enhancement, image analysis, image data processing, etc., can solve the problems of inability to achieve natural fusion, occlusion, paperization, etc.

Active Publication Date: 2020-02-04
ZHEJIANG UNIV OF TECH
View PDF5 Cites 3 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

The structure of 2D tree image is complex, it is difficult to separate the object from the background image by simple user interaction
In addition, the fusion of 2D images in 3D scenes is prone to "paper fragmentation", occlusion, inconsistent depth, etc., so that natural fusion cannot be achieved

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Three-dimensional tree image fusion method based on environmental perception
  • Three-dimensional tree image fusion method based on environmental perception
  • Three-dimensional tree image fusion method based on environmental perception

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0066] The technical solution of the present invention will be further described below in conjunction with the accompanying drawings.

[0067] A three-dimensional tree image fusion method based on environmental perception, comprising the following steps:

[0068] 1. Get the tree object;

[0069] (11) Obtain the three-color map, as follows;

[0070] Any pixel of a natural image is a linear combination of corresponding foreground and background:

[0071] I z = α z f z +(1-α z )B z (1)

[0072] Among them, z represents any pixel in the image, I z represents the pixel value of z, F z represents the foreground color at z, B z Indicates the background color at z, α z is the weighted ratio of the foreground color to the background color at z, and the value is between [0,1]; α z When it is 0, point z is the absolute background, α z When it is 1, it is the absolute foreground; the three-color map divides the image into three areas: absolute foreground, absolute background...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a three-dimensional tree image fusion method based on environmental perception. The method comprises the following steps: step 1, acquiring a tree object; step 2, carrying outdepth consistency parallax fusion; and step 3, adjusting image chromatic aberration and processing occlusion. The tree disparity map constructed by the method can reflect the multi-level characteristics of the tree, the fused disparity map can ensure the overall consistency of the fused tree and the target scene in the depth domain, the generated image tree fusion effect is more natural than Poisson fusion and transparency fusion, and the occlusion relationship between the fused foreground and background can be effectively processed.

Description

technical field [0001] The invention relates to a method for constructing seamless fusion of three-dimensional tree images. Background technique [0002] At present, 3D multimedia technology has received more and more attention, and the requirements for stereoscopic image and video editing technology are also getting higher and higher. Tree objects are widely used in virtual city construction, virtual games, etc. Therefore, if 2D tree objects are intuitively integrated into 3D scenes, it will greatly simplify and speed up the construction of three-dimensional scenes. Therefore, the main purpose of the present invention is to realize 2D tree objects. Natural fusion of images into 3D target stereoscopic scenes. [0003] How to achieve a more natural and seamless fusion is the key to the research of fusion problems. 2D tree images have complex structures, and it is difficult to separate objects from background images with simple user interaction. Moreover, when 2D images are...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06T5/50G06T5/00G06T19/20G06K9/62
CPCG06T5/50G06T5/007G06T19/20G06T2207/10024G06T2207/20221G06T2207/30188G06T2219/2012G06F18/23213
Inventor 董天阳程银婷杨丽锦
Owner ZHEJIANG UNIV OF TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products