Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Image space-based image field depth simulation method

A simulation method and image space technology, applied in the field of image processing, can solve problems such as manual processing, and achieve the effect of reducing image errors

Inactive Publication Date: 2017-02-15
SHANGHAI JIAO TONG UNIV
View PDF4 Cites 5 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, since the image is processed manually, there are still some problems in the processing of some edge problems, and it is easy to produce traces of manual processing.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Image space-based image field depth simulation method
  • Image space-based image field depth simulation method
  • Image space-based image field depth simulation method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment

[0020] Such as figure 1 As shown, an algorithm based on the original scene image and a relatively uniform disparity image is improved and optimized according to the algorithm proposed in the real time depth of field rendering via dynamic light field generation and filtering article, and the optimal depth of field effect image is synthesized as much as possible .

[0021] The method of this embodiment is realized through the following technical solutions, comprising the following steps:

[0022] The first step is to process according to the scene image, use the function of opencv to generate mipmap, and use 5x5kernel Gaussian filter for preprocessing before generating mipmap.

[0023] The second step is to perform wrapping processing on the current image, through the mapping function L in (u, v, s, t) makes the mapping of the pixel value of the current reference image to each R st middle.

[0024] The third step is to retain the closest point by analyzing and comparing the ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to an image space-based image field depth simulation method comprising the following steps: in step S1, a scene image of a reference visual angle is obtained, and a matlab is used for generating texture mapping; in step S2, the texture mapping is subjected to affine transformation, and pixels in the scene image of the reference visual angle are mapped onto an image of an object visual angle via a mapping function; in step S3, as for a plurality of pixel points, mapped to the same point location, in a result obtained from step S2, a minimum depth is kept; cavities in the images are filled, and point location information of the images is stored; in step S4, a point is set as a focal point, corresponding light rays are extracted from the images that are generated from all preset visual angles, focal point and color information of each point location is determined according to a hierarchy function of the texture mapping, and an image having field depth effects can be obtained after weighted stacking operation. Compared with technologies of the prior art, the image space-based image field depth simulation method is characterized by high precision, simplicity and the like.

Description

technical field [0001] The invention relates to an image processing method, in particular to an image space-based image depth simulation method. Background technique [0002] Depth of field refers to the front and back distance range of the subject measured by imaging that can obtain clear images at the front of the camera lens or other imagers. After the focus is completed, a clear image can be formed in the range before and after the focus. This distance range before and after is called the depth of field. There is a certain length of space in front of the lens (before and after the focusing point). When the subject is located in this space, its image on the film is just between the two circles of confusion before and after the focus. The length of the space where the subject is located is called the depth of field. In other words, for the subjects in this space, the blurring degree of the image presented on the negative plane is within the limited range of the permissib...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06T15/04G06T15/50
CPCG06T15/04G06T15/50
Inventor 盛斌
Owner SHANGHAI JIAO TONG UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products