Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Virtual-reality fusion fuzzy consistency processing method based on line spread function standard deviation

A technology of line spread function and virtual-real fusion, applied in the field of virtual-real fusion blur consistency processing, which can solve the problems of defocus blur and inaccurate camera focusing.

Active Publication Date: 2018-11-16
CHANGCHUN UNIV OF SCI & TECH
View PDF15 Cites 4 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

The cause of defocus blur is generally that the focus of the camera is not accurate or the subject is out of focus.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Virtual-reality fusion fuzzy consistency processing method based on line spread function standard deviation
  • Virtual-reality fusion fuzzy consistency processing method based on line spread function standard deviation
  • Virtual-reality fusion fuzzy consistency processing method based on line spread function standard deviation

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0031] Below in conjunction with accompanying drawing and embodiment the present invention will be further described: as figure 1 As shown, a virtual-real fusion fuzzy consistency processing method based on the standard deviation of the line spread function LSF, the camera 2 is connected with the computer 1 through a cable, and the real scene 3 contains the Hiro square black and white identification card 4 in the ARToolkit; its characteristics The specific steps are as follows:

[0032] Step 1, use the camera 2 to shoot the real scene 3, obtain the real scene image and use I 1 express.

[0033] Step 2, utilize the rgb2gray function of matlab to I 1 Grayscale to get the real scene grayscale image I 2 , and according to the formula

[0034] G x (f(x,y))=(f(x+1,y)-f(x-1,y)) / 2

[0035] Calculate I 2 Gradient G in the horizontal direction x , where (x,y) is the image I 2 The pixel in x row y column, f is the image I 2 In the gray value of the pixel, f(x+1, y)-f(x-1, y) is...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to a virtual-reality fusion fuzzy consistency processing method based on a line spread function (LSF) standard deviation. A camera is connected with a computer by a cable; and anHiro square black-white identification card in ARToolkit is arranged in a real scene. The method is characterized in that a real object region that is located at a similar or same depth position as avirtual object in a fusion scene is obtained; an LSF standard deviation of the regional edge is estimated and fuzzy processing is carried out on the virtual object by combining the standard deviationand an image degradation model; and then a fusion scene with the fuzzy consistency between virtual and real objects is generated.

Description

technical field [0001] The invention relates to a virtual-real fusion fuzzy consistency processing method based on line spread function (line spread function, LSF) standard deviation, which belongs to the technical field of computer vision. Background technique [0002] In the augmented reality (Augmented Reality, AR) system, when the camera captures the scene image, the image distortion phenomenon will be blurred due to the external environment and human factors. Virtual objects always remain clear in the fused scene, and there will be obvious splicing phenomenon when fused with the naturally rendered real scene, which makes the scene lack of immersion, and even causes users' eyes to feel uncomfortable and fatigued. Therefore, adding depth-of-field effects to virtual objects in the AR system makes them have the same blur effect as real objects in the scene, which can improve the realism and immersion of the system, and enhance the user's depth perception in the fusion scene...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06T5/00G06T7/13G06T7/136G06T7/50G06T7/90
CPCG06T7/13G06T7/136G06T7/50G06T7/90G06T5/00
Inventor 韩成张超白利娟李华杨帆胡汉平权巍薛耀红徐超
Owner CHANGCHUN UNIV OF SCI & TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products