Unlock instant, AI-driven research and patent intelligence for your innovation.

Method and device for obtaining depth camera reference diagram

A technology of depth camera and acquisition method, which is applied in the field of acquisition of depth camera reference images, can solve the problems of laser secondary scattering noise and increase equipment cost, and achieve the effect of reducing equipment cost and avoiding laser secondary scattering noise

Active Publication Date: 2016-02-03
LENOVO (BEIJING) LTD
View PDF4 Cites 16 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] However, the above method has the problem of laser secondary scattering noise: the physical properties of the actual flat plane must be a surface with a certain roughness, so that the camera will collect surface scattering, which is called "secondary scattering speckle"
At present, in order to solve the above problems, the method of horizontally moving the shooting plane can be used to suppress the secondary speckle, but when shooting horizontally, it is necessary to add guide rails and camera devices, which increases the equipment cost

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Method and device for obtaining depth camera reference diagram
  • Method and device for obtaining depth camera reference diagram
  • Method and device for obtaining depth camera reference diagram

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0059] An embodiment of the present invention provides a method for acquiring a reference image of a depth camera. This method is applied to Figure 1a Depth camera shown.

[0060] see figure 2 , the method includes:

[0061] Step 101: Form a speckle pattern on a reference plane by using a laser emitting device and a diffractive optical element.

[0062] Step 102: Using a camera device to photograph and collect speckle patterns to obtain a first reference image.

[0063] Step 103: Determine the mapping relationship between the spots in the first reference image and the white points in the pattern of the diffractive optical element.

[0064] Step 104: Simulate and synthesize the depth camera reference image according to the mapping relationship and the pattern of the diffractive optical element.

[0065] In the embodiment of the present invention, the speckle pattern is photographed and collected by the camera device as the first reference image, and the mapping relationsh...

Embodiment 2

[0067] An embodiment of the present invention provides a method for acquiring a reference image of a depth camera, which is applied to Figure 1a Depth inside camera shown. see image 3 , the method includes:

[0068] Step 201: Form a speckle pattern on a reference plane by using a laser emitting device and a diffractive optical element.

[0069] Figure 4 It is the pattern of the diffractive optical element in this embodiment. From Figure 4 It can be seen that the area of ​​each white point is uniform and regularly arranged.

[0070] Step 202: Use a camera device to shoot and collect speckle patterns to obtain a first reference image.

[0071] Figure 5 It is the first reference image obtained after shooting the speckle pattern in this embodiment. It can be considered that the spots in the first reference image have a one-to-one correspondence with the white spots in the pattern of the diffractive optical element, and there is a mapping relationship. From Figure 5 ...

Embodiment 3

[0096] An embodiment of the present invention provides a device for obtaining a depth camera reference image, which can be applied to such as Figure 1a Depth camera shown. see Figure 9 , the device includes a first reference image acquisition module 301 , a determination module 302 and a depth camera reference image acquisition module 303 .

[0097] The first reference image acquisition module 301 is configured to acquire a first reference image obtained by capturing a speckle pattern formed on the acquisition reference plane by the imaging device;

[0098] A determination module 302, configured to determine the mapping relationship between the spots in the first reference image and the white points in the pattern of the diffractive optical element;

[0099] The depth camera reference image acquisition module 303 is configured to simulate and synthesize the depth camera reference image according to the mapping relationship and the pattern of the diffractive optical element....

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a method for obtaining a depth camera reference diagram, belonging to the technical field of image processing. The method comprises the following steps: forming a speckle pattern on a reference plane by adopting a laser transmitting device and a diffractive optical component; photographing and collecting the speckle pattern by adopting a photographing device to obtain a first reference diagram; determining a mapping relationship between a spot in the first reference diagram and a white point in the pattern of the diffractive optical component; and simulating and synthesizing the depth camera reference diagram according to the mapping relationship and the pattern of the diffractive optical component. According to the method and device for obtaining the depth camera reference diagram provided by the invention, the speckle pattern is photographed and collected through the photographing device and used as the first reference diagram; the mapping relationship between the spot in the first reference diagram and the white point in the pattern of the diffractive optical component is obtained; the depth camera reference diagram is simulated and synthesised according to the mapping relationship; therefore, the secondary scattering noise points of laser are effectively avoided; furthermore, other external equipment is unnecessary to increase; the equipment cost is reduced; and the invention further discloses the device for obtaining the depth camera reference diagram.

Description

technical field [0001] The present invention relates to the technical field of image processing, in particular to a method and device for acquiring a depth camera reference image. Background technique [0002] At present, image processing technology is a processing behavior that processes images to meet human psychological, visual or other application requirements. With the development of science and technology and the increasing demand of people, the application range of image processing technology is becoming wider and wider. [0003] Depth cameras are gradually emerging due to their ability to measure distances in 3D images. One of the key technologies of existing depth cameras is to obtain reference images. According to the position of each speckle in the reference image, the spatial coordinate values ​​of other points in the actual space are calculated. The current method for depth cameras to obtain reference images is that the laser projector emits tens of thousands...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): H04N13/02
Inventor 王琳
Owner LENOVO (BEIJING) LTD