Real-time registration method for depth maps shot by kinect and video shot by color camera

A color camera and depth map technology, which is applied in image communication, electrical components, stereo systems, etc., can solve the problem of depth data noise affecting calculation efficiency, and achieve the effect of reducing the impact

Active Publication Date: 2014-02-26
ZHEJIANG UNIV
View PDF2 Cites 27 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, there are problems of being easily affected by the noise of depth data and low computational efficiency.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Real-time registration method for depth maps shot by kinect and video shot by color camera
  • Real-time registration method for depth maps shot by kinect and video shot by color camera
  • Real-time registration method for depth maps shot by kinect and video shot by color camera

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0035] First define the abbreviations that will be used in the following instructions:

[0036] : Camera internal reference

[0037] : 3D rotation matrix of the checkerboard coordinate system relative to the camera coordinate system;

[0038] : The third column vector of the 3D rotation matrix of the checkerboard coordinate system relative to the camera coordinate system;

[0039] : The translation of the checkerboard coordinate system relative to the camera coordinate system;

[0040] : The homogeneous coordinate representation of the two-dimensional coordinates of the jth point on the checkerboard plate in the depth image captured by kinect in the ith sample and the depth of the point the product of

[0041] : Rotation by camera coordinate system and kinect coordinate system And and kinect internal reference The resulting mixing parameters, ;

[0042] : The translation between the camera coordinate system and the kinect coordinate system. .

[00...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention provides a stable and fast real-time registration method for depth maps shot by the kinect in real time and video shot by a color camera. According to the method, the step of estimating internal parameters of a depth camera is removed, and the stability of an algorithm is enhanced while the parameters to be solved are reduced. A linear optimization frame is used for solution, the globally optimal solution can be obtained in one step, and thus the calculation efficiency of the algorithm is greatly improved. Although the step of estimating the internal parameters of the depth camera in a traditional algorithm is removed, the executing efficiency for mapping depth information to the video is not affected, and the depth information captured by the kinect can still be mapped to the video in real time. Moreover, because the defined hybrid parameters have good mathematical properties, the internal parameters of the depth camera can still be determined in a reverse mode with a matrix QR decomposition technology to be used by other algorithms.

Description

technical field [0001] The invention relates to a real-time registration method of a depth map captured by a kinect and a video captured by a color camera. Background technique [0002] With the improvement of data storage and transmission technology and the advancement of camera lens technology, high-definition cameras and high-definition network cameras have gradually entered people's field of vision. Using these devices, people can easily obtain high-quality video material and communicate with others in high-definition video. However, in recent years, with the development of augmented reality technology and stereoscopic display technology, traditional two-dimensional high-definition video can no longer meet people's needs. High-realistic interaction with computer-synthesized virtual objects during communication, all of which rely on deep video generation technology. [0003] The deep generation technology of video is a classic problem in the field of computer vision. I...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): H04N13/00
Inventor 童若锋琚旋成可立
Owner ZHEJIANG UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products