Eureka AIR delivers breakthrough ideas for toughest innovation challenges, trusted by R&D personnel around the world.

Information fusion method of kinect depth camera and thermal infrared camera

A technology of depth camera and fusion method, applied in image data processing, instrumentation, calculation, etc., can solve the problems of poor imaging quality, unsatisfactory reconstruction effect, limited calibration parameter accuracy, etc., to achieve the effect of optimizing internal and external parameters

Active Publication Date: 2018-08-24
SOUTHWEAT UNIV OF SCI & TECH +1
View PDF3 Cites 7 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, based on the limitations of the Kinect imaging principle, Kinect has poor image quality in rainy days, foggy days, and low light conditions; in addition, Kinect’s depth camera and RGB camera have limited calibration parameters at the factory, resulting in unsatisfactory reconstruction results. In traditional machine learning and deep learning, only relying on texture information and three-dimensional information is difficult to meet the requirements of input information. Therefore, it is necessary to provide more reliable conditions as input information in the process of machine learning and deep learning to improve the recognition rate of the system.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Information fusion method of kinect depth camera and thermal infrared camera
  • Information fusion method of kinect depth camera and thermal infrared camera
  • Information fusion method of kinect depth camera and thermal infrared camera

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0027] The specific embodiments of the present invention are described below so that those skilled in the art can understand the present invention, but it should be clear that the present invention is not limited to the scope of the specific embodiments. For those of ordinary skill in the art, as long as various changes Within the spirit and scope of the present invention defined and determined by the appended claims, these changes are obvious, and all inventions and creations using the concept of the present invention are included in the protection list.

[0028] refer to figure 1 , figure 1 Show the flow chart of Kinect depth camera and thermal infrared camera 5 information fusion methods; As figure 1 As shown, the method 100 includes steps 101 to 107.

[0029] In step 101, a calibration plate 1 with several circular heating materials and a flat surface is selected.

[0030] During implementation, the material of the calibration plate 1 is preferably heat insulating mater...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses an information fusion method of a kinect depth camera and a thermal infrared camera. The information fusion method comprises the steps of selecting a calibration plate; acquiring images of different angles on the calibration plate through a near-infrared short-wave camera and an RGB camera of the kinect depth camera; carrying out heating treatment on the calibration plate,and acquiring the images of different angles of the calibration plate through the thermal infrared camera and the RGB camera of the kinect depth camera; acquiring the external parameters of the near-infrared short-wave camera, the RGB camera, and the thermal infrared camera by means of a small-hole imaging model and a double-target calibration principle; acquiring the geometric relation between the near-infrared short-wave camera and the RGB camera; acquiring the geometric relation between the thermal infrared camera and the RGB camera; and according to the geometric relation between the near-infrared short-wave camera and the RGB camera and the geometric relation between the thermal infrared camera and the RGB camera, performing the information fusion on the thermal infrared camera, the near-infrared short-wave camera and the RGB camera.

Description

technical field [0001] The invention relates to the field of structured light multi-sensor information fusion, in particular to an information fusion method of a Kinect depth camera and a thermal infrared camera. Background technique [0002] Compared with traditional depth information acquisition methods, Kinect has the characteristics of portability and low price. Kinect can be used to directly obtain the depth information of indoor scenes and target areas. The combination of RGB information can reconstruct the real 3D information of the scene and the target area. However, based on the limitations of the Kinect imaging principle, Kinect has poor image quality in rainy days, foggy days, and low light conditions; in addition, Kinect’s depth camera and RGB camera have limited calibration parameters at the factory, resulting in unsatisfactory reconstruction results. In traditional machine learning and deep learning, only relying on texture information and three-dimensional in...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06T7/80
CPCG06T7/80
Inventor 刘桂华包川张华徐锋龙惠民
Owner SOUTHWEAT UNIV OF SCI & TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Eureka Blog
Learn More
PatSnap group products