Eureka AIR delivers breakthrough ideas for toughest innovation challenges, trusted by R&D personnel around the world.

3D (three-dimensional) visualization method for coverage range based on quick estimation of attitude of camera

A coverage and camera technology, which is applied in the field of rapid camera attitude estimation and coverage 3D visualization, can solve the problems of low precision, calibration objects cannot be placed on site, and traditional methods cannot be applied, so as to avoid coverage dead spots.

Active Publication Date: 2013-11-20
HUAZHONG NORMAL UNIV
View PDF3 Cites 57 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Existing calibration methods are based on precise calibration objects, or require complex attitude control operations on the camera to calculate camera parameters
In many applications, these calibration objects cannot be placed on site, and attitude control operations cannot be performed on the bolt camera, which leads to situations where traditional methods cannot be applied
[0005] 2. The coverage calculation is not accurate and intuitive enough
The calibration method based on active vision needs to control the camera to do some special movements. The algorithm is simple but cannot be applied to occasions where the camera movement is unknown or uncontrollable
The camera self-calibration method uses the camera internal parameter constraints to solve the equation. The calibration process has nothing to do with the scene and camera movement, and the application is flexible, but the accuracy is low.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • 3D (three-dimensional) visualization method for coverage range based on quick estimation of attitude of camera
  • 3D (three-dimensional) visualization method for coverage range based on quick estimation of attitude of camera
  • 3D (three-dimensional) visualization method for coverage range based on quick estimation of attitude of camera

Examples

Experimental program
Comparison scheme
Effect test

Embodiment

[0055] Embodiment comprises the following steps:

[0056] Step 1: Carry out 3D scene modeling and obtain the scene model, mainly including reconstruction and enhancement of the 3D scene model, fusion of real-time video and 3D model, so as to realize accurate reconstruction of the real scene. Process real-time video sequence frames, obtain scene texture, illumination, depth and geometric information from the frame sequence, use it to restore 3D geometric motion information, and use this information to complete the 3D scene model reconstruction work; and then further enhance the 3D scene model Processing, mainly to solve the problems of geometric consistency, illumination consistency and occlusion consistency between video and model.

[0057] Step 2: According to the 3D scene model and the installation position of the camera, register the camera into the 3D scene model, then use the 2D-3D point pair relationship to calculate the camera pose, and optimize the camera parameters to...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a 3D (three-dimensional) visualization method for a coverage range based on the quick estimation of the attitude of a camera, which comprises the following steps: carrying out 3D scene modeling so as to obtain a 3D scene model, carrying out enhancement on the 3D scene model, and in the process of 3D scene modeling, restoring a depth image from each frame of a video; according to the 3D scene model and the installation position of the camera, registering the camera to the 3D scene model, obtaining camera parameters through estimation according to the attitude of the camera, and determining a projection mapping relation between the video and the 3D scene model; and carrying out displaying in the 3D scene model according to a depth shadow texture principle, interactively picking up a target in the 3D scene model, and according to the projection mapping target, calculating an imaging area of the target in the current camera parameter mode.

Description

technical field [0001] The invention relates to the technical fields of computer vision and augmented reality, in particular to a method for quickly estimating camera pose and 3D visualization of coverage. Background technique [0002] With the continuous development of computer graphics and computer vision, it has become a reality to use computer technology to efficiently and realistically simulate the real world interactively. Augmented reality technology is more and more widely used in the real world, and its role is becoming more and more important. The increase of a large number of cameras not only brings massive video data, but also puts forward higher requirements for scientific and efficient resource management. [0003] The existing camera pose estimation mostly adopts the method of camera calibration and feature tracking. Camera calibration is widely used in desktop cameras, robotics, and industrial control. However, in the security field based on the augmented ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06T15/00H04N13/00
Inventor 赵刚何彬李洋洋陈凌云徐忠成潘瑞雪
Owner HUAZHONG NORMAL UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Eureka Blog
Learn More
PatSnap group products