Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

A Visual Tracking and Localization Method Based on Dense Point Cloud and Synthetic View

A technology of visual tracking and positioning method, which is applied in the field of information to achieve the effect of rapid initialization and positioning

Active Publication Date: 2021-08-06
BEIJING INSTITUTE OF TECHNOLOGYGY
View PDF2 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] The purpose of the present invention is to overcome the defects of the prior art, in order to solve the problem of the association between laser radar three-dimensional scanning data and image data, so that it can be used for visual tracking and positioning, and propose a visual tracking based on dense point cloud and synthetic view positioning method

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A Visual Tracking and Localization Method Based on Dense Point Cloud and Synthetic View
  • A Visual Tracking and Localization Method Based on Dense Point Cloud and Synthetic View
  • A Visual Tracking and Localization Method Based on Dense Point Cloud and Synthetic View

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0024] In order to make the purpose, technical solutions and advantages of the embodiments of the present invention more clear, the technical solutions in the embodiments of the present invention will be clearly and completely described below in conjunction with the drawings in the embodiments of the present invention.

[0025] The design idea of ​​the present invention is: transform the three-dimensional point cloud obtained by the laser radar through space back projection to generate a synthetic image under a known viewpoint, and use the method of matching the synthetic image with the real-time image acquired by the camera to estimate the real-time 6 freedom of the camera. degree pose. This method can be used for tracking navigation or auxiliary positioning in the fields of robots, unmanned vehicles, unmanned aerial vehicles, virtual reality and augmented reality.

[0026] The present invention is based on the visual tracking and positioning method of dense point cloud and s...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The present invention provides a visual tracking and positioning method based on dense point clouds and synthetic views. The process is as follows: use three-dimensional scanning of the real scene to obtain color key frame images and corresponding depth images, and perform image repair on the key frame images. Image encoding; image encoding is performed on the current frame image acquired by the camera in real time, and a composite image with the closest coding distance to the current frame image is selected as the reference frame image of the current image; a stable set of matching feature points on the two images is obtained, and Perform processing to obtain the six-degree-of-freedom pose information of the current frame camera relative to the 3D scanning point cloud coordinate system; use the optical flow algorithm to judge, if the requirements cannot be met, update the current frame image to the next frame image acquired by the camera, and proceed again match. The invention can solve the problem of association between the three-dimensional point cloud acquired by the laser radar and the heterogeneous visual image, and has the effect of realizing rapid initialization and positioning of the visual navigation.

Description

technical field [0001] The invention belongs to the field of information technology, and in particular relates to a visual tracking and positioning method based on dense point clouds and synthetic views. Background technique [0002] Visual real-time tracking and positioning of large-scale outdoor scenes has always been an important research direction in the field of computer vision. Outdoor complex and changeable environmental factors, such as lighting, viewing angle, occlusion, weak texture, and objects moving over time, all have a greater impact on the accuracy and robustness of the visual tracking and positioning algorithm. Outdoor combat scenes such as deserts, grasslands, and mountains have a large area and the scene texture information is not rich, which poses higher technical challenges to the visual tracking and positioning algorithm. At present, the commonly used outdoor large scene visual tracking and positioning algorithm is the SLAM (real-time positioning and m...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06T7/246G06T7/269G06T7/73G01S17/02G01S17/66G01S17/89
CPCG01S17/66G01S17/89G06T2207/10016G06T2207/10024G06T2207/10028G06T7/246G06T7/269G06T7/73
Inventor 陈靖缪远东
Owner BEIJING INSTITUTE OF TECHNOLOGYGY
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products