Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Multi-feature fusion visual localization method based on back projection

A multi-feature fusion and visual positioning technology, applied in the field of visual positioning and visual measurement, can solve the problem of low precision

Active Publication Date: 2022-08-02
中国人民解放军63920部队
View PDF6 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0006] Aiming at the problems of low accuracy of vision-guided operations in the operations of sampling, setting out, grabbing and releasing cans by the robotic arm in the sampling task of extraterrestrial celestial bodies, and the problems that a unified optimization model has not been established to solve them, the present invention proposes a multi- The feature fusion visual positioning method, through the abstract modeling of the process of static camera measurement of dynamic targets and dynamic camera measurement of static targets, establishes binocular and monocular cameras for natural features (such as circular targets) and artificially set features (such as target feature) combination positioning optimization model, which realizes the combination positioning of multiple types of features suitable for different camera numbers in different operation application scenarios, and effectively improves the use of robotic arms to collect samples, place samples, grab sealed cans and place them in the remote operation of extraterrestrial celestial bodies. Operation accuracy and automation of key links such as sealed tanks

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Multi-feature fusion visual localization method based on back projection
  • Multi-feature fusion visual localization method based on back projection
  • Multi-feature fusion visual localization method based on back projection

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0044] In order to make those skilled in the art better understand the solutions of the present application, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present application. Obviously, the described embodiments are only The embodiments are part of the present application, but not all of the embodiments. Based on the embodiments in the present application, all other embodiments obtained by those of ordinary skill in the art without creative work shall fall within the scope of protection of the present application.

[0045] As will be appreciated by one skilled in the art, embodiments of the present invention may be provided as a method, system, or computer program product. Accordingly, the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment, or an embodiment combining software and hardw...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

A multi-feature fusion visual positioning method based on back projection, the method includes: 1) extracting the feature pixel coordinates formed by the forward projection of the spatial target in the camera image; 2) determining that there are N cameras in the space, wherein N1 are static cameras; N2 are dynamic cameras; according to whether the camera pose is a fixed parameter, set the appropriate camera pose or spatial target pose as the initial iteration value of the algorithm; 3) According to the known camera pose information or target pose information and target features The pixel coordinates reconstruct the three-dimensional coordinates of the feature in space; 4) According to the reconstructed spatial coordinates, a unified pose optimization model for the relative measurement of multiple types of features by the camera is established by using the fixed constraints between the known features; The linear optimization method iteratively solves the pose measurement optimization model, and obtains the precise pose information of the space target or the camera on the robotic arm, so as to realize the step-by-step and precise guidance and control of the robotic arm and execute the predetermined work procedures and tasks.

Description

technical field [0001] The invention relates to the technical fields of visual measurement and visual positioning, in particular to a multi-feature fusion visual positioning method based on inverse projection. Background technique [0002] With the development of science and technology, robotic arms are not only used more and more widely on the ground, but also play an increasingly important role in the detection of the surface of extraterrestrial celestial bodies, especially for the sampling operation of the surfaces of extraterrestrial celestial bodies. . [0003] The surface landing sampling and return of extraterrestrial celestial bodies is an important means for the world's aerospace powers to explore deep space and expand human understanding of extraterrestrial planets and the solar system. The robotic arm is an indispensable key equipment in the surface landing sampling mission of extraterrestrial celestial bodies. It can perform sampling tasks autonomously according...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06T7/73G06T5/00G06V10/44G06V10/80
CPCG06T7/73G06T2207/10004G06T2207/20221G06F18/253G06T5/80
Inventor 刘传凯李东升谢剑锋王俊魁袁春强张济韬刘茜王晓雪何锡明胡晓东
Owner 中国人民解放军63920部队
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products