Pose estimation method suitable for monocular vision camera in dynamic environment

A monocular vision, dynamic environment technology, applied in the field of computer vision, can solve the problems of inconsistent motion, inaccurate camera pose, affecting the accuracy of the SLAM system, and achieve the effect of improving accuracy and precision.

Active Publication Date: 2019-10-01
重庆高开清芯科技产业发展有限公司
View PDF9 Cites 13 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0003] In actual application scenarios, there must be moving objects other than robots in the environment. Therefore, the background in the video image sequence obtained by the robot is inconsistent with the motion of the moving objects in the environment. Direct use of video image sequences without motion segmentation will lead to visual problems. The camera pose obtained by the odometer is inaccurate, which affects the accuracy of the entire SLAM system
In the existing technology for camera pose estimation, the Faster R-CNN test network structure is difficult to segment target pixels, which in turn affects the accuracy of camera pose estimation

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Pose estimation method suitable for monocular vision camera in dynamic environment
  • Pose estimation method suitable for monocular vision camera in dynamic environment
  • Pose estimation method suitable for monocular vision camera in dynamic environment

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0037] The objects and functions of the present invention and methods for achieving the objects and functions will be clarified by referring to the exemplary embodiments. However, the present invention is not limited to the exemplary embodiments disclosed below; it can be implemented in various forms. The essence of the description is only to help those skilled in the relevant art comprehensively understand the specific details of the present invention.

[0038] Hereinafter, embodiments of the present invention will be described with reference to the accompanying drawings. In the drawings, the same reference numerals represent the same or similar components, or the same or similar steps.

[0039] Below by specific embodiment content of the present invention is provided detailed description, as figure 1 Shown is an overall flow chart of a pose estimation method suitable for a monocular vision camera in a dynamic environment of the present invention. According to an embodimen...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention provides a pose estimation method suitable for a monocular vision camera in a dynamic environment. The pose estimation method comprises the steps: obtaining an image, and carrying out ORB feature point detection on each frame of image; carrying out feature point local matching on two continuous frames of images; judging whether the feature points meet constraint conditions or not, and if the feature points do not meet the constraint conditions, marking the feature points as dynamic points; performing instance-level segmentation on each frame of image to obtain an object contour,and marking all feature points in the object contour as dynamic points when the number of the dynamic points in the obtained object contour exceeds a threshold value; and performing feature matching on the feature points of the unmarked dynamic points by using an RANSAC algorithm, calculating a basic matrix of camera pose transformation, obtaining a camera rotation matrix and a translation vector,and realizing camera pose estimation. According to the invention, the pose accuracy of the camera can be improved, so that the precision of the whole SLAM system is improved.

Description

technical field [0001] The invention relates to the technical field of computer vision, in particular to a pose estimation method suitable for a monocular vision camera in a dynamic environment. Background technique [0002] With the rapid development of intelligent information technology and sensor technology, smart devices such as robots have developed rapidly and are widely used in various fields of social life. Compared with traditional robots, intelligent robots have many unique characteristics, which also put forward higher requirements for intelligent robots, such as intelligent perception, autonomous decision-making, motion control, etc. The intelligentization of the robot requires the robot to be able to position itself in a complex scene, build a map of the surrounding scene, perceive the information of the surrounding scene, and complete the task independently. Simultaneous Location and Mapping (SLAM) is a basic problem in the field of computer vision and smart d...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06T7/73G06T7/50
CPCG06T7/73G06T7/50Y02T10/40
Inventor 林孝康罗一鸣傅嵩
Owner 重庆高开清芯科技产业发展有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products