Vision and IMU sensor fusion positioning system based on dynamic object semantic segmentation

A technology of semantic segmentation and fusion positioning, which is applied in the field of visual positioning, can solve problems such as error matching, scale deviation, and unaligned camera coordinate systems, so as to improve accuracy and robustness, improve cumulative errors, and overcome errors. Effect of Matching Problems and Bad Data Association Problems

Pending Publication Date: 2021-08-06
北京数研科技发展有限公司
View PDF0 Cites 6 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0006] In the actual application process, there are often problems with monocular sensor positioning: the image acquired by the camera is easily disturbed by the external environment (occlusion, moving objects, non-textured scenes, lighting changes, etc.); the image will be blurred when the movement is fast, resulting in Positioning failed; the monocular camera cannot obtain the scale information of the real world, so the camera coordinate system cannot be aligned with the real world coordinate system, and there is a scale deviation; the traditional positioning algorithm based on the monocular camera has no ability to handle dynamic scenes, resulting in errors data association, resulting in poor positioning accuracy or even lost positioning
In a dynamic environment, mismatches are prone to occur during data association, which affects positioning accuracy
In addition, ghosting will appear in the constructed point cloud or grid map, which limits application functions such as map positioning, navigation, obstacle avoidance, and interaction

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Vision and IMU sensor fusion positioning system based on dynamic object semantic segmentation
  • Vision and IMU sensor fusion positioning system based on dynamic object semantic segmentation
  • Vision and IMU sensor fusion positioning system based on dynamic object semantic segmentation

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0031] The following will clearly and completely describe the technical solutions in the embodiments of the present invention with reference to the accompanying drawings in the embodiments of the present invention. Obviously, the described embodiments are only some, not all, embodiments of the present invention. Based on the embodiments of the present invention, all other embodiments obtained by those of ordinary skill in the art belong to the scope of protection of the present invention. In order to facilitate the understanding of the above-mentioned technical solutions of the present invention, the above-mentioned technologies of the present invention will be described below through specific usage methods The plan is described in detail.

[0032] In the present invention, dynamic object instance segmentation is used to remove dynamic feature points, and fusion with IMU sensor data is used as a front-end method of a visual-inertial positioning algorithm.

[0033] According to...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a vision and IMU sensor fusion positioning system based on dynamic object semantic segmentation. The system comprises a front-end algorithm which is divided into a feature extraction module and a tracking module which are used for obtaining the data association of feature points, and tracking the feature points between adjacent frames through a KLT method; an instance segmentation and tracking module performs tracking by using a Deep SORT algorithm and is used for providing data association of semantic information; a dynamic object processing module identifies the dynamic feature points through a dynamic object processing algorithm, and rejects the dynamic feature points in positioning and mapping; and an IMU pre-integration module is used for carrying out integration on an IMU measurement value, IMU pre-integration is adopted as an observation value, and direct integration is carried out after a world coordinate system is converted into a local coordinate system; By utilizing the advantages of the visual sensor and the IMU sensor, the IMU sensor improves the problem of positioning failure caused by image blurring when the monocular camera moves fast; and meanwhile, the visual sensor solves the problem that the IMU accumulative error is relatively large.

Description

technical field [0001] The invention relates to the technical field of visual positioning, in particular to a visual and IMU sensor fusion positioning system based on semantic segmentation of dynamic objects. Background technique [0002] With the gradual improvement of the global satellite navigation system and the rapid development of mobile Internet and wireless communication technology, navigation and location-based services (Location-Based Services, LBS) are of great value to the fields of emergency, national defense, logistics, transportation, advertising and social networking. . According to the "2020 White Paper on the Development of China's Satellite Navigation and Location-Based Service Industry", the annual output value of my country's navigation and location-based service industry reaches hundreds of billions of yuan. With the increasing popularity of smart phones and wearable devices, the demand for navigation and location services is still in the stage of expl...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06T7/215G06T7/246G06T7/73G06T7/13G06T5/30G06K9/62G06N3/04G06N3/08
CPCG06T7/215G06T7/248G06T7/74G06T7/13G06T5/30G06N3/08G06T2207/20081G06T2207/20084G06T2207/20164G06V2201/07G06N3/045G06F18/22
Inventor 郭金辉赵明乐
Owner 北京数研科技发展有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products