Self-adaptive reinforced fusion real-time instance segmentation method based on camera and laser radar

A technology of lidar and camera, which is applied in image analysis, image data processing, character and pattern recognition, etc., can solve the problem of inability to adaptively learn the complementary features of different modes, single-mode sensor perception of the external environment, and low target positioning accuracy and other problems to achieve the effect of avoiding calculation overhead, high positioning accuracy and good adaptability

Active Publication Date: 2020-11-20
SOUTHEAST UNIV
View PDF12 Cites 4 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0006] In short, the problems existing in the existing technology are: only relying on single-mode sensors cannot accurately and robustly perceive the external environment; and the current fusion sensing method based on cameras and lidars has low target positioning accuracy and poor real-time performance, and all The adopted fusion method cannot adaptively learn the complementary features of different modalities

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Self-adaptive reinforced fusion real-time instance segmentation method based on camera and laser radar
  • Self-adaptive reinforced fusion real-time instance segmentation method based on camera and laser radar
  • Self-adaptive reinforced fusion real-time instance segmentation method based on camera and laser radar

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0048] Such as figure 1 As shown, a kind of self-adaptive strengthening fusion real-time instance segmentation method based on camera and lidar of the present invention comprises the following steps:

[0049] S10, feature extraction: use convolutional neural network to extract multi-scale high-level semantic features from the spatiotemporal consistent camera image and lidar point cloud projection image;

[0050] combine figure 2 , the feature extraction step includes:

[0051] S11, camera and lidar time synchronization:

[0052] The closest camera image frame to the current lidar frame is obtained by the time synchronization algorithm, and the time-synchronized lidar point cloud frame and camera image frame are obtained.

[0053] S12, generating a lidar point cloud projection map:

[0054] According to the external parameter matrix M of the lidar coordinate system to the camera coordinate system e , and the internal parameter matrix M of the camera i , a point (X Y Z) o...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses an adaptive reinforcement fusion real-time instance segmentation method based on a camera and a laser radar. The method comprises the steps: respectively extracting the image features of a camera image and a laser radar projection image of a target through employing a convolutional neural network, and respectively obtaining a first image feature and a second image feature;adaptively allocating weights of the first image feature and the second image feature, weighting the first image feature according to the allocated first weight to obtain a third image feature, weighting the second image feature according to the allocated second weight to obtain a fourth image feature, and performing enhanced fusion on the third image feature and the fourth image feature; According to the fused image features, outputting the category, confidence, bounding box and mask of the target by using a real-time instance segmentation network, and obtaining an instance segmentation result of the target. According to the method, target instance segmentation can be accurately and robustly realized in real time in a complex environment, and the method has a wide application prospect inthe field of intelligent network connection vehicle perception.

Description

technical field [0001] The invention belongs to the technical field of target instance segmentation of intelligent networked vehicles, in particular to an adaptive enhanced fusion real-time instance segmentation method based on a camera and a laser radar. Background technique [0002] Intelligent networked vehicles have great potential in improving road safety and traffic efficiency, and accurate perception of the traffic environment is the basis for planning, decision-making and control of intelligent networked vehicles. As the most commonly used sensor in the perception system of intelligent networked vehicles, the camera can obtain detailed shape and texture information of the surrounding environment. In recent years, vision-based deep learning algorithms have achieved impressive results in environmental perception. However, the camera is easily affected by light and weather conditions, causing the performance of the algorithm to seriously degrade or even fail. Therefor...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/00G06K9/34G06T7/70
CPCG06T7/70G06V20/56G06V10/267
Inventor 殷国栋彭湃庄伟超耿可可徐利伟王金湘张宁卢彦博
Owner SOUTHEAST UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products