Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Image information and radar information fusion method and system for traffic scene

A technology of image information and radar information, applied in radio wave measurement system, character and pattern recognition, radio wave reflection/re-radiation, etc., can solve the problem of not paying attention to the construction of fusion structure and optimization of overall performance, and poor adaptability to application scene changes , without considering the advantages of sensors and other issues, to achieve the effect of improving scene adaptability, improving robustness and reliability, and high real-time performance

Active Publication Date: 2018-12-11
北京踏歌智行科技有限公司
View PDF8 Cites 57 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Although these machine learning methods improve the accuracy of detection while reducing the computational intensity, their accuracy largely depends on the training data set in the experimental environment.
In 2015, Alencar used millimeter-wave radar and camera for data fusion to realize road multi-target identification and classification, and used k-means cluster analysis, support vector machine and nuclear principal component analysis to comprehensively analyze camera data and millimeter-wave radar data to obtain road Target, high accuracy but only suitable for the identification of close-range targets in good weather
[0004] To sum up, most of the existing research only discusses the detection problem in specific traffic scenarios, without considering how to use the advantages of different sensors in different traffic scenarios, the fusion method has poor adaptability to application scene changes, and does not pay attention to the construction of the fusion structure and overall performance optimization

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Image information and radar information fusion method and system for traffic scene
  • Image information and radar information fusion method and system for traffic scene
  • Image information and radar information fusion method and system for traffic scene

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0025] The technical methods in the embodiments of the present invention will be clearly and completely described below. Obviously, the described embodiments are only some of the embodiments of the present invention, rather than all of them. Based on the embodiments of the present invention, all other embodiments obtained by persons of ordinary skill in the art without making creative efforts belong to the protection scope of the present invention.

[0026] refer to figure 1 , figure 1It is a schematic diagram of the principle framework of a fusion method and system embodiment of image information and radar information of a traffic scene in the present invention. Firstly, using the method of deep learning, classify the extracted typical traffic scenes in life, such as: straight road in sunny day, straight road in rainy day, ramp in sunny day, curve in sunny night, curve in rainy night, etc., and according to these classification information The corresponding fusion method of...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides an image information and radar information fusion method and system for a traffic scene, and the method comprises the following steps: preprocessing the image information, obtained by a camera, of the front of a vehicle; extracting the feature information in the image information, and comparing the feature information with the pre-stored traffic scene information; classifying the current traffic scene according to a comparison judgment result, performing a corresponding fusion algorithm according to a preset fusion method of image information and radar information whichis adapted to the current traffic scene category, and outputting the result of the fusion algorithm. The method judges a scene according to the collected image information, switches between differentfusion algorithms, effectively utilizes resources, and improves scene adaptability. The method fully utilizes redundancy and complementary characteristics between different sensor data to improve system robustness and reliability. The method adopts a deep learning algorithm in image information processing, which has higher real-time performance and is more accurate in target recognition.

Description

technical field [0001] The present invention relates to the field of safe driving, in particular to a fusion method and system for image information of traffic scenes and radar information. Background technique [0002] With the continuous improvement of vehicle electrification, intelligence and networking, advanced vehicle assistance driving (ADAS) has become the key research direction of major enterprises, universities and research institutes, and environmental perception is the most basic in ADAS system. key technologies. Accurate acquisition of effective target information in front of the road can provide strong technical support for active safety technologies such as adaptive cruise control (ACC) and automatic emergency braking system (AEB), and is of great significance to the development of ADAS systems. Existing environmental perception technologies mostly use a single sensor, or simple superposition of multi-sensor data. Such a method cannot meet the high-precision...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G01S13/86
CPCG01S13/867G01S2013/93271G01S7/417G01S13/931G06V20/56G06V10/454G06V10/82G06F18/24133G06F18/256
Inventor 余贵珍张思佳王章宇张艳飞吴新开
Owner 北京踏歌智行科技有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products