Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Multi-sensor fused intelligent parking system and method

A multi-sensor fusion, sensor subsystem technology, applied in driving system, intelligent parking system, parking space detection and autonomous parking path planning practice field, can solve the problem of single parking scene, the type of obstacles cannot be detected, parking space detection problems such as low accuracy, to achieve the effect of a wide recognition range

Pending Publication Date: 2021-01-05
ZONGMU TECH SHANGHAI CO LTD
View PDF0 Cites 10 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] At present, the common parking systems on the market mainly use ultrasonic sensors to perceive the surrounding environment and find suitable parking spaces. However, the accuracy of parking space detection is low, and it can only identify horizontal and vertical parking spaces, and cannot identify inclined parking spaces, three-dimensional parking spaces, etc. , at the same time, only relying on the ultrasonic supersensor has insufficient perception of the surrounding environment, and cannot detect the specific types of various obstacles. The applicable parking scene is relatively single and the conditions are relatively strict. The actual user experience effect bad

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Multi-sensor fused intelligent parking system and method
  • Multi-sensor fused intelligent parking system and method
  • Multi-sensor fused intelligent parking system and method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0070] as follows figure 1 The sensor subsystem (unit) shown includes a (high-definition) camera and a radar positioning detection component. Further, the (high-definition) camera includes a front-view camera and multiple surround-view (fisheye / wide-angle) cameras; the radar positioning detection component includes multiple ultrasonic radar and multiple mmWave radars.

[0071] 1. Sensor unit (subsystem):

[0072] The sensor unit includes: four fisheye (wide-angle) cameras located on the front, rear, left, and right sides of the vehicle and a front-view camera on the front of the vehicle, a total of 4 ultrasonic ranging sensors located on both sides of the vehicle, and a total of 4 ultrasonic ranging sensors located at the front and rear of the vehicle A total of 8 ultrasonic ranging sensors, as well as four millimeter-wave radars and forward-facing main millimeter-wave radars located on the four corners of the car body. Among them, the installation positions of the four came...

Embodiment 2

[0125] The information processing subsystem also includes an original target detection component and a target fusion detection component. Further, the original target detection component includes a graphic target detection module, a relationship detection module, an ID number tagging module, a tag merging module and a tag filtering (preprocessing) module ; The target fusion detection component includes a point cloud target detection module, a multi-sensor fusion module and a path planning module.

[0126] as follows figure 2 shown

[0127] (1) Obtain at least four-way fisheye cameras in the front, rear, left, and right sides of the vehicle to collect fisheye images and point cloud data from ultrasonic sensors and millimeter-wave radars;

[0128] (2) The graphic target detection module inputs the fisheye image collected by the camera into the convolutional neural network, and the ID number marking module uses the target detection algorithm to obtain the category and position ...

Embodiment 3

[0138] Such as image 3 As shown, the steps of the intelligent parking system method for multi-sensor fusion are as follows:

[0139] S01: Obtain the perception information of different sensors such as the surround view camera, ultrasonic wave, and millimeter wave radar of the (this) vehicle;

[0140] S02: Send different sensor data into the recognition algorithm respectively, and obtain the perception results of different sensors;

[0141] S03: Post-processing the perception results of different sensors to obtain time and space synchronization recognition results;

[0142] S04: Send the sensing result to the decision-making module to realize the parking process of the vehicle.

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a multi-sensor fused intelligent parking system. The multi-sensor fused intelligent parking system comprises sensor subsystems and a signal processing subsystem, wherein the sensor subsystems are arranged on the periphery of a vehicle and used for collecting environment information around the vehicle; and the signal processing subsystem performs information fusion accordingto the environment information provided by the different sensor subsystems, determines space information and position information of parking spaces of different shapes from the environment information, and determines autonomous parking path planning. Safer guarantee information is provided for an automatic driving function in a parking scene.

Description

technical field [0001] The invention relates to the field of automobile driving, relates to a driving (assist) system, and in particular to a practical method for detecting a parking space and planning an autonomous parking path of a multi-sensor fusion intelligent parking (assist) system. Background technique [0002] In recent years, with the development of technology, autonomous driving has become a highly concerned and challenging field. Among them, the automatic driving function in the parking scene is a very important direction of exploration. The autonomous parking system is a branch of the automatic driving system, which helps to solve the problem of difficult parking spaces and difficult parking in large parking lots. The perception information in the parking process depends on the data collection and processing of the sensor, and uses the obtained perception information to realize functions such as parking space search and obstacle detection. [0003] Defects and ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G01S13/931G01S15/931G01S17/931G01S13/86B60W30/06
CPCG01S13/931G01S13/862G01S13/865G01S13/867G01S15/931G01S17/931B60W30/06G01S2013/9314G01S2015/932B60W2420/403
Inventor 丁丽珠张文凯王晓权徐文浩吴子章王凡宋宇
Owner ZONGMU TECH SHANGHAI CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products