Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Autonomous parking system and method based on multi-vision inertial navigation integration

An autonomous parking, multi-vision technology, applied in the field of autonomous parking systems based on multi-vision inertial navigation fusion, can solve the problems of inhumanity, trouble, and difficulty in tracking and calculating current position information with visual SLAM technology, so as to improve the accuracy Effect

Active Publication Date: 2018-01-19
SUN YAT SEN UNIV
View PDF7 Cites 58 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, both radar and ultrasonic sensors have certain blind spots and cannot completely collect data
At the same time, such sensors are expensive and are generally only equipped on high-end models of high-end brands, which cannot play a positive role in the popularization of smart cars
[0003] At the same time, due to the rapid increase in the number of vehicles, it is difficult to find a parking space in the city during busy hours. The process of finding a parking space is boring and extremely troublesome. Unfortunately, most of the autonomous parking systems on the market require a driver to The driver finds the parking space by himself and drives the car near the space to realize self-parking. Some self-parking systems also require the driver to mark the parking space
very inhumane
[0004] In addition, visual SLAM technology is rarely used in the market to provide positioning information for autonomous parking systems. It is difficult for visual SLAM technology to track and calculate the current position information stably. The scene and speed requirements are extremely high, and it is not suitable for truly changeable parking. surroundings

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Autonomous parking system and method based on multi-vision inertial navigation integration
  • Autonomous parking system and method based on multi-vision inertial navigation integration
  • Autonomous parking system and method based on multi-vision inertial navigation integration

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0031] The following will clearly and completely describe the technical solutions in the embodiments of the present invention with reference to the accompanying drawings in the embodiments of the present invention. Obviously, the described embodiments are only part of the embodiments of the present invention, not all of them. Based on the embodiments of the present invention, all other embodiments obtained by persons of ordinary skill in the art without creative efforts fall within the protection scope of the present invention.

[0032] It should be noted that if there is a directional indication (such as up, down, left, right, front, back...) in the embodiment of the present invention, the directional indication is only used to explain the position in a certain posture (as shown in the accompanying drawing). If the specific posture changes, the directional indication will also change accordingly.

[0033]In addition, if there are descriptions involving "first", "second" and s...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses an autonomous parking system based on multi-vision inertial navigation integration. The autonomous parking system comprises a binocular vision SLAM module, a round looking parking detection module and a decision and control module, wherein the binocular vision SLAM module is used for generating a parking lot simulating map by acquired parking lot images through the multi-vision inertial navigation integration, obtaining real-time positioning information of a vehicle under a coordinate system of the parking lot simulating map, and transmitting the real-time positioning information to the decision and control module; the round looking parking detection module is used for acquiring current position data of the vehicle for current environmental distinction detection, and transmitting the testing result to the decision and control module; and the decision and control module is used for making a vehicle autonomous parking strategy according to the received geographicpositioning information of the vehicle in the parking lot simulating map and the current environmental distinction parameters. The invention also discloses an autonomous parking method based on the multi-vision inertial navigation integration. The autonomous parking method is used for realizing the autonomous parking system.

Description

technical field [0001] The invention relates to the technical field of vehicle autonomous parking, in particular to an autonomous parking system and method based on multi-visual inertial navigation fusion. Background technique [0002] Under the background of such rapid popularization of automobiles, the intelligentization of automobiles has become a major trend in the development of the automobile industry. Due to the cumbersome parking process, high technical requirements for drivers, and accidents in the parking process, the autonomous parking system is even more Attract major automobile manufacturers to invest a lot of money in research and development. So far, most of the autonomous parking systems equipped with high-end models on the market rely on radar and ultrasonic sensors to achieve the purpose of parking by obtaining the distance between the car and surrounding objects through radar and ultrasonic sensors. However, both radar and ultrasonic sensors have certain ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): B60W30/06B60W40/02
Inventor 陈龙麦灏黄国杰杨腾
Owner SUN YAT SEN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products