Eureka AIR delivers breakthrough ideas for toughest innovation challenges, trusted by R&D personnel around the world.

Automatic driving lane changing scene classification method and recognition method based on clustering

A scene classification and automatic driving technology, applied in character and pattern recognition, instruments, computing, etc., can solve problems such as labor-intensive and automatic scene classification, and achieve the effect of avoiding classification confusion, saving computing time and space costs, and covering a wide range of scenes.

Pending Publication Date: 2022-05-27
CHONGQING CHANGAN AUTOMOBILE CO LTD
View PDF1 Cites 1 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

The above method realizes the automatic recognition of driving scenes, but it still cannot automatically classify the scenes according to the recognition information, so the above method also has the problem of requiring manual classification, which is labor-intensive

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Automatic driving lane changing scene classification method and recognition method based on clustering
  • Automatic driving lane changing scene classification method and recognition method based on clustering
  • Automatic driving lane changing scene classification method and recognition method based on clustering

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0049] The present invention will be further described below with reference to the accompanying drawings and embodiments.

[0050]The automatic driving scene action factors take into account relevant information such as driving ability, functional characteristics, physical environment, behavior of traffic participants, and evaluation criteria. Scenes are divided into static and dynamic. Static scenes usually include road facilities, traffic accessories, surrounding environment, etc.; dynamic scenes include traffic management control, motor vehicles, non-motor vehicles, and pedestrians. When describing a scene, the present invention often requires multiple parameters to accurately describe a scene object. The parameters considered in the data description of the scene include: the geometric structure and topology of the driving road and the interaction data with other traffic participants; the relative positions and relative motion trends of traffic participants such as surround...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a clustering-based automatic driving lane changing scene classification method and a clustering-based automatic driving lane changing scene identification method. The classification method comprises the following steps: 1) collecting data of each sample point needing lane changing scene classification; 2) preprocessing the data in the step 1); (3) taking each preprocessed sample point as a cluster, calculating the distance between the sample point of each cluster and the sample points of all other clusters by adopting an agglomerated hierarchical clustering algorithm, and carrying out combined clustering on two clusters with the closest distance; and 4) taking the clusters which are combined and clustered together as lane changing scenes, and outputting clustering results. According to the automatic lane changing scene classification method and the automatic lane changing scene identification method, an accurate and reliable basis is provided for further improving an automatic lane changing algorithm and an optimization function, and scene classification identification with high efficiency, low cost and high scene coverage rate is realized.

Description

technical field [0001] The invention relates to the technical field of automatic driving, in particular to a cluster-based automatic driving lane changing scene classification method and identification method. Background technique [0002] With the continuous improvement of the level of automatic driving and the continuous enrichment of functions, in order to ensure the safe and reliable implementation of automatic driving technology, it is necessary to identify and classify the lane changing scenarios of automatic driving. Scene recognition and classification are usually performed directly by manual expert experience, which is labor-intensive, and may cause problems such as inaccurate scene division and incomplete consideration of scenes. [0003] The application number CN202010707401.6 discloses an automatic driving scene classification and identification system and method. The system is provided with a scene data acquisition module, a driving scene identification module, ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06V20/56G06K9/62G06V10/764G06V10/762
CPCG06F18/2321G06F18/24
Inventor 谭鑫谯睿智
Owner CHONGQING CHANGAN AUTOMOBILE CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Eureka Blog
Learn More
PatSnap group products