Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Driver intention recognition method considering human-vehicle-road characteristics

A technology of driver's intention and identification method, which is applied to the driver's input parameters, vehicle components, external condition input parameters, etc., and can solve the problems of poor applicability and low reliability.

Active Publication Date: 2021-09-14
HANGZHOU DIANZI UNIV
View PDF9 Cites 1 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] Aiming at the problems of low reliability and poor applicability of existing driver intention recognition methods, the present invention proposes a driver intention recognition method that considers human-vehicle-road characteristics, and aims to provide a basis for the research of man-machine co-driving technology. A solution to improve driving safety

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Driver intention recognition method considering human-vehicle-road characteristics
  • Driver intention recognition method considering human-vehicle-road characteristics
  • Driver intention recognition method considering human-vehicle-road characteristics

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0038] specific implementation plan

[0039] The following is attached Figure 1-6 The technical solution of the present invention is described in detail.

[0040] like figure 1 As shown, this embodiment provides a kind of driver's intention recognition method that considers human-vehicle-road characteristic, comprises the following steps

[0041] Step 1. Obtain the relevant data of the vehicle and its surrounding vehicles, driver behavior and scene information outside the cab recorded from the driving simulator.

[0042] This is achieved in step 1 by:

[0043] Step 1.1. Arrange two cameras, display screens, and place No. 1 camera directly in front of the driver, and No. 2 camera is facing the driver's screen. There must be no less than three display screens. vision.

[0044] Step 1.2. After arranging the hardware equipment, use the supporting software of the driving simulator to construct the driving scene according to the requirements of the type of intention recognition...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a driver intention recognition method considering human-vehicle-road characteristics. The method is characterized by comprising the following steps: step 1, acquiring related data of a vehicle and surrounding vehicles, driver behavior actions and scene information outside a cab recorded in a driving simulator; step 2, preprocessing the data of the vehicle and the surrounding environment acquired from the driving simulator, and inputting the data into a trained GrowNet network to obtain probability values Pi (P1, P2, ..., P5) of five categories; step 3, respectively storing and processing the video data acquired by two cameras to obtain probability values P'i (P'1, P'2, ..., P'5) of five categories finally; and step 4, performing weighted summation on the Pi and the P'i obtained in the step 2 and the step 3, and taking the category corresponding to the maximum value after five categories are summed as the finally identified driving intention. The driving simulator is fully utilized, data can be collected without depending on a vehicle-mounted sensor, and the experiment is more convenient. In addition, not only can offline training be carried out, but also online testing can be carried out, so that the applicability is improved.

Description

technical field [0001] The present invention relates to the field of man-machine co-driving, in particular to a driver's intention recognition method considering the human-vehicle-road characteristics. Background technique [0002] Today, with the development of transportation, people pay more and more attention to driving safety. Studies have shown that most traffic accidents are caused by drivers' improper operation, and driving safety has always been the focus of regulations and car manufacturers. The passive safety systems developed in the past can no longer meet the current needs, and the widespread use of advanced driver assistance systems makes people travel more secure. If the driver's intention can be identified in advance, the advanced driver assistance system will be more intelligent, so as to better warn the driver of potential dangers during driving and further enhance the initiative of vehicle safety. [0003] Although the field of unmanned driving is develop...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): B60W40/08B60W40/09B60W50/00
CPCB60W40/08B60W40/09B60W50/00B60W2050/0029B60W2050/0033B60W2520/10B60W2540/18B60W2520/06B60W2554/4042B60W2554/80
Inventor 陈慧勤陈海龙刘昊陈勇
Owner HANGZHOU DIANZI UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products