Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Driver vision fusion method for automatic driving trajectory tracking

A trajectory tracking and automatic driving technology, which is applied in the directions of instruments, calculations, character and pattern recognition, etc.

Active Publication Date: 2020-10-20
JILIN UNIV
View PDF3 Cites 7 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Although these studies can obtain models with control accuracy close to or even better than real drivers, they are lacking in further research on human-vehicle-road interaction and more humanized control effects. Therefore, it is necessary to consider visual cognition characteristics

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Driver vision fusion method for automatic driving trajectory tracking
  • Driver vision fusion method for automatic driving trajectory tracking
  • Driver vision fusion method for automatic driving trajectory tracking

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0051] The present invention will be further described in detail below in conjunction with the accompanying drawings, so that those skilled in the art can implement it with reference to the description.

[0052] like figure 1 As shown, the present invention provides a driver's visual fusion method for automatic driving trajectory tracking, including the following implementation process:

[0053] Step 1, experimental design and data collection.

[0054] Experimental equipment: glasses-type eye tracker, to obtain the image information of the driver's gaze point (image coordinate system) and the driver's perspective; a driving recorder, to eliminate image shake; an experimental vehicle with an open CAN bottom layer protocol, which can collect driving information in real time Manipulation behavior information, vehicle kinematics and dynamics information; CAN line signal collector can record CAN signals and collect experimental monitoring data.

[0055] Since the glasses-type eye...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a driver vision fusion method for automatic driving trajectory tracking. The method comprises the steps of 1, acquiring a front view image and eye movement data of the view angle of a driver; 2, separating driver fixation points from the front view image of the view angle of the driver; 3, after the driver fixation points are positioned under a fixed coordinate system, obtaining the fixed coordinates of the driver effective fixation points; 4, determining normal distribution characteristics of the fixed coordinates of the effective fixation points of the driver, and determining a driver preview point according to fitting parameters of normal distribution of the fixation points; 5, converting the driver preview point into a ground coordinate system to obtain the ground coordinates of the preview point; calculating the forward-looking preview time corresponding to the preview point according to the ground coordinates of the preview point, and obtaining a forward-looking preview time probability density graph according to the vehicle speed, the probability density of the preview point and the forward-looking preview time corresponding to the preview point; and6, correcting a predictive control driver model according to the forward-looking preview time probability density graph.

Description

technical field [0001] The invention belongs to the technical field of automatic driving, and in particular relates to a driver vision fusion method for automatic driving trajectory tracking. Background technique [0002] As the core decision-making control unit in the human-vehicle-road closed-loop system, the research on the driver is of great significance for understanding the interaction mode between the human-vehicle-road and the overall system optimization. Among them, an effective research method is to establish a driver model, which is the process of abstracting the actual driver's process of manipulating the car into a mathematical expression. [0003] The research on driver model has been carried out for more than half a century and has been applied in many aspects. In the vehicle design process, the introduction of the driver model can objectively, accurately, comprehensively and deeply evaluate the control performance of the overall human-vehicle-road closed-loo...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06K9/00G06K9/62
CPCG06V20/56G06F18/22G06F18/25
Inventor 胡宏宇程铭盛愈欢
Owner JILIN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products