Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Driver action recognition method and device based on three-dimensional convolutional neural network

A neural network and three-dimensional convolution technology, applied in the field of rail transit, can solve problems such as inaccurate judgment results, easy to be affected by occlusion or environment, etc.

Active Publication Date: 2019-12-06
TRAFFIC CONTROL TECH CO LTD
View PDF4 Cites 5 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] The embodiment of the present invention provides a driver action recognition method and device based on a three-dimensional convolutional neural network, which is used to solve the problem that the methods in the prior art mostly judge based on the images collected from the driver's face, and are easily blocked or affected by the environment. Influence, the problem of inaccurate judgment results

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Driver action recognition method and device based on three-dimensional convolutional neural network
  • Driver action recognition method and device based on three-dimensional convolutional neural network
  • Driver action recognition method and device based on three-dimensional convolutional neural network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0032] In order to make the purpose, technical solutions and advantages of the embodiments of the present invention clearer, the technical solutions in the embodiments of the present invention will be clearly and completely described below in conjunction with the drawings in the embodiments of the present invention. Obviously, the described embodiments It is a part of embodiments of the present invention, but not all embodiments. Based on the embodiments of the present invention, all other embodiments obtained by persons of ordinary skill in the art without making creative efforts belong to the protection scope of the present invention.

[0033] figure 1 It is a schematic flow chart of a driver action recognition method based on a three-dimensional convolutional neural network provided in this embodiment, see figure 1 , the method includes the following steps:

[0034] 101: Obtain the video captured by the driver of the train during the train running;

[0035] 102: Extract ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The embodiment of the invention provides a driver action recognition method and device based on a three-dimensional convolutional neural network. Feature data is extracted from a video photographed bya driver through preset feature engineering, and a target model recognizes the behavior of the driver according to the feature data , wherein the target model is obtained by training a constructed three-dimensional convolutional neural network, the three-dimensional convolutional neural network comprises a plurality of combined layer structures connected in sequence, and each combined layer structure comprises a convolutional layer and a pooling layer. By improving the structure of the three-dimensional convolutional neural network, the trained target model has a more accurate recognition result on the action of the driver. And on the other hand, compared with facial feature acquisition, driver action acquisition is not easily disturbed by the environment, the feature data contains optical flow features reflecting time-dependent changes of driver actions, and the accuracy of an identification result is further improved through the action continuity data.

Description

technical field [0001] The present invention relates to the technical field of rail transit, in particular to a driver action recognition method and device based on a three-dimensional convolutional neural network. Background technique [0002] In ensuring the driving safety of urban rail transit, drivers shoulder important responsibilities, and their accurate actions and clear awareness often determine the safety of passenger transportation. Fewer driver configurations, monotonous driving actions, and high automation of train driving are important reasons for driver fatigue. At the same time, the driver's personal living habits, workload, and working hours will also affect whether the driver is fatigued. Some traditional methods are to alleviate the fatigue of train drivers by improving the management system and work plan. The train "anti-sleep" equipment also reduces the driver's fatigue to a certain extent. Sexual movements are not sensitive to the "anti-sleeping death" ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06K9/00G06N3/04G06N3/08
CPCG06N3/08G06V40/20G06V40/172G06V20/46G06V20/52G06N3/045
Inventor 罗铭肖骁
Owner TRAFFIC CONTROL TECH CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products