Two-stream neural network-based human body image action identification method

A neural network and action recognition technology, applied in the field of computer vision, to achieve the effect of improving recognition accuracy

Inactive Publication Date: 2018-03-30
SUN YAT SEN UNIV
View PDF2 Cites 37 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0008] Aiming at the technical defect that the prior art cannot extract enough time information for action recognition, the present invention provides a human body image action recognition method based on a dual-stream neural network, which can extract informati

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Two-stream neural network-based human body image action identification method
  • Two-stream neural network-based human body image action identification method
  • Two-stream neural network-based human body image action identification method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0041] The invention relates to a human body image action recognition method based on a dual-stream neural network, comprising the following steps:

[0042] S1. Construct temporal neural network and spatial neural network;

[0043] S2. Prepare enough training videos for the temporal neural network and the spatial neural network, then extract information from the training video to train the temporal neural network and the spatial neural network, such as figure 1 As shown, the steps to extract information are as follows:

[0044] S21. The number of video frame segments is set to be k, and the initial value of k is 1;

[0045] S22. The video frame of the training video is divided into 3 sections, and then RGB information and optical flow map information of multiple video frames are collected respectively;

[0046] S23. Make k=k+1 and then carry out the processing of step S22 to each video frame, divide each video frame into 2 sections again, then collect RGB information and opt...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

A method provided by the invention can extract RGB information and light stream graph information of different granularities; for a same video, the method provided by the invention can extract more video information for training, so that compared with a conventional model, long-time complex actions can be better processed; and for overall RGB human body action identification, the method provided by the invention can better improve the identification accuracy.

Description

technical field [0001] The present invention relates to the technical field of computer vision, and more specifically, to a human body image action recognition method based on a dual-stream neural network. Background technique [0002] Image recognition has always been a popular research field in computer vision, and RGB human image action recognition has always been a key research topic due to reasons such as easy overfitting and less representative data sets available for training models. [0003] Since the recognition accuracy of a single RGB image has been difficult to improve, [1] proposed a new neural network model for recognition. The model consists of two neural networks, the first is a spatial neural network, the input data is a traditional single RGB image, and the second is a temporal neural network, the input data is the optical flow map corresponding to the RGB image of the first network, The optical flow map is synthesized from two adjacent RGB images. By calc...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G06N3/04G06N3/08G06K9/00
CPCG06N3/04G06N3/049G06N3/08G06V40/23
Inventor 吴昊宣吴贺俊
Owner SUN YAT SEN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products