Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Single frame representation model based human activity identification method

A technology of human activities and recognition methods, applied in the field of activity recognition, which can solve problems such as inability to accurately identify group activities, long calculation time, and low recognition accuracy

Inactive Publication Date: 2018-09-14
SHENZHEN WEITESHI TECH
View PDF2 Cites 4 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] In view of the low accuracy of human activity recognition in the past, the long calculation time required for recognition, and the inability to accurately identify group activities, the purpose of the present invention is to provide a human activity recognition method based on a single-frame representation model. Each video frame of the video generates an optical flow image. Then, all video frames and corresponding optical flow images are input into a single frame representation model to generate a model representation. Then, a long short-term memory model is used to generate according to the previous step. The model representation generates the predicted final activity label, and finally, a fully connected layer with Softmax activation function is used to determine the final activity label

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Single frame representation model based human activity identification method
  • Single frame representation model based human activity identification method
  • Single frame representation model based human activity identification method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0027] It should be noted that, in the case of no conflict, the embodiments in the present application and the features in the embodiments can be combined with each other. The present invention will be further described in detail below in conjunction with the drawings and specific embodiments.

[0028] figure 1 It is a system flowchart of a human activity recognition method based on a single frame representation model of the present invention. It mainly includes preprocessing, single frame representation model, activity recognition model, model optimization and training.

[0029] Wherein, the preprocessing refers to inputting the original frame (including environment information) and its corresponding optical flow image (providing motion information), and then inputting the video frame at time t and the video frame at time t-1 to the information flow Network 2.0 to calculate the optical flow, because the information flow network 2.0 has the best performance in generating opti...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a single frame representation model based human activity identification method. The single frame representation model based human activity identification method mainly comprisespretreatment, a single frame representation model, an activity identification model, and model optimization and training; the process is as follows: generating a light stream image for each video frame of an inputted video first, then inputting all videos and the corresponding light stream image to one single frame representation model, generating model representation, then using a long and shorttime memory model to generate a predicated final activity label according to the model representation generated in the previous step, and finally determining the final activity label through a full connection layer with a Softmax activation function. The single frame representation model based human activity identification method solves the problems that the previous identification precision of the human activity is not high, the identification requires long computation time and the group activity cannot be accurately identified, the method can identify group activities, the identification precision is relatively high, and the required computation time is less.

Description

technical field [0001] The invention relates to the field of activity recognition, in particular to a human activity recognition method based on a single-frame representation model. Background technique [0002] With the advancement of embedded system design technology, powerful cameras have been embedded in various smart devices, and wireless cameras can be easily deployed on street corners, traffic lights, large stadiums, railway stations, etc., resulting in a large number of Videos have attracted a large number of researchers to study human activity recognition. Human activity recognition can be applied to law enforcement tasks responsible for monitoring large-scale crowd activities. Through rapid human activity recognition and analysis on a large number of videos captured by street surveillance, suspicious or criminal behavior can be quickly identified; similarly, if TV broadcasts can automatically identify and broadcast games Highlights of the video, rather than the en...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/00G06K9/62G06N3/04
CPCG06V20/42G06V20/46G06N3/045G06F18/214
Inventor 夏春秋
Owner SHENZHEN WEITESHI TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products