First-person perspective video interaction behavior identification method based on interaction modeling

A first-person, video interaction technology, applied in character and pattern recognition, neural learning methods, biological neural network models, etc. model, unable to describe the interactive behavior well, etc.

Pending Publication Date: 2020-06-05
SUN YAT SEN UNIV
View PDF4 Cites 6 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0003] The main disadvantage of the above prior art is that it does not explicitly model the interaction between the camera wearer and the interactor
Existing technologies usually directly learn the overall characteristics of the interaction behavior, but the first-person perspective interaction be

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • First-person perspective video interaction behavior identification method based on interaction modeling
  • First-person perspective video interaction behavior identification method based on interaction modeling
  • First-person perspective video interaction behavior identification method based on interaction modeling

Examples

Experimental program
Comparison scheme
Effect test

Embodiment

[0059] The problem to be solved by the present invention is that given a video clip, the intelligent video analysis system needs to identify the behavior category of the people in the video. For an intelligent video analysis system based on a wearable device, the camera is worn on a certain person, and the video is from a first-person perspective. It is necessary to identify the type of interaction between the wearer and others in the first-person perspective. At present, the main first-person perspective interactive behavior recognition method mainly adopts a method similar to the third-person perspective behavior recognition method, and directly learns features from the overall static appearance and dynamic motion information of the video, without explicitly combining the camera wearer and Separate and model the relationship between the interactors with which they interact. The present invention focuses on the first-person perspective interactive behavior recognition problem...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a first-person visual angle video interaction behavior identification method based on interaction modeling, and proposes to separate a camera wearer from an interactor, learn corresponding static appearance and dynamic motion characteristics respectively, and then explicitly model an interaction relationship between the camera wearer and the interactor. In order to separatethe interactor from the background, a mask is generated by using an attention model, and the learning of the attention model is assisted by using a human body analysis model; a motion module is provided to predict motion information matrixes corresponding to a camera wearer and an interactor respectively, and learning of the motion module is assisted through reconstruction of a next frame. And finally, a dual long-short-term memory module for interactive modeling is proposed, and an interactive relationship is explicitly modeled on the basis of the dual long-short-term memory module. According to the method, the interactive behavior of the first-person perspective can be well described and recognized, and a current optimal recognition result is obtained on a common first-person perspective interactive behavior research data set.

Description

technical field [0001] The invention belongs to the technical field of behavior recognition, and in particular relates to a first-person perspective video interactive behavior recognition method based on interaction modeling. Background technique [0002] At present, the main first-person group behavior recognition methods are divided into two categories. One category uses manually designed motion features such as motion trajectory and optical flow, combined with traditional classifiers such as support vector machines; the other category uses deep learning for feature learning. One class of methods adopts a model similar to third-person video action recognition, using convolutional neural networks and long short-term memory models to directly learn behavioral features from video frames. [0003] A major shortcoming of the prior art described above is that the interaction relationship between the camera wearer and the interactor is not explicitly modeled. Existing technologi...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G06K9/00G06K9/62G06N3/04G06N3/08
CPCG06N3/08G06V40/20G06V20/41G06N3/045G06F18/2411Y02T10/40
Inventor 郑伟诗蔡祎俊李昊昕陈立
Owner SUN YAT SEN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products