Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Human body action detection and positioning method based on space-time combination

A technology of human motion and positioning method, applied in character and pattern recognition, instruments, computer parts, etc., can solve the problem that the detection results need to be improved, and achieve the effect of solving the great difference in video length

Pending Publication Date: 2019-05-21
CHINA UNIV OF PETROLEUM (EAST CHINA)
View PDF6 Cites 34 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0003] Existing algorithms, such as SSN, TAG, and CBR, use and improve the method of generating candidate areas to classify candidate areas. To a certain extent, time-series structure information is used, and the effect is more prominent in simple and specific scenes. , but in the actual engineering scene where the human body is seriously occluded, the posture is changeable, and there are many disturbing objects, the detection results of the existing methods still need to be improved.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Human body action detection and positioning method based on space-time combination
  • Human body action detection and positioning method based on space-time combination
  • Human body action detection and positioning method based on space-time combination

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0044] Below in conjunction with accompanying drawing and specific embodiment the present invention is described in further detail:

[0045] A human action detection and localization method based on spatio-temporal joint, such as figure 1 As shown, it is a flow chart of the human motion detection and localization method based on spatio-temporal joint of the present invention, the method includes:

[0046] S1, data preprocessing, divide an input untrimmed video into K equal-length video sequences and divide the entire data set, divide the data set into training set, verification set and test set, and the division ratio is 7:2: 1. The original data comes from the streaming media server of the offshore oil production platform. The monitoring equipment on each offshore platform remains static, and the working platform is used as the monitoring scene. The real-time monitoring video is transmitted through microwave and stored in the streaming media server.

[0047] S2, sparse samp...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a human motion detection and positioning method based on space-time combination, and the method comprises the steps: taking an unpruned video as an input, dividing the video into a plurality of short units with the same length through data preprocessing, carrying out the sparse sampling randomly, and extracting the space-time characteristics through a double-flow convolutional neural network; Secondly, entering a space-time joint network to judge the occurrence interval of actions to obtain a group of action scoring oscillograms, inputting the action oscillograms into aGTAG network, and setting different thresholds to meet different positioning precision requirements and obtain action proposal sections with different granularities; All the action proposal sectionsdetect the types of actions through an action classifier, the time boundary of action occurrence is finely corrected through an integrity filter, and human body action detection and positioning in a complex scene are achieved. The method provided by the invention can be applied to actual scenes with serious human body shielding, changeable postures and more interference objects, and can be used for well processing activity categories with different time structures.

Description

technical field [0001] The invention belongs to the field of computer graphics and image processing, and relates to a human motion detection and positioning method based on time-space integration. Background technique [0002] Human motion detection has always been an important research topic in the field of computer vision. The ability to automatically determine the time interval of an action in a video sequence, including the start time, end time, and type of action, is crucial to our understanding and application of human behavior. . In recent years, due to the explosive growth of video data and the urgent need for intelligent video processing, time-sequence-based action detection methods have received increasing attention. Temporal behavior detection can be divided into two stages: generating time interval proposals and classifying the proposals. The current methods basically pay more attention to the decision-making of candidate proposals, while ignoring the temporal s...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06K9/00G06K9/62
Inventor 宫法明马玉辉李昕袁向兵宫文娟丁洪金
Owner CHINA UNIV OF PETROLEUM (EAST CHINA)
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products