Automatic labeling method for human joint based on monocular video

A technology for automatic labeling of human joints, applied in the field of computer vision, can solve problems such as difficult data processing and large state space

Active Publication Date: 2012-07-25
BEIJING UNIV OF POSTS & TELECOMM
View PDF3 Cites 34 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, human skeleton reconstruction faces the problem

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Automatic labeling method for human joint based on monocular video
  • Automatic labeling method for human joint based on monocular video
  • Automatic labeling method for human joint based on monocular video

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0031] The technical solutions in the embodiments of the present invention will be clearly and completely described below in conjunction with the drawings in the embodiments of the present invention. Apparently, the described examples are only some, not all, embodiments of the present invention. Based on the embodiments of the present invention, all other embodiments obtained by persons of ordinary skill in the art without making creative efforts belong to the protection scope of the present invention.

[0032] figure 1 A flow chart of a method for automatically labeling human joints provided by an embodiment of the present invention, the steps of the method include:

[0033] S101. Foreground detection;

[0034] Use the camera to capture the video of human body movement, and perform anti-shake and denoising processing. Background modeling, to obtain the foreground area. In order to reduce the amount of computation in the subsequent steps, the foreground region in the origi...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention provides an automatic labeling method for a human joint based on a monocular video. The automatic labeling method comprises the following steps: detecting a foreground and storing the foreground as an area of interest; confirming an area of a human body, cutting a part and obtaining an outline of sketch; obtaining a skeleton of the human body and obtaining a key point of the skeleton; utilizing a relative position of face and hands to roughly estimate the gesture of human body; and automatically labeling the point of human joint. During an automatic labeling process, the sketch outline information, skin color information and skeleton information from sketch of human body are comprehensively utilized, so that the accuracy for extracting the joint point is ensured. According to the automatic labeling method provided by the invention, the accurate and efficient cutting for the part of human body is performed, the gesture information of each limb part is obtained and a beneficial condition is supplied to the next operation for obtaining and treating feature vectors of human body.

Description

technical field [0001] The invention belongs to the field of computer vision and relates to an automatic initialization method in motion capture, which can be used for posture estimation and action recognition of human bodies. Background technique [0002] Research on vision-based markerless motion capture technology began in the 1980s. Marker-free motion capture technology integrates the research content of computer vision, computer graphics, image processing, human kinematics and artificial intelligence. It is a hot field interdisciplinary and extremely challenging. Human motion capture technology has strong practical value and can be widely used in various fields, mainly including intelligent monitoring system, new human-computer interaction, medical diagnosis and analysis, film and animation production, game production, virtual reality, content-based video sequence indexing and retrieval, athlete auxiliary training, etc. For example, in the current human-computer inter...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G06K9/00G06K9/54
Inventor 顾仁涛张俊杰纪越峰
Owner BEIJING UNIV OF POSTS & TELECOMM
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products