Method and system for automatically extracting gesture candidate region in video sequence

A candidate area and video sequence technology, applied in digital video image analysis and understanding, and automatically extracting gesture candidate areas, can solve problems such as difficult to achieve effects, large amount of calculation, and failure to meet assumptions, etc., to narrow the range of gesture candidates, The effect of improving accuracy and reducing the possibility of missed detection

Inactive Publication Date: 2012-01-18
ZHEJIANG UNIV
View PDF1 Cites 39 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Most of the other gesture recognition technologies only rely on skin color information for gesture detection and segmentation. In these technologies, it is often assumed that the hand is the only or the largest skin color area in the entire image; in addition, some researchers use motion information to perform gesture detection. The detection localization of , which also assumes that the hand is the only or the largest motion area in the entire image
However, the above two methods are only effective in simple application scenarios, and the actual application scenarios are generally more complex and cannot meet the assumptions, so it is difficult to achieve the effect
In some gesture recognition systems using template matching technology, there is no separate detection and positioning stage, and the preset gesture templates are used to traverse the entire image to find the best matching position and complete detection, positioning and recognition at the same time. However, this traversal method heavy calculation

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Method and system for automatically extracting gesture candidate region in video sequence
  • Method and system for automatically extracting gesture candidate region in video sequence
  • Method and system for automatically extracting gesture candidate region in video sequence

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0046] In order to make the object, technical solution and advantages of the present invention clearer, the present invention will be further described in detail below in conjunction with the accompanying drawings and embodiments. It should be understood that the specific embodiments described here are only used to explain the present invention, not to limit the present invention.

[0047] The method for automatically extracting gesture candidate regions in a video sequence of the present invention includes: starting a gesture video image acquisition system to collect video images; constructing a reference background image B t ; Calculate and generate a motion description image; calculate and obtain a motion segmentation threshold, and convert the motion description image into a binary motion image BM t ; Skin color segmentation, get binary skin color image BS t ; The binary moving image BM t and binary skin color image BS t Perform logical AND operation point by point to o...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a method for automatically extracting a gesture candidate region in a video sequence. The method comprises the following steps of: enabling a gesture video image acquisition system to acquire video images; constructing reference background images; computing to generate motion description images; computing to obtain motion segmentation threshold values and converting the motion description images into binary motion images; conducting skin color segmentation to obtain binary skin color images; conducting logic and operation to the binary motion images and the binary skin color images point by point to obtain binary fused images; and conducting connected region analysis to the binary fused images and selecting the gesture candidate region. The method combines motion information with skin color information to find the gesture candidate region, the two kinds of information complement each other, the detection accuracy is improved, the method is practical and effective and a very good foundation is provided for the segmentation, the positioning and the recognition of gestures.

Description

technical field [0001] The invention belongs to the technical field of intelligent information processing, and relates to a method for automatically extracting gesture candidate regions in a video sequence, which is applied to digital video image analysis and understanding. Background technique [0002] Traditional human-computer interaction methods, such as mouse, keyboard, remote control, etc., are human beings adapting to computers and completing interactive tasks according to preset specifications. In recent years, with the continuous development of technology, the processing power of computers has become stronger and stronger, and people have begun to study natural human-computer interaction technology that conforms to human communication habits, gradually shifting from computer-centered to human-centered. These studies include speech recognition, face and expression recognition, head movement tracking, gaze tracking, gesture recognition, body gesture recognition, and m...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/00G06K7/00
Inventor 王维东赵亚飞
Owner ZHEJIANG UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products