Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Gesture recognition method based on vision

A gesture recognition and visual technology, applied in the field of gesture recognition, can solve the problems of skin color interference such as light changes, and the recognition effect is not expected.

Inactive Publication Date: 2017-01-11
BRIGHTVISION TECH
View PDF4 Cites 19 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] The vision-based gesture recognition method generally uses a network camera as an image extraction device, but it is very susceptible to the influence of light changes and the interference of skin color, and the recognition effect is not as expected. With the emergence of the high-tech product Kinect depth camera, A new development direction has appeared in the human-computer interaction mode, but how to realize gesture recognition on the basis of Kinect has become a hot spot for exploration

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Gesture recognition method based on vision
  • Gesture recognition method based on vision
  • Gesture recognition method based on vision

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0072] In order to make the object, technical solution and advantages of the present invention clearer, the present invention will be further described in detail below in conjunction with the accompanying drawings and embodiments. It should be understood that the specific embodiments described here are only used to explain the present invention, not to limit the present invention.

[0073] The embodiment of the present invention is based on the visual gesture recognition method, such as figure 1 As shown, it specifically includes the following steps:

[0074] Step 1, capture the video stream through the Kinect sensor, and use the function of OpenCV to obtain the depth data and bone data. OpenCV (Open Source Computer Vision Library) is a cross-platform computer vision library released under the BSD license (open source).

[0075] In step 2, the obtained depth image is preprocessed by the median filtering method.

[0076] Step 3, extract the hand area according to the depth d...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a gesture recognition method based on vision, comprising the following steps: step 1, capturing a video stream through a Kinect sensor, and getting depth image data and bone data; step 2, preprocessing the obtained depth image data through use of a median filtering method; step 3, extracting a hand region according to the depth image data and the bone data to get a hand region binary map; step 4, binarizing the edge image of the hand region binary map through edge detection, and extracting palm contour information; step 5, resetting the center of palm according to the hand region binary map; step 6, calculating out convex hull vertices and a circum circle according to the palm contour information; and step 7, performing matching to recognize gestures using a classification decision tree and a variety of gestures in a gesture library according to the center-of-palm resetting and the convex hull vertices and the circum circle calculated.

Description

technical field [0001] The invention relates to the field of image processing, in particular to a gesture recognition method based on vision-based acquisition and processing of depth graphic data. Background technique [0002] It is said that with the rapid development of computers, the mode of human-computer interaction is constantly changing. Real-life experience is available. Body language is also very important in the process of interaction between people. Using hands to realize human-computer interaction is more natural and smooth than using media such as keyboards and mice. One of the reasons for becoming mainstream. [0003] The principle of using hands to realize human-computer interaction is the interpretation of the machine opponent, such as the shape of the hand, the spatial position of the hand, the posture of the hand, and the spatial position of the hand. There are many recognition methods, some of which use supporting equipment, such as acceleration sensors,...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06K9/00G06K9/46
CPCG06V40/28G06V10/443
Inventor 施松新杨勇
Owner BRIGHTVISION TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products