Real-time gesture recognition method and device

A gesture recognition and gesture technology, which is applied in character and pattern recognition, instruments, user/computer interaction input/output, etc., can solve the problems of increasing the complexity of recognition algorithms, template updates, poor versatility, and increased process difficulty.

Inactive Publication Date: 2014-04-09
SICHUAN COC DISPLAY DEVICES
View PDF5 Cites 32 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] However, the traditional gesture recognition algorithm has many shortcomings. First, in the gesture target extraction process, the gesture target extraction process must be passed, because in the visible light environment, the image is sensitive to visible light, which is easy to affect the target extraction effect, and the versatility is poor. Moreover, the background template needs to be trained by using the frame difference method, which increases the complexity of the recognition algorithm and the update of the template. The storage of a frame of data for comparison makes the process more difficult, and the storage capacity increases during embedded transplantation, which is not conducive to the embedded platform. transplant
Second, the splicing of edg

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Real-time gesture recognition method and device
  • Real-time gesture recognition method and device
  • Real-time gesture recognition method and device

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0082] Embodiment one: specific steps: as image 3 as shown,

[0083] Step 1: Obtain infrared images through the left camera and the right camera of the real-time gesture recognition device;

[0084] Step 2: The processor converts the infrared image into a grayscale image and separates the gesture area from the background area through adaptive threshold processing, and then obtains the binarized image of the gesture target through the adaptive method of image binarization (such as Figure 4 ), the white area is the target area, that is, the gesture area, and the black and white area is the separation area, that is, the background area, which is regularly updated over time;

[0085] Step 3: Predict the coordinate position of relevant features (such as edges, skeletons, etc.) for each finger in the gesture area through the prediction and classification algorithm based on the target boundary, and separate and classify different finger feature data (such as Figure 5 and Figur...

Embodiment 2

[0090] Embodiment 2: in step 2, the gesture area is separated from the background area, and the binarization map of the gesture target is obtained through the adaptive method of image binarization. The specific process is: (the binarization process of the data of the left camera and the right camera Similarly, the following process can represent both the left camera data and the right camera data binarization process), such as Figure 4 as shown,

[0091] Step 21: Let the image be f (ex,ey) , the binarized image is p (ex,ey) , the threshold value is T, calculate the normalized histogram of the gray level of the input image, use h (ei) express.

[0092] Step 22: Calculate the gray mean where i=0,1,...,255;

[0093] Step 23: Calculate the zero-order cumulative moment w(k) and first-order cumulative moment μ(k) of the histogram

[0094]

[0095]

[0096] Step 24: Calculate class separation metrics k=0,1,...,255;

[0097] Step 25: Find The maximum value of , and ...

Embodiment 3

[0100] Embodiment three: step 3 wherein the specific steps of the predictive classification algorithm based on the target boundary:

[0101] Step 31: When a suspected gesture target is detected, that is, the white pixel of the binary image, record the gesture gesture at the starting point a of the row ei , and its coordinate value is (xa ei ,ya ei );

[0102] Step 32: When the width of the suspected target in k lines, that is, the number of continuous white pixels is greater than the threshold p, consider that the gesture target is detected instead of a noise point, and record the end point b of the line of gesture targets ei (xb ei ,yb ei ), where p=10.

[0103] Step 33: Obtain the starting edge (xa e(i-1) ,ya e(i-1) ) and the terminal edge (xb e(i-1) ,yb e(i-1) ),which is Figure 5 a, b (point a, b is also the edge of the target in row i). and ya e(i-1) =yb e(i-1) , through (xa e(i-1) ,ya e(i-1) ), (xb e(i-1) ,yb e(i-1) ) two points to find the midpoint coor...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention relates to the field of somatosensory control, in particular to a real-time gesture recognition method and device. The method is real-time, accurate and capable of carrying out image real-time input while carrying out image processing through an algorithm in one frame so that abundant data information like edges, skeletons, tips and nodes of various types of two-dimensional image targets of three-dimensional image targets can be obtained, and the data information can be sorted in real time to obtain an order array. According to the method, accurate and real-time tracking and matching of the targets and arteriovenous trajectory analysis in medical science based on skeleton characteristics can be further achieved. The real-time gesture recognition device comprises a gesture identifying control device and two free degree mechanical rods.

Description

technical field [0001] The invention relates to the field of somatosensory control, in particular to a real-time gesture recognition method and device. Background technique [0002] Gesture recognition is an important technology in the field of somatosensory control. Through this technology, users can use the gesture recognition controller to perform human-computer interaction in the most natural way. For example, music creation can be performed through the user's hands, and fingers can be played in the air. , medical remote surgery, remote dangerous operations, 3D model building and other operations. [0003] The traditional gesture recognition process is to first extract the gesture target from the visible light image through the method of frame difference or skin color recognition; then perform target edge detection, and then sort and stitch the edge sequence; finally, use the curvature corner algorithm to obtain the largest curvature. The coordinates are fingertip-like,...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G06K9/00G06F3/01
Inventor 李翔符赞宣王付生
Owner SICHUAN COC DISPLAY DEVICES
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products