Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Method and device for man-machine interaction

A human-computer interaction and legal technology, applied in the field of human-computer interaction, can solve the problems that the machine cannot recognize gestures accurately and timely, and cannot distinguish the color of human hands well

Inactive Publication Date: 2013-09-25
LENOVO (BEIJING) LTD
View PDF3 Cites 21 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, with this method, since the machine cannot distinguish the color of the human hand from the color of some background technology very well, the machine cannot accurately and timely recognize gestures

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Method and device for man-machine interaction
  • Method and device for man-machine interaction
  • Method and device for man-machine interaction

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0021] The embodiment of the present invention provides a human-computer interaction method, such as figure 1 shown, including:

[0022] S 101. Obtain the gesture characteristics of the current user by collecting the vein image information of the user's hand.

[0023] Vein recognition is based on the characteristic of hemoglobin in the blood that absorbs infrared light. A camera with near-infrared sensitivity will take a picture of the hand, and then the image will be captured in the shadow of the vein, and then the image of the vein will be processed. Digital processing, extraction of eigenvalues ​​of venous blood vessel images.

[0024] According to this principle, after the user makes a certain gesture, the recognition device with a near-infrared sensitive camera takes a camera to collect the hand vein image information of the user to obtain the characteristics of the current user's gesture.

[0025] Wherein the camera with near-infrared sensitivity can be an infrared CCD...

Embodiment 2

[0042] The embodiment of the present invention provides a human-computer interaction method, such as image 3 shown, including:

[0043] S301. Preset an indication command corresponding to a gesture feature of a user, and acquire a correspondence between the preset user gesture feature and an indication command.

[0044] Vein recognition is based on the characteristic of hemoglobin in the blood that absorbs infrared light. A camera with near-infrared sensitivity will take a picture of the hand, and then the image will be captured in the shadow of the vein, and then the image of the vein will be processed. Digital processing, extraction of eigenvalues ​​of venous blood vessel images.

[0045] According to this principle, users can make various gestures to interact with the recognition device. For example, the recognition device can preset the indication command corresponding to the gesture feature of stretching out the right palm as the "power on" command of the recognition d...

Embodiment 3

[0061] The embodiment of the present invention provides a device 40 for human-computer interaction, such as Figure 4 As shown, it includes: a setting unit 41 , an acquiring unit 42 , a determining unit 43 and an executing unit 44 .

[0062] The setting unit 41 is configured to preset an indication command corresponding to a user's gesture feature, and obtain a correspondence between the preset user gesture feature and an indication command.

[0063] Vein recognition is based on the characteristic of hemoglobin in the blood that absorbs infrared light. A camera with near-infrared sensitivity will take a picture of the hand, and then the image will be captured in the shadow of the vein, and then the image of the vein will be processed. Digital processing, extraction of eigenvalues ​​of venous blood vessel images.

[0064] According to this principle, the user can make various gestures to perform human-computer interaction with the device 40 . For example, the setting unit 41 ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The embodiment of the invention provides a method and device for man-machine interaction, which allows the man-machine interaction through hands to be more accurate. The method comprises the following steps: obtaining the gesture features of a current user by collecting the hand vein image information of the current user; determining the instruction corresponding to the gesture features of the current user by comparing the gesture features of the current user to preset gesture features, and actuating the instruction. The embodiment of the invention is suitable for the technical field of man-machine interaction.

Description

technical field [0001] The invention relates to the technical field of human-computer interaction, in particular to a method and device for human-computer interaction based on vein identification. Background technique [0002] In the prior art, the form of interaction between users and machine equipment is mainly through conventional input devices (such as keyboards, mice, and touch screens, etc.), which has become the bottleneck of human-computer interaction. [0003] In order to better realize the direct use of human hands as a human-computer interaction device with a computer, a method of gesture interaction is also proposed in the prior art, such as using specially defined gestures to realize the interaction between the user and the device. However, with this method, since the machine cannot distinguish the color of the human hand from the color of some background technology well, the machine cannot accurately and timely recognize gestures. Contents of the invention ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06F3/01G06K9/00
Inventor 陈柯杨锦平
Owner LENOVO (BEIJING) LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products