Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

A semantic human-computer natural interaction control method and system

An interactive control, human-computer technology, applied in the direction of user/computer interaction input/output, semantic analysis, mechanical mode conversion, etc., can solve the problems of prone to false triggering, low reliability of gesture coding, etc., and achieve a low false trigger rate. , Improve the control accuracy and reliability, the effect of high reliability

Active Publication Date: 2019-05-17
COMP APPL TECH INST OF CHINA NORTH IND GRP
View PDF2 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

In addition, in the existing human-computer interaction recognition, the coding reliability of gestures is not high, and it is easy to have false triggers.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A semantic human-computer natural interaction control method and system
  • A semantic human-computer natural interaction control method and system

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0046] Example 1, such as figure 1 As shown, a semantic human-computer natural interaction control method according to an embodiment of the present invention includes:

[0047] Step 1: According to the control modes of the controlled equipment, these control modes are represented by the operator's gestures, and the corresponding mapping relationship between the operator's gesture library and the controlled equipment control command library is established.

[0048] Among them, the control modes of the controlled equipment are divided into basic control, compound control and mission control.

[0049] Basic control refers to the control of the most commonly used states of the controlled equipment, using basic gestures that conform to human expression habits to directly express these most commonly used states, which is vivid, simple and convenient. Among them, when the accused equipment is a drone, its most commonly used states include takeoff, landing, forward, backward, return,...

Embodiment 2

[0060] Example 2, such as figure 2 As shown, the present invention also provides a semantic human-computer natural interaction control system, characterized in that the system includes:

[0061] The interactive coding module, according to the control modes of the controlled equipment, expresses these control modes through the operator's gestures, and establishes the corresponding mapping relationship between the operator's gesture library and the controlled equipment control command library.

[0062] Among them, the control modes of the controlled equipment are divided into basic control, compound control and task control.

[0063] Basic control refers to the control of the most commonly used states of the controlled equipment, using basic gestures in line with human expression habits to directly express these most commonly used states. Among them, the most commonly used states include takeoff, landing, forward, backward, return, and left turn , turning right, the correspond...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a semantic human-computer natural interaction control method, comprising: step 1, establishing a corresponding mapping relationship between the operator's gesture library and the controlled equipment control command library; step 2, collecting and fusing the myoelectric sensor worn by the operator and the information of the inertial navigation sensor to obtain the gesture information of the operator to realize the recognition of the gesture of the operator; step 3, according to the mapping relationship in step 1, convert the gesture information identified in step 2 into the gesture information of the controlled equipment Control instruction; step 4, transmitting the control instruction in step 3 to the controlled equipment, so as to realize real-time control of the controlled equipment. The invention also provides a semantic man-machine natural interaction control system. The beneficial effects of the present invention are as follows: a mapping relationship conforming to human expression habits is established, the encoding method is vivid, simple, convenient, high in reliability, and has a low rate of false triggering, which improves the control accuracy and reliability of the controlled equipment .

Description

technical field [0001] The present invention relates to the technical field of intelligent control, in particular to a semantic human-computer natural interaction control method and system. Background technique [0002] Gesture recognition technology is a key technology of the new generation of natural human-computer interaction. Compared with traditional mouse, keyboard and other contact operation methods, gestures have the advantages of natural, intuitive, easy to understand, simple operation, and good experience, and are more in line with human daily life. Communication habits and gesture recognition have become a research hotspot in human-computer interaction solutions. As a natural and convenient language, gesture is very suitable for human-computer interaction both emotionally and practically. The research significance of gesture recognition technology is to apply gestures, a natural and intuitive way of communication, to the interface technology of human-computer int...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06F3/01G06F17/27G06K9/00
CPCG06F3/017G06F2203/012G06F40/30G06V40/107G06V40/113
Inventor 赵小川付成龙胡雄文
Owner COMP APPL TECH INST OF CHINA NORTH IND GRP
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products