Eureka AIR delivers breakthrough ideas for toughest innovation challenges, trusted by R&D personnel around the world.

An intelligent human-computer interaction method and device based on graphic-text matching

A technology of human-computer interaction and graphic text, applied in the field of computer vision, can solve the problems of human-computer interaction limitations and low interaction efficiency, and achieve the effect of efficient and simple operation, efficient extraction, and accurate and reliable correlation matching algorithm

Active Publication Date: 2022-04-08
NAT INNOVATION INST OF DEFENSE TECH PLA ACAD OF MILITARY SCI
View PDF2 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

The main purpose is to solve the problem that human-computer interaction in the prior art is limited to single-mode command interaction or direct contact interaction mode, and the interaction efficiency is low

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • An intelligent human-computer interaction method and device based on graphic-text matching
  • An intelligent human-computer interaction method and device based on graphic-text matching
  • An intelligent human-computer interaction method and device based on graphic-text matching

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0035] Please refer to figure 1 , which shows a block diagram of an intelligent human-computer interaction method based on image-text matching provided by an embodiment of the present invention.

[0036] Such as figure 1 As shown, the method of the embodiment of the present invention mainly includes the following steps:

[0037] S1 speech recognition: collect the user's speech information, and use the template matching speech recognition algorithm to convert the speech information into a text sequence; the template matching uses dynamic time warping technology for feature training and recognition, and the hidden Markov model is used to analyze the speech signal Statistical models are established for the time series structure, and signal compression is performed by vector quantization technology;

[0038] After the sound signal is collected with a microphone, the sound signal is converted into a digital signal. The preprocessing operation of denoising the signal, and then us...

Embodiment 2

[0050] Furthermore, another embodiment of the present invention provides an intelligent human-computer interaction device based on image-text matching as an implementation of the methods shown in the above-mentioned embodiments. This device embodiment corresponds to the foregoing method embodiment. For the convenience of reading, this device embodiment does not repeat the details in the foregoing method embodiment one by one, but it should be clear that the device in this embodiment can correspond to the foregoing method implementation. Everything in the example. image 3 A composition block diagram of an intelligent human-computer interaction device based on image-text matching provided by an embodiment of the present invention is shown. Such as image 3 As shown, in the device of this embodiment, there are following modules:

[0051] 1. Voice input module: used to collect voice information of users;

[0052] Including sound acquisition unit and signal denoising unit; the ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses an intelligent human-computer interaction method and device based on graphic-text matching, belonging to the field of computer vision. The method includes: collecting user's voice information and converting it into a text sequence, using natural language processing technology to extract target features in the text sequence; collecting real environment images, and using a deep convolutional neural network to extract natural image features from the original image data ; Calculate the matching degree of each target in the original image and the target in the text sequence, take the target in the image with the highest matching degree as the matching result, and convert it into a machine instruction. The invention combines computer vision technology and natural language processing technology to realize the matching of complex commands and real images, and can automatically locate the relevant entity targets in the image according to a natural language expression command, making the interaction process more natural, and can be applied to disabled robots, rescue A wide range of scenarios such as robots and special robots.

Description

technical field [0001] The invention relates to the field of computer vision, in particular to an intelligent human-computer interaction method and device based on graphic-text matching. Background technique [0002] With the increasing intelligence of computers and robots, machines have been able to assist humans in completing complex tasks involving medical care, industrial production, entertainment, family services, special services, and other fields. A wide range of application scenarios has put forward higher requirements for traditional human-computer interaction systems, especially how intelligent machines can better interact and collaborate with humans. The purpose of human-computer interaction is to combine the respective advantages of humans and machines to better complete complex human-machine collaborative tasks. The ultimate goal is to realize the natural integration of humans and machines in application scenarios. However, at present, human-computer interactio...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06K9/62G06V10/75G06V10/40G06F40/284G06F40/30G06N3/04G06N3/08G10L15/26
CPCG06F40/284G06F40/30G06N3/08G10L15/26G06N3/044G06N3/045G06F18/295
Inventor 印二威谢良张珺倩张敬闫慧炯罗治国张亚坤艾勇保闫野
Owner NAT INNOVATION INST OF DEFENSE TECH PLA ACAD OF MILITARY SCI
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Eureka Blog
Learn More
PatSnap group products