Eureka AIR delivers breakthrough ideas for toughest innovation challenges, trusted by R&D personnel around the world.

Auxiliary semantic recognition system based on gesture recognition

A technology of semantic recognition and gesture recognition, applied in the field of human-computer interaction, can solve problems such as less time for thinking and narration errors

Active Publication Date: 2020-05-15
重庆大牛认知科技有限公司
View PDF6 Cites 4 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] However, compared with keyboard input, voice input is more direct, and ideas can be expressed directly without secondary conversion, resulting in less time for thinking and prone to narrative errors

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Auxiliary semantic recognition system based on gesture recognition
  • Auxiliary semantic recognition system based on gesture recognition

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0037] The auxiliary semantic recognition system based on gesture recognition of the present embodiment, such as figure 1 As shown, it includes an input module, an image acquisition module, an image processing module, a character recognition module, a gesture recognition module, a semantic recognition module and a demonstration module.

[0038] The input module is used for collecting voice information and converting the voice information into the first text.

[0039] The image collection module is used to collect the image of the disability certificate, and the text recognition module is used to recognize the text in the image of the disability certificate, and extract personal data from the recognized text. In this embodiment, the personal data includes name, gender, age, and disability type, and the disability type includes hearing, speech, physical, intelligence, multiple, etc.

[0040] The demonstration module is used to play gesture demonstration video before the image a...

Embodiment 2

[0054] The difference between this embodiment and Embodiment 1 is that in this embodiment, after the demonstration module plays a reminder that the gesture range is too fast, the gesture recognition module is also used to continue to judge whether the motion range is lower than the second threshold. If below the second threshold,

[0055] The gesture recognition module is also used to send the range guide instruction to the demonstration module, and the demonstration module is also used to play the range guide file according to the range guide instruction. In this embodiment, the range guide file is range guide music or range guide video. Specifically, if the user is hearing-impaired, the amplitude-guiding video is played; if the user is not hearing-impaired, the amplitude-guiding music is played. The volume of amplitude guide music is inversely proportional to the amplitude of motion, and the brightness of amplitude guide video is inversely proportional to the amplitude of m...

Embodiment 3

[0058] The difference between the present embodiment and the second embodiment is that the amplitude guide music and the speed guide music in this embodiment are the same music, and the difference is that the volume will change when the amplitude guide music is used. The amplitude guide video and the speed guide video are the same video, the difference is that the brightness will change when used as the amplitude guide video. When the user's movement speed exceeds the first threshold and the movement amplitude is lower than the second threshold at the same time, there is no need to play two different music or videos, and no conflict will be caused.

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to the technical field of human-computer interaction, and particularly discloses an auxiliary semantic recognition system based on gesture recognition, which comprises an input module and an image acquisition module, the input module is used for collecting voice information and converting the voice information into a first text, the system further comprises an image processing module, a posture recognition module and a semantic recognition module, the semantic recognition module is used for judging whether the content of the first text is consistent with the content of asecond text or not, if yes, the semantic recognition module is further used for extracting consulting keywords from the first text, matching the consulting keywords with legal keywords in a legal wordbank and acquiring legal provisions corresponding to the legal keywords which are successfully matched, and the semantic recognition module is also used for outputting the legal provisions. By adopting the technical scheme of the invention, whether the input statement has an error or not can be accurately identified.

Description

technical field [0001] The invention relates to the technical field of human-computer interaction, in particular to an auxiliary semantic recognition system based on gesture recognition. Background technique [0002] Due to the high professional nature of law, it is difficult for non-professionals to master and use it flexibly. Therefore, when people encounter legal problems, they often need to seek help from lawyers for consultation on related legal issues. However, because the number of lawyers is relatively small and the consulting fees are relatively expensive, the consulting services of lawyers cannot meet the consulting requirements of everyone. In order to solve the legal consultation problems of the general public, a legal consultation robot that can provide self-service came into being. [0003] When people consult legal issues through the legal consultation robot, the consultee needs to input the consultation questions, and the system can automatically match the ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06F3/01G06K9/00G06F40/30G06F40/194
CPCG06F3/017G06V40/28
Inventor 吴怡
Owner 重庆大牛认知科技有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Eureka Blog
Learn More
PatSnap group products