Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Human-computer interaction method and device

A technology of human-computer interaction and images to be tested, applied in the field of image analysis, can solve problems such as difficult embedding, low flexibility, and large product volume, and achieve good effects, enhanced stability, and good flexibility

Active Publication Date: 2015-09-09
SHENZHEN INST OF ADVANCED TECH CHINESE ACAD OF SCI
View PDF4 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0003] However, based on 3D technology, gesture and somatosensory control technology: represented by Microsoft Kinect system, this technology realizes the three-dimensional real-time reconstruction of the scene through dynamic three-dimensional reconstruction technology, and imports the visual detection algorithm from 2D to 3D space, which reduces the difficulty of recognition. , but it increases the hardware cost and calculation amount, and the product is relatively large, so it is difficult to embed it into the existing intelligent terminal equipment
[0004] Moreover, the traditional technology is to preset a specific target image, so that the user must provide a target within the preset specific target image during use, and the flexibility is low.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Human-computer interaction method and device
  • Human-computer interaction method and device
  • Human-computer interaction method and device

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0044] In order to solve the problem of low flexibility for users in the traditional technology, a human-computer interaction method and device for user-defined target images are proposed.

[0045] like figure 1 As shown, it is a flow chart of the steps of the human-computer interaction method of an embodiment, including the following steps:

[0046] Step S201, receiving a learning instruction input by a user, and starting a learning mode.

[0047] Step S202, acquiring the target image specified by the user, collecting positive samples and negative samples, and building a classifier including multiple decision trees based on random forest training.

[0048] The acquisition of the target image specified by the user is the user-defined target image. Assume that the user wants to use the palm to realize human-computer interaction. Then in the learning mode, the palm image can be provided as the target image through the camera. In order to make the collection of positive sampl...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

A man-machine interaction method comprises the following steps: a first step of obtaining a target image appointed by a user and establishing classifiers including a plurality of decision-making trees based on random forest training, a second step of storing a plurality of classifiers including the plurality of decision-making trees and a positive-negative sample union, a third step of obtaining an image to be detected, a fourth step of calculating probability that the image to be detected is the same as the target image, a fifth step of judging whether the image to be detected is the target image or not according to a preset judging threshold value, a sixth step of analyzing and obtaining relevancy between the image to be detected and the positive-negative sample union, a seventh step of judging whether the image to be detected is the target image or not according to first relevancy threshold value, an eighth step of judging the image to be detected is the target image at last when the image to be detected is the same as the target image under the judging threshold value and the first relevancy threshold value, and a ninth step of utilizing the image to be detected to adjust parameters of the classifiers and complement the positive-negative sample union. The invention further provides a man-machine interaction device. According to the man-machine interaction method and the device, the user defined target image can be achieved, and recognition precision and system stability can be continuously enhanced in the process of using.

Description

【Technical field】 [0001] The invention relates to the field of image analysis, in particular to a human-computer interaction method and device. 【Background technique】 [0002] In recent years, with the popularization of smart terminal devices, seeking a more natural and simple way of human-computer interaction has become a hot issue in the fields of scientific research and industry. Throughout the development history of human-computer interaction technology, it has gradually developed from mouse, keyboard, remote control, etc. to non-contact operation methods such as vision, voice, gesture, etc., and visual technology is the most important means. That is, the image is obtained through the camera, and the operator's actions and intentions are judged based on image intelligent analysis technology, and then the machine is controlled. But the biggest problem it faces is the complexity and uncertainty of the environment, which makes the technology not yet fully mature. With the...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06F3/01G06K9/66
Inventor 郑锋赵颜果宋展
Owner SHENZHEN INST OF ADVANCED TECH CHINESE ACAD OF SCI
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products