Non-contact type intelligent inputting method based on video images and device using the same

A video image, non-contact technology, applied in the direction of user/computer interaction input/output, graphic reading, mechanical mode conversion, etc., can solve the problems of reducing user experience and not conforming to usage habits, so as to improve interactive response time, The effect of removing the restrictions on the use of venues and auxiliary materials and avoiding stimulation

Inactive Publication Date: 2013-05-15
INST OF AUTOMATION CHINESE ACAD OF SCI
View PDF5 Cites 31 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

In addition to the above two types of virtual input methods, there are also virtual input methods based on camera and video image processing, but the existing methods based on video processing impose various restrictions on the user's usage methods, and some methods require an auxiliary positioning object , such as a template with keyboard keys or paper containing printed images of keyboard keys
Some methods do not need objects for auxiliary positioning, but still ne

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Non-contact type intelligent inputting method based on video images and device using the same
  • Non-contact type intelligent inputting method based on video images and device using the same
  • Non-contact type intelligent inputting method based on video images and device using the same

Examples

Experimental program
Comparison scheme
Effect test

Example Embodiment

[0023] In order to make the objectives, technical solutions, and advantages of the present invention clearer, the following further describes the present invention in detail in conjunction with specific embodiments and with reference to the accompanying drawings.

[0024] figure 1 It shows a design example of a non-contact intelligent input system based on video images proposed by the present invention, which is composed of a display device, a three-dimensional video acquisition system, and a three-dimensional video analysis system.

[0025] The three-dimensional video capture system includes two cameras.

[0026] The video analysis system consists of CPU, ROM, DDR3 SDRAM, camera interface, HDMI interface and serial port. The video acquisition system and the video analysis system can be integrated into a display device, or constructed using existing hardware resources in a display device (such as a TV).

[0027] Two cameras simultaneously collect videos from two angles, and transfer t...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses an interactive intelligent inputting method based on video images. The interactive intelligent method based on the video images is capable of enabling a user to achieve non-contact type operations. Through collecting information like motion paths, speed and acceleration of two hands of the user, the input function of an entity keyboard is achieved, and the collected information is outputted in the mode of characters for upper layer software to invoke and achieve customized application. The non-contact type intelligent inputting method based on the video images is capable of further achieving dynamic adjustment of the operation range of the virtualized keyboard according to the change of the motion range of the two hands of the user. When the two hands of the user are arranged in the working area of an image collection device, a displaying device displays the relative positions of the virtualized keyboard and virtualized fingers of the user, the user can achieve the corresponding operations of the buttons of the virtualized keyboard through the pressing actions of the fingers. In view of being compatible with habits of pressing the keyboard of all kinds of users, the device using the interactive intelligent method based on the video images provides a supervising function of the inputting of the virtualized keyboard, and corrects differences of accuracy brought by all kinds of actions which are not the keyboard inputting action. The non-contact type intelligent inputting method based on the video images is characterized in that a user does not need any auxiliary positioning areas and auxiliary positioning objects for inputting, and the inputting action can be achieved at any position inside the working area of the image collection device. The interactive intelligent method based on the video images is low in cost, wide in applicability, and intelligently interactive. The invention further provides the realizing device based on the interactive intelligent method based on the video images.

Description

technical field [0001] The invention relates to the field of non-contact intelligent input, in particular to a non-contact intelligent input method and device based on video images Background technique [0002] With the rapid development of smart TVs, smart phones and other fields, touch screens have penetrated into every corner of society. Excellent user interaction experience greatly enhances the added value and technological content of products in the consumer field, providing more differentiated and intelligent experiences. Currently, two mainstream virtual input methods are soft keyboards and additional keyboards. Most of the smart phones (such as Apple's iPhone series) use touch screens to draw the shape of a virtual full keyboard as a medium for user input. This input method has also been applied and promoted in portable devices such as tablet computers, but this technology requires the support of touch screen technology, and there is still a gap between the accurac...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G06F3/01
Inventor 王东琳杜学亮郭若杉林啸蒿杰倪素萍张森林忱
Owner INST OF AUTOMATION CHINESE ACAD OF SCI
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products