Virtual touch screen human-computer interaction method, system and device based on depth sensor

A technology of depth sensor and human-computer interaction, which is applied in the field of virtual touch screen human-computer interaction based on depth sensor, can solve the problems of low commercialization, low reliability, and poor stability, so as to improve reliability, flexibility, and accuracy Degree, reduce the effect of misuse

Inactive Publication Date: 2018-02-09
广东广业开元科技有限公司
View PDF4 Cites 3 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, the current non-contact touch technology developed based on the depth sensor is prone to misuse, resulting in low reliability, and also has disadvantages such as low accuracy of interactive recognition, poor stability, and low operating experience. The non-contact touch technology developed based on sensors cannot well meet the user's requirements for the use of non-contact touch technology, which makes it difficult to widely promote and apply the technology, and the commercial availability is low

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Virtual touch screen human-computer interaction method, system and device based on depth sensor
  • Virtual touch screen human-computer interaction method, system and device based on depth sensor
  • Virtual touch screen human-computer interaction method, system and device based on depth sensor

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0068] Such as figure 1 Shown, the virtual touch-screen human-computer interaction method based on depth sensor, this method comprises the following steps:

[0069] Dynamically locate the anchor point of the virtual touch screen, and construct a virtual touch screen interactive space according to the virtual touch screen anchor point obtained by dynamic positioning;

[0070] Carry out spatial division to the virtual touch screen interaction space, so that the virtual touch screen interaction space is divided into at least two virtual touch screen interaction subspace areas;

[0071] Monitor the interactive touch points, and calculate the touch force factor corresponding to the monitored interactive touch points; wherein, the interactive touch points refer to the three-dimensional coordinate positions of the limbs in the virtual touch screen interaction space;

[0072] According to the two-dimensional coordinate position corresponding to the interactive touch point and the tou...

Embodiment 2

[0107] The program system corresponding to the above method, such as figure 2 As shown, the virtual touch screen human-computer interaction system based on the depth sensor, the system includes:

[0108] The construction module is used to dynamically locate the anchor point of the virtual touch screen, and construct a virtual touch screen interactive space according to the virtual touch screen anchor point obtained by dynamic positioning;

[0109] The division module is used for spatially dividing the virtual touch screen interaction space, so that the virtual touch screen interaction space is divided into at least two virtual touch screen interaction subspace areas;

[0110] The calculation module is used to monitor the interactive touch point, and calculate the touch strength factor corresponding to the monitored interactive touch point; wherein, the interactive touch point refers to the three-dimensional coordinate position of the limb in the virtual touch screen interacti...

Embodiment 3

[0135] A software and hardware combination device corresponding to the above method, a virtual touch screen human-computer interaction device based on a depth sensor, which includes:

[0136] memory for storing programs;

[0137] a processor for loading said program and performing the following steps:

[0138] Dynamically locate the anchor point of the virtual touch screen, and construct a virtual touch screen interactive space according to the virtual touch screen anchor point obtained by dynamic positioning;

[0139] Carry out spatial division to the virtual touch screen interaction space, so that the virtual touch screen interaction space is divided into at least two virtual touch screen interaction subspace areas;

[0140] Monitor the interactive touch points, and calculate the touch force factor corresponding to the monitored interactive touch points; wherein, the interactive touch points refer to the three-dimensional coordinate positions of the limbs in the virtual tou...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a virtual touch screen human-computer interaction method, system and device based on a depth sensor. The system comprises a construction module, a division module, a calculation module and a control module. The method comprises the steps that a virtual touch screen interactive space is constructed according to virtual touch screen anchor points; the virtual touch screen interactive space is divided; interactive touch points are monitored, and touch force factors corresponding to the interactive touch points are calculated; and operation control is performed according totwo-dimensional coordinate positions and the touch force factors corresponding to the interactive touch points. The device comprises a memory and a processor used for implementing the virtual touch screen human-computer interaction method. By use of the virtual touch screen human-computer interaction method, system and device, the reliability and flexibility of virtual touch screen interaction can be greatly improved, responses can be made to different touch screen operation based on region boundaries, misoperation is reduced, and the accuracy of virtual touch screen operation is greatly improved. The virtual touch screen human-computer interaction method, system and device based on the depth sensor can be widely applied to the human-computer interaction field.

Description

technical field [0001] The invention relates to non-contact human-computer interaction technology, in particular to a virtual touch screen human-computer interaction method, system and device based on a depth sensor. Background technique [0002] Explanation of technical terms: [0003] The camera of the first-generation Kinect system: it belongs to the depth sensor, which uses PrimeSense’s structured light technology, that is, perceives the surrounding environment through the CMOS infrared sensor, and uses discrete light spots to project such a one-dimensional or two-dimensional image onto the measured object. According to the size and distortion of the image, the surface shape of the measured object can be judged, that is, the depth information. In the somatosensory device of Microsoft Kinect, two infrared cameras collect and provide depth data by emitting / receiving infrared rays. [0004] The camera of the second-generation Kinect system: it belongs to the depth sensor,...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06F3/01G06F3/041
CPCG06F3/011G06F3/0416
Inventor 蔡禹贾义动夏雷
Owner 广东广业开元科技有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products