Operation identification method of non-contact screen

A non-contact, recognition method technology, applied in the field of operation recognition of non-contact screens, can solve the problems of single recognition gesture, hand occlusion, and large influence of light exposure.

Active Publication Date: 2020-04-10
SAMSUNG ELECTRONICS CHINA R&D CENT +1
View PDF3 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0003] Specifically, the machine vision method refers to using a camera to shoot user operations and identify user operations based on the captured content. However, this processing method needs to rely on the camera to be turned on, and the impact of receiving light is large, and the distance judgment is inaccurate. Defects such as occlusion; the electromagnetic signal method refers to the use of electromagnetic induction signals between tools such as stylus and the screen to identify the operations performed by the user through the tool, but this processing method needs to add additional physical equipment and an electromagnetic induction layer under the screen , and does not support multi-touch, the gesture recognition is relatively simple

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Operation identification method of non-contact screen
  • Operation identification method of non-contact screen
  • Operation identification method of non-contact screen

Examples

Experimental program
Comparison scheme
Effect test

example 1

[0105] Example 1: Identify click operation

[0106] a) A single finger hovers above the button to be clicked, the system detects the finger, and indicates the position on the screen, such as Figure 8a shown;

[0107] b) The finger moves down a certain distance in the vertical direction and quickly returns to the initial position. The system recognizes it as a click, such as Figure 8b shown.

example 2

[0108] Example 2: Identify sliding page operations

[0109] a) On a slidable page, such as a web page, the main interface, etc., the finger hovers somewhere, and the system recognizes and indicates the position of the touch point on the screen, such as Figure 9a shown;

[0110] b) Keep the height of the finger unchanged, move in the horizontal direction, and realize page dragging, such as Figure 9b shown.

example 3

[0111] Example 3: 3D object scaling and rotation operation

[0112] a) For a 3D object that can be scaled and rotated, the finger hovers over it, and the system recognizes and marks the touch point, such as Figure 10a shown;

[0113] b) The distance between the finger and the vertical remains unchanged, and the finger is opened or contracted in the horizontal direction to realize the scaling of the target, such as Figure 10b shown;

[0114] c) Two fingers rotate in the same direction to realize the rotation of the target object, such as Figure 10c shown.

[0115] It can be seen from the specific implementation of the above-mentioned application that this application uses algorithms to accurately calculate the distance and coordinates of the finger relative to the screen, and recognizes the user's operation mode through AI neural network training, providing users with an accurate and intelligent non-contact screen operation experience.

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses an operation identification method of a non-contact screen. The method comprises the following steps: sending a detection signal and receiving a corresponding return signal; carrying out contact detection by utilizing the sent detection signal and the received return signal; for each detected contact, calculating a distance R between the contact and the emission source of the detection signal and a distance D between the projection of the contact to a screen and the emission source; determining the coordinate information of the projection relative to the emission sourceaccording to the distance D between the projection and the emission source and the direction angle theta of the return signal, and correcting the coordinate information by using a preset correction coordinate to obtain corrected coordinate information; calculating the dynamic characteristics of the contact; and identifying an operation category by utilizing the dynamic characteristics of the contact, and identifying an operation object by utilizing the corrected coordinate information of the projection of the contact on the screen. By applying the method, the operation of a user can be accurately identified without adding extra physical equipment; and the method is not interfered by light rays and supports multi-point touch control.

Description

technical field [0001] The present application relates to smart terminal technology, in particular to a non-contact screen operation recognition method. Background technique [0002] The non-contact operation of the screen is introduced in the existing smart terminal technology. At present, the non-contact operation technology of the screen generally adopts the method of machine vision or electromagnetic signal. [0003] Specifically, the machine vision method refers to using a camera to shoot user operations and identify user operations based on the captured content. However, this processing method needs to rely on the camera to be turned on, and the impact of receiving light is large, and the distance judgment is inaccurate. Defects such as occlusion; the electromagnetic signal method refers to the use of electromagnetic induction signals between tools such as stylus and the screen to identify the operations performed by the user through the tool, but this processing meth...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G06F3/041
CPCG06F3/0412G06F2203/04108
Inventor 张晋孙静杨建军李斌杨泗群史志建
Owner SAMSUNG ELECTRONICS CHINA R&D CENT
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products