Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Input recognition method and device in virtual scene and storage medium

A virtual input and virtual scene technology, applied in the input/output process of data processing, input/output of user/computer interaction, character and pattern recognition, etc., can solve the problems of poor immersion and realism, and improve immersion The effect of feeling and realism

Pending Publication Date: 2022-07-01
中数元宇数字科技(上海)有限公司
View PDF7 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] However, in this way, the user still needs to interact with the controller or special sensor devices in the real world, which makes the user's sense of immersion and reality poor.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Input recognition method and device in virtual scene and storage medium
  • Input recognition method and device in virtual scene and storage medium
  • Input recognition method and device in virtual scene and storage medium

Examples

Experimental program
Comparison scheme
Effect test

Embodiment approach 1

[0080] Embodiment 1. If the bending angle of the finger is less than 90 degrees, the finger can be calculated according to the position of the starting joint point of the second phalanx, the bending angle of the finger, the actual length of the second phalanx and the actual length of the fingertip. fingertip coordinates.

[0081] like Figure 8 As shown, the starting joint point of the second phalanx is R2, and if R2 can be observed, the binocular positioning algorithm can be used to calculate the position of R2. Knowing the position of R2, the actual length of the second phalanx and the bending angle b of the second phalanx, the position of the starting joint point R5 of the fingertip can be obtained. Furthermore, the fingertip position R6 can be obtained by calculation according to the position of R5, the bending angle c of the fingertip segment, and the actual length of the fingertip segment.

Embodiment approach 2

[0082] Embodiment 2: If the bending angle of the finger is 90 degrees, the position of the fingertip is calculated according to the position of the starting joint point of the second knuckle and the distance that the first knuckle moves to at least one virtual input interface.

[0083] It should be noted that when the bending angle of the finger is 90 degrees, the user's fingertip can move in the same way with the first knuckle. For example, if the first knuckle moves down by 3cm, the fingertip will also go down. Moved 3cm. Based on this, the position of the fingertip is calculated when the position of the starting joint point of the second phalanx and the distance that the first phalanx moves to the at least one virtual input interface are known, and the problem of calculating the position of the fingertip can be transformed into The geometric problem of calculating the end position when the position of the starting point, the moving direction of the starting point and the mo...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The embodiment of the invention provides an input recognition method and device in a virtual scene and a storage medium. In the input recognition method, fingertip coordinates can be calculated by using a binocular positioning algorithm based on recognized positions of key points of a hand, and the fingertip coordinates are compared with at least one virtual input interface in a virtual scene; and if the fingertip position and a target virtual input interface in the at least one virtual input interface meet a set position rule, determining that the user executes an input operation through the target virtual input interface. In this way, the fingertip position of the user can be calculated through a binocular positioning algorithm, the user does not need to interact with a controller in the real world or special sensor equipment, and the immersion and reality of the virtual scene are further improved.

Description

technical field [0001] The embodiments of the present application relate to the technical field of virtual reality or augmented reality, and in particular, to an input recognition method, device, and storage medium in a virtual scene. Background technique [0002] With the rapid development of related technologies such as virtual reality, augmented reality, and mixed reality, head-mounted smart devices are constantly being introduced and the user experience is gradually improving, such as smart glasses such as head-mounted virtual reality glasses and head-mounted mixed reality glasses. [0003] In the prior art, smart glasses can be used to generate virtual interfaces such as holographic keyboards and holographic screens, and a controller or a special sensor device can be used to determine whether the user has interacted with the virtual interface, which enables the user to use it in the virtual world. keyboard and screen. [0004] However, in this way, the user still needs...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06F3/01G06F3/04886G06F3/042G06V40/20
CPCG06F3/017G06F3/04886G06F3/042
Inventor 潘仲光魏铂秦
Owner 中数元宇数字科技(上海)有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products