Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Method and device for interaction between VR/AR system and user

An interaction device and user technology, applied in the field of VR/AR system and user interaction, can solve the problems of low interaction efficiency, long interaction time, cumbersome interaction steps, etc., so as to reduce costs, reduce interaction steps and data, and improve interaction efficiency. Effect

Inactive Publication Date: 2017-03-22
深圳市原点创新有限公司
View PDF5 Cites 5 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] The purpose of the embodiments of the present invention is to provide a method for interacting between a VR / AR system and a user, aiming at solving the problem of cumbersome interaction steps between the VR / AR system and the user, long interaction time, and low interaction efficiency

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Method and device for interaction between VR/AR system and user
  • Method and device for interaction between VR/AR system and user
  • Method and device for interaction between VR/AR system and user

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0027] figure 1 It is an implementation flowchart of a method for interacting between a VR / AR system and a user provided by an embodiment of the present invention, and is described in detail as follows:

[0028] In step S101, in a virtual reality VR system or an augmented reality AR system, a gesture command transmission path is embedded;

[0029] In step S102, utilize camera to capture gesture figure;

[0030] In step S103, the gesture figure captured and the font feature of pre-stored are matched;

[0031] Wherein, step S103 is specifically:

[0032] Analyze the captured gesture graphics to obtain gesture features;

[0033] Match the gesture features with the pre-stored font features.

[0034] In step S104, when the matching is successful, the gesture command corresponding to the font feature is transmitted to the processor by using the gesture command transmission channel.

[0035] Wherein, step S104 is specifically:

[0036] When the matching is successful, acquire t...

Embodiment 2

[0040] The embodiment of the present invention describes the type of font features, which are described in detail as follows:

[0041] The font features include at least one of V-shaped features, inverted V-shaped features, OVV-shaped features, mountain-shaped features, 1-shaped features, and C-shaped features.

[0042] In the embodiment of the present invention, multiple types of font features are set to facilitate subsequent matching of multiple different gesture commands.

Embodiment 3

[0044] The implementation flowchart of step S103 of the method for interacting between the VR / AR system and the user provided by the embodiment of the present invention is described in detail as follows:

[0045] matching the captured gesture pattern with the pre-stored V-shaped feature; or,

[0046] matching the captured gesture pattern with the pre-stored inverted V-shaped feature; or,

[0047] Matching the captured gesture graphics with the pre-stored OVV font features; or,

[0048] matching the captured gesture graphics with the pre-stored chevron-shaped features; or,

[0049] matching the captured gesture graphics with the pre-stored 1-shaped features; or,

[0050] Match the captured gesture graphics with the pre-stored C-shaped features.

[0051] In the embodiment of the present invention, the captured gesture graphics are matched with the pre-stored glyph features according to a preset matching sequence, so as to meet the interaction requirements between the VR / AR sy...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention is applicable to the field of VR or AR and provides a method and device for interaction between a VR / AR system and a user. The method for the interaction between the VR / AR system and the user comprises the steps that a gesture instruction transmission path is inlaid in a VR system or an AR system; a camera is used to capture a gesture graph; the captured gesture graph is matched with a pre-stored font characteristic; and when matching is successful, the gesture instruction transmission path can transmit a gesture instruction corresponding to the font characteristic to a processor. According to the invention, the gesture can be used to satisfy a demand for the interaction between the VR / AR system and the user. The method and device have the beneficial effects that on one side, interaction duration is reduced and interaction efficiency is increased through decrease of interaction steps and data; and on the other side, the cost for adding external input equipment is reduced.

Description

technical field [0001] The invention belongs to the field of virtual reality VR or augmented reality AR, and in particular relates to a method and a device for interacting between a VR / AR system and a user. Background technique [0002] Augmented reality (English: Augmented Reality, abbreviation: AR), augmented reality technology is a new technology developed on the basis of virtual reality (English: Virtual Reality, abbreviation: VR), which can superimpose virtual reality when people come into contact with the real world. Electronic information augments or augments real-world information to help people perform various activities. [0003] However, the current interaction steps between VR / AR systems and users are cumbersome and take a long time, resulting in low interaction efficiency. The reason is that the general VR / AR system interacts with users through peripheral input devices. Due to the large number of interaction steps and data, the processing algorithms involved in...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06F3/01
CPCG06F3/017
Inventor 李兵张元元
Owner 深圳市原点创新有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products