Virtual reality interaction method and system

A technology of virtual reality and interaction method, which is applied in the fields of telephone communication, electrical components, equipment with functional cameras, etc., can solve the problems of low cost of VR glasses and high dependence on the surrounding environment, and achieve low dependence on the surrounding environment, high efficiency, Accurate effect

Inactive Publication Date: 2017-04-26
SHANGHAI CONSTRUCTION GROUP
View PDF3 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0002] Existing interactive methods, such as mobile phone box-style VR glasses are very low cost and can be used with a mobile phone, such as external handles, mobile phone buttons, mobile phone microphones, mobile phone gyroscopes, mobile phone rear cameras, etc., but these existing All interaction methods require user hand and mouth operations, and are highly dependent on the surrounding environment

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Virtual reality interaction method and system
  • Virtual reality interaction method and system

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0029] Such as figure 1 As shown, the present invention provides a virtual reality interaction method, including:

[0030] Step S1, displaying operation items on the screen of the mobile phone;

[0031] Step S2, identifying the operation item corresponding to the user's eye movement position through the front camera of the mobile phone;

[0032] Step S3, acquiring the blinking action of the user through the front camera of the mobile phone;

[0033] Step S4, performing a corresponding operation of the corresponding operation item based on the blinking action. The present invention first recognizes the moving position of the eyeball through the front camera, selects the operation item, and then recognizes the blink of an eye through the front camera, and confirms the operation item. It does not need to interact with other devices other than the mobile phone, and only needs to operate with one eye. Uninterrupted, no need for user hand and mouth operation, low dependence on th...

Embodiment 2

[0041] According to another aspect of the present application, a virtual reality interactive system is also provided, including:

[0042] The display module is used for displaying operation items on the screen of the mobile phone;

[0043] The selection module is used to identify the operation item corresponding to the user's eyeball movement position through the front camera of the mobile phone;

[0044] The confirmation module is used to obtain the blinking action of the user through the front camera of the mobile phone;

[0045] An execution module, configured to execute a corresponding operation of the corresponding operation item based on the blinking action.

[0046] In the virtual reality interactive system according to an embodiment of the present application, in the display module, the selection items displayed on the screen of the machine are operation areas.

[0047] In the virtual reality interactive system according to an embodiment of the present application, t...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention provides a virtual reality interaction method and system. According to the virtual reality interaction method and system of the invention, an eyeball movement position is recognized through a front camera; an operation item is selected; eye blink is recognized through the front camera, and the operation item is confirmed; and therefore, interaction can be realized just by means of a mobile phone, one-eye operation is only required, user experience is not interrupted, hand and mouth operation of a user is not required, the thus, the method and system has low dependency on surrounding environment, high efficiency and high accuracy.

Description

technical field [0001] The invention relates to a virtual reality interaction method and system. Background technique [0002] Existing interactive methods, such as mobile phone box-style VR glasses are very low cost and can be used with a mobile phone, such as external handles, mobile phone buttons, mobile phone microphones, mobile phone gyroscopes, mobile phone rear cameras, etc., but these existing All interactive methods require user hand and mouth operations, and are highly dependent on the surrounding environment. Contents of the invention [0003] The purpose of the present invention is to provide a virtual reality interaction method and system, which does not require external equipment, does not require user hand and mouth operations, and has low dependence on the surrounding environment. [0004] In order to solve the above problems, the present invention provides a virtual reality interaction method, including: [0005] Display action items on the phone screen;...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): H04M1/725
CPCH04M2250/52H04M1/72454
Inventor 李晨露
Owner SHANGHAI CONSTRUCTION GROUP
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products