[0065] In order to have a clearer understanding of the technical features, objectives and effects of the present invention, the specific embodiments of the present invention will be described in detail below with reference to the accompanying drawings. It should be understood that the following description is only a specific description of the embodiments of the present invention, and should not be used to limit the protection scope of the present invention.
[0066] The present invention provides a virtual reality interactive system and method. Its purpose is to adapt to most smart watches 1 on the market through a multi-sensor real-time data collection module, and realize the collection and transmission of data according to the sensors on the smart phone. To the virtual reality device side. Real-time transmission of collected sensor data requires higher real-time efficiency. To achieve intelligent interaction, the system can intelligently analyze and understand sensor data, and then understand human commands and states, and combine rich input with high-quality applications to form a wonderful virtual reality experience environment. By setting up multiple collection terminals, and the collection terminal equipment can be replaced at any time, multiple collection terminals can be matched to work at the same time, creating a multi-person online scene. Added cloud voice semantic recognition, real-time cloud analysis of voice data, and intelligent processing of voice input.
[0067] This application is based on the virtual reality device interaction system of the smart watch 1. It is a virtual reality interactive system with both hardware and software.
[0068] The hardware part is mainly composed of a virtual reality device host 2 and a virtual reality display helmet. The former is mainly responsible for running the entire software system, and the latter is mainly responsible for display functions. The software part is mainly composed of the software on the smart watch 1 side and the software on the virtual reality device side. Among them, you and are mainly responsible for data collection and device control, and the latter is mainly responsible for data processing, software operation, display and audio signal output.
[0069] This system realizes the collection and processing of multi-sensor data, and realizes the interaction between human and smart watch 1, watch and virtual reality system. Voice, acceleration, gyroscope, and direction can all become interactive factors and integrate with immersive 3D display. It embodies the human-centered interactive idea. Users can issue instructions with gestures and voice. The system presents corresponding 3D stereo displays and gives action feedback through sound and vibration. In addition to the above display interaction methods, there are also implicit interaction methods such as heart rate and lighting, which collect relevant information according to the actual situation to form a rich, real and natural interactive environment.
[0070] See figure 1 , figure 1 A structural block diagram of a virtual reality interactive system provided by the present invention, the virtual reality interactive system includes:
[0071] The smart watch 1 is used to collect user interaction information of the wearer; the user uses the hand wearing the watch to make corresponding actions or issue corresponding voice commands.
[0072] Host 2, which is communicatively connected to the smart watch 1. The host 2 is used to receive the user interaction information, analyze and identify it, and use the analyzed and identified user interaction information as interactive input information, and After the interactive input information is processed, the corresponding audio signal, video signal, and vibration signal are output, the audio signal and video signal are sent to the head-mounted device 3, and the audio signal and vibration signal are fed back to the smart watch 1. The smart watch 1 outputs the audio signal and the vibration signal; the user wearing a head-mounted helmet and the smart watch 1 can see the corresponding scene and hear the corresponding sound. That is, the user sees a new picture and hears a new sound through the helmet display 31, and feels the feedback vibration through the smart watch 1.
[0073] The head-mounted device 3 is communicatively connected to the host 2, and the head-mounted device 3 is used to output the video signal and the audio signal.
[0074] Regarding the above communication connection methods, the communication between the smart watch 1 and the virtual reality device is mainly the data transmission of the collected sensors, mainly wirelessly, so the connection can be of the following two types:
[0075] (1) Bluetooth wireless communication technology connection
[0076] (2) Wireless WIFI communication technology connection
[0077] Specifically, the smart watch 1 includes:
[0078] The sensor component 11 is used to collect user interaction information of the wearer;
[0079] The data processing and standardization module 12 is electrically connected to the sensor assembly 11. The data processing and standardization module 12 is used to filter and/or format the user interaction information; actively collect sensor data and pass simple Filtering. On some devices, the data will be reformatted to unify the standardized input of the virtual reality device.
[0080] The watch data wireless transmission module 13 is electrically connected to the data processing and standardization module 12 and wirelessly communicates with the host 2. The watch data wireless transmission module 13 is used for filtering and formatting user interaction information Send to the host 2; realize wireless two-way data transmission only between the watch end and the virtual reality device end.
[0081] The device control module 14 is electrically connected to the watch data wireless transmission module 13. The device control module 14 is used to parse the audio signal and vibration signal fed back by the host 2 and control the watch output module 15 to output the parsed audio Signal and vibration signal; the equipment control module 14 analyzes the content of the feedback data and controls the vibration motor and the horn to perform corresponding feedback operations. Responsible for receiving instructions and controlling the motor vibration and horn on the smart watch 1.
[0082] The watch output module 15 is electrically connected to the device control module 14 for outputting the parsed audio signal and vibration signal.
[0083] Among them, the data processing and standardization module 12, the watch data wireless transmission module 13 and the device control module 14 are data acquisition modules. The data acquisition module is customized according to the mainstream smart phones on the market, and the corresponding data acquisition is designed according to the different operating systems it carries Software. At present, there are two main operating systems for mainstream smartwatch 1, one is Android and the other is IOS. The data collection module can be customized according to the number of sensors equipped with different smart watches 1.
[0084] Specifically, the user interaction information includes heart rate, gravitational acceleration, angular velocity, direction, ambient light intensity and voice, that is, the data collection module will automatically collect the heart rate sensor 111, the gravity accelerometer 112, the gyroscope 113, the electronic compass 114, and the photosensitive The data of the sensor 115 and the sound sensor 116 are simply filtered and formatted, and then transmitted to the data wireless transmission module. The data wireless transmission module on the smart watch 1 side continuously transmits real-time data to the data wireless transmission module on the virtual reality device side. . The smart watch 1 segment receives the feedback data and sends it to the device control module 14. The sensor assembly 11 includes:
[0085] The heart rate sensor 111 is used to collect the heart rate of the wearer;
[0086] The gravity accelerometer 112 is used to collect the gravity acceleration of the smart watch 1;
[0087] The gyroscope sensor 113 is used to collect the angular velocity of the smart watch 1;
[0088] The electronic compass 114 is used to collect the direction of the smart watch 1;
[0089] The light sensor 115 is used to collect the ambient light intensity of the smart watch 1;
[0090] The sound sensor 116 is used to collect the voice of the wearer.
[0091] Specifically, the watch output module 15 includes:
[0092] The vibration motor 151 is used to output the vibration signal;
[0093] The watch speaker 152 is used to output the audio signal.
[0094] Specifically, the host 2 includes:
[0095] Host data wireless transmission module 21, which is connected to the smart watch 1 in wireless communication, and is used to receive the user interaction information;
[0096] An analysis and identification module 22, which is in communication connection with the host data wireless transmission module 21, and is used to analyze and identify the user interaction information;
[0097] The operating system module 23, which is communicatively connected with the analysis and recognition module 22, is used to use the analyzed and recognized user interaction information as interactive input information and process the interactive input information; as shown in the figure, the operating system module 23 includes Operating system and applications, applications: run on the operating system, responsible for interacting with users, processing interactive data, and relying on the operating system to output corresponding audio and video signals. Operating system: Responsible for interacting with hardware, it is a computer program that manages and controls hardware and software resources. Any other software must run under the support of operating system software. On the virtual reality device side, the application runs on the operating system, the application is responsible for interacting with the user, and the operating system is responsible for interacting with the hardware. The application is developed based on an interactive interface and is responsible for processing input and output of corresponding audio and video signals. The user can see the corresponding image display in the head-mounted device 3 and hear the corresponding sound. At the end of the smart watch, data collection software that matches with different watches must be installed, which is responsible for real-time data collection and transmission to the virtual reality device. The application receives the input, makes corresponding processing according to the logic, and outputs the corresponding audio signal, video signal, and vibration signal to the operating system.
[0098] The output module 24 is communicatively connected to the operating system module 23 and the host data wireless transmission module 21 for outputting corresponding audio signals, video signals and vibration signals. The operating system module 23 uses the The output module sends the audio signal and video signal to the head-mounted device 3, and feeds the audio signal and vibration signal back to the smart watch 1. The operating system distributes the video signal and audio signal to the helmet hardware, and passes the vibration signal and audio signal to the data wireless transmission module for data feedback to the smart watch 1. The data wireless transmission module on the virtual reality device side transmits feedback data in real time to the data wireless transmission module on the smart watch 1 side.
[0099] Specifically, the virtual reality device terminal performs analysis and recognition after receiving the data, including heart rate analysis, acceleration analysis, gyroscope analysis, direction signal recognition, photosensitive signal recognition, and voice recognition. The signal after analysis and recognition will be transmitted to the application as an interactive input. The analysis and identification module 22 includes:
[0100] The heart rate analysis unit 221 is configured to analyze the heart rate; analyze the received heart rate data to evaluate the physiological state.
[0101] The acceleration analysis unit 222 is configured to analyze the gravity acceleration; analyze the received gravity acceleration sensor data.
[0102] The gyroscope analysis unit 223 is used to analyze the angular velocity; analyze the received gyroscope sensor data.
[0103] The direction signal identification unit 224 is used to identify the direction; and analyze the received electronic compass sensor data.
[0104] The photosensitive signal identification unit 225 is used to identify the ambient light intensity; analyze the received data of the photosensitive sensor.
[0105] The voice recognition unit 226 is used to recognize the voice. Recognize the received voice data and parse out the instructions.
[0106] Specifically, the output module 24 includes:
[0107] The display signal output unit 241 is used to output video signals;
[0108] The sound signal output unit 242 is used to output audio signals;
[0109] The vibration signal output unit 243 is used to output vibration signals.
[0110] Specifically, the head-mounted device 3 includes:
[0111] The display 31, which is communicatively connected to the display signal output unit, is used to output the video signal; and is responsible for receiving and displaying the video signal.
[0112] The head-mounted device speaker 32 is communicatively connected to the sound signal output unit for outputting the audio signal. Responsible for receiving audio signals and playing them.
[0113] See figure 2 , figure 2 This is a flowchart of a virtual reality interaction method provided by the present invention. The virtual reality interaction method adopts the virtual reality interaction system described above, and the method includes:
[0114] The smart watch 1 collects user interaction information of the wearer;
[0115] The host 2 receives the user interaction information, analyzes and recognizes it, uses the analyzed and recognized user interaction information as interactive input information, processes the interactive input information, and outputs corresponding audio signals, video signals, and vibrations. Signal, sending the audio signal and video signal to the headset 3, and feeding back the audio signal and vibration signal to the smart watch 1;
[0116] The smart watch 1 outputs the audio signal and the vibration signal; the head-mounted device 3 outputs the video signal and the audio signal.
[0117] Wherein, the user interaction information includes heart rate, gravitational acceleration, angular velocity, direction, ambient light intensity and voice.
[0118] Although the present invention is disclosed as above in preferred embodiments, it is not intended to limit the present invention. Any person skilled in the art can make possible changes and modifications without departing from the spirit and scope of the present invention. Therefore, the present invention The scope of protection shall be subject to the scope defined by the claims of the present invention.