Unlock instant, AI-driven research and patent intelligence for your innovation.

A sound waveform processing method, device, mobile terminal and VR headset

A technology of a head-mounted device and a processing method, applied in the field of communication, can solve problems such as adjustment, poor user experience, dizziness, etc., and achieve the effect of better user experience, better immersion, and better user experience

Active Publication Date: 2018-01-09
NUBIA TECHNOLOGY CO LTD
View PDF4 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

As a result, when playing audio, the human ear perceives the sound distance from the sounding object, and there is a certain degree of deviation from the image distance of the sounding object seen by the human eye, and cannot be adjusted in real time according to the image distance felt by the human eye. For VR experience users, this factor greatly affects the degree of immersion of users
Obviously, the current technology cannot meet the increasing immersion needs of users, and the deviation of sound and image can easily cause fatigue and dizziness, resulting in poor user experience

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A sound waveform processing method, device, mobile terminal and VR headset
  • A sound waveform processing method, device, mobile terminal and VR headset
  • A sound waveform processing method, device, mobile terminal and VR headset

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0110] Figure 5 It is a flow chart of a sound waveform processing method in an embodiment of the present invention. The method in this embodiment is applicable to a VR head-mounted device. The following combines Figure 5 To describe a sound waveform processing method of this embodiment, such as Figure 5 As shown, the method of the present embodiment includes:

[0111] S10. Collect the sound wave signal, and acquire sound source angle data of the sound wave signal;

[0112] S20. According to the optical parameters of the optical lens of the virtual reality head-mounted device, determine the actual distance from the sound source of the sound wave signal to the designated object, and determine the ratio of the actual distance to the default distance from the sound source to the designated object relation;

[0113] S30. Correct the sound wave signal in real time according to the sound source angle data and the proportional relationship.

[0114] The method of this embodimen...

Embodiment 2

[0143] Figure 11 It is a schematic diagram of a sound waveform processing device according to an embodiment of the present invention, such as Figure 11 As shown, a sound waveform processing device 0001 of this embodiment includes:

[0144] An acquisition module 1000, configured to acquire a sound wave signal, and acquire sound source angle data of the sound wave signal;

[0145] The determination module 2000 is configured to determine the actual distance and angle from the sound source of the acoustic wave signal to the specified object according to the optical coefficient of the optical lens of the virtual reality head-mounted device, and determine the relationship between the actual distance and the specified object from the sound source to the specified object. The proportional relationship and angle of the default distance of the object;

[0146] A correction module 3000, configured to correct the sound wave signal in real time according to the sound source angle data ...

Embodiment 3

[0149] Figure 12 It is a schematic diagram of a sound waveform processing device according to an embodiment of the present invention, such as Figure 12 As shown, the acquisition module 1000 includes:

[0150] Start the recording unit 1010, which is used to start the recording function of the mobile terminal and collect sound wave signals in real time;

[0151] The parameter collection unit 1020 is configured to collect the frequency response, time difference between microphones and intensity difference between microphones of the sound wave signal through a microphone array composed of two microphones with a relative angle of 180° on the specified object;

[0152] The comparison unit 1030 is configured to compare the preset sound source angle database, and obtain the sound source angle data of the sound wave signal according to the frequency response, the time difference between the microphones and the intensity difference between the microphones.

[0153] In this embodimen...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

A sound waveform processing method comprises the steps of: collection sound wave signals, and obtaining sound source angle data of the sound wave signals; according to optical parameters of an optical lens of a virtual reality head-mounted device, determining the practical distance from the sound source of the sound wave signals to an assigned object, and determining the proportion relation of the practical distance to a default distance from the sound source to the assigned object; and according to the sound source angle data and the proportion relation, correcting the sound wave signals in real time. According to the scheme, the audio playing distance and relative angle of a sounding object are adjusted in real time according to the optical parameters of the VR head-mounted device, the sounding object seen by eyes of a user and the sounding object heard by ears are arranged at the same position, no audio-visual deviation is generated, and the experience of the user is better.

Description

technical field [0001] This application relates to but is not limited to the field of communication technology, especially a sound waveform processing method, device, mobile terminal and VR headset. Background technique [0002] With the current technology, in the virtual scene provided by the VR (Virtual Reality, virtual reality) head-mounted device, the sound emitted by the virtual object will not be associated with the optical parameters of the current VR head-mounted device, and the direction and size of the sound will not be based on the sound The image distance of the object perceived by the user's eyes is adjusted in real time. As a result, when playing audio, the human ear perceives the sound distance from the sounding object, and there is a certain degree of deviation from the image distance of the sounding object seen by the human eye, and cannot be adjusted in real time according to the image distance felt by the human eye. For VR experience users, this factor gr...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): H04S1/00G10L19/008
CPCG10L19/008H04S1/002H04S1/005
Inventor 张圣杰
Owner NUBIA TECHNOLOGY CO LTD