Method and apparatus for sound object following

a technology of sound object and sound processing technology, applied in the field of multimedia signal processing, can solve the problems of difficult application of 3d audio processing technology to content for broadcasting and real-time streaming, time and money, and limited use of 3d audio processing technology such as dolby atmos, so as to improve the sense of immersion

Inactive Publication Date: 2020-09-10
LG ELECTRONICS INC
View PDF0 Cites 2 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Benefits of technology

[0007]An object of the present disclosure is to provide a method and apparatus for localizing a sound image of an audio object based on the location of a video object related to the audio object to improve the sense of immersion regarding audio.
[0008]Another object of the present disclosure is to provide a method and apparatus for effectively determining a relationship between an audio object and a video object contained in a multimedia signal.
[0026]According to the present disclosure, the sense of immersion may be improved regarding audio by localizing a sound image of an audio object based on the location of a video object related to an audio object.
[0027]In addition, according to the present disclosure, a relationship between an audio object and a video object contained in a multimedia signal may be effectively determined.

Problems solved by technology

Accordingly, creating content based on the conventional 3D audio processing technology may take a lot of time and money.
Further, it may be difficult to apply the conventional 3D audio processing technology to content for broadcasting and real-time streaming.
Thus, 3D audio processing technologies such as Dolby Atmos are limited to some content, such as movies, and not yet applied to most broadcast and streaming content.
For audio content contained in most broadcast and streaming content, a sound image is not properly localized according to the location of the sound source or video object, and therefore the sense of immersion created is limited.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Method and apparatus for sound object following
  • Method and apparatus for sound object following
  • Method and apparatus for sound object following

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0048]Reference will now be made in detail to the preferred embodiments of the present disclosure, examples of which are illustrated in the accompanying drawings. Wherever possible, the same reference numbers will be used throughout the drawings to refer to the same or like parts.

[0049]In order to apply 3D audio processing technologies such as Dolby Atmos to broadcast or real-time streaming content, audio mixing technicians need to generate and transmit mixing parameters for 3D effects in real time. The current technology has a difficulty in performing real-time processing. In particular, in order to properly apply 3D audio processing technology such as Dolby Atmos, it is necessary to accurately identify the location of a loudspeaker set on the user (or decoder) side. However, and it is rarely possible for the content producer and supplier (or encoder) to identify all the information on the loudspeaker location in a typical house. Therefore, there is a technical difficulty in applyi...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The present disclosure relates to a method and apparatus for processing a multimedia signal. More specifically, the present disclosure relates to a method comprising obtaining a video frame and an audio frame from the multimedia signal; obtaining at least one video object from the video frame and at least one audio object from the audio frame; determining a correlation between the at least one video object and the at least one audio object; and performing a directional rendering on a specific audio object of the at least one audio object, based on a screen location of a specific video object related with the specific audio object from among the at least one video object according to the determined correlation, and an apparatus therefor.

Description

CROSS-REFERENCE TO RELATED APPLICATIONS[0001]Pursuant to 35 U.S.C. § 119(e), this application claims the benefit of U.S. Provisional Application No. 62 / 815,361, filed on Mar. 8, 2019, the contents of which are all hereby incorporated by reference herein in its entirety.FIELD[0002]The present disclosure relates to multimedia signal processing, and more particularly, to a method and apparatus for providing sound with a three-dimensional sound effect and a sense of immersion through an apparatus configured to output a video signal and an audio signal.BACKGROUND[0003]With advances in technology, devices equipped with a larger and higher-resolution display and multiple speakers have become widespread. In addition, research on video coding technology for transmitting and receiving more vivid images and audio coding technology for transmitting and receiving immersive audio signals has been actively conducted, and multimedia contents produced or created based on the video coding technology ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(United States)
IPC IPC(8): H04R29/00G06K9/00G06V10/75
CPCG06K9/00718H04R29/001G06K9/00744H04R29/008H04W4/40H04W4/02H04W4/027G06V40/171G06V20/46H04S2400/11H04S7/30H04R2499/15H04R2499/13H04H20/89H04H20/95G06V40/20G06V10/806G06V10/75G06F18/22G06F18/253H04N21/4394H04N21/44008H04N21/4341H04N5/607H04S3/008H04N21/47205G10L25/24G10L25/57H04S2400/01G06V20/41G06V40/168
Inventor JUNG, SUNGWONCHOI, TACKSUNGKANG, DONGHYUNLEE, SEUNGSUCHO, TAEGIL
Owner LG ELECTRONICS INC
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products