Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Multimedia digital fusion method and device

A fusion method and multimedia data technology, applied in the field of multimedia digital fusion methods and devices, can solve the problems of single, low accuracy of audio and video fusion, lack of ease of use and practicality, etc.

Active Publication Date: 2020-06-05
三亚至途科技有限公司
View PDF14 Cites 2 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, the current audio and video fusion methods are relatively single, and it is impossible to perform multiple expressions more accurately and quickly, and the degree of audio and video fusion also has low precision, which is not easy to use and practical. How to make the fusion bottom layer More representative visual, auditory, and semantic features are problems that need to be solved

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Multimedia digital fusion method and device
  • Multimedia digital fusion method and device
  • Multimedia digital fusion method and device

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0026] As shown in Figure 1(a)-(c), it is a schematic flowchart of a multimedia digital fusion method in an embodiment, which specifically includes the following steps:

[0027] Step 11, acquire the multimedia data set to be fused.

[0028] Step 12: Analyze each audio and video data of the multimedia data set according to a preset strategy, and determine the classification information of the multimedia data set through a preset classification model based on the generated analysis results.

[0029] In one embodiment, each audio and video data of the multimedia data set is analyzed according to a preset strategy, and before the classification information of the multimedia data set is determined through a preset classification model through the generated analysis result, it also includes:

[0030] Step 111, acquiring multiple types, multiple categories of each type, and multiple image samples corresponding to each category as a training data set.

[0031] Step 211: Train a prese...

Embodiment 2

[0046] The following embodiments further consider the recognition performance of visual information in an acoustic noise environment, especially in a noisy environment, to further improve the accuracy of multimedia digital fusion and the applicability of operation.

[0047] As shown in Figure 2 (a)-(b), it is a schematic flow chart of a multimedia digital fusion method in another embodiment, which specifically includes the following steps:

[0048] Step 21, acquire the multimedia data set to be fused.

[0049] Step 22: Find the category of each audio and video data in the multimedia data set from the preset multimedia database according to the preset strategy, and count the frequency of occurrence of the category of the audio and video data.

[0050] In step 22, the preset policy may be pre-configured, and is used to find a policy for each category of audio and video data. The preset strategy includes: preset one or more keywords used to identify the category of each audio an...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a multimedia digital fusion method. The method comprises the following steps: obtaining a to-be-fused multimedia data set; analyzing each piece of audio and video data of the multimedia data set according to a preset strategy, and determining classification information of the multimedia data set through a preset classification model according to a generated analysis result;extracting at least two to-be-processed audio and video data frame sequences under the same classification information of the multimedia data set; defining the at least two to-be-processed audio and video data frame sequences as fusion frames, and defining other audio and video data frame sequences as calibration frames; and fusing the fusion frames and the calibration frames to complete the fusion operation of the multimedia data set. According to the method, the multimedia digital fusion operation can be completed accurately and quickly, and the method has usability and practicability of thefusion operation. The invention further provides a multimedia digital fusion device.

Description

technical field [0001] The present disclosure relates to the technical field of multimedia and image processing, in particular, to a multimedia digital fusion method and device. Background technique [0002] With the development of science and technology, multimedia technology has an irreplaceable position in people's daily life. Displaying corresponding text information and picture information when playing audio can make audio display more expressive. However, the current audio and video fusion methods are relatively single, and it is impossible to perform multiple expressions more accurately and quickly, and the degree of audio and video fusion also has low precision, which is not easy to use and practical. How to make the fusion bottom layer More representative visual, auditory, and semantic features are problems that need to be solved. Contents of the invention [0003] In order to solve the technical problems in the prior art, the embodiment of the present disclosur...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): H04N5/262H04N5/265G06F16/483G06F16/45
CPCG06F16/45G06F16/483H04N5/262H04N5/265
Inventor 焦彦柱张浩
Owner 三亚至途科技有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products