Unlock instant, AI-driven research and patent intelligence for your innovation.
A multimedia digital fusion method and device
What is Al technical title?
Al technical title is built by PatSnap Al team. It summarizes the technical point description of the patent document.
A fusion method and multimedia data technology, applied in the field of multimedia digital fusion methods and devices, can solve the problems of lack of ease of use and practicability, inability to express, singleness, etc.
Active Publication Date: 2021-05-25
海南风语筑数字科技有限公司
View PDF9 Cites 0 Cited by
Summary
Abstract
Description
Claims
Application Information
AI Technical Summary
This helps you quickly interpret patents by identifying the three key elements:
Problems solved by technology
Method used
Benefits of technology
Problems solved by technology
However, the current audio and video fusion methods are relatively single, and it is impossible to perform multiple expressions more accurately and quickly, and the degree of audio and video fusion also has low precision, which is not easy to use and practical. How to make the fusion bottom layer More representative visual, auditory, and semantic features are problems that need to be solved
Method used
the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more
Image
Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
Click on the blue label to locate the original text in one second.
Reading with bidirectional positioning of images and text.
Smart Image
Examples
Experimental program
Comparison scheme
Effect test
Embodiment 1
[0026] As shown in Figure 1(a)-(c), it is a schematicflowchart of a multimedia digital fusion method in an embodiment, which specifically includes the following steps:
[0027] Step 11, acquire the multimedia data set to be fused.
[0028] Step 12: Analyze each audio and video data of the multimedia data set according to a preset strategy, and determine the classification information of the multimedia data set through a preset classification model based on the generated analysis results.
[0029] In one embodiment, each audio and video data of the multimedia data set is analyzed according to a preset strategy, and before the classification information of the multimedia data set is determined through a preset classification model through the generated analysis result, it also includes:
[0030] Step 111, acquiring multiple types, multiple categories of each type, and multiple image samples corresponding to each category as a training data set.
[0046] The following embodiments further consider the recognition performance of visual information in an acoustic noise environment, especially in a noisy environment, to further improve the accuracy of multimedia digital fusion and the applicability of operation.
[0047] As shown in Figure 2 (a)-(b), it is a schematicflow chart of a multimedia digital fusion method in another embodiment, which specifically includes the following steps:
[0048] Step 21, acquire the multimedia data set to be fused.
[0049] Step 22: Find the category of each audio and video data in the multimedia data set from the preset multimedia database according to the preset strategy, and count the frequency of occurrence of the category of the audio and video data.
[0050] In step 22, the preset policy may be pre-configured, and is used to find a policy for each category of audio and video data. The preset strategy includes: preset one or more keywords used to identify the category of each audio an...
the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More
PUM
Login to View More
Abstract
The disclosure provides a multimedia digital fusion method, which acquires a multimediadata set to be fused; analyzes each audio and video data of the multimediadata set according to a preset strategy, and determines the multimedia data through a preset classification model through the generated analysis results The classification information of the collection; extract at least two audio and video data frame sequences to be processed under the same classification information of the multimedia data set; define at least two audio and video data frame sequences to be processed as fusion frames, and define other audio and video data frame sequences is the calibration frame; fuse the fusion frame and the calibration frame to complete the fusion operation of the multimedia data set. The method can accurately and quickly complete the multimedia digital fusion operation, and has the ease of use and practicability of the fusion operation. The disclosure also proposes a multimedia digital fusion device.
Description
technical field [0001] The present disclosure relates to the technical field of multimedia and image processing, in particular, to a multimedia digital fusion method and device. Background technique [0002] With the development of science and technology, multimedia technology has an irreplaceable position in people's daily life. Displaying corresponding text information and picture information when playing audio can make audio display more expressive. However, the current audio and video fusion methods are relatively single, and it is impossible to perform multiple expressions more accurately and quickly, and the degree of audio and video fusion also has low precision, which is not easy to use and practical. How to make the fusion bottom layer More representative visual, auditory, and semantic features are problems that need to be solved. Contents of the invention [0003] In order to solve the technical problems in the prior art, the embodiment of the present disclosur...
Claims
the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More
Application Information
Patent Timeline
Application Date:The date an application was filed.
Publication Date:The date a patent or application was officially published.
First Publication Date:The earliest publication date of a patent with the same application number.
Issue Date:Publication date of the patent grant document.
PCT Entry Date:The Entry date of PCT National Phase.
Estimated Expiry Date:The statutory expiry date of a patent right according to the Patent Law, and it is the longest term of protection that the patent right can achieve without the termination of the patent right due to other reasons(Term extension factor has been taken into account ).
Invalid Date:Actual expiry date is based on effective date or publication date of legal transaction data of invalid patent.