Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Mobile inspection video quality correction method based on significance multi-feature fusion

A multi-feature fusion and video quality technology, which is applied in digital video signal modification, TV, image data processing, etc., can solve the problems of inability to evaluate and correct mobile inspection video quality, and achieve improved user experience, high recognition accuracy, and The results are accurate

Active Publication Date: 2019-10-08
CHINA UNIV OF MINING & TECH
View PDF5 Cites 4 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] In view of the above analysis, the embodiment of the present invention aims to provide a mobile inspection video quality correction method based on salient multi-feature fusion to solve the problem that the existing technology cannot effectively evaluate and correct the quality of mobile inspection videos

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Mobile inspection video quality correction method based on significance multi-feature fusion
  • Mobile inspection video quality correction method based on significance multi-feature fusion
  • Mobile inspection video quality correction method based on significance multi-feature fusion

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0089] A specific embodiment of the present invention discloses a mobile inspection video quality correction method based on salient multi-feature fusion, such as figure 1 shown, including the following steps:

[0090] S1. Block any still image including the object to be detected in the mobile inspection video, and determine all macroblocks containing the identification features of the object to be detected in the block result, and the significance factor of each macroblock;

[0091] S2. Use each of the above macroblocks to traverse other images in the mobile inspection video, obtain the image block most similar to the macroblock in each frame image, and then obtain the motion vector of each macroblock, combined with its significant feature factor, to obtain The saliency matrix of each frame image;

[0092] S3. According to the obtained significance matrix, determine the block effect eigenvalue, blur effect eigenvalue and information entropy eigenvalue of the mobile inspectio...

Embodiment 2

[0096] Optimizing on the basis of Example 1, the overall idea is as follows figure 2 As shown, in step S1, the saliency factor of each macroblock can be determined by the following formula

[0097] S SDSP (i,j)=S F (x)·S C (x)·S D (x) (1)

[0098] In the formula, S F (x) is the frequency prior, S C (x) is the color prior, S D (x) is the position prior, x is the pixel point matrix corresponding to the macroblock, and (i, j) represents the pixel point.

[0099] Specifically, S F (x) can be obtained by the following steps:

[0100] S111. Perform Lab color space conversion on the macroblock to obtain L channel, a channel and b channel in the three color channels.

[0101] S112. According to the three-color channels obtained above, combined with band-pass filtering, the frequency prior S is obtained by the following formula F (x)

[0102]

[0103] in

[0104] g(x)=f(G(u)) (3)

[0105]

[0106] In the formula, u=(u,v)∈R 2 is the frequency domain coordinate of t...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to a mobile inspection video quality correction method based on saliency multi-feature fusion, belongs to the technical field of video quality correction, and solves the problemthat effective quality evaluation and correction cannot be carried out on a mobile inspection video in the prior art. The method comprises the following steps: partitioning any static image comprisinga to-be-detected object in a mobile inspection video, and determining all macro blocks comprising identification characteristics of the to-be-detected object and significance factors of the macro blocks; traversing other images in the mobile inspection video by using each macro block to obtain an image block most similar to the macro block in each frame of image so as to obtain a motion vector ofeach macro block and a significance matrix of each frame of image; determining a block effect characteristic value, a fuzzy effect characteristic value and an information entropy characteristic valueof the mobile inspection video according to the obtained significance matrix; and establishing a video quality evaluation model, judging whether the video quality is qualified or not, and if not, correcting camera parameters until the video quality is qualified.

Description

technical field [0001] The invention relates to the technical field of video quality correction, in particular to a mobile inspection video quality correction method based on salient multi-feature fusion. Background technique [0002] The current mobile inspection system, such as the mine system, has a large number of coal conveyor belt conveyors and coal flows, and the working conditions are complicated, making it difficult for humans to truly detect whether the belt transportation is safe or not. Generally, the coal flow is automatically detected through mobile inspection videos shipping condition. However, during the detection process, due to various camera environment problems, the video image will be unclear and distorted, which will affect the detection effect. It is necessary to perform real-time quality evaluation and correction on the mobile inspection video. [0003] Carry out real-time quality evaluation and correction on the mobile inspection video. On the one ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): H04N17/00H04N19/86H04N19/176H04N19/137G06T7/13G06T7/246
CPCG06T7/13G06T7/246G06T2207/10016G06T2207/20221G06T2207/30168H04N17/00H04N19/137H04N19/176H04N19/86H04N19/865
Inventor 程德强许超寇旗旗陈亮亮赵凯
Owner CHINA UNIV OF MINING & TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products