Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

A video coding method based on jnd model

A video coding and model technology, applied in the field of video coding, can solve the problems of coding and transmission waste, achieve the effect of simple and convenient calculation, and improve the efficiency of video coding

Active Publication Date: 2022-04-05
JINAN UNIVERSITY
View PDF6 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

It is a waste for the device terminal to encode and transmit information that cannot be perceived by the human eye

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A video coding method based on jnd model
  • A video coding method based on jnd model
  • A video coding method based on jnd model

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0071] The present invention will be further described in detail below in conjunction with the embodiments and the accompanying drawings, but the embodiments of the present invention are not limited thereto.

[0072] A video coding method based on the JND model is a perceptual video coding method based on joint estimation of the pixel domain and the DCT domain, including:

[0073] S1. Establish a pixel-domain JND model. The pixel-domain JND model uses a nonlinear superposition model, combined with background brightness adaptation and texture masking effects, to obtain a pixel-domain JND threshold;

[0074] The pixel-domain JND model uses the nonlinear superposition model (NAMM), which effectively combines background brightness adaptation and texture masking effects, and is applied to color images or videos, which is a relatively mature JND model. The spatial domain JND model of each pixel can be used as an approximation of the nonlinear model, and the pixel domain JND threshol...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention belongs to the technical field of video coding, and relates to a video coding method based on a JND model, including: establishing a JND model in a pixel domain; establishing an improved JND model in a DCT domain; The domain JND model preprocesses the original video to remove visual redundancy in the video; uses the improved DCT domain JND model to process the transformation without skipping mode, and removes distortions that cannot be perceived by the human eye; transforms with small prediction residuals skip mode, a computationally simple luminance masking model is used to reduce computational complexity. The present invention uses the pixel domain JND model to preprocess the video, which can remove the visual redundancy of the human eye, and the calculation is simple and convenient; the improved DCT domain JND model is used to make the processing result more suitable for the human eye; different models are used for different modes, which can Further remove the perceptual redundancy in the video coding process, greatly improving the video coding efficiency.

Description

technical field [0001] The invention belongs to the technical field of video coding, and relates to a video coding method based on a JND model. Background technique [0002] In recent years, with the rapid development of Internet technology and smart devices, multimedia video and images have been widely used, affecting and changing all aspects of human life. Early computer and communication systems were primarily focused on processing and transmitting text or voice messages. However, with the popularity of video applications, the problem of limited computer processing power has also emerged. Due to the huge amount of video information, the application value of the video without any processing is very low whether it is stored or transmitted. For example, if we want to transmit a standard-definition color video with a frame rate of 25 frames per second and a resolution of 720*576 over the network in real time, a bandwidth of 720*576*24*25=248832000 bits per second is require...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): H04N19/85H04N19/154H04N19/625H04N19/80
CPCH04N19/85H04N19/154H04N19/625H04N19/80
Inventor 易清明范文卉石敏
Owner JINAN UNIVERSITY
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products