Unlock instant, AI-driven research and patent intelligence for your innovation.
Method for detecting spliced and tampered images
What is Al technical title?
Al technical title is built by PatSnap Al team. It summarizes the technical point description of the patent document.
A detection method and image technology, applied in the field of image analysis, can solve problems such as inability to achieve end-to-end pixel-level positioning, single and incomplete image tampering features, and achieve the effects of detection, precise extraction, and high accuracy
Inactive Publication Date: 2020-04-24
HEBEI UNIV OF TECH
View PDF16 Cites 9 Cited by
Summary
Abstract
Description
Claims
Application Information
AI Technical Summary
This helps you quickly interpret patents by identifying the three key elements:
Problems solved by technology
Method used
Benefits of technology
Problems solved by technology
[0009] The technical problem to be solved by the present invention is to provide a detection method of spliced and falsified images, which is a detection method of spliced and falsified images based on light source maps and pyramid feature maps, using two convolutional neural networks with the same structure respectively Extract the multi-stage features of the spliced falsified image and its corresponding light source map, combine the multi-scale information, fuse and up-sample the two sets of multi-stage features to obtain the pyramid feature map, and pass the different layers of the pyramid feature map through the region generation network to obtain To tamper with the candidate area, generate a fixed-size feature map through ROI Align, classify the fixed-size feature map, perform bounding box regression and mask prediction, and finally obtain the bounding box and pixel-level positioning of the tampered area, and complete the detection of the stitched tampered image , overcomes the defect that the image tampering features extracted by the existing technology are single and incomplete, it is easy to ignore the tampering target with a small area, and it cannot achieve end-to-end pixel-level positioning
Method used
the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more
Image
Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
Click on the blue label to locate the original text in one second.
Reading with bidirectional positioning of images and text.
Smart Image
Examples
Experimental program
Comparison scheme
Effect test
Embodiment 1
[0072] The first step is to generate the corresponding light source map from the input image:
[0073] First, the input image is divided into similar regions, namely superpixels, and then the light source color is estimated on each superpixel, and the GGE light source map containing different light source color estimates is generated by the generalized gray world estimation method,
[0074] In this way, the corresponding light source map is generated from the input image;
[0075] The generalized gray world estimation method is as follows:
[0076] This method assumes that the average color of the scene under the illumination of a white light source is gray, then the RGB color observed at the pixel coordinate x of the image is f(x), as shown in the following formula (1),
[0077] f(x)=∫ Ω e(λ,x)s(λ,x)c(λ)dλ(1),
[0078] Among them, Ω represents the spectrum of visible light, λ represents the wavelength of light, e(λ,x) represents the spectrum of the light source, s(λ,x) is ...
Embodiment 2
[0111] In addition to the following:
[0112] The first step is to generate the corresponding light source map from the input image:
[0113] First, the input image is divided into similar regions, namely superpixels, and then the color of the light source is estimated on each superpixel, and the IIC light source map containing different light source color estimates is generated by the inverse intensity chromaticityestimation method.
[0114] In this way, the corresponding light source map is generated from the input image;
[0115] The inverse intensity chromaticity estimation method is as follows:
[0116] This method is based on the physical process of object surface reflection, and the observed image intensity is considered to be a mixture of diffuse reflection and specular reflection, then the color intensity on the image pixel coordinate x channel c∈{R,G,B} is p c (x), as shown in the following formula (3),
[0117] pc(x)=∫ Ω (e(λ,x)s(λ,x)+e(λ,x))c(λ)dλ (3),
[011...
the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More
PUM
Login to View More
Abstract
The invention discloses a method for detecting spliced and tampered images, relates to the image analysis technical field, the method is a method for detecting a spliced and tampered image based on alight source image and a pyramid feature image. The method comprises steps of using two convolutional neural networks with the same structure to respectively extract the spliced tampered image and themulti-stage features of the corresponding light source image, combining multi-scale information, fusing and up-sampling the two groups of multi-stage features; obtaining pyramid feature maps, enabling different layers of the pyramid feature map to respectively pass through a region generation network; obtaining tamper candidate regions, generating a fixed size feature map through the ROI Align; carrying out classification, bounding box regression and mask prediction on the fixed-size feature map; and finally, obtaining a bounding box and pixel-level positioning of the tampered region, and completing detection of the spliced and tampered image, thereby overcoming the defects that the tampered feature of the image extracted in the prior art is single and incomplete, a tampered target with arelatively small region is easy to ignore, and end-to-end pixel-level positioning cannot be realized.
Description
technical field [0001] The technical solution of the present invention relates to image analysis, in particular to a method for detecting spliced and falsified images. Background technique [0002] In recent years, image editingsoftware such as Photoshop has been developed rapidly, allowing people to tamper with images according to their own wishes, even to the extent of confusing real ones. If these tampered pictures are used by people with ulterior motives in military politics, the court will testify , Scientific research and other serious occasions will distort the truth, mislead the public, and have an adverse impact on society. Therefore, it is of great significance to study the image forensics technology whether the digital image has been tampered with. [0003] Splicing is the most common means of image tampering. It is to splice a certain part of one image or multiple images into another image to achieve the purpose of falsifying facts. After some post-processing...
Claims
the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More
Application Information
Patent Timeline
Application Date:The date an application was filed.
Publication Date:The date a patent or application was officially published.
First Publication Date:The earliest publication date of a patent with the same application number.
Issue Date:Publication date of the patent grant document.
PCT Entry Date:The Entry date of PCT National Phase.
Estimated Expiry Date:The statutory expiry date of a patent right according to the Patent Law, and it is the longest term of protection that the patent right can achieve without the termination of the patent right due to other reasons(Term extension factor has been taken into account ).
Invalid Date:Actual expiry date is based on effective date or publication date of legal transaction data of invalid patent.