Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

A lightweight color depth learning model for near-infrared images with fusion layer

A near-infrared and colorization technology, applied in image data processing, 2D image generation, biological neural network models, etc., can solve the problems that scene images cannot be colored, equipment hardware requirements are extremely high, and training time increases, etc., to achieve The effect of rich image details, simple and practical method, and consistent object color

Inactive Publication Date: 2019-01-18
TIANJIN POLYTECHNIC UNIV
View PDF0 Cites 8 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

First of all, the existing colorization model using deep learning needs to train a very deep network model, which has extremely high requirements on the hardware of the device, and it is difficult to achieve mass production
In addition, the effect of using the deep learning colorization model on image colorization is highly dependent on the image quality and richness of the training set. Most of the existing deep learning colorization models use large data sets such as ImageNet as the training set. Although the training set has a large number of images, However, it also greatly increases the training time of the network, and there is no corresponding image for a certain type of special scene (rare animals and plants, special food, etc.), so the image of this scene cannot be colored

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A lightweight color depth learning model for near-infrared images with fusion layer
  • A lightweight color depth learning model for near-infrared images with fusion layer
  • A lightweight color depth learning model for near-infrared images with fusion layer

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0021] The process of this method is as follows figure 1 As shown, the method first uses the lightweight Faseter-RCNN image recognition module to recognize near-infrared images, and then searches for 500 images similar to the scene in the entire network, uses this as a training set, and enters the colorized network The module is trained, and finally the near-infrared image is input as the test set to complete the image colorization. The network structure, process and steps of the technical solution of the method will be described below in conjunction with the accompanying drawings.

[0022] 1. Image recognition network module

[0023] The image recognition network model structure is as figure 2 Shown. ①First input the NIR image to VGG-16; ②NIR is propagated forward through CNN to the last shared convolutional layer, and the feature map for RPN network input is obtained, and then forward propagated to the last convolutional layer to generate higher-dimensional feature maps ③The fe...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to an automatic near-infrared image colorization model composed of a lightweight image recognition network module and an image colorization CNN module with a fusion layer. Fast-RCNN performs image recognition on near-infrared image, and then collects and selects images similar to the scene from the network, After training with the colorized CNN module with fusion layer, the near-infrared image is input for colorization. Therefore, the colored near-infrared image is output. This method combines the lightweight image recognition model with the image colorization model withfusion layer, which not only colorizes most of the near infrared images, but also improves the problems such as boundary diffuse color, gray and dark color, coloring error and so on in the colorized images.

Description

Technical field [0001] It relates to a method for automatic colorization of near-infrared images. The method has good colorization effects for low-resolution, high-resolution near-infrared images, and infrared images taken during the day or night, and belongs to image processing technology. It can be applied to the automatic colorization of near-infrared images in the security field. Background technique [0002] Since the near-infrared image (NIR) can reflect the thermal radiation information of the target in the scene, it is not sensitive to the brightness change of the scene, and many important night vision or low-illumination scenes, such as mines, wild animal observation points, military bases, etc. Use near-infrared images to achieve comprehensive monitoring. However, since the near-infrared image obtained is a grayscale image, it also has the shortcomings of blurred edges and insufficient details, so it needs to be colored to increase its color and texture information to ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06T11/00G06N3/04
CPCG06T11/001G06N3/045
Inventor 汤春明郑鑫毅朱雯彦
Owner TIANJIN POLYTECHNIC UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products