High-frame-rate video generation method and device based on data fusion

A technology of data fusion and high frame rate, applied in neural learning methods, biological neural network models, neural architectures, etc., can solve problems such as low image quality, lack of initial brightness values ​​of pixels, poor brightness stability, etc., to ensure image quality , meet the production requirements, and improve the cost effect

Active Publication Date: 2022-08-09
TSINGHUA UNIV
View PDF4 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] This application provides a high frame rate video generation method and device based on data fusion to solve the problem of only using event streams as data input and lacking the initial brightness value of each pixel in the related art, only relying on brightness change records to estimate brightness Poor stability, resulting in technical issues with lower-quality generated images

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • High-frame-rate video generation method and device based on data fusion
  • High-frame-rate video generation method and device based on data fusion
  • High-frame-rate video generation method and device based on data fusion

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0037] The following describes in detail the embodiments of the present application, examples of which are illustrated in the accompanying drawings, wherein the same or similar reference numerals refer to the same or similar elements or elements having the same or similar functions throughout. The embodiments described below with reference to the accompanying drawings are exemplary, and are intended to be used to explain the present application, but should not be construed as a limitation to the present application.

[0038]The method and apparatus for generating a high frame rate video based on data fusion according to the embodiments of the present application are described below with reference to the accompanying drawings. For the related art mentioned in the above-mentioned Background Art Center, only the event stream is used as the data input, and the initial brightness value of each pixel point is lacking, and the stability of the estimated brightness only relying on the ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a high-frame-rate video generation method and device based on data fusion, and the method comprises the steps: obtaining a low-frame-rate video and event data from an event camera; every two adjacent image frames in a video are combined, a timestamp set of all expected intermediate frames is calculated, a first event stream and a second event stream from two boundary frames to the expected intermediate frames are intercepted, and the first event stream and the second event stream are input into a preset pulse neural network for forward propagation. Obtaining a first event stream data feature vector and a second event stream data feature vector, splicing the first event stream data feature vector and the second event stream data feature vector with adjacent image frames, and inputting the spliced image frames into a preset multi-modal fusion network for forward propagation to obtain all intermediate frames; a high frame rate video is generated based on all the intermediate frames. Therefore, the technical problem that the quality of the generated image is low due to the fact that only the event stream is used as data input, the initial brightness value of each pixel point is lacked and the stability of brightness estimation only depending on the brightness change record is poor in the prior art is solved.

Description

technical field [0001] The present application relates to the technical field of computer vision and neuromorphic computing, and in particular, to a method and device for generating high frame rate video based on data fusion. Background technique [0002] The event camera is a bio-inspired sensor, and its working principle is very different from that of a traditional camera. Unlike a traditional camera, which captures the absolute light intensity of the scene at a fixed frame rate, the event camera outputs data if and only when the scene light intensity changes. , this output data is called event stream. Compared with traditional cameras, event cameras have the advantages of high dynamic range, high temporal resolution, and no motion blur. [0003] In the related art, the video generation method uses an event camera as a sensor, uses a pure event stream to generate video, converts the event stream into a grid-like tensor representation by stacking, and then uses a deep lear...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): H04N7/01G06N3/08G06N3/04
CPCH04N7/0127G06N3/049G06N3/08G06N3/045Y02D10/00
Inventor 高跃李思奇别林
Owner TSINGHUA UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products