Vision fusion method based on FPGA

A fusion method and visual technology, applied in the field of visual fusion, can solve problems such as the inability to meet real-time requirements, and achieve the effect of improving computing speed, improving accuracy and high resolution

Inactive Publication Date: 2018-07-24
HARBIN UNIV OF SCI & TECH
View PDF4 Cites 8 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Although redundant calculations can be reduced through certain optimization techniques, the complexity can be reduced to O(N2D), but for general-purpose processors, it still cannot meet the real-time requirements, so it is necessary to use hardware acceleration technology or design dedicated hardware circuits

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0012] The following clearly and completely describes the technical solutions in the embodiments of the present invention. Obviously, the described embodiments are only some of the embodiments of the present invention, but not all of them. Based on the embodiments of the present invention, all other embodiments obtained by persons of ordinary skill in the art without making creative efforts belong to the protection scope of the present invention.

[0013] In the embodiment of the present invention, a visual fusion method based on FPGA includes the following steps: establishing a visual system with FPGA as the core of operation processing, and the left and right views of the stereo camera are input through the SERDES interface; the system includes 6 pieces of 1024K×8 bit capacity The external SRAM, in which every 2 SRAMs form a set of ping-pong buffers, realizes seamless transmission and pipeline processing of left and right views and disparity maps; Nand-Flash ROM chip is used ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a vision fusion method based on FPGA. The method comprises the step of establishing a vision system taking the FPGA as an operation processing core, wherein the left view and the right view of a stereo camera are input through a SERDES interface; the system comprises six external SRAMs in 1024*8 bit capacity, wherein each two SRAMs form a group of ping-pong buffer, therebyrealizing seamless transmission and pipelined processing of the left and right views and the parallax image. The requirements on high-speed operation and high-speed image transmission can be satisfied, multiple general interfaces and a surplus hardware resource are provided, thereby further extending and enhancing. And meanwhile, the Census stereo fusion is the non-parametric fusion method and ismore robust to the non-uniform brightness and gain and like conditions, and the system precision is improved. Compared with the traditional general processor, the parallel computing and the reasonablepipeline design are sufficiently utilized, the algorithm is directly mapped to the structure, the system operation speed is greatly improved, and the requirements on high resolution, high precision and high speed are satisfied.

Description

technical field [0001] The invention relates to a visual fusion method, in particular to an FPGA-based visual fusion method. Background technique [0002] The main task of the stereo vision system is to obtain the 3D information of the scene, and it has been widely used in the fields of mobile robots, target tracking, 3D reconstruction and so on. In the stereo vision system, stereo fusion is the key core. Generally speaking, the scenes where mobile robots such as outdoor unmanned vehicles and lunar rovers usually lack regular features such as points and lines, and due to the uncertainty of the scene and the influence of factors such as lighting, feature extraction is often unstable, and only A sparse disparity map can be obtained, and a dense disparity map must be obtained through interpolation to reconstruct the scene. Therefore, the region matching algorithm is more inclined to be used in the real-time stereo vision system. [0003] In addition, the stereo vision applie...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): H04N13/128H04N13/106H04N13/156H04N13/161H04N13/194
Inventor 李述肖瑶
Owner HARBIN UNIV OF SCI & TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products