Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Light-vision fusion integrated system

An object and lidar technology, applied in the field of intelligent transportation, can solve the problems of prolonged data output, difficulty in large-scale promotion, and inability to fully realize deep sensor fusion.

Pending Publication Date: 2022-04-12
苏州思卡信息系统有限公司
View PDF4 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

The above multi-sensor fusion technology is currently a post-fusion event detection method commonly used in the construction of smart expressways and intelligent transportation. This method mainly has the following problems in practical applications: 1. The deep fusion between sensors cannot be fully realized. 2. The accuracy of event detection is relatively low and the data output delay is long, which makes it difficult to meet the relevant requirements of subsequent vehicle-road coordination; 3. It is necessary to deploy a large number of acquisition devices and data processors, and the overall project construction Large volume, large amount of follow-up maintenance, difficult to achieve large-scale promotion

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Light-vision fusion integrated system

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0025] A light and vision fusion integrated system, such as figure 1 As shown, the system includes: a laser radar, a camera and a data processing unit, and the laser radar, a camera and a data processing unit are arranged in the same device, and the laser radar and the camera are connected with the data processing unit, and the data processing unit includes: Cloud processing algorithm module, image processing algorithm module, PTP time calibration service device, space calibration algorithm module, event analysis logic module and data configuration and forwarding module; the data of lidar and camera pass through point cloud processing algorithm module and image processing algorithm module respectively Carry out preliminary data analysis; then use the PTP time calibration service device to perform time calibration of the lidar and camera sensors and the core processor to achieve synchronous calibration of the frame rate of the three; then perform spatial calibration through the...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a light-vision fusion integrated system, and the system comprises a laser radar, a camera and a data processing unit, the laser radar and the camera are connected with the data processing unit, and the data of the laser radar and the camera are subjected to the preliminary data analysis through a point cloud processing algorithm module and an image processing algorithm module; the PTP timing service device performs timing on the laser radar, the camera sensor and the core processor to realize synchronous calibration of frame rates of the laser radar, the camera sensor and the core processor; then space calibration is carried out through a space calibration algorithm module, and bottom layer fusion of data detected by the laser radar and the camera sensor is achieved; the event analysis logic module compares the traffic participant information with preset traffic event logic information, and if the information is inconsistent, parking practice is automatically reported; and the data configuration and forwarding module is used for uploading the original data of the sensor and the sensed event information to a terminal user platform.

Description

technical field [0001] The present invention relates to the technical field of intelligent transportation, and more specifically, to an integrated system for optical and visual fusion. Background technique [0002] With the continuous development of network communication technology, the entire field of intelligent transportation is also evolving from an intelligent transportation system to an intelligent collaboration system. That is to say, from the current bicycle intelligence to the coordinated development of vehicles and roads. In the whole process, in addition to improving the intelligence of bicycles, the entire road traffic must also be transformed into intelligent high-speed and intelligent transportation. The first thing that needs to be solved for smart roads is to improve the road perception network, develop vehicle-road coordination and automatic driving. At present, bicycle intelligence can satisfy conditional automatic driving, but due to factors such as the s...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G01S17/931G01S17/86G01S7/497H04N5/225G06V20/10G06V20/40G06V10/80G06K9/62
Inventor 徐锦锦张小磊姚洪伟
Owner 苏州思卡信息系统有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products