Eureka AIR delivers breakthrough ideas for toughest innovation challenges, trusted by R&D personnel around the world.

Method and device

A technology of moving direction and sensors, which is applied in the fields of self-driving vehicles and virtual driving, and can solve the problems of large consumption of computing resources, consumption of communication resources, and consumption of self-driving vehicles.

Active Publication Date: 2020-08-07
STRADVISION
View PDF10 Cites 2 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0007] Therefore, there is a problem that even sensors that are unnecessary according to the current driving environment are continuously operated, so that the self-driving vehicle consumes a large amount of power
[0008] To solve this problem, conventional methods employ sensor fusion to use only the best sensors needed to detect the driving environment
[0009] In addition, recently, although autonomous vehicles can share information with each other through V2V communication (i.e., vehicle-to-vehicle communication), a large amount of communication resources are consumed due to sending and receiving a large amount of sensor data, and due to all the sending and receiving sensor The data must be processed, and the consumption of computing resources is also very large

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Method and device
  • Method and device
  • Method and device

Examples

Experimental program
Comparison scheme
Effect test

Embodiment approach

[0059] In order to enable those skilled in the art to easily implement the present invention, exemplary embodiments of the present invention will be described in detail by referring to the accompanying drawings, as shown below.

[0060] figure 1 Schematically shows a reinforcement learning-based learning device for learning a sensor fusion network for multi-object sensor fusion in cooperative driving according to an embodiment of the present invention. refer to figure 1 , the learning device 100 may include: a memory 120 for storing instructions to learn a sensor fusion network for multi-object sensor fusion in cooperative driving of an object self-driving vehicle based on reinforcement learning; a processor 130 for executing Processing corresponding to instructions in memory 120 .

[0061] Specifically, the learning device 100 can generally achieve the desired system performance by using a combination of at least one computing device and at least one computer software, such...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a method for learning a sensor fusion network for sensor fusion of an autonomous vehicle performing a cooperative driving. The method includes the following steps: learning equipment (a) inputs (i) a driving image including the autonomous vehicle, m cooperatively-driving vehicles, and second virtual vehicles and (ii) sensor status information on n sensors in the m cooperatively-driving vehicles into the sensor fusion network, to generate sensor fusion probabilities of sensor values of the n sensors being transmitted, (b) inputs a road-driving video into a detection network, to detect the second virtual vehicles, pedestrians, and lanes and output nearby object information, and inputs sensor values and the nearby object information into a drive network, to generate moving direction probabilities and drive the autonomous vehicle and (c) acquires traffic condition information, generates a reward, and learns the sensor fusion network.

Description

technical field [0001] The present invention relates to a method and apparatus for autonomous driving vehicles, virtual driving, etc.; more particularly, to a method and apparatus for performing multi-agent sensor fusion based on reinforcement learning in autonomous driving. Background technique [0002] The vehicle industry has recently shifted to an era of environmentally friendly and advanced vehicles incorporating IT technology. In addition to the development of vehicle technology, smart vehicles are also being commercialized, in which technologies such as accident prevention, accident avoidance, collision safety, convenience improvement, vehicle informatization, and autonomous driving technologies are applied. [0003] This intelligent vehicle provides comfort functions by providing assistive technology and voice recognition for driver distraction and unskilled operation, thereby reducing accidents caused by driver negligence, and is also expected to reduce time, fuel w...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/62G06N3/04G06N3/08G06V10/776
CPCG06N3/08G06N3/044G06N3/045G06F18/2411G06F18/251H04W4/46G06N5/043G06N3/006G06V20/56G06V10/82G06V10/803G06V10/776G06N3/04G08G1/091G08G1/20G05D1/0221G06N3/049G06F18/2155
Inventor 金桂贤金镕重金鹤京南云铉夫硕熏成明哲申东洙吕东勋柳宇宙李明春李炯树张泰雄郑景中诸泓模赵浩辰
Owner STRADVISION
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Eureka Blog
Learn More
PatSnap group products