Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Device and method for generating label objects for the surroundings of a vehicle

a vehicle and label object technology, applied in the field of vehicle label object generation devices and methods, can solve the problems of comparatively time-consuming and computation-intensive, high cost of manual methods, etc., and achieve the effect of increasing the accuracy of automatic generation of reference labels, and increasing the accuracy of the generated labels

Active Publication Date: 2020-06-04
ROBERT BOSCH GMBH
View PDF0 Cites 1 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Benefits of technology

The patent describes a system for labeling objects using sensor data collected from a vehicle. The system uses a combination of pattern detection and multi-target tracking algorithms to identify attributes of the objects. The labels are generated using a holistic processing of the observation, which takes into account the present, past, and future of the observation. The system can also use reference sensors, such as calibrated sensors or additional sensors, to improve the accuracy of the labels. The classification of the objects can be performed using an artificial neural network or a Bayesian filter. The system can store the sensor data on a persistent storage medium and transfer it to a server computer or cluster for further analysis and labeling. Overall, the system achieves a higher accuracy and reliability of the labels and allows for the use of more advanced algorithms and machine learning methods.

Problems solved by technology

The detected and “labeled” object may be called a “label object.” These manual methods may be expensive in terms of time and costs, however.
In one specific embodiment (e.g., of offline processing), the iterative process does not have to fulfill real time conditions and may therefore be comparatively time-intensive and computation-intensive.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Device and method for generating label objects for the surroundings of a vehicle
  • Device and method for generating label objects for the surroundings of a vehicle
  • Device and method for generating label objects for the surroundings of a vehicle

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0065]FIG. 1 schematically shows a system including a labeling system 200 according to a specific embodiment of the present invention. The labeling system 200 is connected to a sensor set 300, which has a plurality of individual sensors 301, 302. Individual sensors 301, 302 may have or use various contrast mechanisms. In the specific embodiment shown, a first individual sensor 301 is developed as a camera (schematically indicated by a lens) and the second individual sensor 302 is developed as a radar sensor; the second individual sensor 302 thus being of a different sensor type than individual sensor 301 and using different contrast mechanisms. Sensor set 300 is designed to detect raw measurement values of an object 120 in an environment or a surroundings 100 and to transmit these via the line or interface 310 to a memory 400. The type of raw measurement values is specific to the sensor; these may be for example pixel values, positional data (e.g., radar locations) or also specific ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

A method and a labeling system for generating a label object for the symbolic description of an object of an environment of a mobile device, e.g., a robot or a vehicle. The label object includes at least one attribute of an object at a first point in time, from observations of this object. The method includes selecting, from the observations, a first observation recorded at a first point in time, a second observation recorded at a second point in time, the second point in time being a point in time before the first point in time, as well as a third observation recorded at a third point in time, the third point in time being a point in time after the first point in time; and ascertaining, by using the selected observations, the at least one attribute of the object.

Description

CROSS REFERENCE[0001]The present application claims the benefit under 35 U.S.C. § 119 of German Patent Application No. DE 102018220892.1 filed on Dec. 4, 2018, which is expressly incorporated herein by reference in its entirety.FIELD[0002]The present invention relates to a method and a labeling system for generating a label object for the symbolic description of an object of an environment, or a surroundings, of a mobile device, e.g., a robot or a vehicle, in particular of a mobile device that is movable at least partially in automated fashion. The present invention furthermore relates to a program element, a computer-readable medium and a use.BACKGROUND INFORMATION[0003]For a mobile device, e.g., for a robot or a vehicle, a symbolic description of an object, or a plurality of objects, of an environment or a surroundings of the mobile device may be important. For the symbolic description of the object, the object is recorded, e.g., by a camera and is provided with attributes, e.g., ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(United States)
IPC IPC(8): G06K9/62G06K9/00G05D1/00
CPCG06K9/6259G05D2201/0213G06K9/6278G06K9/00805G05D1/0088G06V20/56G06V20/584G06V2201/08G06F18/241G06F18/2431G06V20/58G06V10/95G06F18/24155G06F18/2155
Inventor FEYERABEND, ACHIMPANCERA, ELENAHERTLEIN, HEINZPINK, OLIVERGOEPPEL, THOMAS
Owner ROBERT BOSCH GMBH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products