Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Refining Synthetic Data With A Generative Adversarial Network Using Auxiliary Inputs

Active Publication Date: 2019-03-14
FORD GLOBAL TECH LLC
View PDF3 Cites 45 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Benefits of technology

The patent text describes a method called GAN that uses additional data, such as semantic maps and depth maps, to ensure the correct textures are applied to different parts of a virtual image. This results in a more realistic and accurate image. The technique can also be used to generate refined images for training other models, such as computer vision or autonomous driving systems.

Problems solved by technology

The process of annotating and labeling relevant portions of image training data (e.g., still images or video) for training machine learning models can be tedious, time-consuming, and expensive.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Refining Synthetic Data With A Generative Adversarial Network Using Auxiliary Inputs
  • Refining Synthetic Data With A Generative Adversarial Network Using Auxiliary Inputs
  • Refining Synthetic Data With A Generative Adversarial Network Using Auxiliary Inputs

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0009]The present invention extends to methods, systems, and computer program products for refining synthetic data with a Generative Adversarial Network using auxiliary inputs.

[0010]Aspects of the invention include using Generative Adversarial Networks (“GANs”) to refine synthetic data. Refined synthetic data can be rendered more realistically than the original synthetic data. Refined synthetic data also retains annotation metadata and labeling metadata used for training of machine learning models. GANs can be extended to use auxiliary channels as inputs to a refiner network to provide hints about increasing the realism of synthetic data. Refinement of synthetic data enhances the use of synthetic data for additional applications.

[0011]In one aspect, a GAN is used to refine a synthetic (or virtual) image, for example, an image generated by a gaming engine, into a more realistic refined synthetic (or virtual) image. The more realistic refined synthetic image retains annotation metadat...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The present invention extends to methods, systems, and computer program products for refining synthetic data with a Generative Adversarial Network (GAN) using auxiliary inputs. Refined synthetic data can be rendered more realistically than the original synthetic data. Refined synthetic data also retains annotation metadata and labeling metadata used for training of machine learning models. GANs can be extended to use auxiliary channels as inputs to a refiner network to provide hints about increasing the realism of synthetic data. Refinement of synthetic data enhances the use of synthetic data for additional applications.

Description

CROSS-REFERENCE TO RELATED APPLICATIONS[0001]Not applicable.BACKGROUND1. Field of the Invention[0002]This invention relates generally to the field of formulating realistic training data for training machine learning models, and, more particularly, to refining synthetic data with a generative adversarial network using auxiliary inputs.2. Related Art[0003]The process of annotating and labeling relevant portions of image training data (e.g., still images or video) for training machine learning models can be tedious, time-consuming, and expensive. To reduce these annotating and labeling burdens, synthetic data (e.g., virtual images generated by gaming or other graphical engines) can be used. Annotating synthetic data is more straightforward as annotation is a direct by-product of generating the synthetic data.BRIEF DESCRIPTION OF THE DRAWINGS[0004]The specific features, aspects and advantages of the present invention will become better understood with regard to the following description...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06K9/62G06K9/00G06T7/11G06T7/13G06T11/60
CPCG06K9/6264G06K9/00798G06T7/11G06T7/13G06T2207/20081G06T2207/10016G06T2207/30256G06T2207/10028G06T11/60G06F18/241G06F18/214G06V20/56G06V20/588G06F18/2185
Inventor HOTSON, GUYPUSKORIUS, GINTARAS VINCENTNARIYAMBUT MURALI, VIDYA
Owner FORD GLOBAL TECH LLC
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products