Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Target scene simulation model construction method, data set generation method and electronic equipment

A simulation model and target scene technology, applied in the field of computer vision, can solve the problems of limited accuracy and density, large differences in the texture of the data set simulation model, and small scale of the real scene data set, so as to improve the consistency accuracy and facilitate the generalization. The effect of optimizing training and improving data richness

Active Publication Date: 2022-04-12
ZHEJIANG LAB
View PDF4 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0006] The purpose of the embodiments of the present application is to provide a method for constructing a target scene simulation model, a method for generating a data set, and an electronic device, so as to solve the problems in related technologies that the texture of the simulation model of the data set is quite different from that of the real scene, and the scale of the real scene data set is not large. Due to the technical problems of large size, limited precision and density, this method makes full use of the scale and distribution characteristics of texture information to obtain a large-scale binocular depth data set that is closer to the real scene, and meets the training and application requirements of the binocular stereo matching neural network.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Target scene simulation model construction method, data set generation method and electronic equipment
  • Target scene simulation model construction method, data set generation method and electronic equipment
  • Target scene simulation model construction method, data set generation method and electronic equipment

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0057] Exemplary embodiments will be described in detail herein, examples of which are illustrated in the accompanying drawings. Where the following description refers to the drawings, the same numerals in different drawings refer to the same or similar elements unless otherwise indicated. The implementations described in the illustrative examples below are not intended to represent all implementations consistent with this application.

[0058] The terminology used in this application is for the purpose of describing particular embodiments only and is not intended to limit the application. As used in this application, the singular forms "a," "the," and "the" are intended to include the plural forms as well, unless the context clearly dictates otherwise. It will also be understood that the term "and / or" as used herein refers to and includes any and all possible combinations of one or more of the associated listed items.

[0059] It should be understood that although the terms...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a method for constructing a simulation model of a target scene, a method for generating a data set, and an electronic device. The method includes performing hierarchical segmentation of the target scene according to texture scales to obtain multiple levels of segmentation objects; for the multiple segmentations object, constructing a simulation model corresponding to each object; acquiring image data of a target scene; segmenting a low-resolution texture image corresponding to each segmented object from the image data of the target scene according to the type of the segmented object ;Synthesize the low-resolution texture image corresponding to each segmentation object to obtain the high-resolution texture image corresponding to each segmentation object; respectively map the high-resolution texture image corresponding to each segmentation object to the corresponding In the simulation model of the simulation model, a simulation model with texture is obtained; the simulation model with texture corresponding to each object is unified into the scene, and the simulation model of the target scene is obtained. It is suitable for the generation of large-scale binocular datasets in scenarios such as extraterrestrial detection and automatic driving.

Description

technical field [0001] The present application relates to the technical field of computer vision, and in particular, to a method for constructing a simulation model of a target scene, a method for generating a data set, and an electronic device. Background technique [0002] The method of obtaining 3D depth information by stereo matching with binocular cameras is a hot research topic in the field of computer vision technology. Compared with the traditional binocular stereo matching algorithm, the neural network-based binocular stereo matching algorithm is showing increasingly excellent results in terms of algorithm accuracy and operating efficiency, and the neural network algorithm needs to rely on high-quality large-scale binocular stereo matching. Deep dataset. [0003] At present, the mainstream real scene datasets in the field of binocular stereo matching research mainly include the KITTI dataset that uses lidar to collect depth information, and the Middlebury dataset t...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06T17/00G06T15/04G06T7/11
CPCG06T17/00G06T15/04G06T7/11
Inventor 许振宇宋俊男朱世强李月华
Owner ZHEJIANG LAB
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products