Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Systems and methods for processing 2d/3d data for structures of interest in a scene and wireframes generated therefrom

a technology of scene and wireframe, applied in the field of systems and methods for processing 2d/3d data for structures of interest in a scene and wireframe generated therefrom, can solve the problems of not being able to achieve high-quality results using automatic methods, the effectiveness and quality of algorithms and techniques to extract information from image of structures (or objects etc.) has not been fully resolved, and the geometric features of structures, objects, or elements present in a scene cannot be fully resolved

Pending Publication Date: 2021-10-07
POINTIVO
View PDF0 Cites 4 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, current methodologies do not readily allow 3D data derived from structures, objects, or elements in a scene to be automatically processed to generate wireframe renderings that are accurate in relation to the structure or object for which the renderings are being generated.
In this regard, while computer vision technology has been advancing quickly in recent years, the effectiveness and quality of algorithms and techniques to extract information from imaging of structures (or objects etc.), such as buildings, has not progressed to ensure high quality results can be obtained using automatic methods, especially when the structure is even modestly complex in form.
Existing algorithms used to analyze and extract wireframe renderings in suitable detail from 3D data, such as point clouds, are often unable to fully resolve the geometric features for structures, objects, or elements present in a scene without also an attendant manual evaluation and / or manual manipulation of the output.
This means that existing algorithms are incapable of automatically generating high quality reconstructions for many structures and / or objects that are of interest.
Lack of accurate wireframe renderings from 3D data using automatic methods is particularly acute when the structure being modeled from 3D data is from a time that building designs were generated using non-computer methods because there is likely no design or engineering information retrievable for use as 3D data.
Another problem faced in generation of accurate wireframe renderings from 3D data is when a structure or object is “arbitrary,” in that it is non-standard and / or has not previously been analyzed in the same context.
Such arbitrary structures or objects will therefore not have accurate baseline information retrievable in an available library of structural elements or other relevant information to provide reference for automatic generation of accurate wireframe renderings.
Cost and complexity of creating high quality wireframe renderings for objects that are being examined for the first time is therefore high today, due to the requirement that manual interventions be conducted in order to generate high quality wireframe renderings using current state of the art methodology.
The data-driven approach has the advantage of detecting basic elements of the building, but the quality of 3D modeling or wireframe generation can be limited by the algorithm applied for segmentation.
Automatic 3D modeling or wireframe generation is nonetheless problematic when small features are present in the structure and / or substructures are contained within larger structures.
However, the model-driven approach requires that the needed model element is stored in the library, which makes not standard roof or non-ideal 3D modeling or wireframe generation problematic.
While the hybrid approach can produce greater wireframe quality than use of either the data-driven or model-driven approaches individually, the inherent problems in each of the approaches (e.g., algorithm quality and library limitations) nonetheless reduce the overall 3D modeling and wireframe rendering quality of these methods.
While the hybrid approach has been applied using automatic methods with some success for simple structures where simple building structures are applicable and for which pre-defined building primitives can be applied, for structures and objects that where non-planar and / or a plurality of planar building features are present, wireframe renderings that do not require at least some manual interaction and / or manual result validation remain elusive.
In this regard, while some percentage of buildings are likely to comprise at least some standard features (e.g., roof parts, windows, doors, etc.), many buildings comprise non-standard features or complex arrangements of standard features (e.g., multi-faceted roof structures) that will be difficult to accurately resolve due to the complexity of the arrangements therein.
Further, non-building structures in a scene are more likely to comprise non-standard components or otherwise be arbitrary.
Accordingly, current state of the art methodologies cannot suitably generate accurate wireframe renderings for such structures and objects without manual intervention in the generation of the 3D information and / or manual validation of the results thereof.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Systems and methods for processing 2d/3d data for structures of interest in a scene and wireframes generated therefrom
  • Systems and methods for processing 2d/3d data for structures of interest in a scene and wireframes generated therefrom

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0020]Many aspects of the disclosure can be better understood with reference to the Figures presented herewith. The Figures are intended to illustrate the various features of the present disclosure. While several implementations may be described in connection with the included drawings, there is no intent to limit the disclosure to the implementations disclosed herein. To the contrary, the intent is to cover all alternatives, modifications, and equivalents.

[0021]The term “substantially” is meant to permit deviations from the descriptive term that do not negatively impact the intended purpose. All descriptive terms used herein are implicitly understood to be modified by the word “substantially,” even if the descriptive term is not explicitly modified by the word “substantially.”

[0022]The term “about” is meant to account for variations due to experimental error. All measurements or numbers are implicitly understood to be modified by the word about, even if the measurement or number is...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The inventions herein relate generally to improvements in the generation of wireframe renderings derived from 2D and / or 3D data that includes at least one structure of interest in a scene. Such wireframe renderings and similar formats can be used in, among other things, 2D / 3D CAD drawings, designs, drafts, models, building information models, augmented reality or virtual reality, and the like. Measurements, dimensions, geometric information, and semantic information generated according to the inventive methods can be accurate in relation to the actual structures. The wireframe renderings can be generated from a combination of a plurality of 2D images and point clouds, processing of point clouds to generate virtual / synthetic views to be used with the point clouds, or from 2D image data that has been processed in a machine learning process to generate 3D data. In some aspects, the wireframe renderings are accurate in relation to the actual structure of interest, automatically generated, or both.

Description

CROSS-REFERENCE TO RELATED APPLICATIONS[0001]This application is a continuation application claiming priority to, and the benefit of, U.S. Non-Provisional application Ser. No. 15 / 881,795 filed Jan. 28, 2018, which claims priority to U.S. Provisional Application No. 62 / 451,700 filed Jan. 28, 2017. The contents of both applications are incorporated herein in their entireties by this reference.STATEMENT OF GOVERNMENT INTEREST[0002]This invention was made with government support under contract numbers 1519971 and 1632248 awarded by the National Science Foundation. The Government has certain rights to the invention.FIELD OF THE INVENTION[0003]The inventions herein relate generally to improvements in the generation of wireframe renderings derived from 2D and / or 3D data that includes at least one structure of interest in a scene. Such wireframe renderings and similar formats can be used in, among other things, 2D / 3D CAD drawings, designs, drafts, models, building information models, augmen...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(United States)
IPC IPC(8): G06T17/20G06T17/05G06N20/00G06F30/13
CPCG06T17/20G06T17/05G06T2207/20081G06F30/13G06N20/00G06F30/10
Inventor FATHI, HABIBCIPRARI, DANIEL L.GROSS, BRADDEN J.
Owner POINTIVO
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products