Method for estimating geometrical information of scene from single image through GAN (Generative Adversarial Network)
A single image and geometric information technology, applied in biological neural network models, image enhancement, image analysis, etc., can solve the problems of low output image resolution, long training time, large training samples, etc., to improve performance and accuracy , increase the number of layers, reduce the effect of measurement cost
- Summary
- Abstract
- Description
- Claims
- Application Information
AI Technical Summary
Problems solved by technology
Method used
Image
Examples
Embodiment Construction
[0056] In order to better understand the technical solution proposed by the present invention, the present invention will be further described below in conjunction with the accompanying drawings and specific embodiments.
[0057] Usually, in order to obtain the geometric information of a certain scene, especially the depth information, people use the Kinect camera to take the depth image of the scene, but the measurement distance of the Kinect is short, and the acquired depth information is sparse. In order to obtain all the depth information of the scene, multiple measurements are required. Therefore, we hope to estimate the full depth information of the scene by using the RGB image captured by the normal camera and the sparse depth image captured by the Kinect camera. An ordinary camera takes an RGB image and a Kinect takes a depth image at the same position and angle.
[0058] In the present invention, the image of the scene and the depth of several pixels in the image are...
PUM
Abstract
Description
Claims
Application Information
- R&D Engineer
- R&D Manager
- IP Professional
- Industry Leading Data Capabilities
- Powerful AI technology
- Patent DNA Extraction
Browse by: Latest US Patents, China's latest patents, Technical Efficacy Thesaurus, Application Domain, Technology Topic, Popular Technical Reports.
© 2024 PatSnap. All rights reserved.Legal|Privacy policy|Modern Slavery Act Transparency Statement|Sitemap|About US| Contact US: help@patsnap.com