Method and device for determining depth of object in three-dimensional scene, terminal and storage medium

A technology for 3D scenes and determination methods, which is applied in image data processing, instruments, calculations, etc., can solve the problems of low accuracy of depth values ​​and flickering of overlapping images in 3D scenes, and achieve high precision and avoid flickering of images in overlapping areas Effect

Pending Publication Date: 2020-06-12
深圳市华橙数字科技有限公司
View PDF5 Cites 1 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] The present application provides a method, device, terminal, and storage medium for determining the depth of an object in a three-dimensional scene, so as to solve the problem that the depth value of the existing three-dimensional scene is not high enough to cause flickering of overlapping images in the three-dimensional scene

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Method and device for determining depth of object in three-dimensional scene, terminal and storage medium
  • Method and device for determining depth of object in three-dimensional scene, terminal and storage medium
  • Method and device for determining depth of object in three-dimensional scene, terminal and storage medium

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0032] The following will clearly and completely describe the technical solutions in the embodiments of the present application with reference to the accompanying drawings in the embodiments of the present application. Obviously, the described embodiments are only part of the embodiments of the present application, not all of them. Based on the embodiments in this application, all other embodiments obtained by persons of ordinary skill in the art without making creative efforts belong to the scope of protection of this application.

[0033] The terms "first", "second", and "third" in this application are used for descriptive purposes only, and cannot be understood as indicating or implying relative importance or implicitly specifying the quantity of indicated technical features. Thus, features defined as "first", "second", and "third" may explicitly or implicitly include at least one of these features. In the description of the present application, "plurality" means at least t...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a method and device for determining the depth of an object in a three-dimensional scene, a terminal and a storage medium. The method comprises the steps of obtaining the distance between a target object in the three-dimensional scene and a virtual camera; detecting whether the three-dimensional scene is a close-range display scene or a long-range display scene; when the three-dimensional scene is a close-range display scene, calculating a depth value of the target object by utilizing the distance and a preset first preset minimum observation distance and a preset firstpreset maximum observation distance of a virtual camera; and when the three-dimensional scene is a distant view display scene, obtaining the height from the virtual camera to the ground, determining the current minimum observation distance and the current maximum observation distance of the virtual camera based on the height, and calculating the depth value of the target object by combining the distances, the current minimum observation distance and the current maximum observation distance. By means of the mode, the precision of the depth buffer area of the overlapped image area in the three-dimensional display scene can be improved, and the problem that images flicker in the overlapped area is avoided.

Description

technical field [0001] The present application relates to the technical field of three-dimensional display, in particular to a method, device, terminal and storage medium for determining the depth of an object in a three-dimensional scene. Background technique [0002] With the development of science and technology, more and more 3D scenes are used in various industries. Usually in applications with 3D scenes, the computer needs to convert the data describing the 3D scene into 2D data for viewing on an electronic display screen. This process of converting a 3D scene into a 2D image is usually called rendering. Rendering It is generally divided into three stages: the application stage, the geometric stage and the rasterization stage. In the rasterization stage, the difference between the fixed points of the primitives obtained in the previous stage is performed to generate pixels on the screen, and the final image is rendered. [0003] The task of the rasterization stage is ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06T3/00G06T7/50
CPCG06T3/0037G06T7/50
Inventor 张烨妮刘华陈继超
Owner 深圳市华橙数字科技有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products