Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Monocular depth estimation method based on electro-hydraulic focusable lens, corresponding camera and storage medium

A depth estimation and electro-hydraulic technology, applied in the field of computer vision, can solve the problems of complex monocular camera structure, high initial cost, high price, etc., and achieve the effect of simple and practical establishment process, low cost, and simple method and algorithm

Active Publication Date: 2021-10-22
SHANGHAI UNIV
View PDF8 Cites 3 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0003] At present, there are three main ways to obtain depth information: direct measurement with infrared or radar cameras, inferring the depth of objects based on binocular parallax, and depth estimation methods based on monocular cameras; The equipment directly measured by the camera is usually large in size, expensive and energy-consuming; the method of inferring the depth of an object based on binocular parallax is more complicated than that of a monocular camera, and the estimated depth range is limited by the distance between the two cameras. The baseline length between them; the depth estimation method based on the monocular camera is mainly realized by deep neural network, and the deep neural network needs to design a complex network structure, and requires a lot of data and time to train the neural network, and the upfront cost is very high

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Monocular depth estimation method based on electro-hydraulic focusable lens, corresponding camera and storage medium
  • Monocular depth estimation method based on electro-hydraulic focusable lens, corresponding camera and storage medium
  • Monocular depth estimation method based on electro-hydraulic focusable lens, corresponding camera and storage medium

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0046] see figure 1 , a monocular depth estimation method based on an electro-hydraulic adjustable focus lens, comprising the following steps:

[0047] S1: Use Zemax software to establish the optical imaging system model of the electro-hydraulic zoom lens, and set the radius, thickness, curvature, material and other information of the electro-hydraulic zoom lens used in the Zemax software.

[0048] S2: Establish the functional relationship between the focal length of the electro-hydraulic adjustable focus lens and the optimal imaging object distance. The specific establishment process is:

[0049] A: see figure 2 , using the optical imaging system model modeled in step S1, the relationship between the control current and the focal length of the electro-hydraulic adjustable focus lens can be obtained:

[0050] f=αI+β (1)

[0051] Among them, f is the focal length, I is the control current, and α and β are coefficients obtained by curve fitting;

[0052] B: see image 3 , ...

Embodiment 2

[0074] A monocular depth estimation method based on an electro-hydraulic adjustable focus lens, the only difference from Embodiment 1 is that the sliding window slides with a step size λ in step S8. The rule is:

[0075] a: The pixel point at the lower right corner of the λ×λ pixel in the center of the sliding window is used as the pixel point at the initial position of the image. After the depth value estimation of the window is completed, the sliding window slides along the negative direction of the x-axis with the step size λ to continue the depth estimation until it reaches the leftmost border of the image;

[0076] b: The sliding window returns to the rightmost boundary of the pixel at the initial position of the image, slides once along the negative direction of the y-axis with a step size λ, and then continues to slide according to the above rule a; repeat a and b until all pixels of the image are completed depth estimation.

Embodiment 3

[0078] A camera includes an electro-hydraulic adjustable focus lens, the electro-hydraulic adjustable focus lens is connected with a control module, the control module is connected with a processor module and a storage module; one or more programs are stored on the storage module, when one or more The program is executed by the processor module, so that one or more processors implement any step in the monocular depth estimation method based on the electro-hydraulic adjustable focus lens described in Embodiment 1 or 2.

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention belongs to the technical field of computer vision, and discloses a monocular depth estimation method based on an electro-hydraulic focusable lens, which comprises the electro-hydraulic focusable lens. The method comprises the following main steps: establishing an electro-hydraulic focusable lens optical imaging system model; establishing a function relationship between the focal length of the electro-hydraulic focusable lens and the optimal imaging object distance; collecting images under different focal lengths and recording corresponding focal lengths; registering the collected images;adopting a gradient operator to pre-process the image; obtaining a focal length corresponding to an image where the sliding window with the maximum definition is located through a definition evaluation function, obtaining an optimal imaging object distance to serve as a depth estimation value of a space point corresponding to a center pixel of the sliding window, and calculating depth estimation values of all pixels by sliding the sliding window with the step length lambda. According to the invention, the problems of low efficiency, complex algorithm and high cost of a depth information acquisition mode in the prior art are solved.

Description

technical field [0001] The invention relates to the technical field of computer vision, in particular to a monocular depth estimation method based on an electro-hydraulic adjustable focus lens, a corresponding camera and a storage medium. Background technique [0002] Computer vision is a research field that enables computers to imitate the visual system of the human body to process and analyze the acquired visual information. The goal of computer vision research is to enable robots to effectively perceive object information in three-dimensional space like humans , including the depth and color of the object, and then analyze and understand the information it perceives. Among them, depth information plays a very important role in the fields of automatic driving and 3D reconstruction of scenes. Therefore, it is particularly important to obtain the depth information of 3D scenes. [0003] At present, there are three main ways to obtain depth information: direct measurement wi...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06T7/50G06T7/00G06T7/33
CPCG06T7/50G06T7/0002G06T7/33G06T2207/10028G06T2207/30168
Inventor 李恒宇韩爽刘靖逸谢永浩岳涛谢少荣罗均
Owner SHANGHAI UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products