Supercharge Your Innovation With Domain-Expert AI Agents!

A lens distortion correction and feature extraction method and system

A technology of lens distortion and feature extraction, which is applied to parts of TV systems, character and pattern recognition, and parts of color TVs, etc. It can solve the problem of large data processing volume, high CPU usage, and the inability of robots or drones to perform Real-time positioning and other issues to achieve the effect of improving computing efficiency, reducing hardware costs, and achieving high-precision positioning

Inactive Publication Date: 2019-06-04
DONGGUAN PANRAYS INC LTD
View PDF6 Cites 8 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0008] The present invention aims to solve the technical problem that the distortion correction and feature extraction methods and systems in the prior art have a large amount of data processing and high CPU usage, which leads to the inability of real-time positioning of robots or drones, and provides a new method with high computing efficiency. Method and system for lens distortion correction and feature extraction

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A lens distortion correction and feature extraction method and system
  • A lens distortion correction and feature extraction method and system
  • A lens distortion correction and feature extraction method and system

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0081] The application scenario of this embodiment is to perform indoor positioning in an indoor area where ground features can be extracted. First, the camera is used to obtain the feature information of the edge line of the indoor floor tiles, refer to Figure 4 Establish the X-axis and Y-axis directions, and quickly transform to obtain the relative position information from the position of the camera center point to the straight line in the captured picture.

[0082] S101: Acquire a lens image. Capture pictures for subsequent module processing.

[0083] S102: Image preprocessing. Convert the collected lens image from color image to grayscale image; edge detection, display the outline of the image; cut the image, remove the disturbing outline part.

[0084] Converting a picture from a color image to a grayscale image can reduce the amount of calculations on the subsequent data to be processed, speed up the operation efficiency, and the grayscale image is also easier to pr...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a lens distortion correction and feature extraction method and system. The method mainly comprises the following six steps of acquiring lens picture information, preprocessingpictures, extracting picture contour points, correcting lens distortion, extracting graphic feature key points, performing key point inverse perspective transformation, and calculating a key point physical world equation. The system mainly comprises a hard software module providing an operation platform for the method, namely an input image module, a picture preprocessing module, an image contourpoint extraction module, a contour point distortion correction module, a graph feature key point extraction module and a key point inverse perspective transformation module, and a feature graph physical world equation obtaining module. Compared with a traditional method and a traditional system, the method and the system can reduce more than 6 times of CPU operation amount, remarkably reduce hardware cost while improving calculation efficiency, can quickly obtain physical characteristic information related to the ground, realize indoor high-precision positioning, and are particularly suitablefor an embedded system with few operation resources.

Description

technical field [0001] The invention belongs to the field of computer vision detection, and in particular relates to a method and system for lens distortion correction and feature extraction, which have high calculation efficiency and can effectively reduce CPU running time. Background technique [0002] Computer vision detection has broad application prospects in the fields of robots, unmanned vehicles, and drones. The amount of data to be processed is large and the real-time requirements are high. Lens distortion calibration and feature extraction are widely used in computer vision inspection, especially in indoor positioning systems. In high-precision indoor positioning systems, it is more popular to use radio ranging, such as ultra-wideband positioning or WiFi receiving signal strength positioning, etc., and then use triangulation principles to determine the absolute pose of the camera. Radio positioning needs to know the environment information in advance, and its posi...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): H04N5/232H04N5/357
CPCG01C11/00H04N23/60H04N25/61
Inventor 王峰肖飞汪进黄祖德邱文添李诗语曹彬
Owner DONGGUAN PANRAYS INC LTD
Features
  • R&D
  • Intellectual Property
  • Life Sciences
  • Materials
  • Tech Scout
Why Patsnap Eureka
  • Unparalleled Data Quality
  • Higher Quality Content
  • 60% Fewer Hallucinations
Social media
Patsnap Eureka Blog
Learn More