Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Visual positioning method based on ground texture, chip and mobile robot

A mobile robot and visual positioning technology, applied in processor architecture/configuration, instrumentation, image data processing, etc., can solve the problems of fewer feature points, lower reliability, difficulty in feature selection and extraction, etc., to improve matching operation speed, Reduce calculation time and calculation amount, suppress the effect of natural background

Pending Publication Date: 2020-12-29
AMICRO SEMICON CORP
View PDF5 Cites 3 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, for images of complex scenes, the complexity of feature locations is high, and feature selection and extraction are difficult, making the accuracy of SLAM navigation relying on image feature information low.
On the other hand, images of simple scenes have fewer feature points and lower reliability, which affects the accuracy of SLAM navigation

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Visual positioning method based on ground texture, chip and mobile robot

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0016] The technical solutions in the embodiments of the present invention will be described in detail below with reference to the drawings in the embodiments of the present invention.

[0017] It should be noted that relative terms such as the terms "first" and "second" are only used to distinguish one entity or operation from another entity or operation, and do not necessarily require or imply any relationship between these entities or operations. There is no such actual relationship or order between them. Furthermore, the term "comprises", "comprises" or any other variation thereof is intended to cover a non-exclusive inclusion such that a process, method, article, or apparatus comprising a set of elements includes not only those elements, but also includes elements not expressly listed. other elements of or also include elements inherent in such a process, method, article, or device. Without further limitations, an element defined by the phrase "comprising a ..." does not...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a visual positioning method based on ground texture, a chip and a mobile robot, and belongs to the technical field of monocular visual navigation. Compared with the prior art,a camera disclosed by the invention is mounted at the bottom of a machine body, so that the camera is not interfered by an external light source and can be applied to indoor and outdoor working areas.Extra image acquisition processing does not need to be carried out on the periphery of the vehicle body or above the ground in the navigation positioning process; according to the visual positioningmethod, in the navigation positioning process of the mobile robot, mean weighting operation is carried out on gray values of first ground texture feature points by utilizing pre-configured texel intervals and a gray level distribution relationship, and feature value difference matching search is carried out in cooperation with a pre-configured ground texture feature library, so that a natural background is effectively inhibited; ground texture is highlighted, and effective local texture information is extracted through matching to complete high-precision visual positioning.

Description

technical field [0001] The invention belongs to the technical field of monocular vision navigation, and in particular relates to a ground texture-based visual positioning method, a chip and a mobile robot. Background technique [0002] In the prior art, the visual SLAM navigation of the robot mainly uses the ORB algorithm to detect feature points to realize simultaneous positioning and map construction. However, for images of complex scenes, the complexity of feature locations is high, and feature selection and extraction are difficult, making the accuracy of SLAM navigation relying on image feature information low. On the other hand, images of simple scenes have fewer feature points and lower reliability, which affects the accuracy of SLAM navigation. Contents of the invention [0003] In order to solve the navigation and positioning accuracy problem existing in the current visual SLAM technology, the present invention discloses a visual positioning method based on groun...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06T7/73G06T7/40G06T1/20G01C21/32
CPCG06T7/73G06T7/40G06T1/20G01C21/32G06T2207/20056
Inventor 许登科赖钦伟
Owner AMICRO SEMICON CORP
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products