Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Indoor article searching and positioning method for visually impaired people

A technology for visually impaired people, applied in the field of feature extraction of target items, can solve the problem of not considering characteristic information, etc., and achieve the effect of reducing distortion.

Inactive Publication Date: 2021-01-15
SHANGHAI MARITIME UNIVERSITY
View PDF0 Cites 1 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

This method only uses the attention mechanism for different video features, that is, extracts the overall feature information of different video features through the attention mechanism and fuses them. It also does not consider the overall (object), local (component) and minimum features of different videos. Characteristic information of constituent units (pixels)

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Indoor article searching and positioning method for visually impaired people
  • Indoor article searching and positioning method for visually impaired people
  • Indoor article searching and positioning method for visually impaired people

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0093] The following will clearly and completely describe the technical solutions in the embodiments of the present invention with reference to the accompanying drawings in the embodiments of the present invention. Obviously, the described embodiments are only some, not all, embodiments of the present invention. Based on the embodiments of the invention, all other embodiments obtained by persons of ordinary skill in the art without making creative efforts belong to the scope of protection of the present invention.

[0094] The present invention provides a method for finding and locating indoor items for visually impaired persons that integrates a multi-level attention mechanism neural network. The overall flow chart is as follows figure 1 shown, including the steps:

[0095] S1. People with visual impairments can input the name or characteristics of the items they need to search for through the voice recognition module. They can only input a single item at a time, and collect ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses an indoor article searching and positioning method for visually impaired people. The method comprises the following steps: S1, enabling the visually impaired people to input the name of a target article through a voice module and then collect images indoors through a binocular camera; s2, designing an adaptive sigmoid transfer algorithm (ASTF) based on a neural network, andcombining the ASTF with a Laplace operator to enhance the brightness of the acquired image and reduce the distortion degree; s3, designing a variable-scale convolutional neural network to convolve the images obtained in the step S2 to the same size; s4, designing a convolutional neural network fused with a multi-level attention mechanism to extract feature information of the image obtained in S3,and matching the feature information with feature data of a target object in a database; s5, if matching succeeds, obtaining the position of the target object, and outputing position information of the target object through a voice module; and if the matching is not successful, outputting the information that no information exists through the voice module. According to the invention, visually impaired people can be effectively helped to accurately search articles in a weak light environment.

Description

technical field [0001] The present invention relates to the technical field of image processing and recognition, in particular to a new adaptive sigmoid transfer algorithm (ASTF), a variable-scale convolutional neural network, and a method for extracting features of target items fused with a multi-level attention mechanism neural network. Background technique [0002] At present, there are many visually impaired people in our country. Due to the lack of visual information, visually impaired people cannot perceive and recognize common objects in life, which brings great challenges to their daily life. Although there have been many methods to help the visually impaired overcome visual difficulties in recent years, such as guide dogs and white canes for the blind, etc., these solutions can only serve as a navigation function, and cannot effectively identify and guide the blind. , especially in low-light environments, these methods play a weaker role. When the visually impaired...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06T7/70G06T7/90G06T7/13G06T17/00G06K9/62G06N3/04G06N3/08
CPCG06T7/70G06T7/90G06T7/13G06T17/00G06N3/08G06T2207/10024G06T2207/20081G06T2207/20084G06V10/751G06N3/045G06F18/23G06F18/25G06F18/241
Inventor 罗东升韩德志
Owner SHANGHAI MARITIME UNIVERSITY
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products