Eureka AIR delivers breakthrough ideas for toughest innovation challenges, trusted by R&D personnel around the world.

Interactive blind guiding system and method based on improved Yolov2 target detection and voice recognition

A blind-guiding system and target detection technology, applied in speech recognition, character and pattern recognition, speech analysis, etc., can solve problems such as low security, lack of intelligent interactivity, and large network restrictions, so as to improve image detection speed , good scene description function, and the effect of improving travel safety

Pending Publication Date: 2020-01-24
SOUTH CHINA UNIV OF TECH
View PDF0 Cites 6 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0003] At present, the intelligent blind guide systems that appear on the market are mainly based on infrared ray assistance and guide rods to help blind people travel. They have not realized intelligent interactivity, and their safety is low. They basically rely on the blind people's own judgment to make decisions. The accident rate higher
The recently emerging smart glasses for the blind need to be equipped with human customer service to realize remote interaction, which is difficult to achieve universal promotion and use, and is costly, consumes a lot of resources, and is greatly restricted by the network
[0004] At present, the intelligent system that uses deep learning technology to help blind people find objects is not yet mature. The main reason is that the accuracy and accuracy required to find objects is too large, and portable devices cannot support such a huge computing power.
At present, some low-power target detection networks have achieved similar accuracy and precision to common target detection networks, but the required computing resources have been greatly reduced, which makes it possible to deploy deep neural networks in portable devices.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Interactive blind guiding system and method based on improved Yolov2 target detection and voice recognition
  • Interactive blind guiding system and method based on improved Yolov2 target detection and voice recognition
  • Interactive blind guiding system and method based on improved Yolov2 target detection and voice recognition

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0057] The present invention will be further described in detail through specific embodiments below, but the embodiments of the present invention are not limited thereto.

[0058] In order to better describe the present invention, in the research and implementation of the interactive blind guide system, all the training methods and design principles of deep learning and neural networks quoted in related papers are used, and the symbols that appear are all The corresponding theoretical basis and source code can be found, so I won’t repeat them here.

[0059] An interactive blind guide system based on improved Yolov2 target detection and speech recognition, such as figure 1 , 2 As shown, including the central processing unit and its connected depth camera, high-end speech synthesis device, microphone and power supply, among which:

[0060] Central processing unit: used for system control, data processing and signal transmission to ensure the stable operation of the whole syste...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention belongs to the technical field of computer vision and voice recognition, and relates to an interactive blind guiding system and method based on improved Yolov2 target detection and voicerecognition. The interactive blind guiding system comprises a central processing unit as well as a depth camera, a high-end voice synthesis device, a microphone and a power supply which are connectedwith the central processing unit, the central processing unit is used for system control, data processing and signal transmission, and control software of the interactive blind guiding system is deployed on the central processing unit and comprises a target detection unit, a voice recognition unit and a road planning unit; the depth camera is used for performing image acquisition on the current scene to generate an RGB image and a depth map; the high-end voice synthesis device is used for synthesizing the voice information output by the central processing unit and playing an object searchingresult or a road planning condition; the microphone is used for collecting user voice information and transmitting the user voice information to the central processing unit; and the power supply is used for supplying power to the central processing unit. The garment can assist the blind in living better, and the living quality of the blind is improved.

Description

technical field [0001] The invention belongs to the technical field of computer vision and speech recognition, and relates to an interactive blind guiding system and method based on improved Yolov2 target detection and speech recognition. Background technique [0002] In recent years, with the development of computer science and technology, under the huge impetus of deep learning, a new intelligent technology method, various technologies of artificial intelligence, such as speech recognition technology, image recognition technology, data mining technology, etc., have made substantial progress. Developed and successfully applied in many products. Deep learning is the current focus and hotspot in the field of computer vision research, and it is also one of the commonly used methods to solve complex environmental problems. As a milestone in the history of human science and technology development, computer vision plays a pivotal role in the development of intelligent technology...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/62G10L13/04G10L15/22G10L15/26
CPCG10L15/22G10L2015/223G10L13/00G10L15/26G06F18/2411
Inventor 彭文杰余菲林坤阳林泽锋郑东润范智博罗家祥
Owner SOUTH CHINA UNIV OF TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Eureka Blog
Learn More
PatSnap group products