Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Information acquisition and transfer method of auxiliary vision system

A visual system and stereoscopic vision technology, applied in the information field, can solve problems such as huge amount of computation, errors, immature development of pattern recognition and intelligent systems, etc., and achieve the effect of small amount of computation

Inactive Publication Date: 2009-01-07
XIDIAN UNIV
View PDF0 Cites 36 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] (1) Since this method introduces algorithms such as electronic image stabilization and moving target segmentation, the amount of calculation is huge;
[0006] (2) Due to the immature development of pattern recognition and intelligent systems, the environment cannot be recognized reliably, so wrong conclusions are often given

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Information acquisition and transfer method of auxiliary vision system
  • Information acquisition and transfer method of auxiliary vision system
  • Information acquisition and transfer method of auxiliary vision system

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0027] The present invention will be described in further detail below with reference to the accompanying drawings.

[0028] Refer to attached figure 1 , the information acquisition steps of the present invention are as follows:

[0029] Step 1: Obtain image information.

[0030] Simultaneously obtain two original digital images of the measured object from different angles through two cameras I 1 and I 2 , as shown in Figure 3(a) and Figure 3(b).

[0031] The second step: extract the feature points of the image information.

[0032] Using the Harris corner detection method, the feature points in Figure 3(a) and Figure 3(b) are extracted respectively, and the extraction steps are as follows:

[0033] 2.1, use the following formula to calculate the image I 1 The gradient image:

[0034] X 1 = I 1 ⊗ ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses an information acquisition and transfer method for assistant vision systems. The method comprises the following steps: (1) extracting two original digital images of an object in different angles by using two cameras at the same time; (2) extracting characteristic points of the two original digital images by means of the Harris corner detection; (3) extracting three-dimensional geometrical information of the characteristic points by using the two cameras; (4) making a rectangular region where each characteristic point serves as the center, finding out the position of the next frame characteristic point and calculating motion vectors of the characteristic point; (5) dividing the road surface information of the original digital image by using a color histogram, according to the chromatic information and calculating the road information; (6) coding the motion information of the characteristic point of the original image, the three-dimensional geometrical information of the characteristic point and the road information respectively; and (7) transferring the coded information to a person with vision disorders via the information transfer array unit in the assistant vision system. The information acquisition and transfer method is advantageous in the accurate extraction of three-dimensional geometrical information of the object, and helps the patients with vision disorders to walk directionally and safely.

Description

technical field [0001] The invention belongs to the field of information technology and relates to a method for acquiring and transmitting environmental information. The method can effectively acquire environmental information and can be used to assist visually impaired people to realize directional walking and directional walking in special environments. Background technique [0002] Since the 1970s, scholars from various countries have carried out the design and exploration of electronic walking assistance systems to help blind people obtain environmental information, such as Laser Cane (Bolgiano D, Meeks EJ.A laser cane for the blind[J].IEEE Journal of Quantum Electronic, 1967, 3(6): 268.), FishR (Fish R. Auditory display for the blind [P]. USA: 3800082, 1974203226.), VOICE (Meijer P. Image-audio transformation system [ P].USA: 5097326, 1992203217.) system, Tyflos intelligent assistant for the blind, etc. These systems use sensing devices to obtain environmental data inf...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): A61F9/08G08B21/00
Inventor 郭宝龙孙伟陈龙
Owner XIDIAN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products