Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

A UAV human-computer interaction method based on binocular vision and deep learning

A deep learning and binocular vision technology, applied in neural learning methods, user/computer interaction input/output, mechanical mode conversion, etc., can solve the problems of high computational complexity, lack of acceleration algorithms, slow recognition speed, etc. Wide working range, solving camera drift and complex background interference, high accuracy

Inactive Publication Date: 2019-05-21
TIANJIN UNIV
View PDF7 Cites 17 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] Traditional action recognition algorithms have high computational complexity, and due to the lack of necessary acceleration algorithms, the recognition speed is slow and the accuracy is low

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A UAV human-computer interaction method based on binocular vision and deep learning
  • A UAV human-computer interaction method based on binocular vision and deep learning
  • A UAV human-computer interaction method based on binocular vision and deep learning

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0032] The present invention will be further described in detail below in conjunction with the accompanying drawings and through specific embodiments. The following embodiments are only descriptive, not restrictive, and cannot limit the protection scope of the present invention.

[0033] A human-computer interaction method for drones based on binocular vision and deep learning, the specific steps are as follows:

[0034] 1) When the system is started, according to the content displayed by the camera, the navigator is framed from a single point of view displayed on the ground station, and the navigator is tracked by using the fast tracking algorithm. Low-resolution video sequence in the center.

[0035] 2) Using the stereo matching algorithm based on block matching to perform stereo matching on the low-resolution part, this part of the stereo matching is accelerated by the graphics processor. At the same time, the parameters of this step provide the maximum and minimum paralla...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to a human-machine interaction method for drones based on binocular vision and deep learning. The drone is equipped with an embedded image processing platform and a binocular camera. The embedded image processing platform is connected to the flight controller through an interface. The man-machine communicates with the ground through the platform, which is equipped with a graphics processor, runs the convolutional neural network deep learning algorithm, and performs parallel calculation on the image while the camera captures the video. The invention transplants the convolutional neural network to an embedded platform configured with a dedicated graphics processing unit (GPU), and accelerates the running speed through parallel computing.

Description

technical field [0001] The method belongs to the field of multimedia information processing, and specifically relates to computer vision, deep learning, human-computer interaction and other technologies, especially a method for human-computer interaction of drones based on binocular vision and deep learning. Background technique [0002] Human-computer interaction technology was born with the birth of computers and gradually developed with the development of computer hardware and software. The emergence of new technologies has continuously simplified the process of human-computer interaction. In recent years, with the emergence and development of artificial intelligence technology and the continuous progress and innovation of related software and hardware technologies, how to achieve more convenient human-computer interaction has become a research hotspot, and various new human-computer interaction technologies are emerging. At the same time, with the rise and popularization...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06F3/01G06N3/08G06T7/11G06T7/194G06T7/254G06K9/00
CPCG06F3/017G06N3/08G06T2207/10016G06V40/23
Inventor 侯永宏叶秀峰侯春萍刘春源陈艳芳
Owner TIANJIN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products