Eureka AIR delivers breakthrough ideas for toughest innovation challenges, trusted by R&D personnel around the world.

End-to-end human action recognition method, equipment and medium from the perspective of UAV

A motion recognition and unmanned aerial vehicle technology, applied in the field of motion recognition, can solve problems that affect the effect of human motion recognition, and achieve the effect of avoiding a large number of repetitions

Active Publication Date: 2022-05-17
杭州晨鹰军泰科技有限公司
View PDF8 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

At the same time, the change of the shooting angle of the drone will bring about the relative change of the appearance of the personnel in the picture, which also affects the effect of personnel motion recognition.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • End-to-end human action recognition method, equipment and medium from the perspective of UAV
  • End-to-end human action recognition method, equipment and medium from the perspective of UAV
  • End-to-end human action recognition method, equipment and medium from the perspective of UAV

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0044]The following will clearly and completely describe the technical solutions in the embodiments of the present invention with reference to the accompanying drawings in the embodiments of the present invention. Obviously, the described embodiments are only some, not all, embodiments of the present invention. Based on the embodiments of the present invention, all other embodiments obtained by persons of ordinary skill in the art without making creative efforts belong to the protection scope of the present invention.

[0045] The present invention provides an end-to-end personnel action recognition method from the perspective of an unmanned aerial vehicle, such as figure 1 and figure 2 shown, including the following steps:

[0046] S101. Construct and train a human action recognition network model; the human action recognition network model includes a feature extraction network (Extractor), a human target detection sub-network (Detector), a multi-target tracking sub-network...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

This application discloses an end-to-end personnel action recognition method, device and medium from the perspective of an unmanned aerial vehicle, including: constructing and training a personnel action recognition network model; the model includes a feature extraction network, a personnel target detection sub-network, and a multi-target tracking sub-network Network and human action recognition sub-network; input the image to be tested into the model, use the feature extraction network to perform feature extraction, and the extracted feature map is shared by the three sub-networks; use the human target detection sub-network to detect the current frame with a bounding box The target in the target; use the multi-target tracking sub-network to perform inter-frame multi-target tracking according to the target's appearance feature vector and bounding box; use the human action recognition sub-network to integrate the motion information of the same target between different frames, and identify the target in the current frame type of action. In this way, the three sub-tasks of detection, tracking, and recognition are integrated into the same neural network, avoiding a large number of repetitive feature extraction and calculation redundancy, and achieving the effect of real-time behavior recognition.

Description

technical field [0001] The present invention relates to the technical field of action recognition, in particular to an end-to-end human action recognition method, device and medium from the perspective of a drone. Background technique [0002] Personnel action recognition technology is the key technology of the intelligent monitoring and analysis system. Combined with advanced drones and high-definition cameras, it can form a long-distance cruise warning system and enhance reconnaissance and counterattack capabilities. This is a space-time sequence action positioning task. It is necessary to locate the location where the person's action occurs in each frame of video, and at the same time determine the start and end time of the action. [0003] With the improvement of hardware performance and the application of GPU accelerated computing, methods based on deep learning have achieved great success in the field of computer vision. At present, in order to ensure the accuracy of ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06V20/40G06V40/20G06V10/46G06N3/08
CPCG06N3/08
Inventor 周斯忠郑成俊蒋祁
Owner 杭州晨鹰军泰科技有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Eureka Blog
Learn More
PatSnap group products