Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Action recognition method and its neural network generation method, device and electronic equipment

A neural network and action recognition technology, applied in the field of image recognition, can solve problems such as poor action recognition effect, achieve poor solution effect, and achieve the effect of stability and accuracy

Active Publication Date: 2022-03-29
BEIJING KUANGSHI TECH CO LTD
View PDF3 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] In view of this, the object of the present invention is to provide an action recognition method and its neural network generation method, device and electronic equipment, to solve the technical problem that the effect of the image recognition neural network on action recognition is poor in the prior art

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Action recognition method and its neural network generation method, device and electronic equipment
  • Action recognition method and its neural network generation method, device and electronic equipment
  • Action recognition method and its neural network generation method, device and electronic equipment

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0059] An embodiment of the present invention provides a neural network generation method for action recognition, as a variable convolution kernel neural network generation method that fuses optical flow information, such as figure 1 As shown, the method includes:

[0060] S11: Extract the target image to obtain optical flow features.

[0061] Wherein, the target image may be an image such as a dynamic video or a static picture acquired by an image acquisition device such as a common camera or a depth camera. In this embodiment, the optical flow feature information is first extracted from the target image input to the image recognition neural network to obtain the optical flow feature.

[0062] It should be noted that the optical flow (optical flow) is the instantaneous speed of the pixel movement of the space moving object on the observation imaging plane. Simply put, the optical flow is due to the movement of the foreground object itself in the scene, the movement of the ca...

Embodiment 2

[0071] An embodiment of the present invention provides a neural network generation method for action recognition, as a variable convolution kernel neural network generation method that fuses optical flow information, such as figure 2 As shown, the method includes:

[0072] S21: Extract the target image to obtain optical flow information.

[0073] In this embodiment, the optical flow information is firstly extracted from the target image input to the image recognition neural network, so as to obtain the optical flow information. Among them, the optical flow information expresses the change of the image, since the optical flow contains the information of the target movement, it can be used by the observer to determine the movement of the target.

[0074] Preferably, the target image can be extracted by an optical flow method, so as to obtain optical flow information. It should be noted that the optical flow method uses the changes of pixels in the image sequence in the time d...

Embodiment 3

[0091] This embodiment provides an application example based on the above neural network generation method for action recognition. In an implementation manner, the initial convolutional neural network is a two-dimensional convolutional neural network.

[0092] Preferably, the action recognition method of the two-dimensional deformation convolution kernel neural network may include: firstly, extracting the target image to obtain optical flow information; then extracting the optical flow information to obtain optical flow features; then generating The feature vector; then, according to the feature vector based on the two-dimensional convolution kernel in the two-dimensional convolutional neural network, the spatial dimension offset vector and the time dimension offset vector are obtained; after that, according to the spatial dimension offset vector, the two-dimensional convolution The two-dimensional convolution kernel in the neural network is spatially offset to obtain the two-d...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides an action recognition method and its neural network generation method, device, and electronic equipment, which relate to the technical field of image recognition. The neural network generation method for action recognition includes: extracting a target image to obtain optical flow features; The optical flow feature obtains convolution kernel bias information; according to the convolution kernel bias information, a deformed convolution neural network is generated based on the initial convolution neural network, which solves the problem of image recognition neural network in the prior art for action recognition The less effective technical issues.

Description

technical field [0001] The present invention relates to the technical field of image recognition, in particular to an action recognition method and its neural network generation method, device and electronic equipment. Background technique [0002] At present, action recognition, as an important basis for automatic video analysis, plays an important role in a series of application scenarios such as intelligent monitoring, new retail, human-computer interaction, and education and teaching. [0003] For example, in security monitoring scenarios, if abnormal behaviors such as pickpocketing, lock picking, and fighting can be well identified, it can play an important role in reducing manpower monitoring costs and maintaining public security; in the new retail field, action recognition can help Better understand user behavior, automatically analyze customer preferences, and improve user experience. [0004] However, the current neural network for action recognition mainly focuses...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06N3/04G06V40/20G06V10/82
CPCG06V40/20G06N3/045
Inventor 张弛吴骞
Owner BEIJING KUANGSHI TECH CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products