Inter-frame difference-based violent behavior detection method, system and device, and medium

A technology of inter-frame difference and detection method, applied in the direction of neural learning methods, instruments, biological neural network models, etc., can solve problems such as difficult to be extracted, and achieve the effect of solving fatigue and negligence

Active Publication Date: 2019-11-26
SHANDONG NORMAL UNIV
View PDF17 Cites 10 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Existing local and global methods are subjectively designed for specific tasks, while multi-feature-based violent behavior detection needs to extract features such as sound and texture, which are difficult to extract in hospitals, schools and other places

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Inter-frame difference-based violent behavior detection method, system and device, and medium
  • Inter-frame difference-based violent behavior detection method, system and device, and medium
  • Inter-frame difference-based violent behavior detection method, system and device, and medium

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0035] Embodiment 1, this embodiment provides a violent behavior detection method based on inter-frame difference;

[0036] Such as figure 1 As shown, the violent behavior detection method based on frame difference includes:

[0037] All frame images of the video to be detected are input into the pre-trained first convolutional neural network, and the appearance features of each frame image are output;

[0038] Use the inter-frame difference method to process the video to be detected, and extract several differential frame images; input each differential frame image into the pre-trained second convolutional neural network, and output the action features of each differential frame image ;

[0039] Input the appearance features of each frame image into the pre-trained first classifier, and output the first classification label of the current frame image;

[0040] Input the action feature of each differential frame image into the pre-trained second classifier, and output the s...

Embodiment 2

[0087] Embodiment 2, this embodiment provides a violent behavior detection system based on frame difference;

[0088] A violent behavior detection system based on frame difference, including:

[0089] The appearance feature extraction module is configured to: input all frame images of the video to be detected into the pre-trained first convolutional neural network, and output the appearance feature of each frame image;

[0090] The action feature extraction module is configured to: use the inter-frame difference method to process the video to be detected, and extract several difference frame images; input each difference frame image into the pre-trained second convolutional neural network , output the motion features of each differential frame image;

[0091] The first classification module is configured to: input the appearance feature of each frame image into a pre-trained first classifier, and output the first classification label of the current frame image;

[0092] The ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses an inter-frame difference-based violent behavior detection method, system and device, and a medium, and the method comprises the steps: inputting all frame images of a to-be-detected video into a first convolutional neural network, and outputting the appearance features of each frame image; processing the to-be-detected video by using an inter-frame difference method, and extracting a plurality of difference frame images; inputting each differential frame image into a second convolutional neural network, and outputting an action feature of each differential frame image;inputting the appearance features of each frame of image into a first classifier, and outputting a first classification label of the current frame of image; inputting the action feature of each differential frame image into a second classifier, and outputting a second classification label of the current differential frame image; fusing the first classification label and the second classificationlabel, and outputting a violent behavior detection result of the current frame image; and when the frame number of the violent behavior image exceeds a set threshold, considering that the violent behavior exists in the to-be-detected video.

Description

technical field [0001] The present disclosure relates to the technical field of violent behavior detection, in particular to a method, system, device and medium for violent behavior detection based on frame difference. Background technique [0002] The statements in this section merely mention background art related to the present disclosure and do not necessarily constitute prior art. [0003] In the process of realizing the present disclosure, the inventors found that the following technical problems existed in the prior art: [0004] Human behavior recognition based on surveillance video has always attracted the enthusiastic attention of researchers at home and abroad, not only because the research on human behavior recognition based on video is very important in the field of human-computer interaction, security monitoring, and medical diagnosis. The practical significance of behavior recognition, and the wide application of behavior recognition in many fields makes it h...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/00G06N3/04G06N3/08
CPCG06N3/08G06V20/46G06V20/41G06N3/045
Inventor 吕蕾陈梓铭
Owner SHANDONG NORMAL UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products