Tumble detection method based on video articulation points and hybrid classifier

A hybrid classifier and detection method technology, applied in the field of fall detection based on video joints and hybrid classifiers, can solve problems such as difficult to reduce detection time-consuming, simple model, complex model, etc., to reduce detection time-consuming, reduce accuracy Accurate, high-precision effect

Active Publication Date: 2019-12-03
HANGZHOU DIANZI UNIV
View PDF3 Cites 9 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, the traditional video-based fall detection algorithm relies on manually extracting fall features, and detects falls with the help of a linear discriminant classifier. The model is simple but the accuracy is low. The existing deep learning-based detection algorithm model is complex, and it is difficult to reduce the time-consuming detection.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Tumble detection method based on video articulation points and hybrid classifier
  • Tumble detection method based on video articulation points and hybrid classifier
  • Tumble detection method based on video articulation points and hybrid classifier

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 2

[0087] Such as image 3 Shown, a kind of fall detection method based on video articulation point and mixed classifier, and the difference of embodiment 1 is:

[0088] In step 2, instead of performing Gaussian denoising processing on each frame image of the detected video segment, grayscale processing is performed.

[0089] In step 6, the secondary classifier is different from that in Example 1. The secondary classifier in this implementation is a multi-scale convolutional neural network (referred to as MultiCNN), including a convolutional layer connected sequentially through the activation function Relu , a pooling layer, and three fully connected layers. In the convolution method of the convolutional neural network, the padding parameter is set to 'valid'. Four convolution kernels with scales of 3×3, 5×5, 7×7, and 9×9 are set in the convolution layer; the sizes of the pooling layer are 8×1, 6×1, 4×1, Four pooling operators of 2×1; 3×3, 5×5, 7×7, 9×9 convolution kernels and...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a tumble detection method based on video articulation points and a hybrid classifier. A traditional video-based fall detection algorithm depends on manual extraction of fall features, fall is detected by means of a linear discriminant classifier method, the model is simple, but the accuracy is low. The method comprises the following steps: 1, extracting each frame of imageof a detected video clip; 2, acquiring a human joint data matrix; 3, establishing a plurality of behavior matrixes; and 4, calculating a time characteristic parameter and a space characteristic parameter; 5, carrying out primary classification; 6, carrying out secondary classification. According to the method for extracting the human skeleton joint points, the problem that the human posture cannotbe accurately estimated by extracting the human aspect ratio, the projection area and the like through a traditional method is solved. The behavior matrix is constructed by adopting the sliding window with the fixed size, modeling can be performed on time and space axes at the same time, and falling behavior characteristics are fully expressed.

Description

technical field [0001] The invention belongs to the technical field of fall detection, in particular to a fall detection method based on video joints and hybrid classifiers. Background technique [0002] Scholars at home and abroad have done a lot of research on the fall of the elderly. There are three mainstream fall detection methods: fall detection based on wearable sensors, fall detection based on environmental sensors, and detection based on video images. The fall detection based on wearable sensors mainly collects the data collected by wearable sensors to set the threshold to detect falls, and the inaccurate threshold setting will affect the final detection results. The fall detection based on the environmental sensor mainly judges the fall through the data obtained by the pressure sensor on the ground or the sound detection equipment. If the environmental noise is too large, the data may be abnormal. Video-based fall detection mainly uses ordinary cameras or depth ca...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/00G06K9/62G06N3/04G06N3/08
CPCG06N3/08G06V40/20G06V20/41G06N3/045G06F18/2411
Inventor 蔡文郁郑雪晨郭嘉豪
Owner HANGZHOU DIANZI UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products