Multi-target behavior action recognition and prediction method, electronic equipment and storage medium

An action recognition and prediction method technology, applied in character and pattern recognition, measurement devices, instruments, etc., can solve the problems of few recognition or prediction categories, high cost, inaccuracy, etc., to improve the accuracy and the number of behavior categories, improve The effect of accuracy

Pending Publication Date: 2020-01-14
陈羽旻
View PDF2 Cites 13 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] In order to overcome the deficiencies of the prior art, one of the purposes of the present invention is to provide a multi-target behavior recognition and prediction method, which can solve the problem of the lack of recognition or prediction of human or animal behaviors in the prior art, inaccurate and costly advanced questions
[0005] The second object of the present invention is to provide an electronic d

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Multi-target behavior action recognition and prediction method, electronic equipment and storage medium
  • Multi-target behavior action recognition and prediction method, electronic equipment and storage medium
  • Multi-target behavior action recognition and prediction method, electronic equipment and storage medium

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0078] Example 1: Tickle

[0079] Due to mosquito bites, bacteria, trauma, nervous tension, etc., it will cause local itching and itching of the skin in different positions. When itching occurs, humans and some animals will have the behavior of scratching. By recording the time, location, frequency, etc. of the behavior, the analysis of the environment, drugs, and psychological conditions in the future will have higher value. This embodiment is aimed at identifying and predicting basic behaviors, and its time span is small, and the user's behaviors can be directly reflected from the data detected in the portable monitoring device carried by the person.

[0080] For the existing technology and method, the most direct way is to shoot the user through the camera, and judge by the human body gesture recognition, but this technology requires the user to move in the shooting screen, and requires multiple cameras to shoot from different angles in order to accurately obtain the scrat...

Embodiment 2

[0122] Example 2: Going to the toilet

[0123] These actions such as washing, urinating and defecating, namely: extended behavior actions (hereinafter also referred to as: long-term behavior actions), people with sound physical functions will repeat these actions every day. Whether you wash your hands before meals and the number of times you urinate and defecate every day are closely related to people's health. Washing and urinating generally need to be performed in the bathroom. These actions usually last for a long time, ranging from tens of seconds to tens of minutes. It is difficult to directly use sensor data or even some extracted feature data. The main reason for identifying the above actions is that there is no fixed time range, and secondly, the time span of the action is large and the original data is huge. By recording the time, location, and frequency of the behavior, it will be of great help to the analysis of diet, exercise, and health status in the future.

[...

Embodiment 3

[0200] The present invention also provides an electronic device comprising a memory, a processor and a computer program stored on the memory and executable on the process, the processor implementing the program to achieve a multi-objective as described herein The steps of the action action recognition prediction method.

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a multi-target behavior action recognition and prediction method. The method comprises the following steps: firstly, establishing a behavior action model according to characteristic data information of an individual or an individual animal, sensor data collected by various portable monitoring devices carried by the individual or the individual animal and behavior action information of the individual or the individual animal; processing the sensor data or the action flow data of the individual or the individual animal in a preset time period to generate a data fragment graph; and calculating the data fragment graph by using the corresponding behavior action model to realize recognition of behavior actions of individuals or individual animals within a preset time and/or prediction of behavior actions after a preset time period. The method can achieve the recognition or prediction of various types of behavior actions, and provides an intelligent recognition, prediction and recording method for daily life behavior activities for individuals or individual animals. The invention further discloses electronic equipment and a storage medium.

Description

technical field [0001] The present invention relates to behavior recognition and prediction, in particular to a multi-target behavior and action recognition and prediction method, electronic equipment and storage medium. Background technique [0002] With the improvement of people's requirements for the quality of life, in order to improve people's quality of life and change bad habits, it is necessary to identify, monitor and analyze people's daily activities, diet, exercise and other behaviors. Existing behavior recognition generally uses image recognition, for example, by adding a camera to photograph the actions of a person's behavior and posture, and then performing image processing on the captured image or video to identify the user's current behavior and posture. This method requires the help of third-party equipment, such as camera equipment, etc. In the actual use process, in addition to increasing equipment costs, cameras are not allowed to be installed in some are...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G06K9/62G06K9/00G06K9/46G01D21/02
CPCG01D21/02G06V40/20G06V10/40G06F18/241G06F18/214
Inventor 陈羽旻
Owner 陈羽旻
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products