Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

A vision-based user specific behavior analysis method and a self-service medical terminal

A behavioral analysis and self-service medical technology, applied in the fields of instruments, character and pattern recognition, computer parts, etc., can solve the problems of inability to record emergencies and automatic forensics.

Inactive Publication Date: 2019-05-10
深圳前海默比优斯科技有限公司
View PDF4 Cites 2 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

When various emergencies occur, they almost become a recording tool for post-event evidence collection, unable to make real-time recording and automatic evidence collection for emergencies

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A vision-based user specific behavior analysis method and a self-service medical terminal
  • A vision-based user specific behavior analysis method and a self-service medical terminal
  • A vision-based user specific behavior analysis method and a self-service medical terminal

Examples

Experimental program
Comparison scheme
Effect test

Embodiment approach

[0076] One: Collect video and obtain coordinates of human body joints.

[0077] Use convolutional neural network (CNN) to obtain the position coordinates of human body joints in each frame of the video. Let the jth joint coordinates of the i-th frame be p ij That is (x ij ,y ij );

[0078] 2. Extract the pose features of the human skeleton, and calculate the 10 pose spatiotemporal features of the sample, not limited to the 10 pose spatiotemporal features described below.

[0079] Feature 1: Vector matrix of joint positions relative to the center of mass of the human body

[0080] Define the center of gravity of all relevant nodes as the center of mass of the human body, and the coordinates of the center of mass are:

[0081]

[0082] That is (x ic ,y ic ), the feature matrix of joint coordinate trajectory is T=(t ij ) N×K , where t ij =p ij -p ic =(x ij -x ic ,y ij -y ic ),Such as image 3 It is a schematic diagram of the coordinates of the center of mass of...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a vision-based user specific behavior analysis method and a self-service medical terminal, and is characterized in that video images of a plurality of human body posture samples are sampled, corresponding comparison feature data of each human body posture is obtained through calculation and learning, and the comparison feature data is stored as a comparison feature database; the method comprises the following steps: calculating user space-time characteristic data of a human body posture in an image, comparing the data with data in a comparison characteristic database, and searching comparison characteristic data with a minimum difference value with the user space-time characteristic data in the comparison characteristic database; and when the difference value between the space-time characteristic data of the user and the found comparison characteristic data is smaller than a preset threshold value, judging that the current user belongs to a human body posture corresponding to the comparison characteristic data. Specific behaviors of users are taken as priori knowledge through training, and user behavior analysis is realized based on comparison of the prioriknowledge. Through the system, monitoring is more intelligent, user interaction is more user-friendly, and development of man-machine interaction is promoted.

Description

technical field [0001] The invention relates to the field of video monitoring, in particular to a vision-based user-specific behavior analysis method and a self-service medical terminal. Background technique [0002] Today's society is a rapidly developing society. At the same time, there are more and more emergencies, and the difficulty and importance of monitoring various public places are becoming more and more prominent. Most of the current video surveillance systems only detect and track moving objects in the scene, and rarely further identify and understand the objects. When various emergencies occur, they almost become a recording tool for evidence collection after the event, and cannot make real-time recording and automatic evidence collection for emergencies. Therefore, in the video surveillance system, it is very important to detect and understand the specific behavior of the moving target in real time. [0003] Specific behavior detection and gesture recognition...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06K9/00G06K9/46G06K9/62
Inventor 韦韧黄伟张浩杰郁建生周伟徐志宏张志勇郝云来李嘉诚
Owner 深圳前海默比优斯科技有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products