Patient behavior multi-modal perception and analysis system based on deep learning

An analysis system and deep learning technology, applied in the field of multi-modal perception and analysis system of patient behavior based on deep learning, can solve problems such as difficult to express timing clues, gradient problems, and texture information, and achieve accurate perception Contradictions of sexual needs, improving the efficiency and level of diagnosis and treatment, and improving the effect of prediction accuracy

Active Publication Date: 2020-11-10
FUDAN UNIV
View PDF5 Cites 4 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] For complex medical scenarios such as emergency department, ICU, nursing, isolation ward or metabolic cabin, the traditional disadvantage of the multi-dimensional perception algorithm of patient behavior based on deep learning is that it cannot effectively perceive the fine behavior and fine-grainedness of patients. The compliance of behavior implementation cannot be accurately judged. At the same time, most hospitals and medical data centers are still in the stage of manual sample collection and analysis and automated single-mode analysis for the analysis and research of patient behavior, although some institutions have launched Perceptual analysis of multimodal data, but the lack of compatibility processing and consideration of multimodal data greatly restricts data analysis and research on patient behavior and next-step medical outcomes
[0005] Most of the existing deep learning methods are applied to image information processin...

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Patient behavior multi-modal perception and analysis system based on deep learning
  • Patient behavior multi-modal perception and analysis system based on deep learning
  • Patient behavior multi-modal perception and analysis system based on deep learning

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0043] Such as figure 1 As shown, this embodiment provides a deep learning-based multimodal perception and analysis system for patient behavior, including a data acquisition unit, a patient body posture recognition unit, a patient physiological signal recognition unit, a patient image information recognition unit, and a patient voice information recognition unit. Unit, deep fusion unit and display module, the data acquisition unit is used to acquire multimodal patient data, and the data acquisition unit is respectively connected to the patient body posture recognition unit, patient physiological signal recognition unit, patient image information recognition unit and patient voice information recognition unit; The deep fusion unit is respectively connected to the patient body posture recognition unit, the patient physiological signal recognition unit, the patient image information recognition unit, the patient voice information recognition unit and the display module.

[0044] ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention relates to a patient behavior multi-modal perception and analysis system based on deep learning. The patient behavior multi-modal perception and analysis system comprises a data acquisition unit, a patient human body posture recognition unit, a patient physiological signal recognition unit, a patient image information recognition unit, a patient voice information recognition unit anda deep fusion unit, preprocessing and interested area extraction and diagnosis are achieved on collected multi-modal data such as postures, physiology, images and voice of patients, a deep fusion unit adopts a multi-modal two-dimensional feature and three-dimensional feature fusion network structure, and a preliminary segmentation result is obtained through a 2D deep learning network; and a patient behavior detection result is acquired on the basis of the preliminary segmentation result through a 3D deep learning network. Compared with the prior art, the invention has the advantages that thepatient behavior is evaluated more accurately, the focus is accurately positioned, the accuracy of predicting the pathological trend of the patients are remarkably improved, and a powerful basic guarantee is provided for implementation of scientific intervention of the patient behavior and intelligent optimization means of a medical process.

Description

technical field [0001] The invention relates to the field of patient behavior analysis, in particular to a deep learning-based multimodal perception and analysis system for patient behavior. Background technique [0002] With the continuous development of deep learning technology, in dealing with many single-modal perceptual machine learning tasks, deep neural networks have achieved great advantages and information processing effects compared with traditional information processing methods. For example, the proposal of cyclic neural network and recurrent neural network (RNN) has achieved extremely successful engineering promotion and application of medical diagnosis for the processing of sequence problems of patient medical record text information and voice information; the proposal of models such as AlexNet and ResNet, It even surpasses human performance in task processing in the field of patient behavior video information. [0003] Applying deep learning technology to the...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G06K9/62G06N3/04G06N3/08G06K9/00G06K9/46A61B5/11A61B5/00A61B5/0488A61B5/0402A61B5/0496A61B5/0205A61B5/055A61B5/053A61B6/00A61B6/03A61B8/00
CPCG06N3/08A61B5/1118A61B5/1116A61B5/1121A61B5/02055A61B5/08A61B5/055A61B5/053A61B5/7203A61B5/7235A61B5/726A61B5/7267A61B5/7264A61B5/7275A61B6/00A61B6/032A61B6/52A61B8/52G06V40/10G06V10/44G06N3/045G06F18/253G06F18/254
Inventor 张立华杨鼎康翟鹏董志岩
Owner FUDAN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products