Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Interaction area and interaction time period identification method, storage equipment and mobile terminal

A recognition method and time segment technology, applied in character and pattern recognition, image data processing, instruments, etc., can solve the problem of no human-interactive activity area identification, loss of computing efficiency and real-time performance, and no complete record of activity area activities Time and other issues

Active Publication Date: 2017-11-07
CHONGQING UNIV OF POSTS & TELECOMM
View PDF4 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0003] Most of the existing human-related algorithms are model-based detection algorithms, which need to use specific operators or models to perform matching calculations on the entire image, which greatly increases computing consumption and loses computing efficiency and real-time performance.
At present, there are algorithms for generating a priori regions for human activity regions, most of which cannot take into account real-time and accuracy requirements. When determining regions, human body displacement activities (such as: walking, running, crawling, etc.) and interactive activities (displayed on the image It is an activity that changes between a dynamic state and a static state. The movement stops interacting with objects in a certain space, such as: standing up and sitting down, going out and entering the door, drinking water, etc.) There is no obvious distinction between these two human activity states, and there is no interaction type for the human body. Identification of the active area
Moreover, most of the existing algorithms only focus on the calibration in space, but do not record the time corresponding to the activity, and do not fully record the activity area of ​​​​a human body and the time of activity in this area.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Interaction area and interaction time period identification method, storage equipment and mobile terminal
  • Interaction area and interaction time period identification method, storage equipment and mobile terminal
  • Interaction area and interaction time period identification method, storage equipment and mobile terminal

Examples

Experimental program
Comparison scheme
Effect test

Embodiment approach

[0057] Please refer to figure 1 As shown, it is a flow chart of a preferred embodiment of a method for identifying an interaction area and an interaction time period of a differential pixel according to the present invention. A preferred implementation of the method for identifying the interaction area and interaction time period of the differential pixels includes the following steps:

[0058] Step S1: receiving an input video signal, and performing frame splitting processing on the input video signal to generate a single-frame image.

[0059] Frame rate refers to the number of still pictures played per second in a video format, so the video signal can be disassembled into several still pictures, that is, deframed. There are many existing softwares that can realize the frame splitting function, so I won’t go into details here.

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses an interaction area and interaction time period identification method which is suitable for being executed in calculating equipment. The method comprises the following steps of receiving an input video signal, and carrying out frame disassembling processing on the input video signal so as to generate a single frame image; carrying out human shape detection on the acquired single frame image; carrying out total graph difference and recording differential data; determining whether the first N frames of image pixel difference data satisfy a human body interaction characteristic model and recording a triggered interaction area and a time breakpoint and a terminated interaction area and a time breakpoint; and recording a human body interaction area and a time period The invention also discloses storage equipment and a mobile terminal.

Description

technical field [0001] The invention belongs to the field of computer vision recognition, and in particular relates to a method for recognizing an interaction area and an interaction time period of difference pixels, and also relates to a storage device and a mobile terminal capable of realizing the above functions. Background technique [0002] With the development of science and technology and the wide application of modern video technology, image processing and pattern recognition methods based on machine vision are more and more used in the fields of pattern recognition, motion analysis, video surveillance and artificial intelligence. [0003] Most of the existing human-related algorithms are model-based detection algorithms, which need to use specific operators or models to perform matching calculations on the entire image, which greatly increases computing consumption and loses computing efficiency and real-time performance. At present, there are algorithms for generat...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/00G06T5/00G06T7/254G06K9/46G06K9/62
CPCG06T7/254G06T2207/20024G06T2207/10016G06T2207/30196G06V40/103G06V10/50G06F18/22G06F18/2411G06T5/90G06T5/70
Inventor 赵志强邵立智崔盈冉鹏徐光侠钱鹰周贤菊田健
Owner CHONGQING UNIV OF POSTS & TELECOMM
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products