Check patentability & draft patents in minutes with Patsnap Eureka AI!

Action recognition device and action recognition method

A technology for identifying devices and actions, applied in character and pattern recognition, instruments, computing, etc., and can solve problems such as inability to recognize

Inactive Publication Date: 2020-06-09
RICOH KK
View PDF5 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

These actions cannot be identified by following the trajectory of the position of the center of gravity above

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Action recognition device and action recognition method
  • Action recognition device and action recognition method
  • Action recognition device and action recognition method

Examples

Experimental program
Comparison scheme
Effect test

no. 1 approach 》

[0033] figure 1 is a functional configuration block diagram of the action recognition device according to the first embodiment.

[0034] The action recognition device 100 in this embodiment includes an image acquisition unit 101 , a spatiotemporal feature extraction unit 102 , an action recognition unit 103 , an integrated processing unit 104 , a dictionary creation unit 105 , and a recognition result output unit 106 .

[0035] The image acquisition unit 101 is, for example, as figure 2 The image of the camera 203 installed in the workplace 201 shown is obtained in real time or offline. The installation place of the camera 203 is arbitrary, and it may be any place as long as the action of the worker 202 when working in the workplace 201 can be photographed. The image of the camera 203 can be directly sent from the camera 203 to the action recognition device 100 through, for example, wired or wireless, or can be transmitted to the action recognition device 100 through a reco...

no. 2 approach

[0109] In the second embodiment, in the above-mentioned first embodiment, a condition that the reliability P of the element action is higher than the threshold value Thre set in advance as a judgment criterion is added to carry out comprehensive processing.

[0110] The basic structure of the action recognition device 100 in this embodiment is the same as that of the first embodiment above. figure 1 is the same. In the second embodiment, the action determination unit 104a of the integrated processing unit 104 compares the reliability P of the action of each element in units of frames, and determines the element whose reliability P is higher than the threshold value Thre and whose reliability P is the highest. Action is a function of the object job action. The processing operation of the second embodiment will be described in detail below.

[0111] Figure 15 It is a flowchart for explaining the integrated processing operation performed by the integrated processing unit 104 ...

no. 3 approach

[0131] In the third embodiment, there is a judgment time Tw for judging an element action, and the element action is judged at intervals of the judgment time Tw.

[0132] The basic configuration of the action recognition device 100 is the same as that of the above-mentioned first embodiment. figure 1same. In the third embodiment, the action determination unit 104a of the comprehensive processing unit 104 has a function of comparing the reliability P of each element action at the determination time Tw, and determining the element action with the highest reliability P as the target operation action. The processing operation of the third embodiment will be described in detail below.

[0133] Figure 16 It is a flowchart for explaining the integrated processing operation performed by the integrated processing unit 104 of the action recognition device 100 in the third embodiment. The integrated processing shown in this flowchart is in the Figure 13 Execute in step S27.

[013...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to an action recognition device and an action recognition method, and aims to recognize a series of actions of an operator during operation with high precision. The action recognition device recognizes a standard job predetermined as a monitoring target from an image of an imaging worker, the action recognition device being provided with: an image acquisition unit (101) thatacquires a plurality of frames of images included in the image; an action recognition unit (103) for recognizing a plurality of element actions included in the standard job from the characteristic change of each frame of image, and for determining the reliability of these element actions; and an action determination unit (104a) for comprehensively processing the reliability of each element actionand determining the work action of the worker in the element action.

Description

technical field [0001] The invention relates to an action recognition device and an action recognition method. Background technique [0002] In workplaces such as offices or factories, it is very important to improve work efficiency by using images captured by cameras to visualize and analyze operator actions. [0003] The previous action recognition method, for example, see Patent Document 1 (Japanese Patent Laid-Open No. 2011-100175 ), which uses multiple frames of images continuously obtained by a camera to identify a person, extracts the position track of the person's center of gravity as a feature quantity, and compares the previous The registered action center of gravity trajectory is compared to identify the action of the character. [0004] However, there are not only one but many actions of the worker during work, for example, walking while holding something. These actions cannot be identified by following the trajectory of the above-mentioned center of gravity. ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/00G06K9/62
CPCG06V40/20G06F18/23213G06F18/24
Inventor 关海克
Owner RICOH KK
Features
  • R&D
  • Intellectual Property
  • Life Sciences
  • Materials
  • Tech Scout
Why Patsnap Eureka
  • Unparalleled Data Quality
  • Higher Quality Content
  • 60% Fewer Hallucinations
Social media
Patsnap Eureka Blog
Learn More