Unlock instant, AI-driven research and patent intelligence for your innovation.

Method for categorizing digital video data

a digital video and data technology, applied in the field of digital video data categorizing, can solve the problems of inconvenient use, limited utility or tagging range of each training set, and high cos

Active Publication Date: 2021-08-17
PROCTER & GAMBLE CO
View PDF7 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Benefits of technology

The patent describes a computer method for extracting information about actions or activities captured in digital data. The method involves analyzing data sets, creating clusters of features based on probability density functions and reference function distance calculations, and matching features from different data sets based on their similarity. The technical effect of this patent is an improved ability to extract information about actions and activities from digital data.

Problems solved by technology

Recognition of actions captured in images remains a challenge as actions may require abstract conceptualization, are not easily captured in still images, defining the target action to be tagged can be nuanced, for example: walking versus limping versus running as the action to be tagged.
Each training set may be of limited utility or tagging range.
Each activity may require the construction of an individual data set requiring cost prohibitive amounts of time, and resources, including human supervision, to create.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Method for categorizing digital video data

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0006]As used herein, the term action refers to the movement of an object, exemplary actions include: hand gestures, and facial expressions.

[0007]As used herein, the term activity means a set of actions directed toward a task and including: walking, running, sweeping, cleaning, dusting, mopping, vacuuming, shaving, oral care tasks, child care tasks, etc.

[0008]As used herein, the term: “probability density function” refers to a function which provides an indication of the relative likelihood of different possible values of a variable in a sample space.

[0009]As used herein, the term: “reference probability density function”, refers to probability density functions having the same number of variable dimensions as the probability density functions of interest and having values used to define a sample space for comparing variable probability density function values in the defined sample space.

[0010]Reference distributions ideally represent orthogonal random variables, in the sense that t...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

In one aspect, a computer implemented method for extracting information on actions or activities captured in digital data includes steps of: providing a first digital data set, extracting and characterizing first features from the first digital data set, creating first clusters from the extracted first features according to probability density functions and reference function distance calculations, providing a second digital data set, extracting and characterizing second features from the second digital data set, characterizing the extracted second features according to probability density functions and reference function distance calculations, and matching the second extracted features to a portion of the first extracted features clusters according to similarities in reference function distance calculations.

Description

FIELD OF THE INVENTION[0001]The invention relates to methods for categorizing digital data. The invention relates particularly to categorizing digital video data by tagging actions captured in the data.BACKGROUND OF THE INVENTION[0002]The automated recognition of objects and elements in images is known in the art. Faces, smiles people, animals may each be detected in images using automated computer analysis systems. Recognition of actions captured in images remains a challenge as actions may require abstract conceptualization, are not easily captured in still images, defining the target action to be tagged can be nuanced, for example: walking versus limping versus running as the action to be tagged. Typical methods for utilizing artificial intelligence systems including deep learning systems, require providing training data sets with both input data and identified output results. In the case of image tagging such a system may require extensive manual intervention to construct a trai...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(United States)
IPC IPC(8): G06F17/00G06F16/75G06F16/74G06K9/62G06K9/00G06V10/764
CPCG06F16/75G06F16/74G06K9/00744G06K9/6215G06K9/6221G06K9/6277G06V20/46G06V10/761G06V10/763G06V10/764G06F18/2321G06F18/22G06F18/2415
Inventor ORTEGA, JOSE M
Owner PROCTER & GAMBLE CO