Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Obtaining metrics for a position using frames classified by an associative memory

一种关联存储器、已分类的技术,应用在存储器系统、内存地址/分配/重定位、仪器等方向,能够解决不期望测量额外的度量等问题

Active Publication Date: 2016-03-16
THE BOEING CO
View PDF2 Cites 2 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0003] However, in some cases it may not be desirable to measure additional metrics about the person

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Obtaining metrics for a position using frames classified by an associative memory
  • Obtaining metrics for a position using frames classified by an associative memory
  • Obtaining metrics for a position using frames classified by an associative memory

Examples

Experimental program
Comparison scheme
Effect test

Embodiment approach

[0083] For this implementation, the user will set up a predefined database and embed the training data captured by the motion sensor. The results of the training data will thus be labeled with a label corresponding to each pose for which its measure is expected. Using the associative memory, the user then fetches this data into the associative memory for classifying new observations according to the data. The resulting data will be used as a general classifier.

[0084] Once extracted, the user is able to have the system periodically capture motion data from the motion sensor and perform an entity comparison of the captured data to similarly locate other motion. The resulting category of entity comparisons will be set to "result". Therefore, the new observation will adopt the motion result that is most equivalent to it, such as Figure 10 shown in . Thus, for example, the set of common attributes 1002 of the results 1004 belonging to “reach out” matches those attributes of...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

A method (1200) for identifying a motion of interest of an individual. The method includes collecting, at a computer (1304), motion sensor input data of motions of the individual from a motion sensor (1302) for an interval of time. The method further includes analyzing, using the computer (1304), the motion sensor input data using an analysis application (1308) having a set of classified predetermined motions of interest. The analysis application (1308) classifies a movement captured during the interval of time as a motion corresponding to one of a plurality of pre-determined motions of interest based on shared relative attributes. The method further includes generating an output providing notice of an identified predetermined motion of interest to a monitoring system (1412).

Description

technical field [0001] The present disclosure relates to methods and apparatus for determining which movements of a person or object are generated metrics. Background technique [0002] A classification system receives data, analyzes the data, and then uses a classifier to assign the data to a known set, where one or more elements of the data correspond to one or more elements in the known set. For example, in a human motion detection and classification system, sensors can measure human activity. These sensors can feed their data into a classification system, which then analyzes the data to determine which action the data most closely resembles. An example of such a classification might classify whether a person is sitting, standing, walking, holding a phone, bending over, or taking some other action. In other examples, the classification system may analyze input from sensors on the aircraft and then classify some aspect of the aircraft's operation, such as whether the air...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/00
CPCG06V40/20G06V40/103G06T7/70G06F3/017A61B5/1116A61B5/1118G06F12/08G06T7/20G06T1/60G06F12/082A61B2562/0219G06V40/23G06F18/22
Inventor J·D·惠兰
Owner THE BOEING CO
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products