Method and device for identifying action types, and method and device for broadcasting programs

A technology of action types and recognition methods, applied in character and pattern recognition, input/output of user/computer interaction, television, etc., can solve problems such as inflexible program broadcasting methods and program broadcast errors

Active Publication Date: 2013-04-03
IDEAPOOL CULTURE & TECH CO LTD
View PDF4 Cites 17 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] The embodiment of the present invention provides an action type identification method, a program broadcasting method and a device, which are used to solve the program broadcasting errors and program broadcasting methods that may be caused by the playback of the broadcast controller's control template in the existing online packaging application inflexibility problem

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Method and device for identifying action types, and method and device for broadcasting programs
  • Method and device for identifying action types, and method and device for broadcasting programs
  • Method and device for identifying action types, and method and device for broadcasting programs

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0045] Such as image 3 As shown, it is a flow chart of the action type recognition method in Embodiment 1 of the present invention, including the following steps:

[0046] Step 101: Receive the bone data frame collected by the somatosensory device.

[0047] The skeletal data frame includes reference skeletal nodes and at least one setting coordinates of the skeletal nodes in a three-dimensional coordinate space composed of mutually perpendicular horizontal directions, vertical directions, and depth-of-field directions;

[0048] The set skeletal node is determined according to the needs of actual action recognition, and may be any one of the above-mentioned 20 joint points in the skeletal data frame.

[0049] Step 102: Perform denoising processing on the received skeleton data frame.

[0050] It should be noted that step 102 is a preferred step in Embodiment 1 of the present invention. Salt and pepper noise and other types of noise will be generated during image acquisition ...

Embodiment 2

[0151] Such as Figure 5 As shown, it is a flow chart of the program broadcasting method in Embodiment 2 of the present invention, and the program broadcasting method includes the following steps:

[0152] Step 201: Use the action recognition method in Embodiment 1 to determine the action type of at least one combined action for setting the skeletal nodes in the currently received skeletal data frame;

[0153] Step 202: Determine whether the action type of the determined combination action is a valid action type; if yes, execute step 203; if not, go to step 201.

[0154] In this step 202, if the determined action type of the combined action belongs to the preset action type set, it is determined that the determined action type is a valid action type; otherwise, it is determined that the determined action type is an invalid action type.

[0155] Step 203: According to the stored correspondence between the action type of the combination action and the special effect animation, ...

Embodiment 3

[0168] With the same inventive concept as the first embodiment of the present invention, the third embodiment of the present invention provides an action type recognition device, such as Figure 6 As shown, the device includes: a receiving module 101, a first determining module 102, a first judging module 103, a second determining module 104, an action type determining module 105 and a combined action type determining module 106, wherein:

[0169] The receiving module 101 is used to receive the skeleton data frame collected by the somatosensory device, the skeleton data frame includes the reference skeleton node and at least one set skeleton node in a three-dimensional coordinate space composed of mutually perpendicular horizontal direction, vertical direction and depth of field direction coordinate of;

[0170] The first determination module 102 is used to determine the horizontal direction, vertical direction and depth direction of the set bone node in the current bone data ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a method and a device for identifying action types, and a method and a device for broadcasting programs. The method for identifying the action types is used for judging the action types for acquired skeleton data frames in the horizontal direction, the vertical direction and the depth-of-field direction, and mainly includes firstly, judging the direction of a current skeleton data frame according to position relations among set skeleton nodes in adjacent skeleton data frames; and secondly, judging the action types of the set skeleton nodes by the aid of coordinate values of set skeleton nodes in the current skeleton data frame and coordinate values of set skeleton nodes in marked skeleton data frames when the difference between a frame number of the current skeleton data frame and a frame number of each marked skeleton data frame is within a set range. The method and the device for identifying the action types and the method and the device for broadcasting the programs have the advantage that the action types of the set skeleton nodes can be identified, the method for identifying the action types can further be applied to various scenes, and the purpose of controlling the application scenes by the aid of actions of characters is achieved.

Description

technical field [0001] The invention relates to the technical field of image processing and pattern recognition, in particular to an action type recognition method, a program broadcasting method and a device. Background technique [0002] Online packaging means that templates, text, pictures, real-time information and other content can be combined, rendered and broadcast in real time on the broadcast line during program broadcasting in studios, OB trucks, broadcasting machine rooms, etc. In terms of function, online packaging can be applied to news topics, sports reports, variety entertainment, financial information, weather forecasts, etc., to provide extensive, professional, and accurate instant information, to enhance the visual effects of video and audio signals in real time, and to improve the quality of programs. quality and viewability. With the rapid development of computer graphics and image technology, audiences have higher and higher demand for watching TV progra...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/62G06F3/01H04N5/262
Inventor 吴雷李金楠邸楠
Owner IDEAPOOL CULTURE & TECH CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products