Unlock instant, AI-driven research and patent intelligence for your innovation.

Video interpretation apparatus and method

a technology for interpreting apparatus and video, applied in the field of video interpretation equipment, can solve the problems of inability to recognize events occurring in other types of domains or applications that are different from the learned domain or application for the dataset, and events are too diverse to be interpreted using only a learned event classification model

Inactive Publication Date: 2019-11-12
ELECTRONICS & TELECOMM RES INST
View PDF15 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Benefits of technology

Enables the interpretation of various types of videos by capturing dynamic spatial relations and generating meaningful event descriptions, enhancing the ability to recognize and describe events in diverse video content.

Problems solved by technology

However, since the event classification model generated in this way limitedly defines an event set only within a learned domain or application, it is impossible to recognize events occurring in other types of domains or applications that are different from the learned domain or application for the dataset.
However, when current normal users upload videos (images) captured by smart phones to a Social Networking Service (SNS) site, events are too diverse to be interpreted using only a learned event classification model.
However, Korean Patent Application Publication No. 10-2005-0016741 does not present a method for generating video information using dynamic spatial relations between objects in a video.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Video interpretation apparatus and method
  • Video interpretation apparatus and method
  • Video interpretation apparatus and method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0038]The present invention will be described in detail below with reference to the accompanying drawings. Repeated descriptions and descriptions of known functions and configurations which have been deemed to make the gist of the present invention unnecessarily obscure will be omitted below. The embodiments of the present invention are intended to fully describe the present invention to a person having ordinary knowledge in the art to which the present invention pertains. Accordingly, the shapes, sizes, etc. of components in the drawings may be exaggerated to make the description clearer.

[0039]Hereinafter, preferred embodiments of the present invention will be described in detail with reference to the attached drawings.

[0040]FIG. 1 is a block diagram showing a video interpretation apparatus according to an embodiment of the present invention.

[0041]Referring to FIG. 1, the video interpretation apparatus according to the embodiment of the present invention includes an object informat...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

Disclosed herein are a video interpretation apparatus and method. The video interpretation apparatus includes an object information generation unit for generating object information based on objects in an input video, a relation generation unit for generating a dynamic spatial relation between the objects based on the object information, a general event information generation unit for generating general event information based on the dynamic spatial relation, a video information generation unit for generating video information including any one of a sentence and an event description based on the object information and the general event information, and a video descriptor storage unit for storing the object information, the general event information, and the video information.

Description

CROSS REFERENCE TO RELATED APPLICATION[0001]This application claims the benefit of Korean Patent Application No. 10-2016-0053853, filed May 2, 2016, which is hereby incorporated by reference in its entirety into this application.BACKGROUND OF THE INVENTION1. Technical Field[0002]The present invention generally relates to technology for interpreting a video using a general event generated based on objects in the video.2. Description of the Related Art[0003]In order to interpret a video, there is a need to recognize events corresponding to objects in the video. Conventional event recognition technology is capable of recognizing an important event in a learned domain or application. In order to recognize an event, an event classification model for extracting video features from a time interval and spatial region of an object occurrence and recognizing the event from the extracted video features via machine learning is generated.[0004]However, since the event classification model genera...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(United States)
IPC IPC(8): H04N21/234H04N21/8405G06K9/00G06V10/26
CPCH04N21/8405G06K9/00G06K9/00744G06K2009/00738H04N21/23418G06V20/44G06V20/49G06V20/46G06V10/26G06Q50/01H04N21/23412H04N21/44008H04N21/472
Inventor MOON, JIN-YOUNGKANG, KYU-CHANGKWON, YONG-JINPARK, KYOUNGPARK, JONG-YOULLEE, JEUN-WOO
Owner ELECTRONICS & TELECOMM RES INST