Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Video annotation method and device

A video and video technology, applied in the computer field, can solve the problem of poor generalization of video annotation, and achieve the effect of solving poor generalization.

Pending Publication Date: 2021-03-02
HANGZHOU HIKVISION DIGITAL TECH
View PDF2 Cites 1 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] The purpose of the embodiments of the present application is to provide a video tagging method and device to solve the problem of poor generalization of video tagging

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Video annotation method and device
  • Video annotation method and device
  • Video annotation method and device

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0058] The following will clearly and completely describe the technical solutions in the embodiments of the application with reference to the drawings in the embodiments of the application. Apparently, the described embodiments are only some of the embodiments of the application, not all of them. Based on the embodiments in this application, all other embodiments obtained by persons of ordinary skill in the art without making creative efforts belong to the scope of protection of this application.

[0059]An embodiment of the present application provides a video tagging method, which is applied to an electronic device, and the electronic device may be any electronic device with a data processing function, for example, a mobile phone, a computer, or a tablet computer. Before marking the video containing the target object, the electronic device can shoot the target object to obtain multiple images containing the target object, and then generate a labeling model for labeling the ta...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The embodiment of the invention provides a video annotation method and device, and belongs to the technical field of computers, and the method comprises the steps: obtaining the annotation informationof a target object before the annotation of the target object in a to-be-annotated video, and enabling the annotation information to comprise a plurality of images and the position information of thetarget object in each image, wherein the plurality of images include the target object shot at different shooting angles; based on the annotation information, training a preset annotation model to obtain a first annotation model capable of annotating the target object; and labeling the target object in the to-be-labeled video through the first labeling model to obtain a labeled video. By adoptingthe technical scheme provided by the embodiment of the invention, the problem of poor generalization of video annotation can be solved.

Description

technical field [0001] The present application relates to the field of computer technology, in particular to a video labeling method and device. Background technique [0002] Video annotation is to identify the object appearing in the video frame for each video frame included in the video to be annotated, and the identification method is generally to generate an annotation frame containing the object. [0003] In the related art, the electronic device for video tagging may pre-store multiple general tagging models, each of which corresponds to a preset object, and the specific process of video tagging by the electronic device includes: identifying objects in the video to be tagged object, as the target object to be marked, and then, the electronic device can compare the target object with multiple general annotation models to determine whether the target object is the preset object corresponding to the general annotation model, if the target object is the corresponding The ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/00G06K9/62
CPCG06V20/41G06F18/23
Inventor 亓先军郭竹修
Owner HANGZHOU HIKVISION DIGITAL TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products