Approximate repeated video retrieval method incorporating global R features

An approximate repetition and global technology, applied in the field of approximate repetition video retrieval, which can solve the problems of single local texture information, low video retrieval accuracy, and ignoring global information of feature points.

Active Publication Date: 2017-05-10
XIAN UNIV OF TECH
View PDF3 Cites 9 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0006] At present, most near-duplicate video retrieval methods are based on local features and BOF retrieval models, but these methods only use a single local texture information and ignore the global information of feature points, resulting in low video retrieval accuracy.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Approximate repeated video retrieval method incorporating global R features
  • Approximate repeated video retrieval method incorporating global R features
  • Approximate repeated video retrieval method incorporating global R features

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0074] The present invention will be described in detail below with reference to the accompanying drawings and specific embodiments.

[0075] Framework diagram of an approximate duplicate video retrieval method fused with global R features, such as figure 1 As shown, it can be divided into two parts, namely: offline part and online part. The processing object of the offline part is the target video database, and the inverted index table required for online part query is generated; the online part mainly completes the query process of the query video in the target video database.

[0076] The processing object of the offline part is the reference video library, which performs key frame extraction, SIFT feature extraction, R feature extraction, feature clustering analysis, quantization from feature vectors to visual vocabulary, and generates visual vocabulary and related features. The inverted index table for online part of the query.

[0077] The online part completes the que...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses an approximate repeated video retrieval method incorporating global R features. The method comprises the steps of firstly implementing the extraction of local SIFT features of videos in a databases, based on the coordinate information of the local SIFT features, establishing a global R feature, using descriptor information of the local SIFT features to establish a BOF retrieval model, based on the BOF model, establishing a voting retrieval model, finally implementing an application information fusion strategy to fuse global geometric distribution information into the BOF model, and accurately retrieving approximate repeated video in large-scale data. The approximate repeated video retrieval method incorporating global R features fuses the global R feature can fuse the global geometric distribution information into the BOF model based on the information fusion strategy, and achieve accurate retrieval of the approximate repeated video in large-scale data.

Description

technical field [0001] The invention belongs to the technical field of video analysis and retrieval methods, and in particular relates to an approximate repeated video retrieval method fused with global R features. Background technique [0002] With the rapid development of communication technology, video capture equipment, and video editing software, the number of online videos has grown exponentially. At the same time, video-related services, such as advertising, video sharing, recommendation and monitoring, stimulate the interest of online users and participate in video-related activities, such as searching, uploading, downloading, and commenting. [0003] Today, a large number of videos are uploaded and shared on the Internet every day, and there are a large number of almost repeated videos on the Internet. The emergence of a large number of near-duplicate videos has spawned many new applications, such as: video result reordering, copyright protection, online video usag...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06F17/30G06K9/00
CPCG06F16/783G06V20/48G06V20/46
Inventor 廖开阳王玮郑元林曹从军赵凡蔺广逢
Owner XIAN UNIV OF TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products