Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Video highlight extraction method and system and storage medium

A technology of highlight clips and extraction methods, which is applied in neural learning methods, character and pattern recognition, speech analysis, etc., can solve the problems of time-consuming, heavy workload, and low interpretability of editing models, so as to achieve convenient adjustment of parameters, strong interpretability effect

Pending Publication Date: 2022-01-11
北京领格卓越科技有限公司
View PDF0 Cites 2 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0002] With the development of remote classroom technology, students, teachers, and course sales staff hope to evaluate and promote the exciting content in the classroom through short videos. Manual editing, this method is time-consuming, and the quality of the edited content is affected by unclear editing standards. Human factors have a greater impact, and educational institutions also hope to use automated methods to produce course highlights to promote their courses to reduce the cost of manual editing. Existing automatic video editing technologies include: end-to-end video editing through neural networks , this method is suitable for videos with fast lens changes, such as movies, etc. For online course videos that emphasize course content, these end-to-end editing models are not very interpretable, and need to be retrained when the demand for videos changes Model, heavy workload and high cost

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Video highlight extraction method and system and storage medium
  • Video highlight extraction method and system and storage medium
  • Video highlight extraction method and system and storage medium

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0056] The following will clearly and completely describe the technical solutions in the embodiments of the present invention with reference to the accompanying drawings in the embodiments of the present invention. Obviously, the described embodiments are only some, not all, embodiments of the present invention. Based on the embodiments of the present invention, all other embodiments obtained by persons of ordinary skill in the art without making creative efforts belong to the protection scope of the present invention.

[0057] In order to make the above objects, features and advantages of the present invention more comprehensible, the present invention will be further described in detail below in conjunction with the accompanying drawings and specific embodiments.

[0058] In order to realize automatic editing of wonderful moments, it is first necessary to correctly identify the emotions in the students' courses, convert the emotions into a form that can be recognized by the m...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to a video highlight extraction method and system and a storage medium. The method comprises the following steps: acquiring a to-be-processed network class video and teacher-student interaction characteristics, and dividing the to-be-processed network class video into a plurality of target videos, respectively performing facial expression analysis, eye movement attention analysis and gesture and body movement analysis on the pictures corresponding to all the frames of the target video to obtain a visual feature set of students and a visual feature set of teachers in the pictures corresponding to all the frames; determining the timeliness of student feedback according to the time interval between the voice segment corresponding to the student and the voice segment corresponding to the teacher in the audio of the target video; performing voice recognition and keyword extraction on the voice segments corresponding to the students and the teachers to determine fluency of teacher languages, fluency of student languages and correctness of teaching knowledge; and according to the priority of each target video, determining a highlight in the to-be-processed network class video. The method and system have high interpretability, and a user can conveniently adjust parameters according to requirements.

Description

technical field [0001] The invention relates to the field of video analysis and clipping, in particular to a method, system and storage medium for extracting highlights of a video. Background technique [0002] With the development of remote classroom technology, students, teachers, and course sales staff hope to evaluate and promote the exciting content in the classroom through short videos. Manual editing, this method is time-consuming, and the quality of the edited content is affected by unclear editing standards. Human factors have a greater impact, and educational institutions also hope to use automated methods to produce course highlights to promote their courses to reduce the cost of manual editing. Existing automatic video editing technologies include: end-to-end video editing through neural networks , this method is suitable for videos with fast lens changes, such as movies, etc. For online course videos that emphasize course content, these end-to-end editing models...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06V40/10G06V40/16G06V40/20G06V10/40G06N3/04G06N3/08G10L15/04G10L15/26
CPCG06N3/08G10L15/26G10L15/04G06N3/045G06V40/176G06V40/20G06V40/18G06V20/52G06V20/46G06V20/49G06V20/41G06V10/806G06F18/253G06V40/169G10L15/005G10L15/08G10L2015/088
Inventor 罗冠吴超尘凌淳仲燕杰
Owner 北京领格卓越科技有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products