Real-time special effect processing method in large-scale live video

A technology of video live broadcast and special effect processing, applied in the field of video live broadcast, can solve the problems of long time-consuming face processing and no coverage, etc., and achieve good viewing effect, smooth replacement, and good user experience

Pending Publication Date: 2022-04-12
E SURFING VIDEO MEDIA CO LTD
View PDF9 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0003] Face processing in the existing technology takes a long time, and some initially need to be manually adjusted before tracking and mosaic processing. Some chat apps with added filters can achieve real-time effects due to limited faces on the screen. However, in large-scale live broadcasts, crowds often gather , the prior art does not cover this area

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Real-time special effect processing method in large-scale live video
  • Real-time special effect processing method in large-scale live video
  • Real-time special effect processing method in large-scale live video

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0015] The present invention will be described in detail below in conjunction with the accompanying drawings and specific embodiments. This embodiment is carried out on the premise of the technical solution of the present invention, and detailed implementation and specific operation process are given, but the protection scope of the present invention is not limited to the following embodiments.

[0016] After the video stream is connected, first use the deep convolutional network to obtain the face thumbnail, which is different from opencv and other models commonly used in the market, and use the lighter and faster dlib model for target recognition; once the face is detected, use the card Mann filter tracks and outputs the coordinates; uses the fuzzy model with adjustable effect to process the face thumbnail according to the coordinates; establishes a three-layer convolution, pooling, and two-layer convolution neural network, in which: one layer uses (7*7 ) filter, with a step...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention relates to a method for real-time special effect processing in large-scale video live broadcast, and the method comprises the steps: video stream access: firstly sending a video stream into a deep convolutional network to recognize a face, and obtaining a face thumbnail; kalman filtering is adopted when the human face is detected, the human face is detected and tracked in the video at the same time, and coordinates are output; processing a face thumbnail of the tracked face according to coordinates by using an effect-adjustable fuzzy model; and a feature tag is obtained through the trained model, the maximum matching item in the ip database is extracted by using the tag, a faceswap module is called to carry out face replacement coverage, and finally a new video stream is generated. The face detection, the face tracking and the face mosaic are realized in a large-scale multi-person mirror type live video, the replacement is smooth, the consumed time is short, and an almost real-time effect can be realized under GPU configuration; in combination with the existing ip resources, the face special effect of the hot ip is automatically replaced according to the features of the original face data, so that the method is more attractive, and bidirectional drainage and resource integration are realized.

Description

technical field [0001] The invention relates to a live video technology, in particular to a method for real-time special effect processing in large-scale live video. Background technique [0002] Now, the live video business is in the process of rapid expansion. During the live broadcast, many ordinary people are exposed in the live video, resulting in the leakage of personal privacy, and the protection of personal privacy has gradually attracted public attention. Live video broadcasting is popular because the scene is closer to life and the public. How to balance personal invisibility and video effects is a problem that needs to be solved. [0003] Face processing in the existing technology takes a long time, and some initially need to be manually adjusted before tracking and mosaic processing. Some chat apps with added filters can achieve real-time effects due to limited faces on the screen. However, in large-scale live broadcasts, crowds often gather , the prior art does...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): H04N21/44H04N21/2187G06V40/16G06V10/82G06N3/04G06N3/08
Inventor 宫苏辉肖伟冯振华
Owner E SURFING VIDEO MEDIA CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products