Unlock instant, AI-driven research and patent intelligence for your innovation.
Human face key point tracking method, application and device thereof
What is Al technical title?
Al technical title is built by PatSnap Al team. It summarizes the technical point description of the patent document.
A technology of face key points and corresponding positions, which is applied in the field of image processing and can solve problems such as shaking of face key points
Active Publication Date: 2018-02-16
WUHAN DOUYU NETWORK TECH CO LTD
View PDF4 Cites 13 Cited by
Summary
Abstract
Description
Claims
Application Information
AI Technical Summary
This helps you quickly interpret patents by identifying the three key elements:
Problems solved by technology
Method used
Benefits of technology
Problems solved by technology
[0004] The embodiment of the present invention solves the technical problem that the face key points located by the existing face key point positioning algorithm in the video tracking will shake by providing a face key point tracking method, application and device
Method used
the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more
Image
Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
Click on the blue label to locate the original text in one second.
Reading with bidirectional positioning of images and text.
Smart Image
Examples
Experimental program
Comparison scheme
Effect test
Embodiment approach 1
[0082] S1031. Obtain each pixel in the current video image frame;
[0083] S1032. According to the difference between each pixel point in the current video image frame and the corresponding pixel point in the previous video image frame, determine the first average pixel difference between the current video image frame and the previous video image frame.
[0084] Calculate the pixel difference between each pixel in the current video image frame and the pixel at the corresponding position in the previous video image frame, so as to obtain each pixel difference, and take the average value based on each pixel difference for the current video image frame relative to The mean value of the first pixel difference of the last video image frame.
[0085] S1033. Determine whether the first pixel difference mean is greater than the pixel difference threshold.
[0086] The value of the pixel difference threshold needs to be set according to the actual situation, for example, according to th...
Embodiment approach 2
[0089] Embodiment 2: According to the positions of the key points of the human face respectively located in the current video image frame and the previous video image frame, it is judged whether the human face moves, and the implementation process specifically includes the following steps:
[0090] S1031': According to the difference between each face key point located from the current video image frame and the corresponding position of the face key point located from the previous video image frame, determine the relative position of the current video image frame The second pixel difference mean value of the previous video image frame.
[0091] Specifically, the formula for calculating the mean value of the second pixel difference is:
[0092]
[0093] Among them, L is the average value of the second pixel difference, m is the number of key points of the face, the range of i is [0,m-1], A.x 2 (i) is the x-coordinate of the i-th effective face key point in the last video im...
the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More
PUM
Login to View More
Abstract
The invention discloses a human face key point tracking method, an application and a device thereof. The method comprises the following steps of acquiring a current video image frame; positioning a human face key point from the current video image frame; judging whether a human face in the current video image frame is moved relative to a human face in a previous video image frame of the current video image frame or not; if the human face in the current video image frame is moved relative to the human face in the previous video image frame, determining the human face key point positioned in thecurrent video image frame as an effective human face key point in the current video image frame; if the human face in the current video image frame is not moved relative to the human face in the previous video image frame, determining the weighted sum result of the human face key point positioned in the current video image frame with an effective human face key point at a corresponding position of the previous video image frame as the effective human face key point in the current video image frame. According to the invention, the shaking of human face key points in a video during the trackingprocess is completely avoided.
Description
technical field [0001] The invention relates to the field of image processing, in particular to a method, application and device for tracking key points of a human face. Background technique [0002] Recently, the cascaded shape regression model has made a major breakthrough in the task of face key point localization. This method uses the regression model to directly learn the mapping function from the face image to the face key point position, and then establishes the correspondence from input to output. relation. This type of method is simple and efficient, and has achieved good key point positioning results in both controllable scenes (faces collected under laboratory conditions) and uncontrollable scenes (network face images, etc.). In addition, facial feature point localization methods based on deep learning have also achieved impressive results. [0003] Although there are already relatively mature face key point positioning algorithms, the current face key point pos...
Claims
the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More
Application Information
Patent Timeline
Application Date:The date an application was filed.
Publication Date:The date a patent or application was officially published.
First Publication Date:The earliest publication date of a patent with the same application number.
Issue Date:Publication date of the patent grant document.
PCT Entry Date:The Entry date of PCT National Phase.
Estimated Expiry Date:The statutory expiry date of a patent right according to the Patent Law, and it is the longest term of protection that the patent right can achieve without the termination of the patent right due to other reasons(Term extension factor has been taken into account ).
Invalid Date:Actual expiry date is based on effective date or publication date of legal transaction data of invalid patent.