Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Three-dimensional attitude classification method based on two-dimensional key points and related device

A technology of three-dimensional attitude and classification method, which is applied in the field of image processing, can solve problems such as high cost, large amount of data, and high requirements for input cameras, and achieve the effects of saving calculation and cost, good economy and practicability

Pending Publication Date: 2022-02-18
深圳市心象智能科技有限公司
View PDF0 Cites 2 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0003] Direct estimation method: Estimate 3D key points directly from the image through the end-to-end network, requiring the input to have 3D annotations. This method has fewer constraints, but requires the input and training sets to have 3D annotations. The application of the direct estimation method has a great impact on the input camera. The requirements are high, the cost is high, and such data sets are rare and captured in a controlled environment, resulting in the trained model often not being applicable to actual scenarios
[0004] Two-stage estimation method: first obtain the two-dimensional key point information of the target object, then input the two-dimensional key point information sequence into the subsequent network, and then "upgrade" to the three-dimensional key point, that is, output the three-dimensional pose; in this two-dimensional key point information sequence To include additional information, such as 3D annotations, 2D key points at different times, 2D key points at different angles of view, etc.; in applications based on human body posture, the human body parameter model can also be input according to the 2D human body key point information, and then projected The human body model optimization method for fitting two-dimensional key points; summarizing the two-stage estimation method is to first obtain the two-dimensional key point information sequence, and then obtain the three-dimensional key point estimation. The accuracy of this method is not bad, and the effect is relatively good. The data set requirements are wide and the application range is wide, but the second stage is to upgrade from two-dimensional key points to three-dimensional key points, which requires a large amount of data, so the calculation is large and slow

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Three-dimensional attitude classification method based on two-dimensional key points and related device
  • Three-dimensional attitude classification method based on two-dimensional key points and related device
  • Three-dimensional attitude classification method based on two-dimensional key points and related device

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0134] see figure 1 , figure 1 This is a schematic flowchart of the three-dimensional attitude classification method based on the two-dimensional key points of the target object provided in the first embodiment of the present application. This embodiment is used for the situation of performing three-dimensional attitude classification based on the two-dimensional key point information of the target object in the two-dimensional image, The method can be performed by a three-dimensional pose classification device based on two-dimensional key points of a target object, the device can be implemented in software and / or hardware, and can be integrated in an electronic device, the electronic device can be a mobile terminal or Computers and other equipment. like figure 1 As shown, a three-dimensional pose classification method based on two-dimensional key points of a target object provided in this embodiment may include:

[0135] Identify the target object area where the target obj...

Embodiment 2

[0155] This embodiment is an improvement and perfection of Embodiment 1. Embodiment 1 uses a three-key point relative expression structure. Because of the invariance of translation and rotation, this expression structure can perceive changes in the depth direction. It is very sensitive, but not so sensitive to changes in horizontal and vertical directions. In order to achieve better three-dimensional attitude perception and accurate perception in three directions including horizontal, vertical and depth, this embodiment adds ontology coordinate normalization structure expression way, see the network structure model Figure 7 , in order to better provide a three-dimensional pose classification method based on two-dimensional key points of the target object, which may include:

[0156] Identify the target object area where the target object is located from a target two-dimensional image, that is, determine the area of ​​the target object to be classified, and obtain the two-dime...

Embodiment 3

[0185] figure 2This is a schematic structural diagram of a three-dimensional attitude classification device based on two-dimensional key points of a target object provided in the third embodiment of the present application, which can execute the three-dimensional attitude classification method provided by the embodiment of the present application to achieve the corresponding functional modules and effects of the execution method. . like figure 2 As shown, the three-dimensional pose classification apparatus 200 based on two-dimensional key points may include:

[0186] A two-dimensional key point information acquisition module 201, configured to acquire two-dimensional key point information in a target two-dimensional image through a pre-trained deep learning network;

[0187] A key point structure feature determining module 202, configured to determine the key point structure feature of the target object according to the two-dimensional key point information;

[0188] The ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention is suitable for the technical field of image processing, provides a three-dimensional attitude classification method based on two-dimensional key points and a related device, and realizes that three-dimensional attitude classification of a target object can be obtained by only needing a two-dimensional image of the target object. The method mainly comprises the steps of obtaining two-dimensional key point information in a target two-dimensional image through a pre-trained deep learning network; determining key point structure features of the target object according to the two-dimensional key point information; and speculating the three-dimensional attitude classification of the target object through the key point structure features.

Description

technical field [0001] The embodiments of the present application belong to the technical field of image processing, and in particular, relate to a three-dimensional pose classification method based on two-dimensional key points and related devices. Background technique [0002] A two-dimensional image captured by a camera or a video camera will lose the information of the depth direction, so that the three-dimensional pose of the target object cannot be directly perceived when the two-dimensional image is processed by a computer. However, in the field of security monitoring, autonomous driving, smart elderly care, aviation and airports and other industries, there are a large number of computer recognition of the three-dimensional attitude of people, vehicles, aircraft, etc. in two-dimensional images, and even the recognition of physical space attitude. Requirements, such as the spatial orientation of corridors, tunnels and roads, etc. In order to enable the computer to dir...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06T7/73G06V10/764G06V10/774G06V10/82G06V40/10G06N3/04G06N3/08G06K9/62
CPCG06T7/73G06N3/084G06T2207/20081G06T2207/20084G06T2207/30196G06N3/047G06N3/048G06N3/045G06F18/2415G06F18/214
Inventor 谭品超
Owner 深圳市心象智能科技有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products