Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Rocket booster separation motion parameter measurement method based on deep learning

A technology of parameter measurement and separation motion, applied in neural learning methods, instruments, image analysis, etc., can solve problems such as method interference, affecting measurement accuracy, and the inability of the 3D target detection network to complete tasks, and achieve the effect of fast operation speed

Pending Publication Date: 2022-06-24
BEIJING INSTITUTE OF TECHNOLOGYGY +1
View PDF0 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Secondly, when the rocket and booster are separated, fire, smoke, etc. will be generated, and the image quality of the camera will change accordingly, which will affect the measurement accuracy
Third, complex backgrounds can also greatly interfere with geometric feature-based methods
However, the attitude of the rocket booster requires three degrees of freedom to describe, and the 3D object detection network cannot complete the task

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Rocket booster separation motion parameter measurement method based on deep learning
  • Rocket booster separation motion parameter measurement method based on deep learning
  • Rocket booster separation motion parameter measurement method based on deep learning

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0023] In order to make the purpose, features, and advantages of the present invention more obvious and understandable, the following will clearly and completely describe the technical solutions in the embodiments of the present invention with reference to the accompanying drawings in the embodiments of the present invention. Obviously, the embodiments described below are only a part of the embodiments of the present invention, but not all of the embodiments. Based on the embodiments of the present invention, all other embodiments obtained by those of ordinary skill in the art without creative work fall within the protection scope of the present invention.

[0024] (1) Preprocess the received image or video data.

[0025] Input image I ∈ R W×H×3 , where R is the image pixel subset, W is the image width, and H is the image height. use first Normalize the image, where x i Represents the pixel value of the image, μ is the mean of all samples, σ is the standard deviation of a...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to a rocket booster separation motion parameter measurement method based on deep learning. The method comprises the following steps: carrying out normalization and standardization preprocessing on received image or video data; inputting the preprocessed data into a feature extraction network, and performing down-sampling on a picture to obtain a feature map; respectively carrying out regression on the central point, the scale, the depth and the attitude by utilizing a decoding network consisting of a convolutional layer and an activation function to obtain output; calculating the three-dimensional position of the center of the rocket booster; if the received data is a video or a sequential image sequence of the same video, smoothing the predicted three-dimensional position and posture; and calculating to obtain the speed and the angular speed when the rocket booster is separated. According to the method, the obtained data is input into the network model, an end-to-end single-step pose measurement algorithm is realized, post-processing such as non-maximum suppression is not needed, the operation speed of the algorithm is improved, and then motion parameters of rocket booster separation are obtained through calculation. Compared with the prior art, the method does not need specific geometric features during pose measurement, needs fewer known conditions, and has the advantages of high processing speed, high precision and high robustness in a complex environment.

Description

technical field [0001] The invention relates to the technical field of computer image processing, in particular to a method for measuring separation motion parameters of a rocket booster based on deep learning. Background technique [0002] When the rocket is launched, the rocket booster provides power to the rocket to achieve a predetermined speed, and then quickly separates from the rocket body. Accurately measuring the motion parameters of the booster after separation can provide data support for the prediction and recovery of the landing position of the booster. It can also be used to judge whether the separation was successful. Vision-based motion parameter measurement has the characteristics of no contact with the measured object and high measurement accuracy, and has been widely used in public medical, aerospace, automotive electronics and other fields. The measurement of target pose parameters is the core content of vision-based motion parameter measurement, and is ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06T7/60G06T7/73G06N3/04G06N3/08
CPCG06T7/60G06T7/73G06N3/08G06T2207/20081G06T2207/20084G06T2207/30164G06N3/048G06N3/045
Inventor 宫久路谌德荣王泽鹏刘邵荣
Owner BEIJING INSTITUTE OF TECHNOLOGYGY
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products