Patents
Literature
Patsnap Copilot is an intelligent assistant for R&D personnel, combined with Patent DNA, to facilitate innovative research.
Patsnap Copilot

7042 results about "Model parameters" patented technology

A model parameter is a configuration variable that is internal to the model and whose value can be estimated from data. They are required by the model when making predictions. They values define the skill of the model on your problem. They are estimated or learned from data.

Compressed low-resolution image restoration method based on combined deep network

The present invention provides a compressed low-resolution image restoration method based on a combined deep network, belonging to the digital image / video signal processing field. The compressed low-resolution image restoration method based on the combined deep network starts from the aspect of the coprocessing of the compression artifact and downsampling factors to complete the restoration of a degraded image with the random combination of the compression artifact and the low resolution; the network provided by the invention comprises 28 convolution layers to establish a leptosomatic network structure, according to the idea of transfer learning, a model trained in advance employs a fine tuning mode to complete the training convergence of a greatly deep network so as to solve the problems of vanishing gradients and gradient explosion; the compressed low-resolution image restoration method completes the setting of the network model parameters through feature visualization, and the relation of the end-to-end learning degeneration feature and the ideal features omits the preprocessing and postprocessing; and finally, three important fusions are completed, namely the fusion of the feature figures with the same size, the fusion of residual images and the fusion of the high-frequency information and the high-frequency initial estimation figure, and the compressed low-resolution image restoration method can solve the super-resolution restoration problem of the low-resolution image with the compression artifact.
Owner:BEIJING UNIV OF TECH

System and methodology and adaptive, linear model predictive control based on rigorous, nonlinear process model

A methodology for process modeling and control and the software system implementation of this methodology, which includes a rigorous, nonlinear process simulation model, the generation of appropriate linear models derived from the rigorous model, and an adaptive, linear model predictive controller (MPC) that utilizes the derived linear models. A state space, multivariable, model predictive controller (MPC) is the preferred choice for the MPC since the nonlinear simulation model is analytically translated into a set of linear state equations and thus simplifies the translation of the linearized simulation equations to the modeling format required by the controller. Various other MPC modeling forms such as transfer functions, impulse response coefficients, and step response coefficients may also be used. The methodology is very general in that any model predictive controller using one of the above modeling forms can be used as the controller. The methodology also includes various modules that improve reliability and performance. For example, there is a data pretreatment module used to pre-process the plant measurements for gross error detection. A data reconciliation and parameter estimation module is then used to correct for instrumentation errors and to adjust model parameters based on current operating conditions. The full-order state space model can be reduced by the order reduction module to obtain fewer states for the controller model. Automated MPC tuning is also provided to improve control performance.
Owner:ABB AUTOMATION INC

Single image super-resolution reconstruction method based on conditional generative adversarial network

The invention discloses a single image super-resolution reconstruction method based on a conditional generative adversarial network. A judgment condition, namely an original real image, is added intoa judger network of the generative adversarial network. A deep residual error learning module is added into a generator network to realize learning of high-frequency information and alleviate the problem of gradient disappearance. The single low-resolution image is input to be reconstructed into a pre-trained conditional generative adversarial network, and super-resolution reconstruction is performed to obtain a reconstructed high-resolution image; learning steps of the conditional generative adversarial network model include: learning a model of the conditional adversarial network; inputtingthe high-resolution training set and the low-resolution training set into a conditional generative adversarial network model, using pre-trained model parameters as initialization parameters of the training, judging the convergence condition of the whole network through a loss function, obtaining a finally trained conditional generative adversarial network model when the loss function is converged,and storing the model parameters.
Owner:NANJING UNIV OF INFORMATION SCI & TECH

A combined deep learning training method based on a privacy protection technology

The invention belongs to the technical field of artificial intelligence, and relates to a combined deep learning training method based on a privacy protection technology. The efficient combined deep learning training method based on the privacy protection technology is achieved. In the invention, each participant first trains a local model on a private data set to obtain a local gradient, then performs Laplace noise disturbance on the local gradient, encrypts the local gradient and sends the encrypted local gradient to a cloud server; The cloud server performs aggregation operation on all thereceived local gradients and the ciphertext parameters of the last round, and broadcasts the generated ciphertext parameters; And finally, the participant decrypts the received ciphertext parameters and updates the local model so as to carry out subsequent training. According to the method, a homomorphic encryption scheme and a differential privacy technology are combined, a safe and efficient deep learning training method is provided, the accuracy of a training model is guaranteed, and meanwhile a server is prevented from inferring model parameters, training data privacy and internal attacksto obtain private information.
Owner:UNIV OF ELECTRONICS SCI & TECH OF CHINA

Voice identification method using long-short term memory model recurrent neural network

The invention discloses a voice identification method using a long-short term memory model recurrent neural network. The voice identification method comprises training and identification. The training process comprises steps of introducing voice data and text data to generate a commonly-trained acoustic and language mode, and using an RNN sensor to perform decoding to form a model parameter. The identification process comprises steps of converting voice input to a frequency spectrum graph through Fourier conversion, using the recursion neural network of the long-short term memory model to perform orientational searching decoding and finally generating an identification result. The voice identification method adopts the recursion neural network (RNNs) and adopts connection time classification (CTC) to train RNNs through an end-to-end training method. These LSTM units combining with the long-short term memory have good effects and combines with multi-level expression to prove effective in a deep network; only one neural network model (end-to-end model) exits from a voice characteristic (an input end) to a character string (an output end) and the neural network can be directly trained by a target function which is a some kind of a proxy of WER, which avoids to cost useless work to optimize an individual target function.
Owner:SHENZHEN WEITESHI TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products