Patents
Literature
Patsnap Copilot is an intelligent assistant for R&D personnel, combined with Patent DNA, to facilitate innovative research.
Patsnap Copilot

1173 results about "Low resource" patented technology

Answer: The resources a computer has are mainly processing speed, hard disk storage, and memory. The phrase low on resources usually means the computer is running out of memory. The best way to prevent this error from coming up is to install more RAM on your machine.

Scalable low resource dialog manager

InactiveUS6513009B1Easy to carryEasily scalable architectureSpeech recognitionSpoken languageData set
A spoken language interface between a user and at least one application or system includes a dialog manager operatively coupled to the application or system, an audio input system, an audio output system, a speech decoding engine and a speech synthesizing engine; and at least one user interface data set operatively coupled to the dialog manager, the user interface data set representing spoken language interface elements and data recognizable by the application. The dialog manager enables connection between the input audio system and the speech decoding engine such that a spoken utterance provided by the user is provided from the input audio system to the speech decoding engine. The speech decoding engine decodes the spoken utterance to generate a decoded output which is returned to the dialog manager. The dialog manager uses the decoded output to search the user interface data set for a corresponding spoken language interface element and data which is returned to the dialog manager when found, and provides the spoken language interface element associated data to the application for processing in accordance therewith. The application, on processing that element, provides a reference to an interface element to be spoken. The dialog manager enables connection between the audio output system and the speech synthesizing engine such that the speech synthesizing engine which, accepting data from that element, generates a synthesized output that expresses that element, the audio output system audibly presenting the synthesized output to the user.
Owner:NUANCE COMM INC

Design method of hardware accelerator based on LSTM recursive neural network algorithm on FPGA platform

The invention discloses a method for accelerating an LSTM neural network algorithm on an FPGA platform. The FPGA is a field-programmable gate array platform and comprises a general processor, a field-programmable gate array body and a storage module. The method comprises the following steps that an LSTM neural network is constructed by using a Tensorflow pair, and parameters of the neural networkare trained; the parameters of the LSTM network are compressed by adopting a compression means, and the problem that storage resources of the FPGA are insufficient is solved; according to the prediction process of the compressed LSTM network, a calculation part suitable for running on the field-programmable gate array platform is determined; according to the determined calculation part, a softwareand hardware collaborative calculation mode is determined; according to the calculation logic resource and bandwidth condition of the FPGA, the number and type of IP core firmware are determined, andacceleration is carried out on the field-programmable gate array platform by utilizing a hardware operation unit. A hardware processing unit for acceleration of the LSTM neural network can be quicklydesigned according to hardware resources, and the processing unit has the advantages of being high in performance and low in power consumption compared with the general processor.
Owner:SUZHOU INST FOR ADVANCED STUDY USTC

Purpose domain for in-kernel virtual machine for low overhead startup and low resource usage

Embodiments of the present invention provide an architecture for securely and efficiently executing byte code generated from a general programming language. In particular, a computer system is divided into a hierarchy comprising multiple types of virtual machines. A thin layer of software, known as a virtual machine monitor, virtualizes the hardware of the computer system and emulates the hardware of the computer system to form a first type of virtual machine. This first type of virtual machine implements a virtual operating domain that allows running its own operating system. Within a virtual operating domain, a byte code interpreter may further implement a second type of virtual machine that executes byte code generated from a program written in a general purpose programming language. The byte code interpreter is incorporated into the operating system running in the virtual operating domain. The byte code interpreter implementing the virtual machine that executes byte code may be divided into a kernel component and one or more user level components. The kernel component of the virtual machine is integrated into the operating system kernel. The user level component provides support for execution of an applet and couples the applet to the operating system. In addition, an operating system running in a virtual operating domain may be configured as a special purpose operating system that is optimized for the functions of a particular byte code interpreter.
Owner:RED HAT

An easy-to-realize method and device for full digital frequency conversion

The invention discloses an all digital frequency converting method and a device thereof, being easily realized for hardware. The method and the device are essentially used for sample rate convertion of rational number-times of baseband signals and the convertion of the baseband signals and the intermediate frequency signals in digital communication. Under the coordination of control signals and enabling signals, the convertion of signal sample rate can be finished and the convertion of the baseband signals and the intermediate signals can be finished through the reasonable matching of variable integral number-times wave filtering and fraction-times interpolation. The system of the invention essentially comprises a frequency mixer, a cascade connection integral comb filter, a fraction-time interpolating device, a half-band filter, a signal shaping filter, a power detection module and a control interface. The configurable hardware implemented structure of the invention is applicable to a plurality of modulation methods, has the advantages of low resource consumption and good portability, and is used for various wireless communication systems such as multilevel phase shift keying (MPSK), orthogonal frequency division multiplex (OFDM), direct sequence spread spectrum (DSSS) and continuous phase modulation (CPM), etc.
Owner:ZHEJIANG UNIV

Intelligent resource optimization method of container cloud platform based on load prediction

The invention discloses an intelligent resource optimization method of a container cloud platform based on load prediction, and belongs to the field of container cloud platforms. The method comprisesthe following steps of: based on a grayscale model, predicting the load condition of the next time window of each container instance according to the historical load of the container instance; judgingwhether the load of a node is too high or too low according to the load prediction value of all containers on each physical node; then executing the corresponding scheduling algorithm, migrating somecontainers on the node with over high load to other nodes, so that the load of the node is in a normal range; migrating all container instances on the node with over low load to other nodes so that the node is empty. According to the invention, aiming at the problem that the resource utilization is not balanced and the resource scheduling is delayed in a prior data center, load forecasting analysis is introduced, the load of the data center is scheduled and optimized in advance, the performance loss caused by the over high load of the node and the low resource utilization rate caused by the over low load are avoided, thereby improving the resource utilization efficiency of the platform.
Owner:杭州谐云科技有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products