Patents
Literature
Patsnap Copilot is an intelligent assistant for R&D personnel, combined with Patent DNA, to facilitate innovative research.
Patsnap Copilot

121results about How to "Improve resource usage efficiency" patented technology

Neural network accelerator for bit width partitioning and implementation method of neural network accelerator

The present invention provides a neural network accelerator for bit width partitioning and an implementation method of the neural network accelerator. The neural network accelerator includes a plurality of computing and processing units with different bit widths, input buffers, weight buffers, output buffers, data shifters and an off-chip memory; each of the computing and processing units obtains data from the corresponding input buffering area and weight buffer, and performs parallel processing on data of a neural network layer having a bit width consistent with the bit width of the corresponding computing and processing unit; the data shifters are used for converting the bit width of data outputted by the current computing and processing unit into a bit width consistent with the bit width of a next computing and processing unit corresponding to the current computing and processing unit; and the off-chip memory is used for storing data which have not been processed and have been processed by the computing and processing units. With the neural network accelerator for bit width partitioning and the implementation method of the neural network accelerator of the invention adopted, multiply-accumulate operation can be performed on a plurality of short-bit width data, so that the utilization rate of a DSP can be increased; and the computing and processing units (CP) with different bit widths are adopted to perform parallel computation of each layer of a neural network, and therefore, the computing throughput of the accelerator can be improved.
Owner:TSINGHUA UNIV

Neural Network Accelerator for Bit Width Partitioning and Its Implementation Method

The present invention provides a neural network accelerator for bit width partitioning and an implementation method of the neural network accelerator. The neural network accelerator includes a plurality of computing and processing units with different bit widths, input buffers, weight buffers, output buffers, data shifters and an off-chip memory; each of the computing and processing units obtains data from the corresponding input buffering area and weight buffer, and performs parallel processing on data of a neural network layer having a bit width consistent with the bit width of the corresponding computing and processing unit; the data shifters are used for converting the bit width of data outputted by the current computing and processing unit into a bit width consistent with the bit width of a next computing and processing unit corresponding to the current computing and processing unit; and the off-chip memory is used for storing data which have not been processed and have been processed by the computing and processing units. With the neural network accelerator for bit width partitioning and the implementation method of the neural network accelerator of the invention adopted, multiply-accumulate operation can be performed on a plurality of short-bit width data, so that the utilization rate of a DSP can be increased; and the computing and processing units (CP) with different bit widths are adopted to perform parallel computation of each layer of a neural network, and therefore, the computing throughput of the accelerator can be improved.
Owner:北京芯力技术创新中心有限公司

Boiler system with U-shaped flue and boiler water charging system

The invention discloses a boiler system with a U-shaped flue and a boiler water charging system. The boiler system comprises a boiler body, a boiler flue and a U-shaped flue device, wherein the U-shaped flue device is connected to the boiler body through the boiler flue, the U-shaped flue device comprises a front end vertical flue, a heat exchanger and a rear end J-shaped flue, the front end vertical flue is connected to the boiler flue in a sealing mode, the fume which is discharged by a boiler flows from top to bottom in the front end vertical flue, the heat exchanger is connected to the front end vertical flue in a sealing mode, the fume which is discharged by the front end vertical flue flows the water pipe of the heat exchanger from top to bottom, the rear end J-shaped flue is connected to the heat exchanger in a sealing mode, the fume discharged by the heat exchanger flows from bottom to top in the rear end J-shaped flue and is discharged, and the rear end J-shaped flue, the front end vertical flue and the heat exchanger are connected to form a U shape. The problems that the heat exchanging is not complete due to quick flue speed, the flue is corroded, and the condensate is recovered are solved. The residual heat of the fume can be recovered deeply.
Owner:京能科技(易县)有限公司

Cloud resource scheduling method and device

The invention discloses a cloud resource scheduling method. The cloud resource scheduling method and device comprises the steps that a resource scheduling model is generated on the basis of an elastic stretching strategy and a dynamic migration strategy by combining the scale of a cloud platform; transverse and/or longitudinal multi-dimensional scheduling is conducted on virtual resources of which resource allocation exceeds a preset value through the elastic stretching strategy and the dynamic migration strategy in the resource scheduling model. According to the cloud resource scheduling method, by adopting the resource scheduling model which is generated on the basis of the elastic stretching strategy and the dynamic migration strategy by combing the scale of the cloud platform to conduct dynamic scheduling on the virtual resources in the platform, the migration and bad dynamic loading effect problems which are likely to be caused by an instant resource utilization rate peak value can be avoided, the resource using efficiency of the cloud platform is effectively improved, the stability of the cloud platform is improved, the cloud platform is safer and more reliable, and the user experience degree is increased. The invention further discloses a cloud resource scheduling device which also has the above-mentioned advantages.
Owner:ZHENGZHOU YUNHAI INFORMATION TECH CO LTD

Apparatus and method for migrating virtual machine, where client is located, among different hosts

The invention discloses an apparatus and a method for migrating a virtual machine, where a client is located, among different hosts, and aims to ensure business continuity and shorten migration delay. The apparatus comprises a resource scheduler module and a resource configuration module, wherein the resource scheduler module is used for obtaining a mirror image of a first virtual machine located in a first host and running a server; the client runs in a second virtual machine of the first host; the obtained mirror image is installed and runs in a second host which the client is migrated to, thereby generating a third virtual machine; the server runs in the third virtual machine; and the resource configuration module is used for adding a forwarding flow entry in the second host. By installing and running the mirror image, information required for accessing data in a network storage medium by the client is migrated to the second host, and old information still can be used after the client is migrated, so that the business continuity is ensured; and by adding the forwarding flow entry, a data access request of the client is forwarded to a new virtual machine, and the client does not need to re-establish a connection with a new server, so that the migration duration is shortened.
Owner:HUAWEI TECH CO LTD

Application process management method and application process management device

The invention provides an application process management method. The method comprises: receiving an application process switching instruction; through a switching process function in startup management service, obtaining foreground switching information of a switching foreground process and a background switching information of a switching background process, the switching information being corresponding to the application process switching instruction; according to the foreground switching information, performing foreground switching operation on the switching foreground process; and according to the background switching information, performing background switching operation on the switching background process. The invention also provides an application process management device. The application process management method and the application process management device perform foreground switching operation and background switching operation on the processes according to the foreground switching information and the background switching information of the processes, so as to prevent frequent close and open of the application processes, and improve application usage efficiency of users and resource use efficiency of a system.
Owner:GUANGDONG OPPO MOBILE TELECOMM CORP LTD

Communication network early-warning analysis system based on big data analysis

The invention discloses a communication network early-warning analysis system based on big data analysis. Through adoption of SOAs, backbone transmission networks are divided into a network control and data collection layer, a platform layer and a management application layer; different levels of backbone transmission networks are managed comprehensively; through system interconnection between SOAs, information transverse sharing and application coordination of the different levels of backbone transmission networks are realized between the same levels and between upper and lower levels of a communication management system; through synthesis of the resume information, machine account information and service utilization information of the communication management system, according to level distribution of communication network resource occupation, the utilization condition of the resources is comprehensively analyzed from various aspects such as optical fiber resources, multiplexing section resources and channel resources; the resource utilization condition data of communication devices and communication optical fibers is stored in real time; through analysis of communication resource occupation of different levels, comprehensive analysis is carried out on existing massive history operation and maintenance data through utilization of a big data technology; and potential hidden dangers possibly existing in the communication network can be predicted in advance.
Owner:中国电力技术装备有限公司郑州电力设计院

Simple and precise mixing device of injection molding machine

The invention discloses a simple and precise mixing device of an injection molding machine. The simple and precise mixing device comprises a mixing bin, wherein a fresh material hopper and an old material hopper are arranged above the mixing bin; the fresh material hopper is connected with a fresh material channel; an impeller A is rotationally arranged in the fresh material channel; one end face of the impeller A is connected with a locating rod; the locating rod is sleeved by a locating sleeve; the locating rod and the locating sleeve are connected together by a dowel pin to be capable of rotating together; a driving gear is arranged at the end part of the locating sleeve; the old material hopper is connected with an old material channel; an impeller B is rotationally arranged in the old material channel; a gear shaft is arranged at the end part of the impeller B; and the gear shaft and the driving gear are engaged and matched for transmission, so that the impeller B can rotate with the impeller A. Compared with the prior art, the device is simple in structure and low in manufacturing cost; a fixed quantity of reclaimed waste materials can be added into fresh materials; the resource usage efficiency is improved to the greatest extent; and the proportion of the fresh materials and the waste materials is adjustable.
Owner:宁波海洲机械有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products