Patents
Literature
Patsnap Copilot is an intelligent assistant for R&D personnel, combined with Patent DNA, to facilitate innovative research.
Patsnap Copilot

55 results about "Scalable computing" patented technology

The Scalable Computing research program builds knowledge in high performance computing (HPC) and couples this with scalable cloud computing to support scalable big-data application processing.

System and method for a hierarchical system management architecture of a highly scalable computing system

A modular computer system includes at least two processing functional modules each including a processing unit adapted to process data and adapted to input/output data to other functional modules through at least two ports with each port including a plurality of data lines. At least one routing functional module is adapted to route data and adapted to input/output data to other functional modules through at least two ports with each port including a plurality of data lines. At least one input or output functional module is adapted to input or output data and adapted to input/output data to other functional modules through at least one port including a plurality of data lines. Each processing, routing and input or output functional module includes a local controller adapted to control the local operation of the associated functional module, wherein the local controller is adapted to input and output control information over control lines connected to the respective ports of its functional module. At least one system controller functional module is adapted to communicate with one or more local controllers and provide control at a level above the local controllers. Each of the functional modules adapted to be cabled together with a single cable that includes a plurality of data lines and control lines such that control lines in each module are connected together and data lines in each unit are connected together. Each of the local controllers adapted to detect other local controllers to which it is connected and to thereby collectively determine the overall configuration of a system.
Owner:HEWLETT-PACKARD ENTERPRISE DEV LP +1

System and method for a hierarchical system management architecture of a highly scalable computing system

A modular computer system includes at least two processing functional modules each including a processing unit adapted to process data and adapted to input / output data to other functional modules through at least two ports with each port including a plurality of data lines. At least one routing functional module is adapted to route data and adapted to input / output data to other functional modules through at least two ports with each port including a plurality of data lines. At least one input or output functional module is adapted to input or output data and adapted to input / output data to other functional modules through at least one port including a plurality of data lines. Each processing, routing and input or output functional module includes a local controller adapted to control the local operation of the associated functional module, wherein the local controller is adapted to input and output control information over control lines connected to the respective ports of its functional module. At least one system controller functional module is adapted to communicate with one or more local controllers and provide control at a level above the local controllers. Each of the functional modules adapted to be cabled together with a single cable that includes a plurality of data lines and control lines such that control lines in each module are connected together and data lines in each unit are connected together. Each of the local controllers adapted to detect other local controllers to which it is connected and to thereby collectively determine the overall configuration of a system.
Owner:MORGAN STANLEY +1

Realtime processing of streaming data

The invention described here is intended for enhancing the technology domain of real-time and high-performance distributed computing. This invention provides a connotative and intuitive grammar that allows users to define how data is to be automatically encoded/decoded for transport between computing systems. This capability eliminates the need for hand-crafting custom solutions for every combination of platform and transport medium. This is a software framework that can serve as a basis for real-time capture, distribution, and analysis of large volumes and variety of data moving at rapid or real-time velocity. It can be configured as-is or can be extended as a framework to filter-and-extract data from a system for distribution to other systems (including other instances of the framework). Users control all features for capture, filtering, distribution, analysis, and visualization by configuration files (as opposed to software programming) that are read at program startup. It enables large scalable computation of high velocity data over distributed heterogeneous platforms. As compared with conventional approaches to data capture which extract data in proprietary formats and rely upon post-run standalone analysis programs in non-real-time, this invention also allows data streaming in real-time to an open range of analysis and visualization tools. Data treatment options are specified via end-user configuration files as opposed to hard-coding software revisions.
Owner:FISHEYE PROD

Simulation application-orientated universal extensible computing system

The invention discloses a simulation application-orientated universal extensible computing system. The system comprises a simulation model database 1, a simulation application management node 2 and a simulation computing node 3. The simulation model database 1 is the storage center of a simulation model; and a user can submit a newly developed model to the simulation model database 1 through the simulation application management node 2, and also can search the desired existing model in the simulation model database 1 through the simulation application management node 2 and download the model for use. The system has the advantages that: a simulation application integrated platform has universal function and extensible scale; the universality of the simulation platform and the plug and play of the simulation model realize dynamic adjustment of simulation computing load and resource use efficiency; the integration efficiency of the different application-orientated complex simulation system is improved, and the development and integration efficiency of the simulation application system is improved; and the availability, reliability and resource utilization rate of the simulation application system are improved.
Owner:NO 709 RES INST OF CHINA SHIPBUILDING IND CORP

Server and working method thereof

ActiveCN102521046ASolve processing performance bottlenecksEliminate overheadEnergy efficient ICTResource allocationScalable computingCurrent load
The invention provides a working method of a server, and the working method comprises the following steps: turning on a timer, setting the initial endurance time t of the timer to be equal to 0, judging whether the depth of a current load queue is between a low load threshold value and a high load threshold value or not, judging whether the depth of the current load queue is less than the low load threshold value or not if the depth of the current load queue is not between the low load threshold value and the high load threshold value, judging whether the endurance time t of the timer is more than a low load time threshold value or not if the depth of the current load queue is less than the low load threshold value, switching from a current calculating module to another lower-energy-level calculating module if the endurance time t of the timer is more than the low load time threshold value, and beginning receiving a new load by the lower-energy-level calculating module. The working method of the server has the advantages that the performances of the calculating modules can be extended, a storage space can be extended flexibly, the energy consumption of the calculating modules can be regulated according to performance requirements, and the minimization of the idle-load energy consumption of a system can be achieved to achieve high-efficiency service.
Owner:HUAZHONG UNIV OF SCI & TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products