Patents
Literature
Patsnap Copilot is an intelligent assistant for R&D personnel, combined with Patent DNA, to facilitate innovative research.
Patsnap Copilot

125results about How to "Reduce system performance" patented technology

Method, System and Server of Removing a Distributed Caching Object

The present disclosure discloses a method, a system and a server of removing a distributed caching object. In one embodiment, the method receives a removal request, where the removal request includes an identifier of an object. The method may further apply consistent Hashing to the identifier of the object to obtain a Hash result value of the identifier, locates a corresponding cache server based on the Hash result value and renders the corresponding cache server to be a present cache server. In some embodiments, the method determines whether the present cache server is in an active status and has an active period greater than an expiration period associated with the object. Additionally, in response to determining that the present cache server is in an active status and has an active period greater than the expiration period associated with the object, the method removes the object from the present cache server. By comparing an active period of a located cache server with an expiration period associated with an object, the exemplary embodiments precisely locate a cache server that includes the object to be removed and perform a removal operation, thus saving the other cache servers from wasting resources to perform removal operations and hence improving the overall performance of the distributed cache system.
Owner:ALIBABA GRP HLDG LTD

Open loop MIMO method, base station and user equipment based on direction of arrival

The invention discloses an open-loop MIMO method based on direction of arrival, a base station and user equipment, which are used in the technical field of wireless transmission. The method comprises the following steps that: the user equipment is divided into low-speed, middle-speed and high-speed user equipment according to the moving speed of the user equipment; the low-speed user equipment is set to adopt a closed-loop MIMO mode, and the middle-speed and high-speed user equipment is set to adopt an open-loop MIMO mode; then a sending end measures the direction of arrival of a feedback link, and estimates a precoding matrix index number of a sending link according to the direction of arrival obtained through the measurement; a rank of an MIMO system is decided; an MIMO sending mode is decided according to the rank and the precoding matrix index number; then the MIMO mode and the rank are sent to a receiving end; the receiving end intercepts the MIMO mode and the rank; and finally, the receiving end performs feedback according to intercepted information. The invention aims at the middle-speed and high-speed user equipment to provide an open-loop MIMO system and an open-loop MIMO device based on the direction of arrival, and has the characteristics of simple design and good system performance.
Owner:SHARP KK

System for maintaining a buffer pool

In a multi-threaded computing environment, a shared cache system reduces the amount of redundant information stored in memory. A cache memory area provides both global readable data and private writable data to processing threads. A particular processing thread accesses data by first checking its private views of modified data and then its global views of read-only data. Uncached data is read into a cache buffer for global access. If write access is required by the processing thread, the data is copied into a new cache buffer, which is assigned to the processing thread's private view. The particular shared cache system supports generational views of data. The system is particularly useful in on-line analytical processing of multi-dimensional databases. In one embodiment, a dedicated collector reclaims cache memory blocks for the processing threads. By utilizing a dedicated collector thread, any processing penalty encountered during the reclamation process is absorbed by the dedicated collector. Thus the user session threads continue to operate normally, making the reclaiming of cache memory blocks by the dedicated collector task thread transparent to the user session threads. In an alternative embodiment, the process for reclaiming page buffers is distributed amongst user processes sharing the shared memory. Each of the user processes includes a user thread collector for reclaiming a page buffer as needed and multiple user processes can concurrently reclaim page buffers.
Owner:ORACLE INT CORP

Reduced complexity detector for multiple-antenna systems

A reduced-complexity maximum-likelihood detector that provides a high degree of signal detection accuracy while maintaining high processing speeds. A communication system implementing the present invention comprises a plurality of transmit sources operable to transmit a plurality of symbols over a plurality of channels, wherein the detector is operable to receive symbols corresponding to the transmitted symbols. The detector processes the received symbols to obtain initial estimates of the transmitted symbols and then uses the initial estimates to generate a plurality of reduced search sets. The reduced search sets are then used to generate decisions for detecting the transmitted symbols. In various embodiments of the invention, the decisions for detecting the symbols can be hard decisions or soft decisions. Furthermore, in various embodiments of the invention, the initial estimates can be obtained using a plurality of linear equalization techniques, including zero-forcing equalization, minimum-mean-squared-error equalization. The initial estimate can also be obtained by nulling and canceling techniques. In various embodiments of the invention, the data output corresponding to the transmitted symbols can be obtained using a log-likelihood probability ratio. The method and apparatus of the present invention can be applied to any communication system with multiple transmit streams.
Owner:AVAGO TECH WIRELESS IP SINGAPORE PTE
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products