Patents
Literature
Hiro is an intelligent assistant for R&D personnel, combined with Patent DNA, to facilitate innovative research.
Hiro

42results about How to "Reduce cache pressure" patented technology

Internet of Vehicles content cache decision optimization method

The invention relates to an Internet of Vehicles content cache decision optimization method, which belongs to the technical field of mobile communication. In the Internet of Vehicles content cache decision optimization method, the Internet of Vehicles is provided with a plurality of content cache nodes, and the content requested by a vehicle can be stored in the content cache nodes; if a nearby vehicle or a roadside unit already caches the request content of the current vehicle, the current vehicle obtains a content service from the cache node through a V2V link or a V2I link; a mobile edge computing server is deployed on the RSU side, can provide storage and computing capabilities and is used for content storage and processing; and due to the fact that the moving speed of the vehicle is high, the content request vehicle cannot completely obtain the needed content within a current RSU coverage range, and therefore the content request vehicle needs to continue to obtain the remaining content within a next RSU coverage range. The Internet of Vehicles content cache decision optimization method aims to reduce the total time delay of acquiring the required content by the content requestvehicle. According to the Internet of Vehicles content cache decision optimization method, the association problem of vehicles can be solved, and the content pre-caching is considered, so that the optimal content caching decision is obtained.
Owner:CHONGQING UNIV OF POSTS & TELECOMM

Flight location information playback system and playback method

The invention discloses a flight location information playback system and a playback method. The system comprises a database, a data processing module, a playback control terminal and a playback display terminal, wherein the data processing module is independently connected with the database, the playback control terminal and the playback display terminal. The system has the following advantages that: (a) the database is adopted to store the flight location information, an index is configured for a playback condition data item, the query speed of playback speed is improved, and data playback instantaneity is realized; (b) effective information is displayed during playback initialization, only the flight information of a flight with location updating in appointed time is retained and displayed, and the flight information of the flight without locating updating in the appointed time is deleted, wherein the flight information includes the historical information of the flight; and (c) adaptive time slicing playback is carried out, a way of reading playback cache data according to a time period is adopted to greatly lower the data caching pressure of the playback control terminal, timeperiod length is adaptively regulated along with playback fast forward or slow forward, and playback fluency is guaranteed.
Owner:CIVIL AVIATION UNIV OF CHINA

Storage system write cache data issuing method and related components

The invention discloses a method for issuing write cache data of a storage system. The method comprises the steps of determining a concurrent processing threshold for scheduling a transaction to a write cache; when write cache data issuing is started, judging whether the to-be-processed data volume of the write cache reaches a concurrent processing threshold value or not according to the issued data volume and the write cache completion data volume; if so, pausing the issuing of the data to the write cache; and if not, continuing to issue the data of the write cache. According to the method, an issuing strategy of data from the transaction to the write cache in a controllable mode is provided; the concurrent threshold from transaction scheduling to write cache IO is introduced; and the transaction IO issuing thread is scheduled only when the concurrent number is smaller than the threshold value, so that the data processing pressure of the write cache is reduced; the situation that theIO of an upper-layer host is overtime is avoided; and the user experience is improved. The invention further provides a device and equipment for issuing the write cache data of the storage system anda readable storage medium, which have the above beneficial effects.
Owner:北京浪潮数据技术有限公司

Message scheduling method and device and network chip

The invention provides a message scheduling method and device and a network chip, which are applied to the network chip. The network chip comprises a classification layer, and the classification layercomprises a plurality of classification nodes; the method comprises the following steps: after a classification node acquires a scheduling task of a network message, and acquiring a node identifier of the classification node; determining a queue identifier corresponding to the node identifier of the classification node according to a mapping relationship between the stored node identifier and thequeue identifier of the output queue; judging whether an output queue corresponding to the determined queue identifier meets a back pressure condition or not; when a back pressure condition is satisfied, giving up the scheduling task; when the back pressure condition is not met, executing a scheduling task of the network message, so that the network message is written into the determined output queue. By the adoption of the method, scheduling of the network messages is achieved on the L3 layer, the congestion problem caused by the fact that the cache amount of the output queues is too large is effectively avoided, and the cache pressure of the corresponding output queues is reduced.
Owner:新华三半导体技术有限公司

Fire-fighting monitoring method based on internet of things and fire-fighting monitoring system thereof

InactiveCN107221118AReduce casualtiesTimely notification informationFire alarmsMonitoring systemThe Internet
The invention relates to a fire-fighting monitoring method based on the internet of things and a fire-fighting monitoring system thereof. The method comprises the following steps that fire-fighting monitors are installed to monitor the fire-fighting safety situation in the home of each user; a monitoring node processor is installed to receive monitoring information and process the monitoring information; and a fire-fighting center system judges the fire-fighting situation according to abnormal information and the fire-fighting scene, determines the optimal rescue plane and the coverage scope of the fire-fighting scene according to the fire-fighting situation and transmits information of the rescue plan and the coverage scope to the monitoring node processor, and the monitoring node processor broadcasts escape signals to the users within the coverage scope. The invention also relates to the system. The system comprises the fire-fighting monitors, the monitoring node processor and the fire-fighting center system. With application of the fire-fighting monitoring method based on the internet of things and the fire-fighting monitoring system thereof, the optimal escape mode can be timely notified to the user so that personnel casualty can be reduced, the fire-fighting personnel can be timely informed to take the rescue mode in time and the fire-fighting safety problem can be solved.
Owner:SHENZHEN SHENGLU IOT COMM TECH CO LTD

Delay loading detection method and device, electronic equipment, storage medium and product

The invention provides a detection method and device for delayed loading, electronic equipment, a storage medium and a computer program product, and the method comprises the steps: obtaining a class name of data needing to be loaded in a delayed manner when a call operation for triggering the delayed loading is detected; acquiring corresponding first parameter data from a mapping relation stored in the data structure according to the class name; if it is determined that data needing to be loaded in a delayed mode is loaded in a delayed mode according to the first parameter data, and when new data are added after the data need to be loaded in the delayed mode, triggering calling operation of the delayed loading again, and obtaining corresponding second parameter data from the mapping relation stored in the data structure according to the class name; the second parameter data are new data added after delayed loading. According to the method and the device, when the calling operation for triggering the delayed loading is detected, whether the calling operation for triggering the delayed loading is re-triggered or not is determined according to the class name of the data so as to obtain new data, so that the integrity of the data is ensured, meanwhile, the local cache pressure is also reduced, and the system performance is improved.
Owner:BEIJING DAJIA INTERNET INFORMATION TECH CO LTD

A method for evenly distributing code streams of each program when constructing multi-program ts streams

The invention relates to a method for achieving uniform distribution of all paths of program code streams in the construction process of multiple program TS streams. When TS packages are transmitted at a time, N memory blocks are allocated to N paths of audio / video data packages needing to be transmitted, and list information and the integral number of TS packages are stored in each memory block. The total number of the TS packages in the N memory blocks is counted at the transmitting moment, the percentage of the number of the TS packages in each memory block on the total number of the TS packages is calculated, and a range of consecutive serial numbers is allocated for each memory block according to the corresponding percentage; an uniform distribution random number with the value range smaller than the total number of the TS packages is generated; the TS package, with the generated random falling into the memory block serial number range, in the corresponding memory block is read and transmitted. Uniform distribution random numbers are generated repeatedly until all the TS packages are transmitted. Thus, sudden mass data transmission in a certain path of program is avoided, the stability of broadcasting is improved, and the audio / video data caching pressure at the receiving end of a cable television is relieved.
Owner:XI AN JIAOTONG UNIV

Method and system for preloading page information

The application discloses a page information preloading method and system. The method comprises: searching next operation related content within current webpage and a historical operation record thereof; detecting and acquiring the moving velocity of a touch point within the current page; obtaining the moving direction of the touch point based on the position information of the touch point within the current page and the position of the touch point within the previous page in the historical operation record; based on the moving velocity of the touch point, forming a preloading area of a horizontal axis length along a moving direction and a longitudinal axis length perpendicular to the moving direction at the touch point within the current page; based on the moving velocity of the touch point, further adjusting the preloading area after optimization of the horizontal axis length and the longitudinal axis length; based on the preloading area, sending a preloading content request to a backend, and invoking next operation content from the backend for preloading. The method and system of the invention search information data within a trend range from the backend and preload the information data to the page, which eliminates waiting time for loading information data when users conduct next page operation.
Owner:ALIBABA (CHINA) CO LTD

Data processing method and device

The invention provides a data processing method and device, and the method is applied to a web application program, and comprises the steps that the web application program obtains an access request for accessing a target file; the web application program stores the accessed target file in a designated storage area in a storage medium of the electronic equipment according to the access request, and the designated storage area is an area, except for an area occupied by a cache of the web application program, in the storage medium; and the web application program loads and displays the target file from the specified storage area. The method comprises the following steps: when a web application program needs to access a target file; and storing the accessed target file in a specified storagearea of the electronic equipment except the area occupied by the cache of the web application program, According to the method and the device, the caching pressure of the web application program during operation can be reduced, when the file accessed by the web application program has the local disk, the file can be directly loaded from the local disk, and the file downloading from the network isreduced, so that the operation fluency of the web application program is improved, and the user experience is improved.
Owner:GUIYANG LONGMASTER INFORMATION & TECHNOLOGY CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products