Unlock instant, AI-driven research and patent intelligence for your innovation.

Improvement method of distributed cache system Memcached based on BP neural network

A BP neural network and distributed cache technology, which is applied in neural learning methods, biological neural network models, transmission systems, etc., can solve problems such as poor user experience, low read and write speed, and weak concurrency performance, so as to improve access speed, The effect of improving work efficiency

Active Publication Date: 2017-12-01
NANJING UNIV OF POSTS & TELECOMM
View PDF5 Cites 4 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0002] With the advent of the era of cloud computing and big data, the ever-increasing demands of user groups have brought massive data growth, forcing traditional caching technologies to increasingly show poor capacity scalability, low read and write speed, poor user experience, and weak concurrency and other disadvantages

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Improvement method of distributed cache system Memcached based on BP neural network
  • Improvement method of distributed cache system Memcached based on BP neural network
  • Improvement method of distributed cache system Memcached based on BP neural network

Examples

Experimental program
Comparison scheme
Effect test

specific Embodiment approach

[0028] The technical solutions of the present invention will be further described in detail below in conjunction with the accompanying drawings and specific embodiments. The specific implementation is described as follows:

[0029] Step 1: Initialize HC Model.

[0030]The HC Model consists of two random Memcached nodes, one of which is designated as the hot server Hot Server, which mainly stores key values ​​and routing information of hot objects, and the other designated as the cold server ColdServer, which stores temporarily recovered key values ​​and routes information. When initializing HC Model, the hot data server node (Hot Server) and the cold data server node (Cold Server) in HC Model are empty.

[0031] Step 2: Use the three-layer BP neural network classifier to train the sample data and obtain the classification basis, such as image 3 Shown:

[0032] (1) Initialization: set the number of nodes H in the hidden layer, the weight w between the input layer and the h...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses an improvement method of a distributed cache system Memcached based on BP neural network. The method comprises the following steps: initializing HC Model; training sample data by using three-layer BP neural network classifier to obtain the classification basis; inputting actual data into successfully trained three-layer BP neural network classifier to produce a hotspot data object and a cold spot data object; loading the real data of the HC Model, loading the hotspot data object and the cold spot data object to a corresponding server node. By using the HC Model and a method for distinguishing the hotspot data object and the cold spot data object dependent on the three-layer BP neural network classifier provided by the invention, a hotspot data server and a cold spot data server are newly added in comparison with the self-contained distributed node of the Memcached, and the routing and key value information of the corresponding data are stored in the server, thereby greatly improving the following access speed on the data by the system; and the working efficiency of the Memcached distributed cache system is greatly improved.

Description

technical field [0001] The invention belongs to the efficiency field of distributed cache data, and relates to a data exchange model and a data classification method. Background technique [0002] With the advent of the era of cloud computing and big data, the ever-increasing demands of user groups have brought massive data growth, forcing traditional caching technologies to increasingly show poor capacity scalability, low read and write speed, poor user experience, and weak concurrency. and other disadvantages. The distributed cache technology of cloud computing just provides a solution, which has the advantages of high-speed reading and writing, rapid expansion, support for concurrency and fast response. The distributed cache system Memcached has the characteristics of high performance and distribution. It manages cache data by maintaining a unified and huge Hash table in memory, aiming to reduce the database load of dynamic applications and improve the access speed of ca...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): H04L29/08G06N3/08
CPCH04L67/1097G06N3/084H04L67/568H04L67/5682H04L67/60
Inventor 金仙力赵兴旺
Owner NANJING UNIV OF POSTS & TELECOMM
Features
  • R&D
  • Intellectual Property
  • Life Sciences
  • Materials
  • Tech Scout
Why Patsnap Eureka
  • Unparalleled Data Quality
  • Higher Quality Content
  • 60% Fewer Hallucinations
Social media
Patsnap Eureka Blog
Learn More