Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Method for optimizing throughput rate and response time of decision engine

A technology of response time and throughput rate, applied in the field of decision-making engine, can solve problems such as reducing system response efficiency, frequent fluctuation of system TPS and responsetime, affecting service TPS and response time, etc.

Pending Publication Date: 2020-07-07
无锡智道安盈科技有限公司
View PDF4 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0003] In actual business scenarios, high concurrency will frequently put real objects into tasks and destroy real objects through the API. Such operations will bring frequent JVM GC garbage collection, thereby reducing system response efficiency, and system TPS and response time will decrease. Frequent fluctuations, this is because the JVM GC will use CPU resources, resulting in a short-term interruption of the system at the millisecond level, thus affecting the TPS and response time of the service

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Method for optimizing throughput rate and response time of decision engine

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0022] like figure 1 As can be seen, the present invention includes the following processing flow:

[0023] Step 1: Use the blocking queue to create a task buffer pool for tasks, initialize the tasks and put them in the queue, take out the first task and assign it to the current task, and the initialization is complete;

[0024] Step 2: When the request comes, use the current task to process the request, and when the request counter and the threshold modulo are 0, log off the current task;

[0025] Step 3: Generate a new task and put it in the queue, take out the top task from the task buffer pool at this time and assign it to the current task, and continue to process the rule execution request;

[0026] Step 4: Merge the unregistered tasks and generated tasks into the queue.

[0027] In use, according to the attached figure 1 As shown, after introducing the above method, in the jmeter test curve of the rule engine stress test, the TPS is stable at around 8000, the system i...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a method for optimizing the throughput rate and response time of a decision engine. The method comprises the following steps that: 1, the task buffer pool of tasks are established by using a blocking queue, the tasks are initialized, the tasks are putted into the queue, the task at the first place is taken out, a value is assigned to the current task, and the initializationis finished; 2, when a request arrives, the request is processed by using the current task, and the current task is logged out when a counter is requested and the modulo of a threshold value is 0; 3,a new task is generated and inputted into the queue; and 4, the logged-out task and the generated task are merged into the queue. According to the method, the task buffer pool is constructed, JVM garbage collection operation caused by the fact that a large amount of garbage is generated when the request executes task deletion each time can be avoided; batch centralized destruction is carried out through modulus operation, so that the response time instability of a rule engine caused by garbage collection under high concurrency of a system is avoided.

Description

technical field [0001] The invention mainly relates to the field of decision engines, in particular to a method for optimizing the throughput rate and response time of the decision engine. Background technique [0002] The decision engine originated from the rule-based expert system and developed from the reasoning engine. It is an independent module or component widely used in the Internet industry. It realizes the separation of business decisions from application code and uses predefined semantics Modules write business decisions, accept data input, interpret business rules, and make business decisions based on business rules. The decision engine mainly includes three parts, the rule base, the fact base (working memory) and the reasoning engine. The functional positioning of the decision engine middleware determines that the decision engine shoulders important tasks. Most of these work decision engine frameworks have been implemented, but when specific business scenarios ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06F9/50G06F9/48G06F9/52
CPCG06F9/5027G06F9/5022G06F9/4881G06F9/52G06F2209/5018
Inventor 王聪郑高峰
Owner 无锡智道安盈科技有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products