Patents
Literature
Patsnap Copilot is an intelligent assistant for R&D personnel, combined with Patent DNA, to facilitate innovative research.
Patsnap Copilot

44 results about "Aggregate level" patented technology

Aggregate level cost method refers to an actuarial accounting method that tries to match and allocate the cost and benefit of a pension plan over the span of the plan's life. The Aggregate Level Cost Method typically takes the present value of benefits minus asset value and spreads the excess amount over the future payroll of the participants.

Method for reducing fetch time in a congested communication network

Congestion within a communication is controlled by rate limiting packet transmissions over selected communication links within the network and modulating the rate limiting according to buffer occupancies at control nodes within the network. Preferably, though not necessarily, the rate limiting of the packet transmissions is performed at an aggregate level for all traffic streams utilizing the selected communication links. The rate limiting may also be performed dynamically in response to measured network performance metrics; such as the throughput of the selected communication links input to the control points and/or the buffer occupancy level at the control points. The network performance metrics may be measured according to at least one of: a moving average of the measured quantity, a standard average of the measured quantity, or another filtered average of the measured quantity. The rate limiting may be achieved by varying an inter-packet delay time over the selected communication links at the control points. The control points themselves may be located upstream or even downstream (or both) of congested nodes within the network and need only be located on only a few of a number of communication links that are coupled to a congested node within the network. More generally, the control points need only be associated with a fraction of the total number of traffic streams applied to a congested node within the network.
Owner:RIVERBED TECH LLC

Data quality management using business process modeling

InactiveUS20070198312A1Impacts the overall quality of sales dataLow costResourcesComplex mathematical operationsInformation processingDashboard
A business process modeling framework is used for data quality analysis. The modeling framework represents the sources of transactions entering the information processing system, the various tasks within the process that manipulate or transform these transactions, and the data repositories in which the transactions are stored or aggregated. A subset of these tasks is associated as the potential error introduction sources, and the rate and magnitude of various error classes at each such task are probabilistically modeled. This model can be used to predict how changes in transactions volumes and business processes impact data quality at the aggregate level in the data repositories. The model can also account for the presence of error correcting controls and assess how the placement and effectiveness of these controls alter the propagation and aggregation of errors. Optimization techniques are used for the placement of error correcting controls that meet target quality requirements while minimizing the cost of operating these controls. This analysis also contributes to the development of business “dashboards” that allow decision-makers to monitor and react to key performance indicators (KPIs) based on aggregation of the transactions being processed. Data quality estimation in real time provides the accuracy of these KPIs (in terms of the probability that a KPI is above or below a given value), which may condition the action undertaken by the decision-maker.
Owner:DOORDASH INC

Data quality management using business process modeling

InactiveUS20080195440A1Impacts the overall quality of sales data.Low costResourcesComplex mathematical operationsDashboardInformation processing
A business process modeling framework is used for data quality analysis. The modeling framework represents the sources of transactions entering the information processing system, the various tasks within the process that manipulate or transform these transactions, and the data repositories in which the transactions are stored or aggregated. A subset of these tasks is associated as the potential error introduction sources, and the rate and magnitude of various error classes at each such task are probabilistically modeled. This model can be used to predict how changes in transactions volumes and business processes impact data quality at the aggregate level in the data repositories. The model can also account for the presence of error correcting controls and assess how the placement and effectiveness of these controls alter the propagation and aggregation of errors. Optimization techniques are used for the placement of error correcting controls that meet target quality requirements while minimizing the cost of operating these controls. This analysis also contributes to the development of business “dashboards” that allow decision-makers to monitor and react to key performance indicators (KPIs) based on aggregation of the transactions being processed. Data quality estimation in real time provides the accuracy of these KPIs (in terms of the probability that a KPI is above or below a given value), which may condition the action undertaken by the decision-maker.
Owner:DOORDASH INC

Generalized likelihood ratio test (GLRT) based network intrusion detection system in wavelet domain

An improved system and method for detecting network anomalies comprises, in one implementation, a computer device and a network anomaly detector module executed by the computer device arranged to electronically sniff network traffic data in an aggregate level using a windowing approach. The windowing approach is configured to view the network traffic data through a plurality of time windows each of which represents a sequence of a feature including packet per second or flow per second. The network anomaly detector module is configured to execute a wavelet transform for capturing properties of the network traffic data, such as long-range dependence and self-similarity. The wavelet transform is a multiresolution transform, and can be configured to decompose and simplify statistics of the network traffic data into a simplified and fast algorithm. The network anomaly detector module is also configured to execute a bivariate Cauchy-Gaussian mixture (BCGM) statistical model for processing and modeling the network traffic data in the wavelet domain. The BCGM statistical model is an approximation of α-stable model, and offers a closed-form expression for probability density function to increase accuracy and analytical tractability, and to facilitate parameter estimations when compared to the α-stable model. Finally, the network anomaly detector module is further configured to execute a generalized likelihood ratio test for detecting the network anomalies.
Owner:AMIRMAZLAGHANI MARYAM +2
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products