Remote channel message compression method and system for electricity consumption collection system

A technology for collecting system and power consumption information, applied in transmission system, digital transmission system, adjusting channel coding and other directions, can solve the problems of data real-time cannot be guaranteed, consume a lot of time, transmission delay is large, etc., to reduce algorithm steps, The effect of high compression effect and low implementation cost

Inactive Publication Date: 2016-05-04
CHONGQING UNIV OF POSTS & TELECOMM
2 Cites 4 Cited by

AI-Extracted Technical Summary

Problems solved by technology

The communication volume is very large, and the data transmission to the master station takes a lot of time
The power grid department not only needs to pay high operating costs, but also has low communication rates, large transmiss...
the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Abstract

The invention discloses a remote channel message compression method and system for an electricity consumption collection system. The remote channel message compression method comprises the steps that: a master station issues a command to a concentrator, the concentrator responds to the command to the master station and generates an uplink message, the concentrator compresses the uplink message and transmits the compressed uplink message to the master station, and after receiving the compressed uplink message, the master station carries out a decompression process; and a downlink message transmission process from the master station to the concentrator is an inverse process. Online modeling is carried out according to the distribution properties of contexts in the orders of the uplink message; a symbol probability of a next arrival character according to context information established in a probability distribution model, and the symbol probability is encoded and output; and the distribution properties of the contexts in the orders according to encoded characters, and the probability distribution model is adaptively updated. By adopting the remote channel message compression method and system disclosed by the technical scheme of the invention, the data transmission delay of a remote channel of the electricity consumption collection system is shortened, the data transmission efficiency is improved, the remote channel communication bandwidth is saved, and the data transmission cost between the master station and the concentrator is reduced.

Application Domain

Technology Topic

Image

  • Remote channel message compression method and system for electricity consumption collection system
  • Remote channel message compression method and system for electricity consumption collection system
  • Remote channel message compression method and system for electricity consumption collection system

Examples

  • Experimental program(4)

Example Embodiment

[0026] Example one , The present invention provides a communication message compression system between the master station and the concentrator of the electricity information collection system, such as figure 1 Shown.
[0027] Probability model: In this method, a multi-layer context-dependent probability model is used. It has the following characteristics. For a certain character to be encoded in the message, the outline of the context tree constructed step by step using the encoded string The information obtains the cumulative frequency of the characters to be encoded in each level of context. There are two mechanisms to calculate the predicted probability of the character.
[0028] The first is the back-off mechanism, which is to find whether the character to be encoded appears in the current long context, and if so, output the cumulative frequency of the character to be encoded and the cumulative frequency of the previous character. If it does not appear, output an escape character and return to the next shorter context. Until the 0th level context, that is, the character to be encoded, the probability assigned to it is the ratio of the number of occurrences of the symbol to the number of symbols that have been read. If the character to be encoded has never appeared before, the model is converted to a -1 order context, and a fixed probability of 1/256 is assigned to it. Because the number of hexadecimal message data character sets is 256. The second is a hybrid mechanism, which combines the predicted probability of the character in each context. The sum is weighted by a certain weight. Its weight is a function of the number of times the symbol appears in the current context. The first method is used here. In addition, an exclusion method is used in the back-off mechanism of the probability model. When the context changes from a high-level to a lower-level context, the characters that have appeared in the high-level context are excluded from the low-level context, because if this occurs in the high-level context Characters, it will not fall back to the lower-level context. This can increase the prediction probability of other characters in the low-level context, and improve compression.
[0029] Learning mechanism: The context tree data structure used in this method retains the probability distribution of symbols in all different contexts. After the current character is encoded, the learning mechanism further learns the character statistics of the real data stream by updating certain context tree nodes informational. There are two commonly used learning mechanisms: deep update and shallow update. Deep update, that is, after encoding a character, update its current context node and all shorter context nodes (including empty context), that is, add 1 to the frequency of the character. Shallow update scheme, that is, after encoding a character, only the highest-order context currently matched is updated. If the character is observed for the first time, all shorter context nodes are recursively updated. Obviously, the learning mechanism and the probability model are inseparable, but there is no accurate learning algorithm suitable for all probability models. This method prefers a deep update scheme.
[0030] Arithmetic coding: In this method, the probability model uses adaptive arithmetic coding after predicting the character probability. That is to code the character according to the cumulative probability distribution of the current character and the cumulative probability distribution of the previous character.
[0031] The following describes the data compression processing flow from two levels. On the one hand, the data transmission process is described from the application scenario of the data compression module in the electricity consumption information collection system. On the other hand, the data compression module algorithm is described from the specific implementation process. Character processing flow. Finally, use specific message examples for comparison.

Example Embodiment

[0032] Example two;
[0033] Such as figure 2 with 3 As shown, the present invention provides the communication environment of the method in the actual electricity consumption information collection system. For the convenience of description, only the parts related to the examples of the present invention are shown. The application communication environment of the present invention includes three parts, a concentrator, a remote communication channel, and a master station system. The concentrator is responsible for collecting and storing the electric energy data and power quality data of the meter connected under a platform. The master station system is responsible for issuing various request data, setting parameters, and control commands to the concentrator according to actual business needs.
[0034] Because the characteristic of the remote channel of the electricity information collection system is that the uplink channel traffic is much greater than the downlink traffic, the reason is that the concentrator in the uplink communication has a large amount of data to be uploaded to the master station. Therefore, this example only illustrates the uplink channel data compression processing flow.
[0035] Such as figure 2 As shown, after the concentrator receives the main station call-in command, it generates the corresponding 376.1 response message according to the function code. The message is then compressed by the data compression/decompression module, and then sent out by the concentrator remote communication module. After receiving the compressed message data, the remote communication module performs the decompression inverse process consistent with the concentrator, and finally obtains the actual response message. It should be understood that the above process is reversible.

Example Embodiment

[0036] Example three;
[0037] In the embodiment of the present invention, the data to be compressed is compressed online, that is, the message data can be compressed while being generated, without waiting for all the message data to be generated and then compressed, which reduces the transmission delay of the data inside the concentrator.
[0038] Such as Figure 4 As shown, the present invention provides a schematic flow diagram of a specific data compression algorithm, which is detailed as follows:
[0039] Step S1, initialize the search tree, and set the root node to be empty.
[0040] Step S2, obtain the next character S to be encoded.
[0041] Steps S3~S6, judge whether S is an escape character (whether the current context node has a child node, if so, output an escape character, add the character that appears in the current context to the excluded character set, and turn to the next shorter context At the node, execute S3~S6 in a loop until it is not an escape character.
[0042] Step S7, if the character is not an escape character, encode the character.
[0043] Step S8, update the status of the context node of each level, and add the context node that does not exist.
[0044] In step S9, the excluded character set is cleared.
[0045] In step S10, if it is a termination character, the encoding is terminated.
[0046] The characteristic of the compression algorithm proposed by this method is that the decoder can construct the same data structure and context statistics as the encoder when decoding, even if the predicted probability distribution of the character is adaptively changed, but each character The codec is the encoded character information that can be obtained by the codec.
the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

no PUM

Description & Claims & Application Information

We can also present the details of the Description, Claims and Application information to help users get a comprehensive understanding of the technical details of the patent, such as background art, summary of invention, brief description of drawings, description of embodiments, and other original content. On the other hand, users can also determine the specific scope of protection of the technology through the list of claims; as well as understand the changes in the life cycle of the technology with the presentation of the patent timeline. Login to view more.
the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Similar technology patents

Run-length coding and decoding methods and devices

InactiveCN102185612ASimplify the compression processIncrease the compression ratioCode conversionImage codingRun-length encodingBit numbering
Owner:张环蚀

Classification and recommendation of technical efficacy words

  • Improve compression
  • Increase the compression ratio

Systems and methods for prefetching objects for caching using QOS

ActiveUS20080228938A1Improve compressionImprove efficiency of obtain and service dataMultiple digital computer combinationsData switching networksClient-sideTraffic volume
Owner:CITRIX SYST INC

Picture encoding method and image decoding method

InactiveUS20050031030A1Increase the compression ratioHighly practicalPicture reproducers using cathode ray tubesPicture reproducers with optical-mechanical scanningGroup of picturesComputer science
Owner:GK BRIDGE 1
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products