Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Design of background automatic computing engine of data management system

A data management system and automatic calculation technology, applied in database management systems, electronic digital data processing, special data processing applications, etc., can solve the problem that the storage system cannot store big data, etc., and achieve the effect of convenient use and good user experience

Pending Publication Date: 2021-10-15
深圳弘星智联科技有限公司
View PDF0 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] Aiming at the deficiencies of the prior art, the present invention provides the design of the background automatic calculation engine of the data management system, which solves the problem that the storage system cannot store large data

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Design of background automatic computing engine of data management system

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0025] Such as figure 1 As shown, the embodiment of the present invention provides the design of the background automatic calculation engine of the data management system, and adopts the technology combining spark and hadoop distributed architecture to process large-scale data calculation, including the following specific implementation steps:

[0026] Step 1. The party that produces data produces data in Kafka. The Kafka cluster contains multiple broker servers, and Flume can be one of Flume-ng and Flume-og;

[0027] Step 2. The offline architecture uses flume to collect data in Kafka and transfer it to HDFS;

[0028] Step 3. The real-time architecture uses sparkstreaming to consume data in Kafka, formats the original data, and writes the data into hbase or elasticsearch;

[0029] Step 4. Write sql scripts to format the original data in hdfs, and finally use snappy compression to write them into the offline data warehouse hive;

[0030] Step 5. Analyze and process the forma...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a design of a background automatic computing engine of a data management system, and relates to the field of MDM background computing engines. According to the design of the MDM background automatic computing engine, the technology of combining spark and hadoop distributed architecture is adopted to process large-scale data calculation. The design comprises the following specific implementation steps that a data production party produces data in kafka; an offline architecture adopts flume to collect data in the kafka to the hdfs; a real-time architecture adopts sparkstreaming to consume the data in the kafka, performs formatting processing on the original data, and writes the data into hbase; an sql script is written to format original data in the hdfs, and finally, the original data is written into an offline data bin hive by adopting snappy compression; and the formatted data is analyzed and processed, and then a final result is written into the hbase to be called by a foreground service system. Compared with a previous MDM system, the problem that data cannot be stored can be solved only by adding a server even if the data volume is large nowadays, and meanwhile, multi-path data writing can be realized, so that the user experience is better.

Description

technical field [0001] The invention relates to the field of MDM background computing engines, in particular to the design of a data management system background automatic computing engine. Background technique [0002] MDM (Mobile Device Management), also known as mobile device management, provides complete life cycle management of mobile devices from device registration, activation, use, and elimination. Mobile device management (MDM) can realize functions such as user and device management, configuration management, security management, and asset management. Mobile device management (MDM) can also provide comprehensive security system protection, while managing and protecting mobile devices, mobile apps, and mobile documents. [0003] In the existing technology, the MDM background cannot store a large amount of data, and cannot support large-scale data calculation, resulting in poor user experience, and the data writing path is single, resulting in relatively large limit...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06F16/172G06F16/182G06F16/25G06F16/27G06F16/28G06F16/242G06F9/48
CPCG06F16/172G06F16/182G06F16/258G06F16/27G06F16/283G06F16/2433G06F9/4843
Inventor 刘春华向程周华陈华殷王晓栋
Owner 深圳弘星智联科技有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products