Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Multi-level cache processing method of drive program in embedded type operation system

An operating system and driver technology, applied in memory systems, multi-program devices, program startup/switching, etc., can solve problems such as affecting operating efficiency and data leakage, saving hardware resources investment, improving operating efficiency and improving performance. The effect of stability

Active Publication Date: 2013-04-10
珠海拓普智能电气股份有限公司
View PDF3 Cites 18 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0002] In existing real-time data acquisition equipment, limited to hardware resource conditions such as limited CPU, if the embedded operating system does not undergo multi-level cache processing in the driver program, the multi-concurrent task processing of the embedded operating system and real-time complete Data collection may cause the CPU to frequently enter the high-priority data interruption collection program, resulting in data leakage caused by the interruption of the low-priority collection program, and causing the collected data to be replaced by new data before it can be processed in time. Disadvantages such as coverage greatly affect the operating efficiency

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Multi-level cache processing method of drive program in embedded type operation system

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0011] see figure 1 As shown, the multi-level cache processing method of the driver in the embedded operating system of the present invention can improve the operating efficiency of the embedded system by setting the storage unit that fully utilizes the existing hardware peripheral resources through software. The multi-level cache processing method of driver program in the embedded operating system of the present invention is as follows: at first, utilize the peripheral hardware register of CPU as first-level hardware cache, such as: A / D sampling CPU has the cache space of 16 words; Secondly , using the DMA storage space of the CPU as the second-level hardware cache, and its storage space is generally K-level; when the first-level hardware cache is full, a DMA interrupt is generated to read the data in the first-level hardware cache into In the second-level hardware cache; then, use part of the RAM space as the first-level software cache, because the storage space of DMA is sm...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a multi-level cache processing method of a drive program in an embedded type operation system. The multi-level cache processing method comprises the steps of: using a peripheral register of a CPU (Central Processing Unit) as a first-level hardware cache; using a DMA (Direct Memory Access) storage space of the CPU as a second-level hardware cache; when the first-level hardware cache is written fully, generating the interruption of the DMA to ensure that data in the first-level hardware cache is written in the second-level hardware cache; using partial RAM (Random Access Memory) space as a first-level software cache, when the second-level hardware cache is written fully, generating the interruption of the CPU to ensure that data in the second-level hardware cache is written in the first-level software cache for waiting processing; and using partial RAM space as a second-level software cache, starting a corresponding data processing task progress to ensure that invalid data in the first-level software cache is removed and valid data is stored in the second-level software cache. According to the multi-level cache processing method of the drive program in the embedded type operation system, the operation efficiency of the embedded system can be effectively increased.

Description

technical field [0001] The invention relates to an embedded operating system, in particular to a multi-level cache processing method for a driver in the embedded operating system. Background technique [0002] In existing real-time data acquisition equipment, limited to hardware resource conditions such as limited CPU, if the embedded operating system does not undergo multi-level cache processing in the driver program, the multi-concurrent task processing of the embedded operating system and real-time complete Data collection may cause the CPU to frequently enter the high-priority data interruption collection program, resulting in data leakage caused by the interruption of the low-priority collection program, and causing the collected data to be replaced by new data before it can be processed in time. Coverage and other disadvantages greatly affect the operating efficiency. Contents of the invention [0003] The purpose of the present invention is to provide a multi-level...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06F9/48G06F12/08G06F12/0802
Inventor 秦宇李安兵李正恒
Owner 珠海拓普智能电气股份有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products