Patents
Literature
Patsnap Copilot is an intelligent assistant for R&D personnel, combined with Patent DNA, to facilitate innovative research.
Patsnap Copilot

33 results about "Asynchronous I/O" patented technology

In computer science, asynchronous I/O (also non-sequential I/O) is a form of input/output processing that permits other processing to continue before the transmission has finished. Input and output (I/O) operations on a computer can be extremely slow compared to the processing of data. An I/O device can incorporate mechanical devices that must physically move, such as a hard drive seeking a track to read or write; this is often orders of magnitude slower than the switching of electric current. For example, during a disk operation that takes ten milliseconds to perform, a processor that is clocked at one gigahertz could have performed ten million instruction-processing cycles.

Kernel Bus System to Build Virtual Machine Monitor and the Performance Service Framework and Method Therefor

Some embodiments concern a kernel bus system for building at least one virtual machine monitor. The kernel bus system is based on kernel-based virtual machine. The kernel bus system is configured to run on a host computer. The host computer comprising one or more processors, one or more hardware devices, and memory. The kernel bus system can include: (a) a hyperbus; (b) one or more user space components; (c) one or more guest space components configured to interact with the one or more user space components via the hyperbus; (d) one or more VMM components having one or more frontend devices configure to perform I / O operations with the one or more hardware devices of the host computer using a zero-copy method or non-pass-thru method; (e) one or more para-virtualization components having (1) a virtual interrupt configured to use one or more processor instructions to swap the one or more processors of the host computer between a kernel space and a guest space; and (2) a virtual I / O driver configured to enable synchronous I / O signaling, asynchronous I / O signaling and payload delivery, and pass-through delivery independent an QEMU emulation; and (f) one or more KVM components. The hyperbus, the one or more user space components, the one or more guest space components, the one or more VMM components, the one or more para-virtualization components, and the one or more KVM components are configured to run on the one or more processors of the host computer. Other embodiments are disclosed.
Owner:TRANSOFT

Intelligence education E-card system platform based on internet of things and cloud computation

The invention discloses an intelligence education E-card system platform based on internet of things and cloud computation. The platform comprises an IaaS (Infrastructure as a Service) unit, a PaaS (Platform as a Service) unit, an SaaS (Software as a Service) unit, a data collector and a sensing terminal, wherein the IaaS unit is responsible for transferring and processing information obtained by a sensing layer by using infrastructure as service; the PaaS unit makes up a comprehensive service platform with RFID (Radio Frequency Identification) and the data communication technology by using cloud computation as a fundamental platform; the SaaS unit supports a plurality of front-end browsers and is connected to the infrastructure through connectivity access points by adopting industry-advanced technical standards and technical specifications and adopting an SOA (Service-Oriented Architecture) system structure; the data collector supports a plurality of communication protocols, applies to a plurality of communication modes, adopts an I/O (Input/Output) model-IOCP (I/O Completion Port) mode, and uses threads for pool-processing asynchronous I/O requests; and the sensing terminal comprises a POS (Point-of-Sale) machine, a building machine, a recognizing machine, multimedia, a channel machine, a vehicle-mounted machine, a water controller and other specific-purpose terminal equipment.
Owner:王向东

Large concurrent encrypted communication algorithm for secure authentication gateway

InactiveCN109639619AIncrease the number of concurrent connectionsCoping with shockTransmissionSecure transmissionNetwork communication
The invention discloses a large concurrent encrypted communication algorithm for a secure authentication gateway, related to a data security transmission method. The traditional communication encryption library adopts a synchronous processing method tightly coupling communication and encryption, which can only increase the concurrent processing by increasing the number of threads or processes; andlimited by the number of the threads and processes, the communication bandwidth of the network and the processing power of the CPU cannot be fully utilized. The large concurrent encrypted communication algorithm for the secure authentication gateway is independently implemented through network communication and SSL processing, and supports independent optimization; an asynchronous I/O mechanism of the operating system is used fully to improve the throughput processing capacity; the thread pool technology is used to make full use of the computing power of the CPU; and the queue technology is used to perform buffer processing on peak data. The main advantage of the large concurrent encrypted communication algorithm for the secure authentication gateway is that the SSL encrypted communication with high concurrency (greater than 50,000), high number of connections per second (greater than 500), high throughput (greater than 800Mb/s), and high peak impact can be realized.
Owner:北京安软天地科技有限公司

Asynchronous output incoming signal processing method

InactiveCN101158912ASolve the problem of asynchronous input and outputVersatileMultiprogramming arrangementsAsynchronous I/OComputer programming
The invention discloses an asynchronous input and output signal processing method, which provides a set of event-based signal processing mechanism, and a particular design is made on an interface of the driving program to be matched with the set of signal processing. The invention comprises the following steps: step 1, a subscriber thread as a driving program user proposes the needs for conducting the input and output signal processing through the driving program interface; step 2, the driving program establishes an event example, which shall be exported to the subscriber thread; step 3, the subscriber thread obtains an input and output processing signals through the event example, and complete the relevant operation in accordance with the signal definition by the drive; the driving program completes the equipment input and output operation, and give public notice on subscriber thread processing signals through the event example. Some high-level synchronous mechanism are simplified, so as to be provided to the driving program designer for use; a subscriber thread can simultaneously conduct the asynchronous I / O operation with a plurality of equipment; and the driving program can automatically define the signal semantic meaning of the asynchronous I / O, thereby providing richer functions.
Owner:KORTIDE LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products