Unlock instant, AI-driven research and patent intelligence for your innovation.

Deep learning based caching system and method for self-driving car in multi-access edge computing

a multi-access edge computing and learning-based caching technology, applied in computing models, selective content distribution, instruments, etc., can solve problems such as unsatisfactory effects of content provision, and achieve the effect of reducing the download delay of contents

Active Publication Date: 2020-05-07
UNIV IND COOP GRP OF KYUNG HEE UNIV
View PDF0 Cites 3 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Benefits of technology

This approach minimizes content download delays and provides real-time, low-latency content matching passenger demands, especially during high-traffic periods.

Problems solved by technology

However, providing infotainment contents from existing data centers may entail prolonged communication delays between the car and the data center and may incur undesirable effects to the provision of contents.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Deep learning based caching system and method for self-driving car in multi-access edge computing
  • Deep learning based caching system and method for self-driving car in multi-access edge computing
  • Deep learning based caching system and method for self-driving car in multi-access edge computing

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0014]Descriptions of specific structures or functions relating to certain embodiments derived based on the concept of the present invention as set forth in the present specification are provided merely as examples for explaining the embodiments derived from the concept of the invention. The embodiments can be practiced in a variety of implementations and are not limited to the embodiments described herein.

[0015]FIG. 1 illustrates a caching system according to an embodiment of the present invention.

[0016]Referring to FIG. 1, a caching system 1 according to an embodiment of the present invention may include a data center 100, an MEC server 200, and an object 300.

[0017]A caching system 1 according to an embodiment of the present invention can be applied to an MEC (multi-access edge computing) environment. MEC refers to a network environment in which computing is performed at each network edge instead of at a centralized cloud, so as to reduce cloud load and shorten data processing tim...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

A caching system based on the invention can include an object requiring a content and an MEC server configured to determine caching contents based on a first prediction value, which may include the probability of the content being requested by the object within an allotted area and a prediction rating of the content, and download and cache the determined caching contents from a content provider. The object can include a recommendation module configured to recommend a content from among the caching contents by applying a k-means algorithm and binary classification to the first prediction value and a second prediction value, which may include a prediction value associated with a characteristic of a user, and a deep learning based caching module configured to search available MEC servers on a movement path of the object, select an optimal MEC server, and download and cache the recommended content from the optimal MEC server.

Description

CROSS-REFERENCE TO RELATED APPLICATIONS[0001]This application claims the benefit under 35 USC § 119(a) of Korean Patent Application No. 10-2018-0133873 filed on Nov. 2, 2018 in the Korean Intellectual Property Office, the entire disclosure of which is incorporated herein by reference for all purposes.TECHNICAL FIELD[0002]The present invention relates to a deep learning based caching system and method, more particularly to a deep learning based caching system and method for autonomous driving in a multi-access edge computing environment.BACKGROUND ART[0003]The self-driving car was introduced to save lives by preventing accidents resulting from human error and improper behavior and can relieve the user of the stress of controlling a car and driving while keeping watch of the surrounding environment. It is anticipated that the self-driving car will gradually be introduced to public transportation. When the time comes at which a driver operating a car no longer needs to worry about the ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(United States)
IPC IPC(8): H04N21/2183H04N21/25H04N21/258G06F15/18G06N5/04
CPCH04N21/25841H04N21/2183G06N20/00G06N5/046H04N21/251H04L67/306G06Q30/0269H04L67/5681H04L67/5682H04N21/4331H04N21/41422H04N21/4668H04N21/4666G06N3/08G06Q50/10
Inventor HONG, CHOONG SEONNDIKUMANA, ANSELME
Owner UNIV IND COOP GRP OF KYUNG HEE UNIV