Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

A named data network adaptive caching strategy for vehicle networking

A technology for naming data networks and caching strategies, applied in electrical components, transmission systems, etc., can solve problems such as low utilization of cache resources and reduce cache diversity, and achieve low cache efficiency, avoid broadcast storms, and release cache space Effect

Active Publication Date: 2019-01-22
CHANGAN UNIV
View PDF5 Cites 12 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

The existing NDN caching strategy leads to a large number of redundant copies of vehicle nodes, which makes the utilization of caching resources in the network low and reduces the diversity of caching

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A named data network adaptive caching strategy for vehicle networking
  • A named data network adaptive caching strategy for vehicle networking
  • A named data network adaptive caching strategy for vehicle networking

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0026] The present invention is described in further detail below in conjunction with accompanying drawing:

[0027] Such as figure 1 As shown, the present invention is a named data network adaptive caching strategy oriented to the Internet of Vehicles,

[0028] Let the source request node be V i , the data source node is V j1 , the set of vehicle nodes in the Data return path is V j ={V j1 ,V j2 ,V j3 ...V jn}, V j2 is the data source node V j1 Return the next node, V j3 ~V jn It is the next node on the backhaul path in turn; the last node V of the backhaul jn i.e. V i .

[0029] Specifically include the following steps:

[0030] Step 1): Add fields in the Interest package and Data package,

[0031] Add the hop number Hop field in the data packet format of the Named Data Network Interest packet, record the number of nodes that the Interest packet passes through, add 1 to the Hop field every time a node passes, and add the number of occurrences of the request pr...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a named data network adaptive caching strategy for vehicle networking;by extracting the number of request prefix occurrences of Vj1, Vj2, and Vj3 in the data packet return pathof the previous time window, requesting the average of the number of occurrences of the prefix to calculate the content popularity, and calculating the interval hop count of the data packet buffer ofthe node;calculating the interval hops of the node's Data packet cache. The invention combines cache content and traffic flow dynamic characteristics, and proposes a cache decision based on content popularity and vehicle density, and caches data packets by hopping.Calculating the interval hops of the node's Data packet cache based on the content popularity in the named data network. Then, the vehicle density is adaptively adjusted to adjust the hop count, and the data packet is cached by the hop to effectively release the buffer space, thereby solving the problem of homogenization cache and low cache efficiency, optimizing the overall performance of the vehicle networking, reducing congestion, reducing the network overhead, reducing the cache redundancy in the network, avoiding the problem of broadcast storm, and improving the utilization rate of network resources and the overall network performance.

Description

technical field [0001] The invention relates to the field of named data network of the Internet of Vehicles, in particular to an adaptive caching strategy for the named data network oriented to the Internet of Vehicles. Background technique [0002] At present, named data network (NDN) has been widely used in the field of Internet of Vehicles. In-network caching is an important feature of NDN. With the rapid increase of content volume, how to use the limited cache space to choose a reasonable cache for each user has become an issue. become a key issue of NDN caching strategy. Existing NDN caching strategies lead to a large number of redundant copies of vehicle nodes, which makes the utilization of caching resources in the network not high and reduces the diversity of caching. Contents of the invention [0003] The purpose of the present invention is to provide a named data network adaptive caching strategy oriented to the Internet of Vehicles, so as to overcome the defici...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): H04L29/08
CPCH04L67/12H04L67/5682H04L67/63
Inventor 段宗涛樊娜张天洋朱依水张俊哲
Owner CHANGAN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products