Content delivery (or distribution) networks (CDN) are generally perceived as being of concern only for wide area networks (WANs). The idea is that relatively static content can distributed to be held as close to a user as possible so as to reduce the stress on a single server that would result from videos, music, and other files all being streamed from a single place. The CDN provides stores of the files in multiple different places around the world, so leading to a more balanced network load with less latency for those needing to access the files.
However, there is nothing to stop a CDN being implemented in a more local environment, and indeed, this may be a good idea.
Although local area network (LAN) bandwidths are growing at a reasonably fast rate, the traffic traversing these networks is growing at a faster rate. High-definition video conferencing, high-definition voice over IP (VoIP), increasing use of images in documents, and other large single-file activities have moved traffic volumes up by orders of magnitude in some organizations. The future will bring more data as we move toward an Internet of Things, with machine-to-machine (M2M) and other data migrating from proprietary networks to the standardized realm of the corporate IP network. Combine this with the need to deal with the world of Big Data using sources not only within the organization, but also from along its extended value chain and information sources directly from the web, and it is apparent that LAN bandwidth may remain insufficient to support the business’ actual needs.
Sure, priority and quality of service based on packet labeling can help — but requires a lot of forethought and continuous monitoring to get right. Much of the M2M and other Internet of Things data will be small packets that are being pushed through the network at frequent intervals — and this sort of chatter can cause problems. Some of this data will need to be near real time; some can be more asynchronous. The key is to ensure that it does not impact the traffic that has to be as close to real time as possible — such as the video conferencing and voice traffic.
A large proportion of an organization’s data is relatively unchanging. Reports, contracts, standard operating procedures, handbooks, user guides, and other information used on a relatively constant basis by users will change only on an occasional basis and is ideal for a CDN to deal with. These files can be pushed out once to a data store close to the user so that access does not result in a long chain of data requests across the LAN and WAN to the detriment of other more changeable or dynamic data assets. Only when the original file changes will the request from the user trigger the download of the file from the central repository — at which point it is placed in the local store again, such that the next user requesting the file gets that copy.
Many of the WAN acceleration vendors already have the equivalent of a CDN built in to their systems through the use of intelligent caching. In many cases, there is little to do to set up the capability — the systems are self-learning and understand what changes and what doesn’t. In some cases, there may be a need for some simple rules to be created, but this should be a once-only activity.
Extending the use of WAN acceleration into the LAN could bring solid benefits to organizations that are looking to move towards a more inclusive network — and now may be the best time to investigate implementing such an approach before it gets too late.
Image credit: Wikimedia Commons