What’s the Future of Application Delivery?

Typing HandsDo you think of application delivery as being staid?  Perhaps you think that application delivery was a dynamic topic several years ago, back when the industry first discovered that running a chatty protocol such as CIFS over a wide area network leads to really bad application performance and that techniques such as de-duplication can reduce some of the performance problems associated with bulk file transfers.  As 2013 draws to a close, one thing is crystal clear — if you think of application delivery as being staid, you need to think again.

One reason why application delivery has traditionally been such an important topic is that, as discussed in the 2013 Application and Service Delivery Handbook, when the performance of a business critical application begins to degrade, the company typically loses revenues and often loses clients.  In addition, the number of business critical applications that the typical company has to support has been increasing.  Whereas it used to be rare to find a company that has more than a handful of business critical applications, today roughly thirty percent of companies have more than twenty applications that they consider to be business-critical.

It is not just the number of business critical applications that is increasing, so is the volume of WAN traffic.  This increase is driven by a number of factors including the increasing mobility of employees, the expanding use of public cloud computing, and the increasing utilization of varying forms of video; e.g. videoconferencing, telepresence, and video surveillance.  Video traffic is particularly challenging, both because it tends to consume significant WAN bandwidth, and because it is sensitive to delay and jitter.  Big data is just beginning to emerge as another factor that has the potential to make stringent demands on the WAN.  The phrase big data refers to working with a collection of data sets that is so large and complex that it is difficult, if not impossible, to process them using traditional techniques.  In an increasing percentage of instances, big data applications are real-time, which often means that large volumes of data must transit the WAN quickly enough so as to not impact application performance.

The traditional approach that IT organizations took relative to WAN optimization was to implement dedicated appliances.  While that is still a viable option, increasingly IT organizations implement WAN optimization functionality through software running on a general-purpose computer or on a virtual machine.  Implementing WAN optimization in software should be looked at as part of the larger movement to Software-Defined Everything (SDE), whether that is a Software-Defined Data Center (SDDC) or a Software-Defined Network (SDN).

In an SDDC all infrastructure is virtualized and delivered as a service, and the control of this datacenter is entirely automated by software.  While it’s true that few, if any, IT organizations have currently implemented an SDDC, it’s also true that the steps that the majority of IT organizations have already taken to implement virtualization and private cloud computing are key steps on the path to implementing and SDDC, and that most companies will travel further down that path in 2014.

While an SDN can be a component of an SDDC, it’s also possible to use an SDN in the WAN or in a campus or branch office.  One of the interesting use cases for an SDN is to enable applications to dynamically signal the network for the types of services that it needs; e.g. optimization or security.  Cisco’s recent Application Centric Infrastructure (ACI) announcement takes this core concept of application signaling and expands it to where the application signals the entire infrastructure for the services it wants.  Neither SDN nor ACI will see broad adoption in 2014.  However, both of these emerging approaches point to a near-term future in which applications dynamically interact with one or more components of the infrastructure to ensure acceptable levels of performance and security.

Given its importance, IT organizations need a plan for how they will ensure acceptable application delivery.  That plan has to be able to respond to the existing challenges associated with mobile workers, cloud computing, and demanding applications such as video and big data.  That plan, however, also has to be able to respond to the emerging challenges that are associated with an environment in which applications dynamically make requests for the types of services it needs, and the infrastructure has to automatically respond to those requests.

Image credit: Jacob Bøtter (flickr)

About the author
Jim Metzler
Jim has a broad background in the IT industry. This includes serving as a software engineer, an engineering manager for high-speed data services for a major network service provider, a product manager for network hardware, a network manager at two Fortune 500 companies, and the principal of a consulting organization. In addition, Jim has created software tools for designing customer networks for a major network service provider and directed and performed market research at a major industry analyst firm. Jim’s current interests include both cloud networking and application and service delivery. Jim has a Ph.D. in Mathematics from Boston University.