Measuring Tape

What’s The Real Measure Of An MSP?

Measuring TapeThere’s always been a certain Yin versus Yang when it comes to the decision whether to do IT in-house or offload the capabilities to an outsourcer or cloud provider.  Personally, when I ran IT departments I preferred the “do everything in house” model, but I also realize IT was a lot simpler then.

However, like all things in life, IT doesn’t need to be an all-or-nothing proposition.  Managed services provide a good balance between total outsourcing and do-it-yourself.  The infrastructure stays on premise, the IT staff remains in place, and a third-party group comes in and runs the infrastructure based on a set of best practices.

However, the world of IT is changing, and nowhere is that more apparent than in the data center.  Managing data center resources involves the MSP running existing workloads for the customer with service level agreements (SLAs) that are measured on things like uptime, network performance, or mean time to repair.

Are these the right things for MSPs to be measured on today, though?  The whole face of the data center has changed.  Instead of being measured on whether the MSP maintained the required uptime promised, or fixed a problem quickly in a highly redundant data center, the MSP should be measured on metrics that are more in line with where data centers are today.  These would be factors such as speed of provisioning of a virtual machine, or how fast a VM/storage/network migration happens when a workload moves.

Here’s where the vision of the software-defined data center almost becomes a requirement for the MSP.  Let’s say the MSP takes advantage of server virtualization and is able to rapidly provision or migrate virtual workloads in near real time.  Well, that’s great, but that only solves part of the problem for the customer.  If a traditional network is still in place, the provisioning time for network services can take the normal 30-90 days that it does today.  Hardly the nimbleness customers would be looking for in a next generation data center.

Software-defined networks solve this problem as they provide the automation required to orchestrate the configuration of network, storage, and compute resources to bring this rapid provisioning to the data center.  I think this offers a great opportunity for managed service providers to try and grab some share over their competitive foes: implement a software-defined data center and give customers new SLAs with metrics that are meaningful to where the industry is going, rather than legacy SLAs that really don’t mean anything in today’s world.

I’ve maintained that the bulk of interest in software-defined networks and network virtualization has been from large web providers.  MSPs building new services around the concept of “software defined” could bring SDNs down-market to a larger group of companies.  I think the idea offers a great opportunity for managed service providers, but it does require a bit of boldness to set service level agreements that are out of the comfort zone of where the industry is today.  However, success often comes to those companies willing to move out of their comfort zone first.

Image credit: aussiegal (flickr)

About the author
Zeus Kerravala
Zeus Kerravala
Zeus Kerravala is the founder and principal analyst with ZK Research. He provides a mix of tactical advice to help his clients in the current business climate and with long term strategic advice. Kerravala provides research and advice to the following constituents: End user IT and network managers, vendors of IT hardware, software and services and the financial community looking to invest in the companies that he covers.