It’s hard to go anywhere today and not be inundated with the cloud. I see billboards on highway 101 between San Jose and San Francisco, there are cloud signs in airports, and it seems every TV show has some kind of cloud commercial embedded in it. This has put tremendous pressure on IT leaders to quickly develop a cloud strategy and start leveraging its flexibility and transformative powers.
However, most IT leaders I’ve talked to struggle with what should go into the cloud and what should stay on-premise. While very small businesses may choose to use the cloud for everything, any company of significant size is likely to adopt a hybrid model where public cloud services and a private cloud are both used.
Last week I attended the IT Roadmap event in Denver and one of the speakers was Jamie Cutler, CIO of QEP Resources who did a presentation titled “Cloud Migration: Sending the Right Apps and Processes into the Cloud”. In his presentation, Mr. Cutler went through his company’s best practices around putting applications and data in the cloud.
First Steps and Next Steps
The first step in making the journey to the cloud is understand what applications the organization already has running in the cloud. There isn’t a business I know that has a good handle on all of the cloud applications running in their environment today. Cutler used a service from a company called Skyhigh Networks to reveal all of the cloud services and the Skyhigh application discovered approximately 750 cloud services outside of what IT knew about. This is typical of other businesses that have gone through a similar process.
The discovery process uncovered a number of cloud-based workloads that the business would consider “non core” and mainly departmental in nature. These cloud services include HR applications such as payroll and timekeeping, accounts payable, travel and expense, contract management, and non-critical IT infrastructure. These are fairly standardized applications that require very little in the way of customization. There was one application that the QEP wound up bringing back in-house because the cloud provider could not provide the necessary changes to fit into the businesses workflow in a timely manner.
Looking out into the future, Cutler is planning to migrate disaster recovery infrastructure and data analytics into the cloud. Both of these make sense to me. Analytics require speed and agility and the cloud platforms can offer performance in both areas that exceed what most organizations can do in-house. Disaster recovery is an area that few businesses do well. I’ve always joked that most organizations are fantastic and backing things up. It’s the recovery phase they struggle with.
Cutler did discuss what workloads would not go into the cloud. At QEP, any kind of core system or business critical data should remain on premise in a private cloud. Also, based on the experience he had with the application being brought back in-house, his recommendation was to avoid migrating applications that require a lot of development or flexibility. Cloud providers are commodity-based and they cannot give the necessary turn around times on changes that businesses may require.
Additionally, while great strides have been made with respect to cloud security, most large organizations have better security practices than the majority of cloud providers, particularly the smaller ones.
Summarizing Cutlers talk track, businesses should focus on putting capabilities in the cloud that:
- Need lean, fast delivery
- Do not require extensive customization or application integration
- Do not have extensive customer data or strong compliance requirements
- May be more consumer or business unit focused
- Do not have high levels of security requirements
Everything else should remain on premise in a private cloud.
The cloud is certainly the next big frontier for IT. Cutler echoed my beliefs that the cloud is powerful and transformative but not everything should go into the cloud.