Feb 9, 2015
It’s prediction season for the year ahead, and one of the most popular forecasts this time is that 2015 will be when hybrid clouds become practical. This isn’t one of those forecasts that come around time and again though – like “this is the year of desktop Linux” – no, this one is pretty solid.
Some might even call it a no-brainer. That’s because if you are running IT for pretty much any organization, you are going to have a hard time NOT dealing with the cloud in one form or another. And unless you are one of those brave enough to actually go all-cloud, that means figuring out ways for your on-premise services, storage and other resources to coexist happily with their remote cousins out in the cloud.
So as you start planning your hybrid cloud strategy, here are some questions that you might want to ask, in order to ensure that you both get value overall and also minimize risk on the cloud side.
To truly be a hybrid cloud, the two sides need to do more than coexist: they need to interact in useful ways. Think of it as a kind of mash-up, like creating a unique new (and profitable!) service by layering your own data on Google’s maps. So while the most popular enterprise use for cloud storage today is probably off-site backup and/or archiving, that is about as hybrid as using a tape library. The two are simply performing different roles in the IT infrastructure – the cloud is just a cheap way to get stuff off-site.
One alternative is a cloud-connected local storage device that transparently migrates data between your local storage and the cloud, dynamically taking advantage of each technology’s strengths. Current and frequently-used data is cached locally, increasingly often on solid-state storage. Quite a few companies offer appliances of this kind, though of course they approach the problem in different ways and have different strengths. They include Avere, Ctera, EMC, Nasuni, Panzura, and Microsoft with Azure StorSimple. Storage virtualization software such as Datacore’s SANsymphony can also achieve this kind of thing.
Giving your users access to an enterprise-grade file share & sync (FSS) service, whether on one of the commercial platforms or via a self-hosted or managed service such as AeroFS, Cornerstone or TeamDrive, is not a true hybrid cloud. It can be a really good idea, as it solves an important need in a safe and compliant way, but unless it does more than just sync, all you are doing is duplicating your stuff in the cloud. Duplication leads to complication, fragmentation, waste and confusion.
For many organizations, the cloud storage element is the big problem. Trusting your company’s data to a third-party carries all sorts of worries and potential risks. For a start, will it be stored securely and protected properly? Some cloud service providers assume that backups are the client’s responsibility. They might keep revisions and deletions for 30 days, say, but after that you’re on your own. And while consolidation allows a cloud service to implement far stronger protection than any individual client could on their own, catastrophes can still happen, as they did for some Salesforce customers back in 2013. A hybrid strategy can actually help here, if it allows you to run your own backups. Alternatively, there are now cloud services that are specifically designed to backup other cloud services, such as Asigra and Backupify.
If something does go wrong, or if your cloud-side provider collapses (as Nirvanix famously did) or gets hacked, or if it gets raided and has equipment taken away (whether by cops or robbers), can you get your data back? Because while you can outsource the hosting, you cannot outsource your legal and regulatory responsibility. Some vendors have seen an opportunity here. For instance, NetApp has a hybrid scheme called NetApp Private Storage, where you base a filer that you own in a colo near the cloud data center that you plan to use for your cloud computing, and connect the two via a private low-latency link such as Amazon’s AWS Direct Connect. Of course, owning the filer also makes it easier to do replication and snapshots.
Everything – even Software-as-a-Service – has to physically live somewhere. Depending on the rules and laws that apply to your organization, it may need to live on servers and storage located in the same country or region as you. Even the big cloud hosts understand this need. However, Microsoft’s principled battle to stop a US court seizing data from its Irish data center shows that extraterritoriality is still a problem. That’s why in many countries you can now find local cloud providers, with no overseas presence that could be leaned on by foreign judges or politicians.
The cloud was sold to us as this great leveler – a tool that enabled almost everyone to compete on equal terms. The snag is, once you have outsourced an enterprise application, getting it back (whether because you have decided to in-source or to move to a different cloud service) can be tough, if not impossible. Even if you have enough local storage to pull the data back to and WAN optimization in place to speed the data transfers, how long will it take?
If it is just data, there are quite a few data-mover services that will do all the donkey-work in the cloud, with no need for you to download and upload. The task is rather tougher for workloads though, partly because of incompatibility between cloud platforms, and the result is cloud vendor lock-in. The Open Data Center Alliance is working to improve cloud interoperability. Another route forward could be to Dockerize the application, putting it into a portable Docker container; then it is just data to move.
Image credit: WikiMedia Commons