Best practice #1: Scrutinize service-level agreements
Proceed with caution when it comes to getting a service-level agreement (SLA) from a cloud provider. That means read the SLA closely before committing.
"There are a few major providers offering SLAs that are very vague about things like guaranteed recovery and assured destruction of data," Beth Israel Deaconess Medical Center (BIDMC) storage architect Michael Passe said at a Storage Decisions session last year. "You want to look behind the wizard's curtain to see what is really there."
Lauren Whitehouse, a senior analyst at Milford, Mass.-based Enterprise Strategy Group (ESG), said data access is one area that bears close examination in an SLA.
"Generally, SLAs have to do with access to the service, not to data," she said. "Generally, the service has to be down more than 10 minutes before it's considered an outage, so two nine-minute outages in an hour don't count as an outage. If there's an outage of the service, they just adjust the bill -- that's the kind of game that gets played. You have to ask, 'What about access to data?'"
Best practice #2: Follow your business needs
Lantmännen, a collective owned by 40,000 Swedish farmers, saved more than $6 million in the first year after building an internal private cloud with EMC Corp. storage and Riverbed Technology WAFS devices, said Dennis Jansson, Lantmännen's chief security officer.
Jansson said users choose what type of application they need through a web interface, and each service has a fee, SLA and integrated enterprise security management application.
"We're able to actually follow business needs," Jansson said of the cloud. "It doesn't make decisions on applications the users need."
He called the cloud "an easier way to say consolidation, virtualization and standardization."
Best practice #3: Repurpose your own resources
Online advertising sales rep firm Gorilla Nation Media LLC built an external customer-facing cloud and an internal cloud for employees by using servers it already owned along with cloud vendor ParaScale Inc.'s Hyper-scale Storage Cloud software to build an object-based clustered NAS system for unstructured data. Alex Godelman, vice president of technology at Gorilla Nation, said the cloud replaced a more expensive NAS setup.
"To grow the internal cloud, we just add more nodes," he said. "The design of the system is also very simple -- we just kind of use it. And it allows us to breathe some life into a huge existing investment, which means we created the system virtually for free."
Best practice #4: Prepare for the future
Even if you're not ready for the cloud now -- or the cloud's not ready for you -- start thinking about how it may help you down the road.
Charles Shepard, director of systems architecture at the MGM Mirage in Las Vegas, said he will consider an external private cloud when technology advances make it feasible.
"When Fibre Channel over Ethernet [FCoE] becomes completely adaptable and adopted over the next five years, and when it is completely standardized, that is the pathway to develop a full cloud outside our data center," he said. "If you have a big enough pipe, like 10 Gigabit Ethernet [10 GbE] or even 100 [Gigabit] Ethernet, you might be able to take a database and write from it to the cloud."
He said FCoE would be well suited to multitenancy, which is a crucial component of the cloud.
"It inherently subsegments networks for internal and external multitenant environments," he said.
Best practice #5: Beware of hidden costs
Cloud storage providers will tell you the basic cost per gigabyte of cloud storage up front to help you figure out how much it will cost you per month depending on the amount of data you need to store. But these basic costs are only part of the picture, and providers may also charge extra for data transfers, metadata functions, or copying and deleting files. And don't forget the costs of connecting to the cloud, perhaps with a T1 line.
No comments:
Post a Comment