Capacity in the cloud

IT managers grapple with moving large stores of data into the cloud


Jeff Kubacki, CIO at Kroll, set a goal for the risk management consulting firm to reduce its storage costs by 25% over the next three years.

With some 13 petabytes of stored data to date, Kubacki plans to attack the problem with a mix of tiered storage, business process changes and newer options -- including cloud storage.

Though in its infancy, cloud storage seems like an attractive option, with its elasticity, utility-like billing, multiple storage locations and ability to pull data directly from the storage device. But the cloud is still uncharted territory when it comes to sending large chunks of data through the ether.

"Cloud is one of those things that we've been talking to our vendors about to see when it might make sense for us to put our toe in the water," Kubacki says. "We're still just figuring out if it's going to be right for us."

Kroll's IT architects will be investigating ways to migrate about 25% of the risk assessment firm's eligible data through its Internet "pipes" and into the cloud. (The majority of data, mostly legal discovery documents, is considered too sensitive to store in the cloud, Kubacki says.) While storage capacity in the cloud is expandable, limits in the capacity of network connections to the cloud can create challenges for enterprises with multiple petabytes of data to move back and forth.

Enterprises are asking whether their pipes are big enough to transfer their stored data to the cloud, and often, the answer is no. "The latency is the big inhibitor for what you can use [cloud] storage for," says Adam Couture, an analyst at Gartner. "Right now, for enterprises, we see the [use restricted to] archiving, backup, maybe some collaboration."

But most cloud providers say there are easy ways around capacity issues when migrating data to the cloud -- starting with the physical migration of the initial data to the datacentre location.

It's relatively easy to host and transfer large amounts of data from a day-to-day, user-level perspective, says Rob Walters, general manager of the Dallas office of cloud hosting company The Planet. But moving 20TB to 25TB of data in a chunk continues to daunt current systems. "The networks that we have [today] just aren't good at it. It's just a weak point right now, and everybody is looking at dealing with that," Walters says.

For enterprises, the "initial ingestion" of backup data to the cloud can be done by copying data to the cloud over a WAN or LAN link, but "that initial backup, depending on how much data you have on your server, could take weeks," Couture cautions.

Doctors' offices that hire Arvada, Colo.-based Nuvolus to create private cloud storage for their sensitive patient data don't like data to be copied and physically taken out of their offices, says Nuvolus CEO Kevin Ellis. So the company requires its health care industry clients to have "a decent Internet connection" -- typically 500Gbit/sec. -- to transfer the backup data over the pipes, says Ellis.

"Depending on the office, we could be looking at pretty long upload times," he says. "You're uploading overnight. We're trying to make sure we're not impacting the doctor's office during the day as well."

"Recommended For You"

AT&T gets on the cloud Proceed carefully with cloud storage, says Gartner